Risk assessment of micronutrients

Risk assessment of micronutrients

Toxicology Letters 180 (2008) 123–130 Contents lists available at ScienceDirect Toxicology Letters journal homepage: www.elsevier.com/locate/toxlet ...

751KB Sizes 3 Downloads 203 Views

Toxicology Letters 180 (2008) 123–130

Contents lists available at ScienceDirect

Toxicology Letters journal homepage: www.elsevier.com/locate/toxlet

Risk assessment of micronutrients A.G. Renwick a,∗ , R. Walker b a b

School of Medicine, Faculty of Medicine, Health and Life Sciences, University of Southampton, Southampton SO16 7PX, UK School of Biological Sciences, Faculty of Health and Medical Sciences, University of Surrey, Guildford, Surrey GU2 7XH, UK

a r t i c l e

i n f o

Article history: Available online 28 May 2008 Keywords: Micronutrients Risk assessment Upper intake level Risk–benefit analysis

a b s t r a c t Risk assessment of micronutrients has to take into account two different intake–response relationships; the risk of deficiency, which decreases with increase in intake, and the risk of toxicity, which increases with increase in intake. The available databases on micronutrients tend to focus on benefits at low intakes, and there are usually few reliable data on hazard identification and characterisation at high intakes. Application of the usual default uncertainty factors for species differences, human variability and database inadequacy could result in “recommended” upper intake levels that would cause deficiency. There have been a number of comprehensive reviews that have used low, and largely arbitrary, uncertainty factors to establish tolerable upper intake levels for vitamins and minerals. A recent FAO/WHO Workshop developed a structured approach to the application of a single composite uncertainty factor. Risk–benefit approaches have been developed recently that balance the risk of toxicity against the risk of deficiency, and offer the potential for more scientifically based methods. © 2008 Elsevier Ireland Ltd. All rights reserved.

1. Introduction Dose–response relationships have been central to risk assessment since Paracelsus noted that “All things are toxic and there is nothing without toxic properties. It is only the dose which makes something a poison”. Established approaches exist for the risk assessment of manmade non-nutrient chemicals, based on the assumption that all such chemicals are toxic; high doses are given in order to reveal possible hazards, and risk assessment is based on “moving down” the dose–response relationship to define a level of intake that does not produce any measurable identified hazard. Testing guidelines have been developed to ensure that the data are of sufficient quality for hazard identification and dose–response assessment. Risk management frequently involves some form of approval process, and the establishment of control measures to limit the extent of human exposure and to ensure that there is negligible risk. Humans are also exposed to many naturally occurring non-nutrient toxicants, such as mycotoxins and biologically active plant alkaloids, which require risk assessment. Although there is not usually a “sponsor” for such compounds, toxicological studies are frequently available and the risk assessment approach adopted is similar to that taken for man-made chemicals, with the aim of defining a level of human exposure with negligible risk. In both cases it is con-

∗ Corresponding author. E-mail address: [email protected] (A.G. Renwick). 0378-4274/$ – see front matter © 2008 Elsevier Ireland Ltd. All rights reserved. doi:10.1016/j.toxlet.2008.05.009

sidered that there is no benefit to the exposed individual and any uncertainties are taken into account by the use of large uncertainty factors; for example a 100-fold uncertainty factor is normally used to convert a no-observed-adverse-effect-level (NOAEL) in an animal study into a level of human exposure with negligible risk (see below). In contrast, micronutrients are essential for human health and health concerns usually relate to the possibility that the levels of intake may not be adequate for nutritional purposes. In consequence, there is a general (public) perception that vitamins and minerals are safe, i.e. that intakes above nutritionally adequate levels are without risk and that there is no need to “move up” the dose–response relationship from “safe dietary intakes” in order to consider the hazards and risks associated with higher levels of intake. However, the wide availability of supplements containing levels of vitamins and minerals that may be up to 100-fold higher than the recommended daily intake level (EVM, 2003), means that formal risk assessments are essential. Evaluations of the safety of high doses of vitamins and minerals have been undertaken in the USA (Institute of Medicine (IOM), 1997, 1998, 2000, 2002), the European Union (European Food Safety Authority (EFSA), 2006), the UK (Expert Group on Vitamins and Minerals (EVM), 2003) and by the Nordic Council (NORD, 1995). In addition, proposals have been made by associations of manufacturers and producers of vitamins and vitamin supplements (e.g. the Council for Responsible Nutrition (1997) and the European Federation of Health Product Manufacturers Association (1997)).

124

A.G. Renwick, R. Walker / Toxicology Letters 180 (2008) 123–130

Fig. 1. Dose–response relationships relevant to the risk assessment of a vitamin or mineral. The hypothetical dose–response data are presented as cumulative distributions of the incidences, with similar slopes (which may not apply—see later). The acceptable range of intake should take into account any uncertainties inherent in the use of the dose–response data for toxicity in setting a safe upper human intake. The “acceptable range” may be wide (e.g. ascorbic acid) or very narrow (e.g. fluoride); in some cases the nutrient needs of one section of the population may overlap the adverse effects in another (e.g. iodine) (adapted from Renwick, 2006).

2. Risk assessment issues relevant to micronutrients There are two dose–response relationships for micronutrients which need to be considered in establishing safe levels of human

intake (Fig. 1). Adverse effects would arise at low and inadequate intakes due to deficiency; such effects would decrease in incidence with increase in intake. Conversely, adverse effects could arise at high intakes due to toxicity; such effects would increase in inci-

Table 1 Typical differences micronutrients and non-nutrient chemicals

Data on benefits Animal data Human data Data on adverse effects at low intakes Animal data Human data Data on adverse effects at high intakes Animal data

Human data

Metabolic and kinetic aspects Absorption Transport in blood Entry into tissues Metabolism

Renal excretion

Micronutrients

Non-nutrient chemicalsa

Mechanistic/metabolic data often available

Not relevant—no direct benefit to the exposed individual Not relevant—no direct benefit to the exposed individual

Data used to establish RNI

Mechanistic data sometimes available Data used to establish RNI

Animal toxicity studies are only rarely performed, usually in response to a suspected adverse effect detected in human studies

Adverse effects are sometimes identified from observational epidemiology studies and associated follow-up research Carrier facilitated uptake at low concentrations; passive diffusion at high concentrations Free or bound to specific carrier proteins, such as transferritin Carrier facilitated transport at low concentrations; passive diffusion at high concentrations Usually highly specific enzymes, often of low capacity (readily saturable); less specific pathways utilised at high concentrations Glomerular filtration + saturable re-uptake transporters (to maintain constant blood levels) + passive reabsorption + tubular secretion

Low-dose endocrine effects are currently a controversial issue Human intakes would be set at less than any dose producing adverse effects Studies are performed to identify any hazard and to define dose-response relationship according to approved guidelinesb ; the studies involve different durations of intakes, and all major organ systems are assessed without any preconception of the hazard(s) that may be detected; the resulting data are used for establishing safe human intakes Pre-approval data from human research studies are usually supportive and define tolerability

Passive diffusion of lipid soluble form Free or bound to plasma proteins, such as albumin Passive diffusion of lipid soluble form Usually enzymes of low specificity but high capacity

Glomerular filtration + passive reabsorption + tubular secretion

RNI: reference nutrient intake. a The text applies to low molecular weight chemicals that undergo a formal risk assessment such as food additives and pesticides. b The extent and nature of toxicity testing necessary for risk assessment purposes have been established by international guidelines, for example by the OECD (Organisation for Economic Cooperation and Development).

A.G. Renwick, R. Walker / Toxicology Letters 180 (2008) 123–130

dence with an increase in dose. Both the incidence and the severity of effects, due to either deficiency or toxicity, may change with intake level. The relative positions of the dose–response relationships vary widely between different vitamins and minerals. In most cases the adverse effects associated with micronutrient deficiency and toxicity are unrelated, and different adverse effects, with different health implications, are produced at very low and very high intakes. The range of acceptable intakes would be from the recommended daily intake for normal health (called the Reference Nutrient Intake in the UK and the Population Reference Intake by the EFSA) and a safe upper level based on the toxicological profile at high intakes and taking into account any uncertainties in the database (called the Safe Upper Intake in the UK and the Tolerable Upper Intake Level by the EFSA). The approach adopted for non-nutrient chemicals, of identifying a NOAEL from animal toxicity studies and dividing it by a large uncertainty factor, cannot be applied to micronutrients. Important differences between non-nutrients chemicals and micronutrients are given in Table 1. Homeostatic mechanisms ensure that the amounts of many vitamins and minerals in the body are largely independent of intake at low levels. Homeostatic mechanisms can be overwhelmed at very high intakes and this may give non-linear relationships between intake and body burden. The recent FAO/WHO (2006) paper (see below) indicated that the “range of acceptable intakes” shown in Fig. 1 might arise from micronutrient-specific homeostatic mechanisms, such as the control of absorption, excretion, or liver storage. Experimental data supporting this assumption are lacking because the intakes associated with saturation of homeostatic mechanisms have only rarely been defined; establishing tolerable upper levels on the basis of homeostatic range is not a practical proposition at the present time. Toxicity data on micronutrients are seldom of a quality suitable for risk assessment, and there are more uncertainties associated with adverse effect data on micronutrients than with pre-approval data on non-nutrient chemicals. 2.1. Data available for hazard identification and characterisation associated with high intakes of micronutrients For convenience the data available on the toxicity of micronutrients may be divided into data from studies in humans and data from animal studies. 2.1.1. Data from studies in humans Data on the adverse effects of high intakes of micronutrients may arise from clinical studies on nutritional need or on interactions between nutrients, clinical studies on therapeutic medical uses, epidemiological analyses and anecdotal case reports In most cases human studies have not been designed to establish safety at high doses, but rather to establish benefit in relation to the prevention of deficiency using low doses or to study a potential therapeutic benefit at much higher doses. Data on adverse effects are often reported in clinical trials for therapeutic indications, but such trials usually define the incidence of adverse effects at the therapeutic dose, which may be a gross exaggeration of nutritional intakes, including supplements. Data on adverse effects (or lack of effects) for lower doses, which would not produce the therapeutic effect, are not usually reported, but such information would be extremely valuable in defining a tolerable upper intake level. An additional problem is that clinical trials of therapeutic benefit are often performed in highly selected patient groups, giving problems of extrapolation to the general population. Epidemiology studies on relationships between nutrient intakes and disease often suffer from confounding variables (for example co-exposure to selenium and fluoride), and causation using recognised criteria (Bradford-

125

Hill, 1965) is rarely established. Anecdotal case reports are useful for hazard identification, but frequently involve “abuse” dosages and cannot be used to define the incidence of the adverse effect at different levels of intake in the general population. 2.1.2. Data from studies in animals In many cases hazard identification is based on animal experimentation and in some cases the recognised hazards have subsequently been investigated in human volunteers. However, for ethical reasons, such investigations are usually limited and restricted to reversible effects. A defined set of studies is necessary for establishing daily intakes of chemicals, such as pesticides and food additives, that would be without significant adverse health effects (see Table 1); studies have to be performed using agreed protocols, and to the established quality criteria of good laboratory practice, such as the OECD guidelines. In contrast, animal studies on vitamins and minerals are usually hypothesis driven and limited in the range of tissues and effects studied. Adverse effects are reported sometimes at high intakes, but the dose–response relationships are studied only rarely, and the NOAEL is seldom identified. Such limited data would not be considered adequate to approve the use of a pesticide or food additive, and additional data would be required prior to approval. Additional uncertainty factor(s) (see below) would be applied to derive a tolerable intake in the case of a food contaminant for which risk management advice is necessary. The absence of an adequate safety database for most vitamins and minerals means that the scientific basis of hazard characterisation is less secure than is usually the case, but the use of large uncertainty factors, as would traditionally be applied for non-essential compounds, could result in adverse effects due to a deficiency condition. 2.2. The use of uncertainty factors in hazard characterisation Uncertainty factors allow for aspects of hazard characterisation for which there are no compound-specific data. There has been a long history of use of such factors (WHO, 1987) and their validity has been the subject of numerous reviews (listed in Renwick, 2006). By definition, uncertainty factors are not precise, and the values selected are usually one log unit (10) or a half-log unit (3). Uncertainty factors can be divided into those related to issues of extrapolation, those used because of database deficiencies or those applied largely for risk management reasons. The range of typical default values that may be applied to non-nutrient chemicals is shown in Fig. 2. For compounds undergoing an approval process, such as additives and pesticides, the acceptable daily intake (ADI) is usually estimated by dividing a NOAEL from animal studies by a 100-fold uncertainty factor; this factor is to allow for undefined but expected differences between species (10-fold; animal to human) and between different human individuals (10-fold; human variability) (see Fig. 2). Although many of the uncertainty factors in Fig. 2 might appear unnecessary for an essential component of the human diet, they are relevant to establishing maximum safe intakes (tolerable upper levels) that could arise from the ingestion of supplements. Application of uncertainty factors provides a “margin of safety” between intakes by the general population and the intakes that cause adverse affects; the greater the uncertainty the higher the margin of safety. There is no reason why the safety margin available to the general population in relation to intake of food supplements should be any different from that which would apply to other ingested compounds, providing that the maximum recommended intake was nutritionally adequate. Many of the factors in Fig. 2 relate to the adequacy of the database available for risk assessment purposes;

126

A.G. Renwick, R. Walker / Toxicology Letters 180 (2008) 123–130

Fig. 2. Typical uncertainty factors that may be applied in the risk assessment of non-nutrient chemicals (adapted from a scheme provided by the author (see http://ec.europa.eu/food/risk/session2 5 en.pdf) and published in FAO/WHO (2006) (Figs. A3–5)). ADI: acceptable daily intake; LOAEL: lowest-observed-adverse-effect-level; NOAEL: no-observed-adverse-effect-level; TDI: tolerable daily intake.

many of these are relevant to “supra-nutritional” intakes of vitamins and minerals, because micronutrients have seldom, if ever, been the subject of rigorous hazard identification and characterisation studies. A major problem faced by all attempts to establish tolerable upper levels of intake is that the strict application of the default uncertainty factors to micronutrients could result in “non-toxic” intakes that were associated with serious ill-health due to deficiency. For example, the extra factor for teratogenicity (Fig. 2), which is applied largely for risk management reasons, could be applied to vitamin A, because it is a known animal and human teratogen at high intakes; but this could result in such low intakes that foetal malformation could arise from deficiency. Therefore some form of compromise is necessary. Lower margins of safety are used in the case of pharmaceuticals, because there is direct benefit to the exposed individual. Although this logic could be applied to nutritional levels of intake, evidence of benefit at high supplemental intakes would be needed to justify a reduction in the uncertainty factor. A possible approach would be to use micronutrient-specific kinetic or dynamic data to adjust the usual 10-fold uncertainty factors for interspecies differences and human variability (Renwick, 1993). The International Programme on Chemical Safety (IPCS) has published guidance on the use of chemical-specific adjustment factors (IPCS, 2005) to replace the toxicokinetic or toxicodynamic factors for interspecies differences or human variability. Although there are human data on vitamins and minerals, there are very few studies that define species differences or human variability in the plasma concentrations at high and potentially toxic intakes. In consequence this approach remains a theoretical, rather than a practical proposition. 2.3. Discrepancies between different risk assessments of micronutrients Given the inadequacy of the available database on the adverse effects caused by high intakes of micronutrients, it is not surprising that the three major recent attempts to establish tolerable upper intakes (USA—IOM, 1997, 1998, 2000, 2002; EU—EFSA, 2006;

UK—EVM, 2003) have resulted in different numerical values, even when based on the same adverse effect. The different issues outlined above are illustrated well by the example of vitamin B6 (pyridoxine). Pyridoxine is an essential cofactor for a number of enzymes involved in amino acid metabolism and deficiency leads to retarded growth, acrodynia, alopecia and skeletal changes, as well as neurological effects such as seizures and convulsions. The daily requirement for pyridoxine is related to protein intake and is equivalent to approximately 2–3 mg/day (SCF, 1993). Pyridoxine has been promoted at daily intakes up to three orders of magnitude higher than this as a treatment for premenstrual and carpal tunnel syndromes. The main hazards identified at high doses of pyridoxine are neurotoxicity and neuronal degeneration, which were initially identified in animal studies and confirmed by anecdotal reports of neurotoxicity in humans at doses up to and exceeding 1 g/day. The IOM in the USA, set a tolerable upper intake level of 100 mg/day based on application of twofold uncertainty factor to a NOAEL of 200 mg/day based largely on two human studies (Bernstein and Lobitz, 1988; Del Tredici et al., 1985), supported by additional studies which had reported no neuropathy in hundreds of individuals given pyridoxine doses of 40–500 mg/day (Brush et al., 1988; Ellis, 1987; Mitwalli et al., 1984; Tolis et al., 1977). The UK Committee on Toxicity (COT, 1997) reviewed the same data, and concluded that none of the human studies were suitable for setting an upper level because they were not of sufficient duration to allow the development of vitamin B6-induced neuropathy. This group advised that intake should not exceed 10 mg/day based on a lowest-observed-advised-effect-level (LOAEL) from a study in dogs (50 mg/kg body weight) with a 3-fold uncertainty factor to allow for LOAEL to NOAEL extrapolation and a 100-fold factor to allow for species differences and human variability. This advice resulted in extensive, negative and emotional media reactions, suggesting it was an infringement of personal freedom and raising questions about the integrity of the members and chairman of the COT. The advice triggered the need for a comprehensive UK evaluation of the safety of all micronutrients, and the EVM was established: the EVM derived a safe upper level of 10 mg pyridoxine/day (EVM, 2003)

A.G. Renwick, R. Walker / Toxicology Letters 180 (2008) 123–130

127

Fig. 3. Decision tree for micronutrient risk assessment. Adapted from the scheme developed as part of the FOSIE project organised by ILSI Europe (Renwick et al., 2003). *A reversible arrow indicates that the deficiency in the database should be noted and then the subsequent steps evaluated. ADME: absorption, distribution, metabolism and excretion.

using the same logic as the COT. At the same time the European Scientific Committee on Food (SCF) was evaluating the safety of high intakes of micronutrients and set a tolerable upper intake level of 25 mg pyridoxine/day (EFSA, 2006) based on data from a controversial study (Dalton and Dalton, 1987) of adequate duration, but unusual design, applying an uncertainty factor of 4 to allow for the fact that the intake was a possible effect level (twofold) and for deficiencies in the database (twofold).

certed Action funded by the European Commission. The activity was run as individual theme groups (ITGs) with the final overview activity (ITG-G) chaired by Professor R. Kroes. The report of ITG-G (Renwick et al., 2003) included decision trees for low molecular weight chemicals, micronutrients, macronutrients and whole foods. The decision tree for micronutrients (see Fig. 3) provided a structured framework for the evaluation of available data, using traditional approaches to risk assessment.

3. Proposals to improve the quality of micronutrient risk assessment

3.1.2. FAO/WHO In May 2005 the FAO and WHO organised a Technical Workshop on Nutrient Risk Assessment (FAO/WHO, 2006) which provided a comprehensive review of evaluation procedures for hazard identification and characterisation and for exposure assessment. A major aim was the development of transparent processes for data evaluation and how uncertainties have been taken into account during the evaluation. The summary scheme for setting upper intake levels is given in Fig. 4. The scheme, which complements the ILSI decision tree, included a number of critical issues (identified by superscript letters in Fig. 4):

3.1. Traditional risk assessment approaches There have been two major reviews of how the approaches can be refined in order to optimise the use of the available toxicity data. 3.1.1. FOSIE The risk assessment approaches used for different types of compound, including micronutrients, were considered under the FOSIE (Food Safety in Europe: Risk Assessment of Chemicals in the Food and Diet) project undertaken by ILSI (International Life Sciences Institute)-Europe as part of an EU Fifth Framework Programme Con-

(a) Identify critical effect (for age/sex/lifestage subpopulation). This step is comparable to any risk assessment and involves analysis

128

A.G. Renwick, R. Walker / Toxicology Letters 180 (2008) 123–130

Fig. 4. Scheme for setting upper levels of intake for nutrients. Adapted from the scheme developed by a FAO/WHO Workshop (FAO/WHO, 2006). BI: benchmark intake (equivalent to a benchmark dose; for example an intake giving a pre-specified (e.g. 5%) level of response in the study); LOAEL: lowest-observed-adverse-effect-level; NOAEL: no-observed-adverse-effect-level; UL: upper level of intake. a, b, c, d—see text for explanation.

of all available data to identify the adverse effect (toxicity) that is produced at the lowest level of intake. Normally this would be based on data from studies in humans and be based on data that are adequate for risk assessment purposes. (b) Quantitative adjustment of the NOAEL/LOAEL/Benchmark Intake (reference point). This step incorporates the objective correction of the reference point using specific data. Adjustment to the reference point could be made for species differences in kinetics or dynamics (if based on animal data—an approach analogous to the use of chemical-specific adjustment factors (IPCS, 2005)) or differences in bioavailability (if based on study using a modified form of the nutrient). (c) Composite uncertainty factor. A composite factor was recommended that would take into account both toxicity at high doses and essentiality. The factor would allow for all uncertainties. Each source of uncertainty would be listed and graded according to its contribution to the overall uncertainty. The basis of the composite uncertainty factor should be described clearly and explicitly. (This step avoids the multiplication of multiple uncertainty factors, which could result in intake below the recommended nutritionally adequate intake. In effect the multiplicative approach to uncertainty factors is replaced by an additive approach.) (d) Adjustment for unstudied age/sex/lifestage subpopulations. This would include correction of a UL derived from adult data to

allow for intakes by different age groups, both children and the elderly, or subpopulations with known differences in requirement or susceptibility. 3.2. Risk–benefit approaches ILSI Europe established an Expert Group to evaluate the possibility of integrating the intake–response relationships for essentiality and toxicity into a single risk:benefit analysis. The reference nutrient intake is the intake 2 standard deviations higher than the estimated average requirement (EAR), which is an indicator of adequacy in 50% of the individuals in a life stage or gender group (IOM, 1997). This method requires a coefficient of variation that reflects human variability, and values of 10% (IOM) and 15% (SCF, 1993) have been used. A similar approach was developed (Renwick et al., 2004) in relation to the dose–response for the toxicity at high intakes, and using a comparable population distribution models allowed better comparison of the risks associated with benefit and toxicity. The proposed model used coefficients of variation of 15% and 45% for human variability in susceptibility to deficiency and toxicity, respectively. The latter value was derived from data on human variability in the kinetics of therapeutic drugs; this was justified on the basis that micronutrient-specific pathways are likely to be saturated at high intakes so that the kinetics would be more dependent on pathways with low specificity but

A.G. Renwick, R. Walker / Toxicology Letters 180 (2008) 123–130

129

Fig. 5. Intake–response relationships for the incidences of the absence of benefit or the presence of toxicity (adapted from Renwick et al., 2004). The data are plotted assuming a log-normal distribution, and illustrate the influence of the use of sensitive biomarkers of effect. ED50: dose giving a 50% incidence; CV: coefficient of variation.

high capacity, and also because this was the most comprehensive and suitable human database available (Dorne et al., 2005). The model was based on log-normal distributions, rather than normal distributions because these better reflected general human variability. A practical advantage of this approach is that the only information necessary is the EAR and the incidence of high-dose toxicity measured at a single level of intake; obviously more reliable estimates would be obtained if response data were available for a number of different levels of intake. A graphical representation of the modelling is given in Fig. 5. This figure also shows how different incidences would be obtained for more sensitive biomarkers of either benefit or toxicity. It was proposed (Renwick et al., 2004) that advice to risk managers would be in tabulated form giving the incidences of both inadequacy and toxicity at different intakes derived from these distributions. This approach was shown to be a workable method using data on selenium (Renwick et al., 2008), but it could not resolve problems due to the adequacy of the available data, or the need for risk managers to weigh the balance of potential health impacts for the incidences of benefit and toxicity when these differed in severity or nature. Formal risk–benefit analyses include weighting factors for different types of adverse health effect; examples are disabilityadjusted life-years (DALYs) and quality-adjusted-life-years (QALYs). Hoekstra et al. (2008) used DALYs in a risk–benefit model that comprised hazard and benefit identification, hazard and benefit characterisation including dose–response functions, exposure assessment, and risk–benefit integration. They analysed the overall risk–benefit impact of the introduction of fortification of bread with folic acid and concluded that the results were critically dependent on assumptions made about the association between folate and colorectal cancer. This analysis highlighted that once again the methods available to analyse data are more advanced than the information fed into the models. In addition, DALYs are unlikely to be available for the biomarkers of effect that are being used increasingly to define intake–response relationships for both essentiality and toxicity.

4. Conclusions Unlike food additives and pesticides there is no requirement for the formal toxicity testing of high doses of micronutrients, but the availability of high-dose supplements containing vitamins and minerals makes such data increasingly important. However, even if high quality data were available, application of the usual risk assessment procedures, with large uncertainty factors, could decrease the health of the population by the generation of an “acceptable intake” of vitamins and minerals that caused deficiency. In consequence, considerable recent attention has also focussed on better methods for risk assessment of high intakes, and for integrating data on essentiality and toxicity. Recent advances in risk assessment methods for micronutrients will not prove to be of practical benefit until there are better dose–response data on benefits and the adverse effects at high intakes. Conflict of interest statement AGR was a member of the UK-EVM, the SCF/EFSA Working Group, the ILSI-Europe Expert Group and attended the FAO/WHO Workshop, but has no other interests. RW was a member of the SCF/EFSA Working Group. References Bernstein, A.L., Lobitz, C.S., 1988. A clinical and electrophysiologic study of the treatment of painful diabetic neuropathies with pyridoxine. In: Clinical and Physiological Applications of Vitamin B-6. Alan R Liss Inc, pp. 415–423. Bradford-Hill, A., 1965. The environment and disease: association or causation? Proc. R. Soc. Med. 58, 295–300. Brush, M.G., Bennett, T., Hansen, K., 1988. Pyridoxine in the treatment of premenstrual syndrome: a retrospective survey in 630 patients. Br. J. Clin. Prac. 42, 448–452. COT, 1997. Committee on Toxicity of Chemicals in Food, Consumer Products and the Environment. Annual Report. Department of Health, London. Council for Responsible Nutrition, 1997. Vitamin and Mineral Safety. Council for Responsible Nutrition, Washington, DC (see http://www.crnusa.org/ safetypdfs/000CRNSafetyExecSumm.pdf).

130

A.G. Renwick, R. Walker / Toxicology Letters 180 (2008) 123–130

Dalton, K., Dalton, M.J.T., 1987. Characteristics of pyridoxine overdose neuropathy syndrome. Acta Neurol. Scand. 76, 8–11. Del Tredici, A.M., Bernstein, A.L., Chinn, K., 1985. Carpal tunnel syndrome and vitamin B-6 therapy. In: Vitamin B-6: Its Role in Health and Disease. Alan R Liss Inc, pp. 459–462. Dorne, J.L.C.M., Walton, K., Renwick, A.G., 2005. Human variability in xenobiotic metabolism and pathway-related uncertainty factors for chemical risk assessment: a review. Food Chem. Toxicol. 43, 203–216. Ellis, J.M., 1987. Treatment of carpal tunnel syndrome with vitamin B6. Southern Med. J. 80, 882–884. EFSA, 2006. Tolerable upper intake levels for vitamins and minerals. Scientific Committee on Food. Scientific Panel on Dietetic Products, Nutrition and Allergies. European Food Safety Authority. ISBN 92-9199-014-0. European Federation of Health Product Manufacturers Associations, 1997. Vitamins and Minerals. A Scientific Evaluation of the Range of Safe Intakes. The Council of Responsible Nutrition, Surrey, United Kingdom. EVM, 2003. Expert Group on Vitamins and Minerals. Safe Upper Levels for Vitamins and Minerals. Published by the Food Standards Agency of the United Kingdom. ISBN 1-904026-11-7. FAO/WHO, 2006. A model for establishing upper levels of intake for nutrients and related substances. Report of a Joint FAO/WHO Technical Workshop on Nutrient Risk Assessment, Geneva, 2005. World Health Organisation. ISBN 92 4 159418 7. Hoekstra, J., Verkaik-Kloosterman, J., Rompelberg, C., van Kranen, H., Zeilmaker, M., Verhagen, H., de Jong, N., 2008. Integrated risk–benefit analyses: method development with folic acid as example. Food Chem. Toxicol. 46, 893–909. IOM, 1997. Institute of Medicine. Dietary Reference Intakes for Calcium, Phosphorus, Magnesium, Vitamin D, and Fluoride. http://books.nap.edu/ openbook.php?isbn=0309063507 (checked April 2008). IOM, 1998. Institute of Medicine. Dietary Reference Intakes for Thiamin, Riboflavin, Niacin, Vitamin B6 , Folate, Vitamin B12 , Pantothenic Acid, Biotin, and Choline. http://books.nap.edu/openbook.php?isbn= 0309065542 (checked April 2008). IOM, 2000. Institute of Medicine. Dietary Reference Intakes for Vitamin C, Vitamin E, Selenium, and Carotenoids. http://www.nap.edu/ openbook.php?isbn=0309069351 (checked April 2008).

IOM, 2002. Institute of Medicine. Dietary Reference Intakes for Vitamin A, Vitamin K, Arsenic, Boron, Chromium, Copper, Iodine, Iron, Manganese, Molybdenum, Nickel, Silicon, Vanadium, and Zinc. http://books.nap.edu/ openbook.php?isbn=0309072794 (checked April 2008). IPCS, 2005. International Programme on Chemical Safety: Chemical-specific adjustment factors for interspecies differences and human variability. World Health Organisation, Geneva. ISBN 92 4 154678 6. Mitwalli, A., Blair, G., Oreopoulos, D.G., 1984. Safety of intermediate doses of pyridoxine. Can. Med. Assoc. J. 131, 14. NORD, 1995. Nordic Council. Risk Evaluation of Essential Trace Elements—Essential versus toxic levels of intake. Nordic Council of Ministers, Copenhagen. Renwick, A.G., 1993. Data-derived safety factors for the evaluation of food additives and environmental contaminants. Food Addit. Contam. 10, 275–305. Renwick, A.G., Barlow, S.M., Hertz-Picciotto, I., Boobis, A.R., Dybing, E., Edler, L., Eisen¨ brand, G., Greig, J.B., Kleiner, J., Lambe, J., Muller, D.J.G., Smith, M.R., Tritscher, A., Tuijtelaars, S., van den Brandt, P.A., Walker, R., Kroes, R., 2003. Risk characterization of chemicals in food and diet. Food Chem. Toxicol. 41, 1211–1271. Renwick, A.G., Flynn, A., Fletcher, R.J., Muller, D.J.G., Tuijtelaars, S., Verhagen, H., 2004. Risk–benefit analysis of micronutrients. Food Chem. Toxicol. 42, 1903– 1922. Renwick, A.G., 2006. Toxicology of micronutrients: adverse effects and uncertainty. J. Nutr. 136, 493S–501S. Renwick, A.G., Dragsted, L.O., Fletcher, R.J., Flynn, A., Scott, J.M., Tuijtelaars, S., 2008. Minimising the population risk of micronutrient deficiency and over consumption: a new approach using selenium as an example. Eur. J. Nutr. 47, 17–25. SCF, 1993. Reports of the Scientific Committee for Food on Nutrient and Energy Intakes for the EC, 31st Series, Official Publication of the European Community. ´ R., Guyda, H., Naftolin, F., 1977. Ineffectiveness of pyridoxine (B6) Tolis, G., Laliberte, to alter secretion of growth hormone and prolactin and absence of therapeutic effects on galactorrhea–amenorrhea syndromes. J. Clin. Endocrinol. Metab. 44, 1197–1199. WHO, 1987. Environmental Health Criteria No. 70. Principles for the Safety Assessment of Food Additives and Contaminants in Food. World Health Organisation, Geneva.