104
News & Comment
TRENDS in Parasitology Vol.18 No.3 March 2002
was associated with impaired resistance to subsequent parasitemia. The COI was estimated according to the frequency of MSP1 Block 2 K1 and Mad20 allele frequencies (the COIKM) and then classified according to the presence or absence of the MSP1 Block 2 RO33 allele. Whereas COIKM was found to be positively associated with malaria transmission and age, it was negatively associated with resistance to subsequent infections with parasitemias of >500 parasites µl−1. Independent of the COIKM, the presence of the RO33 allele in
previous infections was associated with resistance to subsequent infections. An interesting finding was that children who were homozygous (SS) or heterozygous (AS) for the sickle cell haemoglobin genes were more resistant to high parasitaemias than children without the gene (AA), and the mean COIKM was not significantly different between the children. The results of the study could have a significant public health impact because, against common belief, they suggest that a faster development of a protective immunity
against high parasitemia and disease would be achieved by intervention strategies reducing the COI – something that should be tested in future epidemiological and intervention studies in the field. 1 Branch, O.H. et al. (2001). Plasmodium falciparum genotypes, low complexity of infection, and resistance to subsequent malaria in participants in the Asembo Bay Cohort Project. Infect. Immun. 69, 7783–7792
Richard Reithinger
[email protected]
Unexpected consequences of vaccination against parasites The idea that pathogens might evolve in response to the selection pressure imposed by a vaccination programme has been around for a while [1]. A major concern is ‘escape mutants,’ variants expressing epitopes that vaccinated individuals fail to recognize. This particularly applies to fast-evolving pathogens such as HIV, but could also be relevant for parasites such as malaria and trypanosomes, which exhibit innate antigenic diversity. Other possible evolutionary responses to vaccination programmes have received less attention previously, but a recent study has now changed this [2]. Gandon et al. used mathematical models to consider what selection pressures vaccines with particular properties might impose on human malaria parasites. They then consider how the parasite might respond to this selection in terms of its ‘virulence’, quantified here as the increase in the mortality rate of infected individuals. The good news is that in some circumstances, the parasite is expected to evolve lower virulence, thus enhancing the long-term impact of a vaccination programme in reducing total malaria mortality. The not-sogood news is that in other circumstances, the parasite is expected to evolve higher
virulence, which compromises the effectiveness of vaccination, although overall mortality can still be reduced. The bad news is that in particular circumstances, the parasite evolves even higher virulence and the net effect is that vaccination increases overall mortality. It is important to understand how such a worrying outcome might arise. The problem is largely to do with vaccines that protect against disease rather than infection. Hosts given such a vaccine have reduced mortality and can transmit less malaria, but more virulent parasites are more successful in vaccinated hosts, so vaccination creates a selection pressure for higher virulence. A corollary of this is that infection of non-vaccinated hosts results in higher levels of mortality than in the absence of vaccination. The models by Gandon et al. evaluate the balance between these competing epidemiological and evolutionary consequences of vaccination. They suggest that low coverage with a vaccine that provides partial protection against parasite-growth within the host can result in a net increase in malaria mortality, given that higher parasite-growth rates result in increased transmissibility and
increased virulence. They also suggest that these ‘evolutionary’ effects could be seen over time scales of a few decades. To some extent, these findings depend on the various assumptions made by the models, but the underlying arguments are general so the results are likely to be, at least, qualitatively robust. This paper is not an argument against developing malaria vaccines, but it is an argument for trying to understand exactly how candidate vaccines work, and how different aspects of parasite biology, such as transmissibility and virulence, are functionally related. It is also a welcome reminder of the value of applying evolutionary biological thinking to public health issues, warning us that short-term solutions can sometimes conceal long-term problems.
mainstay for vector control re-addressing their policy for malaria prevention. Data on the relative cost and effectiveness of these options will be instrumental in guiding this decision-making process. A recent example of such a costeffectiveness analysis is the work by P. Kamolratanakul et al. where
lambdacyhalothrin-treated nets were compared with DDT spraying in a malaria hyper-endemic area in Thailand [1]. Using an operational framework, the authors showed that treated nets were more cost-effective than spraying, and recommend replacing IRS with ITNs in high-risk areas in Thailand. This analysis is likely to have important
1 McLean, A.R. (1995) Vaccination, evolution and changes in the efficacy of vaccines: a theoretical framework. Proc. R. Soc. London B Biol. Sci. 266, 389–393 2 Gandon, S. et al. (2001) Imperfect vaccines and the evolution of pathogen virulence. Nature 414, 751–756
Mark Woolhouse
[email protected]
Are the costs correct? The debate on whether insecticide-treated bednets (ITNs) or indoor residual-house spraying (IRS) should be advocated as the means of vector control for malaria is heating up. This has been partly fuelled by the strong support given to the promotion of ITNs by the Roll Back Malaria initiative, with many countries where IRS has been the http://parasites.trends.com
1471-4922/02/$ – see front matter © 2002 Elsevier Science Ltd. All rights reserved.
News & Comment
implications for how vector control is undertaken in this country, but is it correct? In fact, the analysis showed that DDT spraying was marginally more effective in reducing malaria episodes than ITNs. The reason why ITNs were shown to be more cost-effective was because they were cheaper. However, on careful reading it becomes evident that a rather important component of the ITN costs were not included: the price of the bednet itself. The authors argument for not including this was that a provider’s perspective was being taken and the Ministry of Public Health would only provide free re-impregnation
TRENDS in Parasitology Vol.18 No.3 March 2002
not free nets. Therefore, the cost analysis assesses only net re-impregnation. However, the effectiveness analysis evaluates treated nets because households were provided with nets if they did not have them. This means that the cost and effectiveness for the ITN option are not comparable. Either effectiveness should have been assessed in communities only provided with re-impregnation or the costs need to include the cost of providing nets to those who did not have them. Although there could be many other arguments against using DDT spraying, the data provided by Kamolratanakul et al. is not
105
sufficient to dismiss this approach on costeffectiveness alone. Economic evaluations such as cost-effectiveness analysis are powerful tools for influencing policy makers. It is important that they are carefully scrutinized before being used as the foundations for a change in national policy. 1 Kamolratanakul, P. et al. (2001) Cost-effectiveness and sustainability of lambdacyhalothrin-treated mosquito nets in comparison to DDT spraying for malaria control in western Thailand. Am. J. Trop. Med. Hyg. 65, 279–284
Helen Guyatt
[email protected]
Haemoglobin C for supermen and women Position 6 on the beta chain of haemoglobin holds a special place for malariologists. The substitution of glutamine by valine in this position leads to haemoglobin S, the best example of natural selection in humans, protecting against the risk of developing severe falciparum malaria as well as the risk of death from this disease. However, the situation is less clear for another variant, haemoglobin C, where lysine rather than valine replaces glutamine at the same position. Whereas there might be a reduction of invasion by the malarial parasite, Plasmodium falciparum, into haemoglobin C-containing cells and decreased parasite growth in these cells, there are minimal epidemiological data supporting the hypothesis of protection by haemoglobin C against malaria. Recently, Modiano et al. have shown in a large case-control study in the West African country of Burkina Faso that haemoglobin C
is associated with a 29% reduction in the risk of clinical malaria in heterozygotes, and a startling 93% reduction in homozygotes. The study involved many subjects (4348) including 476 with uncomplicated and 359 with case-definition severe malaria [1]. This particular haemoglobinopathy is believed to be of limited disadvantage to the individual, and the gene for haemoglobin S is not present in the geographical areas covered by this case study, where haemoglobin C is common. Hence, these data would support the notion that the gene for this haemoglobinopathy should increase slowly over time. When all things are equal (i.e. in the absence of malarial control), the gene for haemoglobin C should gradually move towards fixation (increase in frequency, replacing the normal haemoglobin A gene) and exclude the haemoglobin S gene from this part of West Africa. This argument sounds reasonable but, unfortunately, the authors did not include
death as an end point nor did it detect a significant reduction in the occurrence of severe malaria in individuals possessing haemoglobin C. The authors suggest that the protection provided by the haemoglobin C homozygote is in the same ballpark as that provided by the sickle cell heterozygote. The authors also argue that because the site of their study was at one of the major university teaching hospitals, they might have included the more severe uncomplicated malarias, thus reducing the likelihood of detecting significant differences between the uncomplicated and severe groups. We have yet to see this haemoglobinopathy increase in frequency in this West African niche. Time will tell. 1 Modiano D. et al. (2001) Haemoglobin C protects against clinical Plasmodium falciparum malaria. Nature 414, 305–307
Geoffrey Pasvol
[email protected]
In Brief
Alan Guttmacher Institute wins Gates Foundation grant The Alan Guttmacher Institute (AGI) has been awarded a US$7.5 million grant from the Bill and Melinda Gates Foundation for improving programs and policies regarding youth, HIV and AIDS in sub-Saharan Africa. Working with research partners in four countries, AGI will use these funds to perform a unique and detailed five-year investigation of why young people engage http://parasites.trends.com
in high-risk sexual behaviors, and why existing programs and policies may be falling short in helping to prevent the spread of infection. The grant is extremely timely because recent evidence shows that adolescents have at >35% chance of dying of AIDS in some southern and eastern African countries. AGI (www.gutmacher.org) is a non-profit organization focused on sexual and reproductive health research, policy analysis and public education, with offices in New York and Washington DC, USA. SHK
NKT cells offer all-round protection The response of natural killer T-like (NKT) cells to infection with the protozoan parasite Trypanosoma cruzi can affect the level of parasitemia, tissue parasite burden, the intensity of chronic inflammatory responses and possibly the outcome of Chagas disease according to a new study [(2002) Infect. Immun. 70, 36–48]. S. Kahn and colleagues studied the role of NKT cells in normal and
1471-4922/02/$ – see front matter © 2002 Elsevier Science Ltd. All rights reserved.