Distinguishing between response to HIV vaccine and response to HIV

Distinguishing between response to HIV vaccine and response to HIV

THE LANCET COMMENTARY COMMENTARY Distinguishing between response to HIV vaccine and response to HIV See page 256 Demonstrating the presence of an i...

52KB Sizes 0 Downloads 99 Views

THE LANCET

COMMENTARY

COMMENTARY

Distinguishing between response to HIV vaccine and response to HIV See page 256 Demonstrating the presence of an infection has become easier with developments in antigen and genome detection assays, but the multiplicity of tests can, paradoxically, make it difficult to prove that an infection is absent. This difficulty is exemplified by HIV infection which can be identified (in rising order of cost) by serology, antigen detection, DNA detection by PCR, RNA detection by RT-PCR or bDNA, and viral isolation. The specificity of each assay has been determined, but every repeat assay will lead to a small but cumulative risk of false positivity. In this issue of The Lancet David Schwartz and colleagues report a false-positive RT-PCR result in a very particular circumstance, that of an HIVvaccine clinical trial. This finding raises critical questions for the conduct of all phase-III HIV-vaccine trials, and for the design of vaccine constructs. After 3 years in the scientific wilderness, phase III trials of putative HIV vaccines have been placed back on the agenda by President Clinton.1 There is now a consensus of sorts that for a vaccine to be able to prevent HIV infection it should induce both humoral and cellular immune responses.2,3 The former are defined in terms of neutralising antibodies, and the latter by HIV-specific, HLA-class-I-restricted CD8 cytotoxic T-lymphocytes (CTL). The induction of the full neutralisation response is still a problem; vaccine-induced antiserum in animals or in human beings will neutralise T-cell-line adapted (TCLA) HIV-1 isolates, sometimes to high titre, but will not neutralise primary or “field” isolates. In natural HIV infection, by contrast, the antiserum induced will neutralise both field and TCLA isolates; indeed, human HIV-positive antiserum neutralises primary isolates from all HIV-1 genetic subtypes in a group-specific manner.4 Unfortunately, the group-specific epitope has not been defined, and humoral responses to this epitope have not yet been induced by any vaccine candidate. However, several vaccines are now inducing CTL responses to HIV; mostly, these vaccines are based on techniques that allow a degree of replication of the construct, leading to intracellular processing of the viral antigens, and hence the expression of peptide epitopes in association with HLA class I. These approaches include live viral vectors carrying HIV genes, vaccinia-gp160 and avipox-gp160, both used with a recombinant gp120 booster, and most recently, naked DNA.5 Whereas the humoral response is directed at env, particularly gp120, CTL epitopes are commoner on gag, pol, and nef.6 Thus, vaccines to induce broad humoral and cellular immunity are likely to contain not just gp120, but also some, all, or more than p17, p24, gp41, nef, and pol. In the light of the complex antigens likely to enter trial, 230

the question of correctly identifying and discriminating those individuals who are immunised against HIV from those who actually acquire an HIV infection becomes critical. One of the earliest ethical concerns about HIVvaccine trials was that those immunised might become HIV-1 “seropositive”, and hence suffer discrimination. When vaccines contained only gp120, this was a surmountable problem. HIV diagnosis is still largely determined by serology, both by screening and confirmatory ELISA assays, and by western blot. Most ELISAs use the extraordinarily wellconserved sequence in gp41, the AVERY epitope, which is so immunodominant that almost all HIV-positive individuals produce binding antibodies.7 The problem of excluding this gp41 epitope from HIV vaccines has been recognised, and the epitope does not seem to be relevant for protective immunity. Currently, the recombinant monomeric gp120 vaccines exclude gp41, and the gp160 constructs omit the AVERY epitope. However, the elusive group-specific neutralisation epitope on gp160 will undoubtedly lead scientists to attempt to make oligomeric constructs, which will probably not be conformationally correct without this critical immunodominant region. The great majority of putative HIV vaccines to have entered clinical trial to date have been either gp120 or gp160 constructs, which give clear isolated bands on western blot. Newer constructs with multiple structural epitopes will lead to more complex western-blot bands, which will be practically indistinguishable from true infection. Genome-based HIV detection at first sight seems to be an attractive confirmatory assay for distinguishing between vaccinees and infected individuals. The first choice should be DNA detection by PCR. Proviral DNA is invariably detectable in HIV-positive people, if sufficient cells are examined, even if the viral load is low; primers in pol and the long terminal repeats enable detection of almost all HIV-1 variants. By contrast, HIV RNA may be present at too low a level for detection by RT-PCR, even with the most sensitive assay, and the commercial assays are calibrated against B-subtype sequences. Viral isolation is too insensitive and specialised to be considered for discriminatory purposes. The problems encountered by Schwartz and colleagues began with a cluster of falsepositive results by RT-PCR, a well-recognised phenomenon. However, the issues raised go beyond the type of PCR assay to be used for discrimination of vaccinees. The simplest screening assay to separate vaccine from infection response is likely to remain a serological one, and the gp41 AVERY epitope is still the best option, since no other epitope anywhere in the virus is so universally recognised by infected individuals. As traditional western

Vol 350 • July 26, 1997

THE LANCET

COMMENTARY blots are not likely to be helpful discriminators with complex antigen vaccines, visual reading of bands in a line-immunoprecipitation-assay format to discriminate presence or absence of the gp41 epitope may be the cheapest and most effective solution. However, it is imperative that this immunodominant epitope be deleted from all current and future vaccine constructs. UNAIDS is well placed to secure international agreement on this issue, which must be in place before the start of phase III trials.

Jonathan Weber Department of Communicable Diseases and Genitourinary Medicine, Imperial College School of Medicine at St Mary’s, London W2 1PG, UK 1 2 3

4

5

6 7

Office of the President. Commencement Address at Morgan State University, May 18, 1997. Bloom B. A perspective on AIDS vaccines. Science 1996; 272: 1888–90. Haynes B, Pantaleo G, Fauci A. Towards an understanding of the correlates of protective immunity to HIV infection. Science 1996; 271: 324–28. Weber JN, Fenyö E-M, Beddows S, Kaleebu P, Björndal A, and the WHO Networks for HIV Isolation & Characterisation. Neutralisation serotypes of human immunodeficiency virus type-1 field isolates are not predicted by genetic subtype. J Virol 1996; 70: 7827–32. Boyer J, Ugen K, Wang B, et al. Protection of chimpanzees from high dose heterologous challenge by DNA vaccination. Nat Med 1997; 3: 526–32. Rowland-Jones S, McMichael A. Role of CTL in HIV pathogenesis. Curr Opin Immunol 1995; 7: 448. Weber J, Clapham PR, Weiss RA, et al. Human immunodeficiency virus infection in two cohorts of homosexual men: neutralising sera and associations of anti-gag antibody with prognosis. Lancet 1987: i: 119-22.

deletion in single metastatic tumour cells in lymph-node tissue by fluorescent in-situ hybridisation (FISH) from a patient with renal-cell carcinoma. The immediate implications seem to be clear: molecular assays can detect small clusters of cancer cells that would otherwise have been missed by routine histopathology. This point raises a more relevant and far-reaching question for the clinician: will the better definition of locoregional and micrometastatic spread by molecular staging be an indication to change patient management? The answer will come from carefully designed clinical trials. In the case of head-and-neck cancer a prospective multicentre trial is in progress in the USA to evaluate the efficacy of molecular staging and the impact on surgical practice. One of the human cancers to be studied in the near future is likely to be node-negative breast cancer. Could the detection of rare infiltrating cancer cells in axillary lymph nodes be an independent predictor of relapse and an indication for adjuvant therapy? And, if yes, could the absence of such cells make adjuvant chemotherapy unnecessary? Time will tell.

Carlos Caldas CRC Academic Department of Oncology, University of Cambridge School of Clinical Medicine, Cambridge CB2 2QQ, UK 1 2

3

Molecular staging of cancer: is it time? See page 264 Occult locoregional spread of cancer is a common cause of local recurrence and an early indicator of subclinical micrometastatic dissemination. Accurate staging of primary tumours is therefore extremely important and may result in either more extensive primary therapy to reduce the risk of local recurrence or systemic adjuvant therapy with the aim of eliminating micrometastasis. Currently staging is histopathological, relies on the identification of malignant cells at surgical margins or in regional lymph nodes, and is often inaccurate. Cancer results from the progressive accumulation of mutations in genes of somatic cells. These mutations are powerful molecular markers that could potentially be used clinically for early cancer diagnosis, staging, improved ability to define prognosis, and better treatment selection.1 The first demonstrations came with the reports of detection of mutations in the urine of patients with bladder cancer and in the stool of patients with colorectal cancer identical to the ones present in the primary tumours.2,3 These findings have now been confirmed in several cancers by other groups. In squamous-cell carcinoma of the neck the use of a molecular assay enabled detection of small populations of infiltrating tumour cells harbouring mutations of the tumour suppressor gene p53 specific for the primary tumour in surgical margins and cervical lymph nodes that were assessed histopathologically to be negative.4 More importantly it was found that patients with “p53 mutation”-positive margins seemed to have a substantially increased risk of local recurrence.4 In the current issue of The Lancet a study by Svetlana Pack and colleagues describes the detection of gene

Vol 350 • July 26, 1997

4

Caldas C, Ponder BAJ. Cancer genes and molecular oncology in the clinic. Lancet 1997; 349 (SII): 16–18. Sidransky D, Von Eschenbach A, Tsay YC, et al. Identification of p53 gene mutations in bladder cancers and urine samples. Science 1991; 252: 706–09. Sidransky D, Tokino T, Hamilton SR, et al. Identification of ras oncogene mutations in the stool of patients with curable colorectal cancer. Science 1992; 256: 102–05. Brennan JA, Mao L, Hruban RH, et al. Molecular assessment of histopathological staging in squamous-cell carcinoma of the head and neck. N Engl J Med 1995; 332: 429–35.

Explaining inequalities in coronary heart disease See page 235 Social class is an important risk factor in determining health and illness. There is an inverse relation between social class and most health outcomes almost everywhere in the industrialised world and throughout the lifespan.1 Until recently epidemiologists have not paid detailed attention to social class because it is not clear what can be done about it. People at the lowest social class level are poor and have less adequate education, nutrition, and housing: all of these are interrelated and are difficult to modify. It is not very helpful to identify a risk factor that, short of a revolution, seems difficult to change. However, earlier reports from the Whitehall study had shown that people in the employment grade just below the top grade had higher rates of coronary heart disease than those at the very top: these people were not poor, nor were their education, nutrition, and housing inadequate. What could account for this disease gradient even in relatively well-off groups? One hypothesis was that those lower in employment grades had progressively less control over the forces affecting their lives.2,3 Control has been studied in the workplace. The work of Karasek and Theorell4 and others showed that control in the workplace is an independent risk factor for coronary heart disease. In the Whitehall II cohort, Bosma and colleagues5 showed that, even after controlling for other 231