Available online at www.sciencedirect.com
Technological Forecasting & Social Change 75 (2008) 905 – 932
Toxicogenomic predictive modeling: Emerging opportunities for more efficient drug discovery and development Arsia Amir-Aslani ⁎ PRISM-Sorbonne, School of Management (UFR de Gestion 06), Université de Paris 1 Panthéon Sorbonne, 17, Rue de la Sorbonne, 75231 Paris Cedex 05, France Received 26 March 2007; received in revised form 11 October 2007; accepted 12 October 2007
Abstract Drug discovery companies are coming under increasing pressure to prove the long-term safety of their products more precisely, and to provide more data on them. As highlighted by Vioxx, for many drugs, the existence of adverse drug reactions (ADRs) becomes apparent once the compound has been extensively prescribed and a population base of considerable size has been exposed to the therapeutic agent. The ability to make decisions regarding termination of clinical development of a non-viable drug candidate as early as possible will have a large financial impact for a pharmaceutical company. Knowledge regarding the interactions of chemicals, genes, and cell function can improve chemical risk analyses. These efforts will be aided by continued improvement and expansion of predictive toxicology in combination with a range of mutually supportive technologies to develop strategies to facilitate better and more focused decisionmaking throughout the drug discovery process. Failure to implement such an approach causes companies to withdraw drugs from development or the market. This not only presents human health consequences but also has a negative economic impact on the industry. As such, one of the major challenges in drug discovery is to accurately predict which new drugs will be associated with a significant incidence of ADRs. The ability to produce information on potential toxicity early in the discovery phase will become the basis for judging whether a drug candidate merits further development. © 2007 Elsevier Inc. All rights reserved. Keywords: Drug Discovery; R&D productivity; Toxicogenomics; Technology Integration; Predictive modeling; Biotechnology; Technology Roadmapping
⁎ Araxes Associates, 374 Rue de Vaugirard 75015 Paris, France. E-mail address:
[email protected]. 0040-1625/$ - see front matter © 2007 Elsevier Inc. All rights reserved. doi:10.1016/j.techfore.2007.10.002
906
A. Amir-Aslani / Technological Forecasting & Social Change 75 (2008) 905–932
1. Introduction Historically competitive forces in the pharmaceutical industry have obliged companies to focus on blockbuster drugs (drugs with sales exceeding USD 1 billion) by shifting their efforts on maximizing profits through heavy marketing. This strategy has allowed companies to secure adequate resources to offset the cost of expensive research and development (R&D) programs. In effect, small-molecule drug discovery is an increasingly costly and time-consuming process. In 2003 it costs from USD 500 to 800 million to develop a new pharmaceutical product and of that 75% represents risk in the form of products that fail [1,2]. Bains [3] has reported that these costs are shared fairly evenly between the discovery and preclinical studies (39.7%) and clinical development (43.9%) and the remaining 6.4% being devoted to the registration/approval phase of drug development [3]. However, the main cause of this inefficiency is due to the fact that almost half of drug candidates fail during the development phase. Other than the costs and uncertainty of drug development, the length of time that it takes for a drug candidate to successfully reach the market is also of major concern to the drug discovery industry. The total development time for a successful drug candidate is around 12.5 years [3]. More specifically, major pharmaceutical companies spend on average between 40 and 60 million dollars annually per compound being investigated or between USD 109,000 and 175,000 on a daily basis for their R&D efforts for each until that potential drug candidate reaches the market. It becomes fairly obvious that any technology capable of shortening the drug development time frame will have considerable impact on a company's R&D cost structure. Furthermore, any shortening of this time frame will also help maximize profits by extending a product's life cycle through an increase of the commercialization period. In effect, an innovative therapeutic product is protected by patents for a period of roughly 20 years of which 12.5 years is economically wasted while the product is progressing through development. This will leave the pharmaceutical company basically 7.5 years to fully enjoy commercialization. Considering that it will take 3 years to reach peak sales of over a billion dollars for a blockbuster drug, this will leave the company with a period of basically 4.5 years to commercialize aggressively its therapeutic product. Consequently, for a drug generating a billion dollars annually any extra day of commercialization will help the company secure USD 2.7 million of sales. Thus, by reducing the time that it takes to bring an effective drug to market, a company can benefit greatly from an unchallenged market position and extend the period of patent protected sales. Failure of compounds in late preclinical development, in the clinic and even worse while being marketed, represents a very important economic burden for the pharmaceutical industry. A major contributor to the high attrition rate in drug development continues to be unanticipated toxicity. As a result, the drug discovery industry is seeking sophisticated new approaches and technologies for the discovery and design of new drugs from an improved understanding of basic mechanisms of toxicity. It is expected that new technologies will be coming to the forefront of the industry to reduce this time and increase the quality of new drugs considerably. One of these is toxicogenomics which aims at achieving better risk assessment by helping in identifying earlier in the drug discovery process the safety profile of candidates. In effect, by providing better methods for monitoring clinical trials toxicogenomics will render the identification of lead compounds more efficient. With the attrition rate of new drug candidates in the drug discovery pipeline or new drugs in the marketplace together is 96% the information which toxicogenomics can provide to drug discovery concerning toxic mechanisms, safety and efficacy will ultimately reduce the cost of lead drug candidates
A. Amir-Aslani / Technological Forecasting & Social Change 75 (2008) 905–932
907
and bring safer more effective drugs to market. By streamlining the process of bringing a product to market toxicogenomics presents enormous opportunities for the drug discovery and development process. Furthermore, toxicogenomics will help increase the understanding of the relationship between genetic variability and individual response to pharmaceutical agents. In effect, until now the blockbuster approach has been driven by mass-marketed products that were not differentiated across the patient population. With the help of novel technologies such as toxicogenomics it will be possible to stratify patient populations into particular subpopulations. This will result ultimately, in dramatic reshaping of market segments, changing the cost and the outcome of clinical trials, creating opportunities for novel pricing strategy and producing more cost-effective drugs. 2. Drug discovery environment Pharmaceutical companies are often engaged in a race in order to be the first to market with a specific type of drug for a particular disease. For example, Viagra monopolized for years the market as a treatment for erectile dysfunction until Levitra and Cialis were approved. The race to shorten the time-to-market has created a dangerous and a highly uncertain environment. In effect, drug development requires numerous clinical trials that need to be completed for an average drug before approval is granted. However, limited studies (both in duration and in group size) are more likely to miss a particular side-effect or potential risk than studies conducted on large test groups over an extended period of time. Also, the institution at the US FDA of an industry-funded “fast track” drug approval process has created a situation where drugs are being moved faster towards the market. Fast-track approvals, which are usually based on short-term testing of small test groups, could lead potentially to disastrous consequences particularly for drugs which are specifically designed for chronic diseases. Despite the complex nature of today's drugs, the Food and Drug Administration (FDA) is processing new drug applications at an increasing pace [4]. From 1993 to 1999 the FDA approved 232 drugs known as “new molecular entities.” During the previous 7 years, the FDA only approved 163 new molecular entities. According to the Center for Medicines Research International, the FDA takes an average of 1.3 years to approve a new drug. As a consequence, the public is exposed to a drug for years before ongoing long-term studies disclose its dangerous side-effects. As such, on Sep. 29 2004 in the largest drug recall in history Merck & Co withdrew its popular arthritis drug Vioxx from the market, acknowledging it caused increased risk of stroke, heart attack and death. Since its approval in 1999, Vioxx was used by two million people and had earned for Merck USD 2.5 billion dollars in 2003 alone. The Vioxx case is the most visible and largest recall, yet several other drugs have been withdrawn from the market over drug safety issues since 1993 (Table 1). Moreover, since the late 1990s the number of drugs given a “black box” label (a warning of side-effects that could lead to death or serious injury) has increased considerably [5]. As highlighted by the Vioxx case, addressing Adverse Drug Reactions (ADR) is a major problem from a public health perspective as well as for the development of new medicines. Lazarou et al., have demonstrated that ADR is between the fourth and sixth cause of death in the United States, accounting for more than 100,000 deaths in 1994 [6]. Furthermore, Lasser et al. [5] concluded that one in five new drugs has unrecognized ADRs that do not show up until after the drug has been approved [5]. The study analyzed 548 drugs approved from 1975 through 1999 and discovered that 56 of them were later given a serious side-effect warning or even taken off the market completely. The study specifically focused on “black box” warnings, which highlight the most serious side-effects that were added to the drug's label after its release. If one of the more life-threatening side-effects is not detected prior to release, it can cause
908
A. Amir-Aslani / Technological Forecasting & Social Change 75 (2008) 905–932
Table 1 Major drug withdrawals between 1997 and 2005 Drug
Date of approval
Date of Adverse drug reactions withdrawal
Pondimin (Fenfluramine) Redux (Dexfenfluramine) Seldane (Terfenadine) Duract (Bromfenac)
1973 1996 1985 1997
1997 1997 1998 1998
Posicor (Mibefradil) Raxar (Grepafloxacin) Hismanal (Astemizole) Lotronex (Alosetron)
1997 1997 1998 2000
1998 1999 1999 2000
Propulsid (Cisapride) Rezulin (Troglitazone) Baycol (Cerivastatin) Raplon (Rapacuronium) Vioxx (Rofecoxib) Palladone (Hydromorphone) Bextra (Valdecoxib)
1993 1997 1997 1999 1999 2004 2001
2000 2000 2001 2001 2004 2005 2005
Risk of heart valve abnormalities Risk of heart valve abnormalities Risk of fatal heart rhythm abnormalities Severe hepatic reactions, potentially fatal fulminant hepatitis and liver failure, with some cases requiring transplantation Heart conditions including arrhythmias and low blood pressure Severe cardiovascular events among patients Serious cardiac side-effects, involving changes in heart rhythm Intestinal damage resulting from ischemic colitis, severely obstructed or ruptured bowels, and death Heartbeat interruption and cause an arrhythmia Severe liver toxicity has been known to occur Fatal rhabdomyolysis, a severe muscle adverse reaction An inability to breathe normally that can lead to permanent injury or death Increased risk of heart attack and stroke Potential fatalities when taken with alcohol Increased risk of heart attack and stroke
Because of human health–safety concerns, a long list of drugs has been withdrawn from the market.
major problems and create a serious hazard for the general public once the drug is on the market. The premature approval and marketing of dangerous drugs which are ultimately found to pose far greater risks than any benefit they may have had. Table 2 highlights a number of recent FDA warnings regarding marketed drugs resulting in labelling changes about potential risks of adverse reactions in patient populations. Despite concern over potential Table 2 Recent FDA warnings Drug
Marketed
FDA alert
Warning
Pioglitazone HCl
Actos, Actoplus Met, and Duetact Avandia, Avandamet, and Avandaryl Gabitril
8/2007
2/2005
Avastin
4/2007
Linezolid
Zyvox
3/2007
Omalizub
Xolair
2/2007
May cause or exacerbate heart failure, particularly in certain patient populations May cause or exacerbate heart failure, particularly in certain patient populations Risk of seizures in patients without epilepsy being treated with this drug Tracheoesophageal (TE) fistula formation in a recent clinical study in patients with limited-stage small cell lung cancer (SCLC) Patients treated with linezolid had a higher chance of death than did patients treated with any comparator antibiotic, and the chance of death was related to the type of organism causing the infection Serious (immediate and delayed) and life-threatening allergic reactions (anaphylaxis) in patients after treatment with Xolair. Usually these reactions occur within 2 h of receiving a Xolair subcutaneous injection.
Rosiglitazone maleate Tiagabine hydrochloride Bevacizumab
8/2007
A. Amir-Aslani / Technological Forecasting & Social Change 75 (2008) 905–932
909
risks the balance between the drugs benefits and risks has supported their continued use. In such cases when potential risks exist, target populations at risk should be identified in order to avoid the occurrence of life-threatening side-effects that would ultimately lead to drug withdrawals. 3. From risk averse to innovative and safe The fierce nature of the competition has also flooded the market with multiple versions of the same class of drug. Products such as Vioxx, Celebrex, Bextra, Arcoxia, and Prexige are all Cox-2 inhibitors. Similarly, Mevacor, Zocor, Pravachor and Lipitor are all members of the same HMG Co A reductase inhibitor family of Cholesterol reducing drugs. Mevacor was FDA approved and released in U.S. in September 1987, Pravachor in October 1991 and Zocor in December 1991. These cholesterol lowering drugs which follow the first in this class, Mevacor, constitute follow-up drugs and have capitalized on the success of the latter drug in the market place. In other words, once the first breakthrough discovery is made of a new pharmacological activity for a new molecule, subsequent years have seen the emergence of a host of drug candidates from the same chemical class and possessing similar pharmacological profiles. Considering that the success rate in the discovery of new chemical entities with fundamentally new chemical and biological profiles of activity is very low the development of such follow-up drugs has been essentially motivated by purely commercial considerations [7]. There has been a growing criticism that too many molecules have been developed and approved with similar chemical structure and the same pharmacological profile, with very little to distinguish them from each other in terms of their therapeutic utility. Furthermore, since the Vioxx withdrawal the FDA has been under pressure to raise its threshold for approving new drugs when safe, effective therapies already exist. Avandia problems have reinforced these pressures on the FDA. Conventionally, the regulatory agencies, are not obliged to consider better efficacy over existing drugs as a criterion for approval; rather, they require only the establishment of efficacy and safety of the new drug over a placebo. However, the demand for evidence-based therapeutics by various stakeholders has obliged the regulatory authorities to adopt a novel and more appropriate paradigm for drug approval. However, the risk/benefit analysis of Isentress (raltegravir) has led the Antiviral Drugs Advisory Committee of the FDA to vote unanimously to recommend accelerated FDA approval of the therapeutic compound in combination with other antiretroviral therapy (ART) for the treatment of HIV infection in treatment-experienced patients with ongoing viral replication despite existing therapy. Eventhough the potential risks of Isentress include rash, liver injuries, muscle problems and cancer the therapeutic compound provides a new treatment option for combatting the HIV epidemic. In effect, current drugs on the market attack two enzymes, reverse transcriptase and protease involved in the HIV life cycle. Isentress is the only drug to target the third enzyme, known as integrase. Also, safety data from controlled clinical trials have shown that there is a potentially significant increase in the risk of heart attack and heart-related deaths in patients taking Avandia(rosiglitazone) a drug approved to treat type 2 diabetes. However, according to the FDA (FDA News May 21, 2007) other published and unpublished data from long-term clinical trials of Avandia, provide contradictory evidence about the risks in patients treated with Avandia. Since the drug was approved, FDA has been monitoring several heart-related adverse events. The most recent labelling change in August, 2007, for Avandia included a new warning about a potential increase in heart attacks and heart-related chest pain in certain patient population using Avandia. This new warning was based on the result of a controlled clinical trial in patients with existing congestive heart failure [8]. Notably the meta-analysis reported concluded that
910
A. Amir-Aslani / Technological Forecasting & Social Change 75 (2008) 905–932
Rosiglitazone was associated with a significant increase in the risk of myocardial infarction and with an increase in the risk of death from cardiovascular causes. However, a panel of outside experts recommended to the FDA on July 30, 2007 that the diabetes drug Avandia should be kept on the market, despite concern over its heart risks highlighting that the balance between the drugs benefits and risks supported its continued use in the United States. In order to respond to such challenges, drug discovery companies have no other alternative than to ensure that products under development offer a significant advantage and have safer profiles than existing therapies or are targeted at unmet medical needs. New technologies are now coming to the forefront of the industry to address such issues. However, much of the current pharmaceutical paradigm concentrates efforts on enhancing efficacy. This is inherently inefficient, since increased potency does not imply reduced toxicity. Indeed, more potent compounds might lead to more severe ADRs, contributing to the likelihood of failure due to toxicity. Thus, there is a need for integrating better toxicology testing in the earlier phases of the drug discovery process. Ideally, the toxicity of drug candidates would be detected at the time of lead selection before expensive clinical testing begins. In effect, novel technologies such as toxicogenomics will help curb adverse health effects resulting from drug toxicity. This will lead to higher productivity by increasing the attrition rate at the preclinical stage of the drug discovery help to lower R&D costs and outcome of clinical trials. 4. Biotechnology's contribution to the drug discovery process Pharmaceutical companies are very much aware of the contribution of biotechnology companies in developing cost-effective and innovative drug candidates. In effect, efforts by biotechnology companies have resulted in a dramatic change in the drug discovery and development paradigm. As such, therapeutics derived from biotechnology have gained considerable advantages in an increasingly competitive market. Knowledge based drug discovery is research that integrates data from genomics, proteomics and expression experiments to identify drug targets, create pathway and network models, and choose lead compounds. Advances in the understanding of the molecular interactions that underlie diseases and the availability of powerful informatics tools have spurred the growth of structure-based or rational drug design. Sophisticated new approaches and technologies in the discovery and design of new drugs are replacing the traditional methods of discovery and development. A specific molecule such as Gleevec which was approved by the FDA in 2001, which inhibits the BCR/ABL kinase that causes chronic myelogenous leukemia (CML), is a prime example of such targeted agents. According to innovation.org (www.innovation.org) there are currently over 2300 novel drug candidates (both chemical and biological compounds) that are in development. A great majority of them are being developed by early-stage biotechnology start-ups. According to PhRMA more than 300 drug candidates are currently in development to treat or prevent numerous rare diseases; 646 drug candidates to treat patients with cancer, 146 drug candidates for cardiovascular diseases and stroke; 240 drug candidates for neurological conditions, 197 drug candidates for mental health; more than 900 drug candidates to treat the diseases of aging and 77 drug candidates to treat HIV/AIDS. This has resulted in a situation where biotechnology companies have become essentially the product innovators and pharmaceutical companies the market deliverers. At least, in the short-run biotechnology companies are becoming a supplier of promising drug candidates to the pharmaceutical manufacturers. Even though many of the compounds currently in development will ultimately fail at the various stages of the drug development process one cannot deny that developments in biotechnology have allowed
A. Amir-Aslani / Technological Forecasting & Social Change 75 (2008) 905–932
911
the drug discovery industry to move from serendipity-based research towards rational, evidence-based approaches.
5. Understanding genomics and its impact on drug discovery: a major scientific breakthrough “The effort to decipher the human genome will be the scientific breakthrough of the century — perhaps of all time. We have a profound responsibility to ensure that the life-saving benefits of any cutting-edge research are available to all human beings.” President Bill Clinton, Wednesday, March 14, 2000. Ever since early 1990s, scientific researchers using more and more sophisticated techniques have been engaged in unraveling the vast and complex code of the human genome—the totality of genetic information to be found in the DNA sequences on the chromosomes in human cells. The science of genomics initially referred mainly to the study of the mammalian genome—specifically, the mapping, sequencing, and analyzing of its genes [9–11]. The scope soon evolved, focusing not just on the genes' structure but on their function as well. More recently, the scope of the term broadened further, focusing on the exploitation of the genome knowledge, particularly for human therapeutics. Prior to the revolution in genomics, the pharmaceutical industry was able to exploit 500 possible drug targets. With the mapping of the human genome, the number of potential targets has risen exponentially [12]. Experts believe that knowledge derived from the sequencing of the human genome will release 2000–5000 novel drug targets. This 10-fold increase in the number of new targets will have tremendous implications for R&D. Notably, advances in the understanding of the molecular interactions that underlie diseases and the availability of powerful informatics tools have spurred the growth of structure-based or rational drug design whereby drugs can be designed by analyzing the structure of the molecular target and its active site. However, over the past two decades, genetic engineering has unraveled many detailed disease mechanisms but the translation of this knowledge into profitable drug development has been painfully slow and burdened with many failures. This problem is now starting to be recognized by policy makers. The 2004 FDA white paper “Innovation or stagnation” has explicitly stated that: today's revolution in biomedical science has raised new hope for the prevention, treatment and cure of serious illnesses. However, there is a growing concern that many of the new basic discoveries may not quickly yield more effective and more affordable and safe medical products for patients (Fig. 1) [13]. In response, the FDA is advocating much greater emphasis on translational and critical path research focused on the clinical assessment of novel products. In effect, the sequencing of the human genome has not only considerably simplified the search for genes that predispose people to develop certain diseases but has also provided the drug discovery industry with a wide spectrum of new opportunities for the discovery of innovative drugs and improved treatments [14–16]. 6. Genomics and toxicity: integrating toxicogenomics in drug discovery To ensure product safety, regulatory agencies require classical animal toxicology studies to show that the product is safe enough for early human testing and clinical studies to demonstrate safety for commercial distribution. Even so, these rigorous toxicology studies have not yet been able to avoid
912
A. Amir-Aslani / Technological Forecasting & Social Change 75 (2008) 905–932
Fig. 1. The technology maze of the drug discovery process. Sophisticated new approaches and technologies in the discovery and design of new drugs are replacing the traditional methods of discovery and development. Each technology is typically only part of the larger set of activities in the R&D value delivery system in which it participates. Interaction and integration across complementary technologies is critical to successful projects. Grey background: biology/robotics/informatics related; white background: chemistry/robotics/informatics related; dashed line: traditional approaches to toxicity.
missing safety problems during costly drug development that show up during commercial distribution. There is clearly room for improvement of toxicology evaluation methods and procedures also at these late stages in pharmaceutical development. Conventional methods for the evaluation of drug toxicity are not only time-consuming but also cost-intensive. Thus, employing technologies that improve the early identification of drugs that are likely to suffer failure in clinical trials is an essential strategy for addressing high failure rates in the development stages. The result is a radical change in the lead compound selection process with better use being made of information for revealing pharmacological and genetic toxicity [17–20]. In effect, according to “Innovation and Stagnation: Challenge and Opportunity on the Critical Path to New Medical Products,” a white paper published in 2004 by FDA: “Despite some efforts to develop better methods, most of the tools used for toxicology and human safety testing are decades old. Although traditional animal toxicology has a good track record for ensuring the safety of clinical trial volunteers, it is laborious, time-consuming, requires large quantities of product, and may fail to predict the specific safety problem that ultimately halts development” [13]. Toxic effects are a main reason for compound failure and there is a significant need for their detection as early as possible in development. A systematic approach based on the best available knowledge of the biological system is needed to capitalize on their individual advantages of combining speed and direction. With respect to drug discovery, the genomics revolution of recent years has provided the industry with unprecedented opportunities for the assessment of drug effect [10]. These developments in conjunction with advances in high throughput screening techniques suggest that there may never have been a brighter time for innovation in pharmaceutical research [21]. The science of
A. Amir-Aslani / Technological Forecasting & Social Change 75 (2008) 905–932
913
genomics has allowed establishing the link between adverse health effects resulting from drug toxicity reactions with alterations in gene expression. As such, the application of genomics to toxicology has lead to the advent of toxicogenomics [22]. The application of toxicogenomics to the drug discovery process will provide the possibility of identifying biological mechanisms and pathways initiated further to the exposure of pharmaceutical compounds. This discipline is particularly relevant to drug discovery because of its combination of genomics and bioinformatics can lead to the identification and characterization of the mechanism of action of new drugs. One of the major objectives of toxicogenomics is to be able to provide prediction of long-term effects of compounds using short-term assays (Fig. 2). 7. Drug discovery process and the need for early toxicity identification Pharmaceutical profiling is an emerging strategy in drug discovery because properties have a major effect on in vitro and in vivo pharmacology [23,24]. This has resulted in a strong need for appropriate biomarkers at the clinical end of drug development process for patient population profiling. The
Fig. 2. The current pharmaceutical paradigm concentrating efforts on enhancing efficacy is inherently inefficient, since increased potency does not imply reduced toxicity. Indeed, more potent compounds might lead to more severe ADRs, contributing to the likelihood of failure due to toxicity. In a resource constrained environment it is essential to develop decision-making tools and portfolio optimization approaches capable of increasing the attrition rate prior to entering clinical development. The goal would be to develop robust ad predictive biomarkers for adverse drug reactions in man that might occur during clinical development. More specifically, biomarkers may be used to help in lead compound selection, dose focusing, patient stratification and defining the mechanism of action of novel therapeutics. Validated biomarkers, especially surrogate endpoints, may have broader utility in both a regulatory context and in clinical practice.
914
A. Amir-Aslani / Technological Forecasting & Social Change 75 (2008) 905–932
functional complexity of disease mechanisms calls for screens that can accurately identify specific, yet comprehensive toxicity biomarkers. Combinatorial biomarkers provide a short-term answer to this problem. They are identified by systematic approaches, such as gene expression profiling, and represent a more holistic view of the organism, without the requirement for a complete understanding and systematic modeling [25,26]. The ability to measure biological signals in patient samples may accurately assess or, even better, predict drug-induced toxicity [27]. Thus, the identification and validation of sensitive and specific toxicity biomarkers not only may define mechanisms involved in toxicity, but also improve risk assessment, a fundamental process in drug development. Profiling data assists the diagnosis of compound performance at various barriers, assists prioritization and optimization, and highlights factors that affect development attrition. The application of this approach to compounds at different stages of preclinical testing would allow the elimination of unfavourable compounds early in development, and increase the quality level of lead selection. As highlighted earlier, any reduction in risk during the early phases of the process will have a multiplier effect on added value. First, it can impact the quality of drug development pipelines by providing more specific information as to the mechanisms of drug pathologies and providing it earlier in the discovery–development process. Second, it can improve the efficiency of the process because toxicogenomics information complements genomic target identification and characterization methods used in discovery and leads to reduced attrition during drug development for unfavorable compounds. 8. Drug development and the need for patient stratification In many clinical trials a large number of patients are recruited in order to address the issue of interpatient variability. However, despite the inclusion of a large number of patients in current trials, the characterization of rare ADRs presents a major challenge. Undertaking extensive safety tests in large and heterogeneous populations prior to market approval would significantly increase the time and cost of clinical evaluation and create a significant barrier to drug development. Patient stratification based on pharmacogenomic tests can allow scientists to develop a trial design comprised of a genetically differentiated patient pool, using genomic biomarkers to predict response of a group of individuals to a therapeutic. In undifferentiated patient pools, the number of nonresponders could jeopardize a trial's endpoint, thereby possibly preventing advancement of a therapeutic to a genetically responsive subpopulation. In effect, researchers have highlighted that Iressa (Gefitnib) has a profound impact in a small population of patients (10 to 20% of the patients). This small number of patients in the trial was most probably diluted out by the lack of response from the other patients [28]. For example, overexpression of epidermal growth factor receptor (EGFR) occurs in many types of cancer and has become a target for cancer therapy. The EGFR tyrosine kinase inhibitor (TKI) Iressa targets the tumor protein to treat nonsmall cell lung cancer (NSCLC). Specific mutations on EGFR gene correlate with clinical response and screening for these mutations has become increasingly integrated into clinical practice to identify those individuals who will most benefit from treatment. Similarly, using pharmacogenomic testing to predict HER-2 overexpression as a means to stratify breast cancer patients eventually paved the way to a successful NDA for Herceptin. Also, in a Phase III clinical trial designed to evaluate CML patient response to Gleevec, pharmacogenomic testing was able to identify a 31-gene biomarker within the patient population that predicted clinical response with 94% accuracy [29]. Moreover, in a Phase II clinical trial designed to evaluate Myeloma patient response to
A. Amir-Aslani / Technological Forecasting & Social Change 75 (2008) 905–932
915
Velcade, pharmacogenomic testing identified a 30-gene biomarker that predicted responders with 71% accuracy and non-responders with 84% accuracy [30]. Toxicogenomics and pharmacogenomics have led to several valid genetic tests that provide clinical dosing recommendations by the FDA (Table 3). Moreover, a list of DNA based biomarkers of enzyme or transporter activity currently considered as “exploratory” biomarkers can be found in Huang, S.-M., and Lesko, L. J. 2005 [31]. For such markers the correlation between certain genotypes and enzyme or transporter activities was observed in vitro only. For others contradictory data have been published for different drugs and the correlation between SNP genotype or haplotype and the response measures. The complex interplay of genotypes of the enzymes, transporters, and receptors, among other factors such as lifestyle, physiology, chronobiology can have an equal or greater impact in predisposing individuals to adverse reactions (Fig. 3). Ideally, data generated from toxicogenomics and pharmacogenomics studies need to be complemented with other risk factors for the prediction of ADRs. This can affect the risk/benefit ratio for individual patients and need to be considered when evaluating varied results from many genotyping studies with small number of subjects. This will ultimately help examine how well therapeutic drugs will work in a real-world group of patients. 9. The need of technology roadmapping in an uncertain environment? The potential of new technologies to increase the speed of bringing a product to market presents enormous opportunities. However, speed does not necessarily guarantee quality or therapeutic and commercial success. This has resulted in an abundance of drug candidates which does not necessarily mean a marked increase in productivity as the number of new active compounds launched each year by the industry in general has stayed roughly the same. Development programs that are not founded on the principles of quality and rigor are destined for delay, at best, and failure, at worst. Quality and not quantity must permeate every element of drug development. Markets and technologies are changing rapidly, cost pressures are increasing, regulatory authorities are more demanding, and product life cycles and time-to-market are shrinking. Even with greater efficiency in lead identification and optimization, improvements will also need to be made in clinical development and the overall speed to market. In this environment, drug discovery firms need to focus on their future markets and use strategic technology planning to stay ahead of the game. Gaining and sustaining a competitive advantage requires that a company understands the entire value delivery system, not just the portion of the value chain in which it participates. The management of competitive capabilities can no longer be discussed purely at an enterprise level based on a sole technology. The key challenge for firms is to develop and sustain competitive advantage in a complex business environment is direction and not speed. Technology roadmapping can help drug discovery companies better understand their markets and allow them to make informed technology investment decisions. In an accelerated drug development program each essential element of drug development for the new agent needs to be addressed. Technology roadmapping can assist drug discovery companies to identify their future product, service and technology needs and to evaluate and select the technology alternatives to meet them. It can ensure technology providers access to the critical capabilities needed to seize opportunities from the major market developments. By providing strategies on when and how to access those technologies, a technology roadmap can help companies to position themselves better for the future [32].
916
Table 3 The efficacy of toxicogenomics/pharmacogenomics tests in preventing ADRs Test utilization Drug
Drugs associated with this biomarker
C-KIT expression
Information
Imatinib mesylate
CYP2C19 variants
Information
Voriconazole
CYP2C9 variants
Information
Celecoxib
CYP2D6 variants
Information
Atomoxetine
DPD deficiency
Information
Capecitabine
EGFR expression with alternate context Her2/neu overexpression Protein C deficiencies (hereditary or acquired) TPMT variants
Required
Cetuximab (colorectal cancer)
Required
Trastuzumab
Recommended Azathioprine
UGT1A1 variants
Recommended Irinotecan
Gastrointestinal stromal tumor c-Kit expression “In vitro, imatinib inhibits proliferation and induces apoptosis in gastrointestinal stromal tumor (GIST) cells, which express an activating c-kit mutation.” “Gleevec is also indicated for the treatment of patients with Kit (CD117) positive unresectable and/or metastatic malignant gastrointestinal stromal tumors (GIST).” Omeprazole, Pantoprazole, Poor metabolizers and extensive metabolizers with genetic Esomerprazole, Diazepam, defect lead to change in drug exposure. Nelfinavir, Rabeprazole Warfarin Poor metabolizers and extensive metabolizers with genetic defect lead to change in drug exposure. Venlafaxine, Risperidone, People with reduced activity in this pathway have higher plasma concentrations of atomoxetine compared with people with normal Tiotropium bromide activity. inhalation, Tamoxifen, Timolol Maleate Fluorouracil Unexpected, severe toxicity (e.g., stomatitis, diarrhea, neutropenia and neurotoxicity) associated with 5-fluorouracil has been attributed to a deficiency of dihydropyrimidine dehydrogenase (DPD) activity. Gefitinib Patients enrolled in the clinical studies were required to have immuno-histochemical evidence of positive EGFR expression using the DakoCytomation EGFR pharmDx™ test kit. Detection of HER2 protein overexpression is necessaryfor selection of patients appropriate for Herceptin therapy Hereditary or acquired deficiencies of protein C or its cofactor, protein S, has been associated with tissue necrosis following warfarin administration. Thioguanine, Thiopurine methyltransferase deficiency or lower activity due to Mercaptopurine mutation at increased risk of myelotoxicity. UGT1A1 mutation in patients, exposure to drug and hence their susceptibility to toxicity. Individuals who are homozygous for the UGT1A⁎28 allele are at increased risk for neutropenia following initiation of camptosar treatment.
Recommended Warfarin
Label
This table provides some examples of FDA approved of valid genomic biomarkers for approved drug labels (www.fda.gov). Most drugs have complex metabolic pathways so that multiple variant alleles could be responsible for ADRs. The main drugs causing ADRs are metabolized by one or more cytochrome P450 enzymes (CYP family)with a high frequency of inactive alleles.
A. Amir-Aslani / Technological Forecasting & Social Change 75 (2008) 905–932
Biomarker
A. Amir-Aslani / Technological Forecasting & Social Change 75 (2008) 905–932
917
Fig. 3. Sources of individual variability further to therapeutic drug exposure. Pharmacokinetics: (absorption, metabolism, excretion, distribution); pharmacodynamics: receptors or proteins targeted by the therapeutic drug; preexisting pathologies: history of cardiovascular, renal, liver conditions etc…; Physiology (age, weight, gender); chronobiology refers to the study of cyclic events in living organisms (sleeping, eating, etc.). Incorporating such information in predictive databases might help implement and evaluate “real-life” trials that would better reflect the population that would ultimately use the therapeutic drug.
A technology roadmap process starts with the endpoint or vision clearly in mind and then traces alternative technology paths to achieve it. In order, to achieve this endpoint drug discovery companies ideally need a roadmapping tool to determine the technological processes and products required to fulfill future market demands. A major improvement needed in the drug development process is in the field of toxicology, which is the point where most developmental bottlenecks occur. This process is unique in that it encourages firms, R&D organizations, governments and industries to develop a shared vision of the future and explore the opportunities and pathways to achieve it. The development of therapeutic compounds requires the implementation of technology roadmaps that are both forward or backward looking (Fig. 4). Backward roadmapping because, drugs are ultimately utilized within a consequent patient population and are thus market driven. Thus, backward roadmapping involves in finding out how to reach a given target set by the marketplace. At the same time, the development of innovative therapeutic compounds is largely technology-driven and is more likely to use forward roadmapping. In other words, forward roadmapping is the process of building upon existing technologies until new targets appear. It aims to evaluate the potential of a given technology by considering the possibilities for the satisfaction of future needs. The challenges of operating in a complex and uncertain environment often mean that no single firm or industry has the resources to develop the full spectrum of technologies required. Through roadmapping, companies will be in a position to develop creative solutions to the technology issues and research needs identified. By optimizing the sharing technologies and knowledge, firms can leverage financial and intellectual resources to achieve market success [33]. Technology roadmapping will help a drug discovery company optimize its strategic planning and business development framework and provides a way to address technology gaps [34]. Furthermore, roadmapping provides managers with comprehensive technology assessments required for a long range perspective of future product needs. Technology platform companies are in the obligation of building advantageous positions based on value-added technologies. As such, roadmapping will help early-stage drug discovery companies increase considerably efficiency by providing direction capabilities within the technology maze of the drug discovery process. In effect, because of the distinct disciplines within the industry, knowledge acquisition
918
A. Amir-Aslani / Technological Forecasting & Social Change 75 (2008) 905–932
Fig. 4. Driving forces of drug discovery paradigm change. The converging forces of technology push and market pull.
from the development process can be extremely high. The explosion of new discovery technologies necessitates the formation of complementary organizational structures. 10. The trend towards technology integration The drug discovery industry's R&D productivity will languish far behind its potential unless companies are able to demonstrate how and where in the drug discovery improvements are to be realized [35]. The success of a drug discovery company depends fundamentally on the quality and efficiency of its platform at a time when the demand for services, innovative medicines and cutting-edge technologies are at their peak. Moreover, in just over a decade, using major technological achievements in gene sequencing and biochip capacity the race to sequence the genome generated data faster than expected. In reality, such achievements have generated a range of novel problems associated with information overload and statistical quality control which have led to more interdisciplinary approach, with an emphasis on speeding up processes and changing organizational structures to maintain output levels (Fig. 5). All organizations must make crucial choices on the activities, tools and advancement criteria for each stage [36]. Complementary and mutually supportive tools need to be utilized in an integrated manner to further enhance the identification of lead compounds. Such integrated approaches will help the prioritization of preclinical drug candidates. As a consequence, drug discovery companies are concentrating their efforts on approaches to raise the “probability of success” of lead candidates in an effort to decrease the rate of attrition in the development process.
A. Amir-Aslani / Technological Forecasting & Social Change 75 (2008) 905–932
919
Fig. 5. Technology roadmap process starts with the endpoint or vision clearly in mind and then traces the alternative technology paths to achieve it. The synergy resulting between various disciplines (chemistry, biology, informatics) offers value through in an integrative approach. The result is a radical change in the lead compound selection process with better use being made of information for revealing pharmacological, toxicity and ADME. Any reduction in risk during the early phases of the process will have a multiplier effect on added value. Technologies such as toxicogenomics will help decrease attrition rate during drug development caused by ADRs. While the application of pharmacogenomics will result in a change of the mass–market paradigm into a tailor–market paradigm.
Interaction and integration across complementary technologies will be critical to successful projects (Fig. 6). Coordination of such-inter technology extensions at the level of the value chain represents the realistic vision of value creation planning within the drug discovery process. Otherwise in the absence of mutually supportive technologies enormous value gets lost. Implementing platforms that incorporate other value-adding technologies has become a pre-requisite for success. The option of ignoring such issues will undoubtedly compromise a company's long-term success. Each technology is typically only part of the larger set of activities in the R&D value delivery system in which it participates [37]. But all those environments share common challenges: a novel base of technology and a complex context in which that the technology must be applied. That combination of novelty and complexity makes a company's excellence in technology integration critical. This need has become even more acute for the technologies involved in the early phase of the drug discovery process. The lack of interaction and integration amongst the various technologies within the value chain will favor the development of many unpromising drug candidates. The positioning of a technology within the overall drug discovery value chain is unique. The ability to quickly distinguish between promising projects is absolutely essential in today's resource constrained environment. The ever-increasing complexity of today's research environment explains the absence of productivity in drug research [11,20]. The increasing pace of biological discoveries has shifted
920
A. Amir-Aslani / Technological Forecasting & Social Change 75 (2008) 905–932
Fig. 6. Competitive capabilities can no longer be discussed purely at an enterprise level based on a sole technology. The lack of interaction and integration amongst the various technologies within the value chain has favored the development of many unpromising projects. The concept of data integration should apply across the entire drug discovery value chain. Otherwise in the absence of mutually supportive technologies, enormous value gets lost when it comes to move a compound from one project to the next. Complementary and mutually supportive tools needed to be utilized in an integrated manner to further enhance the identification of lead compounds. Such integrated approaches will help the prioritization of preclinical drug candidates.
the productivity bottleneck downstream in the drug development process. Ultimately, the objective for a drug discovery company is not necessarily to provide definitive answers in early preclinical stages. The definitive answer is really at the end of the clinical trials. What a drug discovery company needs to achieve early on is to proceed to a rank-ordering exercise and select the candidates with an optimal efficacy and toxicology profile to move into a preclinical safety study. The functional integration across a range of technologies and capabilities of an enterprise has become a priority for technology platform companies [37]. By providing complementary and mutually supportive data of different types on a drug candidate companies will be able to lay the foundation for better decisionmaking and ultimately product portfolio management. Careful management and integration of new technologies with more efficient clinical development practices can bring about dramatic progress in the time it takes a product to proceed from concept to launch in a cost-effective environment. 11. From early toxicology assays to predictive models Information on the toxicity profiles of lead compounds has become critically important in guiding the selection of a drug candidate for development, and is obtained routinely at an early stage of preclinical evaluation. When studies revealed that poor toxicity profiles cause development attrition, organizations implemented rigorous testing during candidate selection to ensure that compounds with poor properties
A. Amir-Aslani / Technological Forecasting & Social Change 75 (2008) 905–932
921
did not advance [38]. The lack of integration of toxicogenomics amongst the various technologies within the value chain favored the development of many unpromising projects. Researchers in the pharmaceutical and biotechnology industry have been developing tools over the years to maximize the efficacy of drugs while minimizing toxicity [39,40]. A decade ago, the number of drugs failing preclinically due to poor pharmacokinetics was upwards of 40%, but improved in vitro and animal models have reduced that rate to about 10%. Failures due to toxicology, however, are still in the 30% to 40% range, making it the number one reason for preclinical attrition. In this section strategies and tools that enable researchers in the pharmaceutical industry to design drugs that are safer and more robust for humans will be discussed. However, before any of the financial benefits of toxicity can be recognized, there is a need for a better understanding of biological mechanisms of toxicity. Indeed, the manifestations of toxicity must be characterized before toxic properties can be predicted. Toxicology encompasses many biological systems and processes, creating a need to develop high throughput screening technologies and to identify surrogate markers of toxicities. This is not to say that the accumulated insight onto disease mechanisms has not been of enormous importance, it is just that a more comprehensive understanding of biological networks is needed for rational drug design to really succeed. Toxicogenomics has emerged as a method for detecting and predicting compounds with toxic liabilities at a very early stage and with unprecedented precision [41,42]. While it is impossible to predict how much toxicology screening will change the efficiency of drug discovery a reasonable goal of predictive toxicology is to increase the success rate in early clinical development. Today's technology platform companies must develop the capability and expertise to provide services such as early or late-stage discovery, preclinical testing that takes into account potential toxic profiles of compounds under investigation. Only through the addition of such add extra facet to their capabilities that drug discovery companies will be able to contain further unwarranted development costs.
Fig. 7. The evolution of toxicogenomics related technologies will ultimately lead to predictive toxicology. Each of the technologies and approaches mentioned here faces considerable challenges. Deriving accurate and consistent information from volumes of data is a daunting task that requires key refinements to transition arrays from the lab to the clinic. Part of the problem is a lack of appropriate standardization, such as with experimental standards, data standards, and QA/QC standards.
922
A. Amir-Aslani / Technological Forecasting & Social Change 75 (2008) 905–932
A multidisciplinary approach integrating bioinformatics, genomics and proteomics, chemistry, text mining, and mathematics needs to be utilized in order to provide the specific expertise needed to tackle immediate tasks and to achieve specific objectives in their R&D programs. There are still many gaps in our scientific understanding of toxicological mechanisms and lack of high throughput definitive screens forces most companies to utilize an array of approaches in order to cut costs of drug development due to drug failure. Many companies adopted in vitro, in vivo, and in silico tools, while others are trying to harness the potential of other approaches (Fig. 7). 11.1. In vitro testing Working with in vitro systems early for toxicology profiling is important since it is expensive not only to synthesize compounds, but also to perform animal testing. Thus, developing in vitro systems for toxicology studies can save time and money. Molecules are screened in vitro at an early stage to prevent taking weak candidates into animal studies. Examining gene expression alterations in response to drugs can ultimately lead to the understanding of underlying mechanisms of toxicity, which could be valuable for the identification of potential safety issues early in the drug development process. By grouping the gene expression profiles of well-characterized model compounds and relating these changes to conventional indices of toxicity, a gene expression fingerprint could be generated and used to predict the toxicity of a drug candidate [41,42]. Such strategies provide a valid and useful way of evaluating and selecting compounds within the discovery pipeline and constitute a valid decision point in moving a drug candidate to the next stage. Once that is accomplished, in vivo studies can be conducted with the best candidates with a higher chance of success. In vitro techniques are still often of limited predictive use because they cannot accurately reflect the complex environments that drug candidates encounter in a living organism. Overall, gene expressionbased profiling is powerful yet in its infancy. Its potential has not yet been fully realized. Gene expression data enhanced by complete proteomic analysis will enable investigators to probe the complexities of the mechanisms of normal genetic and toxic pathways. Subsequently, when combined with information on gene/protein groups, functional pathways and networks, and human genetic polymorphisms, these data will provide a more complete image of the complex gene-environment interactions and human health risks. However, in vitro systems for toxicogenomic studies are faced by some challenges. In effect, the predictive value of in vitro systems is highly dependent on the choice of the model utilized for conducting toxicogenomic studies (primary cell/tissue culture or of cell lines). Different models utilized for the determination of a specific toxicity can produce substantial differences in the results and analysis of those results. This may result in a situation that inappropriate selection of an in vitro model could lead to contradictory results. In addition, the local microenvironment of the tissues and complex interactions between adjacent tissues is difficult to be modelled in in vitro systems. 11.2. Animal models While conventional cell-based assays can evaluate the potential effects of drugs in culture, they cannot accurately determine metabolic complexities that affect drug efficacy or cause toxicity. Mammalian models are key in predictive toxicity, but they are also expensive, work intensive and require large quantities of compound [43]. Therefore, there are still situations in which animal models will be needed.
A. Amir-Aslani / Technological Forecasting & Social Change 75 (2008) 905–932
923
However, results obtained through time gene expression changes can be detected after 1 to 3 days of treatment as opposed to animal models which it will take 2 to 4 weeks for theses changes to occur. In effect, when studying dosing in animals, the typical endpoints are a wide range of physiological alterations such as: serum chemistry, hematology, histopathology, body weight changes, and food intake. However, the recent progress achieved in transgenic and knockout animal models has considerably increased the value of applying preclinical animal models in toxicogenomic studies. The use of such models which contain specific human genetic characteristics of interest is crucial for gaining a better understanding of the mechanism of action of candidate drugs. Even though model systems such as zebrafish, roundworms and fruit flies are much less sophisticated organisms, they are easier to manipulate and more cost-effective for toxicogenomic studies. Despite promising concepts and studies [43], the application of preclinical animal models in drug development faces several challenges. Obviously there is a major difference between dosing for 3 days to and dosing for 2 weeks to as far as compound requirements and resources in general. More importantly, there are quantitative differences in dose–response relationships between animal models and humans. Although there is a certain degree of similarity in the biochemical and molecular pathways of different species, the biological response to drugs may certainly differ between the species. Therefore, it is necessary to identify common interspecies biomarkers of toxicity that can be used for comparison of toxic responses. Also, the biological response to a given exposure may present a qualitative dimension. In effect, it is important to predict toxicity of candidate drugs across different species in order to minimize the risk of misinterpretation caused by species-derived differences in response to drug treatment. 11.3. Pharmacogenomics Another major challenge for the drug discovery industry is the detection and prediction of idiosyncratic toxicity. Unexpected adverse drug reactions which occur randomly in a dose-independent fashion and are independent of pharmacological properties are referred as idiosyncratic [44]. Many of these idiosyncratic reactions result from genetic variations (polymorphisms) in drug-metabolizing enzymes antioxidant defences and the immune system and can readily alter an individual's response to a drug. The objective is to identify patient populations capable of best responding to treatments and then to identify and validate markers that could help differentiate those patients. Therefore, the ability to identify genetic polymorphisms is not only critical for understanding mechanisms behind metabolic activation of potentially toxic compounds, but also represents one of the major challenges in which toxicogenomics can be successfully implemented in drug development. Novel technologies such as pharmacogenomics will help the identification of susceptible patient populations. As we see polymorphisms and genetic variability in the coming years move primarily from the research laboratory into clinical practice and more routine use in the medical field, the applications for toxicity purposes are going to be very important [45]. The gradual evolution of emphasis from therapeutics to genotyping screening assays has created a new dynamics of interaction comprising drug discovery and clinical laboratory companies. In effect, there are significant opportunities for many new assays, including tests that will predict predisposition or monitor disease progress, look at disease severity, and predict response to therapy. However, to succeed in an environment of tailored products, drug discovery companies have to establish a systematic linkage with diagnostic and genomics companies, as well as regulatory authorities.
924
A. Amir-Aslani / Technological Forecasting & Social Change 75 (2008) 905–932
11.4. Databases/toxicity models The development of high capacity gene arrays has provided a high throughput method for toxicity screening based on the induction of genes associated with a toxic pathway. Many of the fundamental experiments that are undertaken are reductionistic such as making a single genetic perturbation and observing how it affects everything else [46]. This will help catalogue genomic effects of drug and chemical treatments. Further creation of a large reference database of gene expression profiles induced by positive and negative control compounds for selected toxicologic endpoints will be able to generate an extensive pathway profile that we relate to our existing database of toxicity pathways. The massive amount of genomics data generated from toxicogenomics studies is often complicated and multivariate and has given research scientists a major challenge that has yet to be resolved. A comprehensive gene expression reference database and a robust bioinformatics tool for data analysis will thus play an important role in the interpretation of toxicogenomics data. Ultimately, the goal of the new technologies is to generate information that can be compiled into a database and accessed for in silico prediction of toxicity [47–50]. Databases generated from such toxicity profiles can be used to assess other candidates' corresponding biological responses in order to provide a clearer picture of a compound's toxicological and pathological endpoints. There are two main drug development applications for chemical toxicology informatics, a system used to correlate large sets of structural data with toxicological properties. First, it can be used to screen libraries virtually and prevent potentially toxic compounds from being chosen as potential products. Second, it can be used in lead optimization to eliminate more toxic compounds. And thirdly, regulatory agencies can use predictive screening to help assess compound safety. Computer-based (in silico) prediction systems for toxicity based on the evaluation of quantitative structure–activity relationships (QSAR) are currently proposed as alternative tests to animal experiments in toxicity testing. Harnessing the power of computer simulation and modelling of biological pathways can help decipher molecular mechanisms of toxicity. Analysis of structure/property relationships enabling easy identification of genes and proteins responsible for the toxic effects will help the determination of toxic effects across species. Such approaches will have the capacity of measuring the effect of drug candidates on the gene expression of thousands of genes simultaneously. Such profiles can provide means for cataloguing a compound's toxicity profile [47]. The evaluation of structural analogy has found entry in the assessment of chemicals [48,49]. Such analogy comparisons may lead to a reduced toxicological testing program and thus to a reduction in animal testing. One of the primary approaches currently being applied to decrease attrition of new drugs is computational structure–toxicity ((Q)STR) modeling for prediction of potential adverse effects. Availability of high-quality structure–toxicity databases and the informatics tools to efficiently mine and visualize these data are two of the most important factors in successful (Q)STR development. 11.5. Systems biology Detailed quantitative models can additionally provide a mechanistic basis for the failure of certain treatment paradigms and explore alternative drug targets. The extent to which a model can be predictive increases as kinetic constraints are added on a system. The challenge for biology overall is to understand how organisms function. Pathway modeling is of particular promise, as its specificity and predictive power will only increase with the influx of more complete and accurate data. By discovering how function
A. Amir-Aslani / Technological Forecasting & Social Change 75 (2008) 905–932
925
arises in dynamic interactions, systems biology will be able to provide a clear image of the missing links between molecules and physiology [51]. In effect, knowledge thus obtained through systems biology will help the scientific community to understand how the individual biological networks function independently as well as one in respect to another within the cell or organism. Diseases such as obesity, diabetes, hypertension cardiovascular diseases will be effectively treated through a systems approach. Systems biology will provide data to medicine in a wide range of forms such as: DNA sequences, RNA expression profiles, proteins, and metabolites [52]. The challenge for systems biology will be to integrate and relate such measurements in order to construct models capable of reflect the dynamic environment in the cell, tissue, organ, or organism. So far, systems biology approaches have served as research tools. 11.6. Predictive modeling A thorough understanding of risk factors based on known pharmacology, chemistry, drug metabolism, toxic mechanism, and patient characteristics will facilitate key decisions in drug development. The ultimate goal is to be able to predict human metabolism, toxicity and pharmacokinetics based solely on the structural formula of novel compounds [53]. Unfortunately, currently there is no computational system capable of accomplishing these tasks, and many practitioners are skeptical of such developments in the near-future. Artificial intelligence tools need to work through all the predicted and observed relationships among a wide range of components and put them in context within the complex system. The simulation models they lay out cause-and-effect relationships. Such models need to comprise an estimation of the probability of human populations with unfavourable risk factors (to allow go/no go decisions), and the feasibility of the identification of at risk populations (to allow safe administration of the drug), are information, which may be critical to the development of safe drugs. Examples of information regarding risk factors are such physiological (age, gender, race, and disease state) and environmental conditions that can enhance toxicity, such as co-administered drugs or foods that would lead to toxicity due to either pharmacokinetic or pharmacological interactions. 12. Toxicity risk assessment models The ability to quickly distinguish between promising drug candidates is crucial in today's environment. At Biopharm summit 2003: Why drugs fail; San Francisco, October 29, 2003, speakers presented their perspectives on why the pharmaceutical industry is spending more on R&D but getting less in return. According to Meryvn Turner, Senior Vice-President of Worldwide Licensing and External research at Merck: “we are generating more data but not being more successful, and further highlighted that ‘There is a zone of chaos’, when in vivo takes over, and we understand very poorly what goes on there. We can wander around in the zone for years. At Merck, 50 percent of molecules die from toxicity (due to unpredictable long-term safety in animals). One third die from man our understanding is less profound than we thought”. It must be appreciated at the outset, however, that it is not possible to predict if a new compound will be toxic. Computational methods are nevertheless able to predict individual toxic endpoints, which, when combined with information on likely usage (for e.g. treatment duration, dosage and route of administration, etc.), can provide useful information to assist in risk assessment. New companies have entered the market for the development of toxicity risk assessment models, and the sector as a whole has begun to produce simple pathway models toxicity as well as for absorption,
926
A. Amir-Aslani / Technological Forecasting & Social Change 75 (2008) 905–932
Table 4 Toxicity and drug response assessment models developed by biotechnology companies and public institutions Company/public organization
Platform for prediction of toxicity
Logichem Ltd
Oncologic
Compudrug LHASA Ltd.
Accelrys, Inc
Multicase Inc.
Leadscope Inc.
Pharma Algorithms Inc. FDA (ICSAS)
Major endpoints predicted
OncoLogic™ can analyze a chemical structure to determine the likelihood that it may cause cancer. This is done by applying the rules of structure–activity relationship (SAR) analysis and incorporating knowledge of how chemicals cause cancer in animals and humans. Hazard Expert A new artificial neural network based approach using atomic fragmental descriptors has been developed to categorize compounds according to their in vitro human cytotoxicity. VITIC Chemically intelligent database, which can recognise and search for similarities in chemical structures. This makes it an essential research tool for toxicologists working in drug discovery or product safety, and Vitic is especially useful in (quantitative) structure–activity relationship (QSAR) modelling. TOPKAT TOPKAT employs robust and cross-validated quantitative structure toxicity relationship (QSTR) models for assessing various measures of toxicity TOPKAT uses Kier & Hall electrotopological states (E-states) as well as shape, symmetry, MW, and logP as descriptors to build statistically robust quantitative structure–toxicity relationship (QSTR) models for over 18 endpoints MCASE, CASE, Designed for the specific purpose of organizing biological/toxicological data obtained CASEOX from the evaluation of diverse chemicals. These programs can automatically identify molecular substructures that have a high probability of being relevant or responsible for the observed biological activity of a learning set comprised of a mix of active and inactive molecules of diverse composition. New, untested molecules can then be submitted to the program, and an expert prediction of the potential activity of the new molecule is obtained. Toxscope Correlates toxicity information with structural features of chemical libraries. This virtual decision-making software provides access to 150,000 chemical structures. The ToxScope databases encompass acute toxicity, hepatotoxicity, mutagenicity and carcinogenicity. Toxfilter Mammalian acute toxicity
Develops rules for quantifying toxicological and clinical endpoints, evaluates data mining and quantitative structure–activity relationship (QSAR) software, and develops toxicological and clinical effect prediction programs through collaborations with software companies. US Environmental A public data foundation for improved structure–activity and predictive toxicology Protection Agency capabilities. Tripos SYBYL program A genetic algorithm-based conformational search tool for exploring the 3-D shapes that molecules attain. The GASP algorithm measures similarity and overlays molecules based on matches between their hydrogen bond donors–acceptors, hydrophobes, and other species. The company also has been using genetic algorithms on binding problems. Cerep SA Bioprint Profiles the active ingredients from over 2500 marketed drugs, failed drugs, and reference compounds, in a panel of more than 180 well-characterized in vitro assays including a diverse selection of molecular targets (GPCRs and other receptors, ion channels and transporters, enzymes, kinases, etc.), as well as solution properties and in vitro ADMET properties. In addition, BioPrint includes a large in vivo dataset based on clinical information (therapeutic uses, adverse drug reactions, pharmacokinetics and drug-drug interactions) for nearly all active pharmaceutical ingredients included in BioPrint. Informatics and Computational Safety Analysis Staff DSSTox
A. Amir-Aslani / Technological Forecasting & Social Change 75 (2008) 905–932
927
Table 4 (continued) Company/public organization
Platform for prediction of toxicity
Genelogic, Inc.
Toxexpress, Bioexpress PrimaTox
RxGen
Iconix (acquired by Entelos)
Archimedes Galapagos
Bioseek
Entelos
University of Virginia
Major endpoints predicted
Gene expression profiles, curated references to public data, clinical information, and industry-leading quality control measures to reduce sample variability. Assessment of hepatotoxicity and other organ toxicities for a drug candidate including traditional measures of toxicity and multi-omic analysis of organ responses. Results are compared to known toxicity responses within PrimaTox™ databases. Drugmatrix DrugMatrix™ is an integrated database containing chemical structure, gene expression, clinical chemistry, hematology, organ weight, and histopathology data on 650 pharmaceuticals and environmental toxicants. Systematic mining of DrugMatrix has allowed the development of toxicity associated gene expression classifiers that can predict the clinically observed adverse effects of structural analogs. Archimedes Virtual based physiosimulations. Mathematical equations that quantitatively Model characterize the dynamic relationships between components over time. Drugstore, Starlit, Optimized for data mining and integration of third-party data and consist of expertly Sarfari, ADMEns curated information. Identify and effectively prioritize compounds. Interactive Biomap BioMAP® systems are human primary cell-based assay systems engineered to replicate the intricate cell and pathway interactions present in human disease settings. Compounds tested in these systems induce specific patterns of changes, BioMAP profiles, that are compared to a large number of reference profiles in BioSeek's database using proprietary algorithms. Physiolab Provides integrated expertise in the areas of metabolism and cardiovascular function, systems immunology and inflammation, and respiratory function. Each PhysioLab system represents dynamic whole-organism physiology. PhysioLab system is based on mathematical equations that quantitatively characterize the dynamic relationships between components over time Coxen An algorithm that uses comparative microarray data and responses of a panel of 60 human cancer cell lines maintained by NCI (NCI-60 panel) to anticancer drugs to predict responses in other cell lines and in human tumors.
metabolism, and elimination of compounds (Table 4). Moreover, the establishment of such structure– toxicity databases will allow companies to implement better decision-making processes for the development of safer products. As technologies become more robust and the quality of data increases, products that analyze cellular pathways and model networks, whole cells, and organ systems will undoubtedly be playing a more efficient and cost-effective role in drug discovery. The result will be a radical change in the lead compound selection process with better use being made of information for revealing pharmacological, toxicity and ADME. The reduction of risk during the early stages of the drug discovery process will have a multiplier effect on added value. Understanding the complexity of biological systems requires a broader perspective rather than focusing on just one method in isolation for prediction. All expert systems for toxicity prediction are limited by the availability of toxicity data on which to develop, evaluate and validate models. Indeed, the potential power of predictive toxicology is restricted by the paucity of publicly available, high-quality data for modelling.
928
A. Amir-Aslani / Technological Forecasting & Social Change 75 (2008) 905–932
Each predictive toxicology technique presents its distinct advantages and shortcomings, an obvious approach would be to combine predictions from different models. The predictive capacity of such models would be considerably increased by providing the flexibility to combine specific developmental endpoints into broader biologically-related systems. Ultimately, the various individual efforts, which address a range of toxicities (neurotoxicity, developmental toxicity, carcinogenicity, etc.) and a range of study types (in vivo, genomic, etc.), provide pieces of information that, when used together, can potentially solve the puzzle of predicting human response to chemical exposures. By providing predictive databases biotechnology companies will be able to streamline and enhance the drug discovery process by accelerating and expanding early-stage characterization of successful lead drug candidates, identified at the screening stage, in order to more accurately select promising drug candidates. Computational approaches to predict toxicity from chemical structure have the potential of being used at a number of stages of drug development. Such techniques have the convenience of being faster and costeffective and less dependant on animal usage compared with in vivo toxicity testing. For example since 1999, Cerep has managed to secure numerous strategic collaborations with major pharmaceutical partners for its predictive database BioPrint™ (Pfizer, Servier, Asrtra Zeneca, Lilly, Sanofi-Synthélabo, Aventis Pharma, Sepracor, Roche, Solvay, Brystol Myers Squibb). In the future, the use of computer-aided toxicity prediction techniques must be tempered by the realization and appreciation of its limitations [54]. The reality is that there is not a simple set of rules that will answer the questions concerning toxicity. A toxic event can be the culmination of an incredibly complex series of physical, (bio)chemical and physiological processes. Simplistic modelling of such phenomena will only result in trivial models with limited utility. A more holistic approach to toxicity prediction is required that may include an integrated assessment of ADME, route of exposure and dose, etc. [54]. Data used to make an informed assessment may come from diverse sources including QSARs, expert systems and knowledge of the effects of ‘similar’ chemicals. Such a database will provide the capacity to relate specific changes in gene expression to specific adverse effects and to look for similar pathways in different organisms. Such an approach will provide an objective way of assessing surrogate systems for reporting or predicting potential rare adverse effects in humans. While the potential for toxicogenomics is thus very high for studying active substances, the main challenge remains to ensure that only high-quality data are compiled and analyzed. 13. Conclusion The principle of toxicology originated in the 15th century with Paracelsus' statement [55] stating “All things are poison and nothing is without poison. Solely the dose determines that a thing is not a poison”. This has laid the foundation for today's traditional practice of toxicology. In effect, dose–response relationship is considered as the most important data set from which toxicity is determined. Recent drug withdrawals and clinical development failures have questioned this paradigm and placed various stakeholders under enormous pressure to act by implementing novel ways of addressing the toxicity dilemma. The evolution in biology, chemistry and robotics has allowed drug discovery companies to rapidly develop new approaches to address toxicity and safety issues. Furthermore, the intense pressure to discover new therapeutics while controlling costs has led drug discovering companies to increase efficiency of their drug discovery and development efforts. This process shift has allowed scientists to screen tens of thousands of compounds in a period of days instead
A. Amir-Aslani / Technological Forecasting & Social Change 75 (2008) 905–932
929
of years. However, this dramatic increase in speed has not been accompanied by progress in terms of direction, notably in terms of toxicity. As a result, the US Food and Drug Administration (FDA) is putting the drug discovery sector under intense scrutiny and pressure to increase openness through more responsible marketing as well as pushing the emphasis back on R&D again. As such, the drug discovery industry has no other alternative than to become more proactive when it comes to risk management. More realistic models that allow researchers to look more quantitatively at benefit–risk management need to be further developed. Toxicogenomics is one approach to efficient drug discovery and development. Toxicogenomics is not, however, a single technology. One must possess expertise in a wide range of capabilities and techniques in order to maximize its utility. An enhanced understanding of how chemicals affect gene function may make direct toxicity testing of new compounds assessed in the regulatory process cheaper and quicker to perform. More toxicity data could also help the application of structure–activity relationship analysis for extrapolating health impacts to like compounds when direct testing is not practical. While the idea that toxicogenomics can discriminate between toxic and non-toxic compounds is attractive, there are several challenges to overcome before this technology will have any real impact on drug discovery and development. The current challenges in predictive toxicology are to expand existing safety databases, develop new tools and increase the speed of data mining, improve the predictability of toxicological properties, and then integrating that information into drug development. Toxicogenomics techniques are enabling rapid and intelligent synthesis of novel compounds—fundamentally increasing our ability to create novel therapeutic compounds with desired qualities. If this is recognized early, toxicogenomics in association with other technologies could be harnessed to develop therapeutically preferred compounds. 14. Perspectives Tremendous advances will undoubtedly occur in the next decade in treating disorders for which the scientific community has developed rudimentary models, such as diabetes and other metabolic diseases. As our knowledge of biological networks increases that allows to make better predictions, we will experience a shift from disease treatment to health maintenance and disease prevention. Finally, medicine will become personalized with treatment tailored for each individual. Furthermore, the emergence of translational medicine will help to bridge the information gap between discovery and development. By using tools like biomarkers and effectively capturing and mining clinical data, translational medicine will help identify patient sub-groups and find optimal dosing regimens least likely to cause adverse drug effects. One likely outcome may see drug discovery companies re-evaluate their aggressive marketing of new drugs as soon as they come to market. How will toxicogenomics evolve and what will be its final role in drug discovery? Although the field of toxicogenomics is in its early stages and will be a long-term effort, it is already clear that the benefits of this methodical and comprehensive approach will be both extensive and exciting. The breadth and scope of knowledge derived from this effort will be used to understand the underlying mechanism of disease by toxic chemicals, as well as for drug discovery, and also will be critical platform for developing interventional and remedial strategies to interrupt the disease process. Predictive toxicology of the 21st century will not be a sole technology, it will be a family of related technologies in which parallel processing will be applied to all types of toxicity problems. It is important
930
A. Amir-Aslani / Technological Forecasting & Social Change 75 (2008) 905–932
to recognize that few toxicogenomics efforts will evolve independent of other drug discovery technologies. Optimizing toxicogenomics approaches without regard to complementary processes and capabilities will do little to accelerate drug discovery. For example, toxicogenomics will be used in concert with molecular modeling to find compounds with specific desired properties. Perhaps the single most important factor that will affect the future of toxicogenomics is our understanding of molecular diversity itself. Because molecular structures and structure–property relationships are not well understood, as such toxicogenomics is also one of many tools that may advance our knowledge of how chemicals affect gene function. This will lead to the creation of libraries capable of providing enough knowledge to design intelligent therapeutic compounds for a specific target. Also, the scientific community can use libraries generated with toxicogenomics-added information to learn more about how genes work by analyzing the impacts of gene perturbations. Expanded knowledge on gene function could be utilized to develop products with desired features. Toxicogenomics could also be one of the technologies used to refine candidates into marketable products. Advances in these fields will ultimately pave the way towards predictive toxicity models. Unlike other technologies that have promised to revolutionize drug discovery, toxicogenomics will undoubtedly prove itself as an efficient tool for drug identification. Its impact will continue to grow as more drug discovery groups will fully integrate this technology into the core of their discovery efforts. References [1] J.A. DiMasi, R.W. Hansen, H.G. Grabowski, The price of innovation: new estimates of drug development costs, J. Health Econ. 22 (2003) 151–185. [2] M. Dickson, J.P. Gagnon, Key factors in the rising cost of drug discovery and development, Nat. Rev. Drug Discov. 3 (2004) 417–429. [3] W. Bains, Failure rates in drug discovery and development: will we ever get any better? Drug Discov. World Fall (2004) 9–18. [4] Impact report Tufts Center for the study of drug development. 7(2005) September/October 2005. [5] K.E. Lasser, P.D. Allen, S.J. Woolhandler, D.U. Himmelstein, S.M. Wolfe, D.H. Bor, Timing of new black box warnings and withdrawals for prescription medications, JAMA 287 (2002) 2215–2220. [6] J. Lazarou, B.H. Pomernaz, P.N. Corey, Incidence of adverse drug reactions in hospitalized patients: a meta-analysis of prospective studies, JAMA 279 (1998) 1200–1205. [7] Public Citizen, Rx R&D Myths: the Case Against the Drug Industry's R&D “Scare Card”, July 2001. [8] S.E. Nissen, K. Wolski, Effect of Rosiglitazone on the risk of myocardial infarction and death from cardiovascular causes, New Engl. J. Med. 356 (2007) 2457–2471. [9] T.B. Bumol, A.M. Watanabe, Genetic information, genomic technologies, and the future of drug discovery, JAMA 285 (2001) 551–555. [10] A.L. Hopkins, C.R. Groom, The druggable genome, Nat. Rev. Drug Discov. 1 (2002) 727–730. [11] J. Knowles, G. Gromo, Target selection in drug discovery, Nat. Rev. Drug Discov. 2 (2003) 63–69. [12] J. Beattie, P. Ghazal, Post-genomic technologies — thinking beyond the hype, Drug Discov. Today 8 (2003) 909–910. [13] FDA, Innovation & Stagnation Challenge & Opportunity on the Critical Path to New Medical Products, March 2004, http:// www.fda.gov/oc/initiatives/criticalpath/whitepaper.html. [14] H. Kubinyi, Drug research: myths, hype and reality, Nat. Rev. Drug Discov. 2 (2003) 665–668. [15] D. Noble, Will genomics revolutionize pharmaceutical R&D? Trends Biotechnol. 21 (2003) 333–337. [16] Y. Baba, Development of novel biomedicine based on genome science, Eur. J. of Pharm. Sci. 13 (2001) 3–4. [17] C.A. Shillingford, C.W. Vose, Effective decision-making: progressing compounds through clinical development, Drug Discov. Today 6 (2001) 941–946. [18] A.P. Li, Screening for human ADME/Tox drug properties in drug discovery, Drug Discov. Today 6 (2001) 357–366. [19] A.-E.F. Nassar, A.M. Kamel, C. Clairmont, Improving the decision-making process in structural modification of drug candidates: reducing toxicity, Drug Discov. Today 9 (2004) 1055–1064.
A. Amir-Aslani / Technological Forecasting & Social Change 75 (2008) 905–932
931
[20] A. Oberemm, L. Onyon, U. Gundert-Remy, How can toxicogenomics inform risk assessment, Toxicol. Appl. Pharm. 207 (2005) S592–S598. [21] B.A. Bunin, Increasing the efficiency of small-molecule drug discovery, Drug Discov. Today 8 (2003) 823–826. [22] N.F. Neumann, F. Galvez, DNA microarrays and toxicogenomics: applications for ecotoxicology, Biotechnol. Adv. 20 (2002) 391–419. [23] S. Whitebread, J. Hamon, D. Bojanic, L. Urban, In vitro safety pharmacology profiling: an essential tool for successful drug development, Drug Discov. Today 10 (2005) 1421–1433. [24] A. Bugrim, T. Nikolskaya, Y. Nikolsky, Early prediction of drug metabolism and toxicity: systems biology approach and modeling, Drug Discov. Today 9 (2004) 127–135. [25] L.R. Bandara, S. Kennedy, Toxicoproteomics — a new preclinical tool, Drug Discov. Today 7 (2002) 411–418. [26] S. Ekins, Y. Nikolsky, T. Nikolskaya, Techniques: application of systems biology to absorption, distribution, metabolism, excretion and toxicity, Trends Pharmacol. Sci. 26 (2006) 202–209. [27] R. Koop, Combinatorial biomarkers: from early toxicology assays to patient population profiling, Drug Discov. Today 10 (2005) 781–788. [28] M. Fogarty, The reality of targeted therapies, Scientist 16 (2002) 35. [29] L.A. McLean, I. Gathmann, R. Capdeville, M.H. Polymeropoulos, M. Dressman, Pharmacogenomic analysis of cytogenetic response in chronic myeloid leukemia patients treated with Imatinib, Clin. Cancer Res. 10 (2004) 155–165. [30] J.O. Claudio, A.K. Stewart, Advances in myeloma genetics and prospects for pharmacogenomic testing in multiple myeloma, Am. J. Pharmacogenomics 5 (2005) 35–43. [31] S.-M. Huang, L.J. Lesko, Correlation between genes, diseases and biopharmaceuticals, in: J. Knablein, R.H. Muller (Eds.), Modern Biopharmaceuticals — Design, Development and Optimization Application of Pharmacogenomics in Clinical Pharmacology — in Volume I: Molecular Medicine, Wiley, VCH, 2005. [32] M. Rinne, Technology roadmaps: infrastructure for innovation, Technol. Forecast. Soc. 71 (2004) 67–80. [33] I.J. Petrick, A.E. Echols, Technology roadmapping in review: a tool for making sustainable new product development decisions, Technol. Forecast. Soc. 71 (2004) 81–100. [34] R. Phaal, C. Farukh, D. Probert, Customizing roadmapping, Res. Technol. Manag. (2004 March/April) 26–37. [35] S. Subramaniam, Productivity and attrition: key challenges for biotech and pharma, Drug Discov. Today 8 (2003) 513–515. [36] O. Gassmann, G. Reepmeyer, Organizing pharmaceutical innovation: from science-based knowledge creators to drugoriented knowledge brokers, Creat. Innov. Manag. 14 (2005) 233–245. [37] A. Amir-Aslani, S. Negassi, Is technology integration the solution to biotechnology's low research and development productivity? Technovation 26 (2006) 573–582. [38] E.H. Erns, L. Di, Pharmaceutical profiling in drug discovery, Drug Discov. Today 8 (2003) 316–323. [39] R.M. Lawn, L.A. Lasky, Pharmaceutical biotechnology: the genomes are just the beginning, Curr. Opin. Biotech. 11 (2000) 579. [40] K. Rasmussen, Creating more effective antidepressants: clues from the clinic, Drug Discov. Today. 11 (2006) 623–631. [41] G. Steiner, L. Suter, F. Boess, R. Gasser, C. verade, S. Albertini, S. Ruepp, Discriminating different classes of toxicants by transcript profiling, Environ. Health Perspect. 112 (2004) 1236–1248. [42] M. McMillain, A. Nie, J.B. Parker, A. Leone, M. Kemmerer, S. Bryant, J. Herlich, L. Yieh, A. Bittner, X. Liu, J. Wan, M.D. Johnson, P. Lord, Drug-induced oxidative stress in rat liver from a toxicogenomics perspective, Toxicol. Appl. Pharm. 207 (2005) 171–178. [43] P.M. Caroll, B. Dougherty, P. Ross-Macdonald, K. Browman, K. FitzGerald, Model systems in drug discovery: chemical genetics meets genomics, Pharmacol. Ther. 99 (2003) 183–220. [44] D.P. Williams, B. Kevin Park, Idiosyncratic toxicity: the role of toxicophores and bioactivation, Drug Discov. Today 8 (2003) 1044–1050. [45] A. Smart, P. martin, The promise of pharmacogenetics: assessing the prospects for disease and patient stratification, Stud. Hist. Philos. Boil. Biomed. Sci. 37 (2006) 583–601. [46] M.H.V. Van Regenmortel, Reductionism and complexity in molecular biology, EMBO Rep. 5 (2004) 1016–1020. [47] J.D. Walker, L. Carlsen, E. Hulzebos, B. Simon-Hettich, Global government applications of analogues, SARs and QSARs to predict aquatic toxicity, chemical or physical properties, environmental fate parameters and health effects of organic chemicals, SAR QSAR Environ. Res. 13 (2002) 607–616.
932
A. Amir-Aslani / Technological Forecasting & Social Change 75 (2008) 905–932
[48] A.C. White, R.A. Mueller, R.H. Gallavan, S. Aaron, A.G. Wilson, A multiple in silico program approach for the prediction of mutagenicity from chemical structure, Mutat. Res. 539 (2003) 77–89. [49] B. Simon-Hettich, A. Rothfuss, T. Steger-Hartmann, Use of computer-assisted prediction of toxic effects of chemical substances, Toxicology 224 (2006) 156–162. [50] J.F. Allen, In silico veritas, EMBO Rep. 2 (2001) 542–544. [51] M. Bogyo, B.F. Cravatt, Genomics and proteomics: from genes to function: advances in applications of chemical and systems biology, Curr. Opin. Chem. Biol. 11 (2007) 1–3. [52] T. Werner, Regulatory networks: linking microarray data to systems biology, Mech. Ageing Dev. 128 (2007) 168–172. [53] J. Maggioli, A. Hoover, L. Weng, Toxicogenomic analysis methods for predictive toxicology, J. Pharmacol. Toxicol. 53 (2006) 31–37. [54] M.T.D. Cronin, Computer-aided prediction of drug toxicity and metabolism, in: A. Hillisch, R. Hilgenfeld (Eds.), Modern Methods of Drug Discovery, Birkhauser Verlag, Basel, 2003. [55] J.F. Borzelleca, Paracelsus: herald of modern toxicology, Toxicol. Sci. (2000) 53. Dr Arsia Amir-Aslani is a former investment banker and was a senior marketing executive at a healthcare consultancy firm, where he provided consulting services to US and European life sciences companies. As an investment banker he was heading the life science practice at the corporate finance department of Oddo Pinatton, a French investment banking group, where he concentrated on M&A and business development activities. Prior to founding, Araxes Associates, an independent research firm, he was Vice-President for Corporate Development & Strategy at Crucell, a dual-listed (NASDAQ, Euronext: Amsterdam) Dutch biotechnology company. He has more than 10 years of experience in the biotechnology sector from a capital market, consulting and industry perspective. He holds a Ph.D. in Molecular and Structural Pharmacology from the University of Paris, Pierre et Marie-Curie, as well as a M.Sc. in International Management from the University of Paris, Sorbonne. Furthermore, Dr AmirAslani is Adjunct Professor of Competitive Intelligence in the M.Sc. “Innovation and Management of Technology” at the University of Paris, Panthéon Sorbonne. He has numerous publications in both academic and practioner journals. His interests include dynamic competitive strategy, valuation issues, portfolio & risk management, organization dynamics, scenario planning, real options, and strategic marketing.