Forum: Science & Society
Quality assurance mechanisms for the unregulated research environment Denise Hanway Riedl and Michael K. Dunn Ferring Research Institute, 4245 Sorrento Valley Blvd., San Diego, CA 92121, USA
Discussions on research quality and reproducibility are appearing in the pages of scientific journals with heightened significance and gaining media attention. Many institutions have developed guidelines to address the topic of quality in basic research, but questions remain about how best to implement and monitor compliance. Herein we present quality assurance (QA) mechanisms developed specifically for the unregulated discovery research environment to preempt growing concerns arising in both academia and industry for data-driven applications of biotechnology. Introduction Data gathered in pursuit of basic science can lead to understanding of disease mechanisms, identification of drug targets, and discovery of novel therapeutics. In a pharmaceutical setting, research data has tremendous bearing on a company’s decision-making, resource utilization, and patent strength, with potential impact on product development and time-to-market. Discovery research is not subject to the regulatory quality systems mandated at later stages of drug development, such as Good Laboratory Practices (GLP). Lack of reproducibility, the quality of published preclinical data, and the (mis)handling of data have received growing attention [1–3], highlighting a need for action; however, without formal requirements or industry-wide standards, the definition of quality in discovery research remains open to interpretation. Indeed, we found significant differences in the approach to data quality among our own researchers. Recognizing the need, we aspired to implement mechanisms that ensure data quality and integrity while avoiding the potential threat to scientific ingenuity from excessive regulations. Others had walked this road [4,5], but our discovery unit required a customized solution. Balancing QA and scientific innovation Ferring Research Institute is the flagship discovery research center for Ferring Pharmaceuticals, focusing on the discovery of therapeutic peptides and proteins. Our research spans peptide drug design, pharmacology, biochemistry, and preclinical pharmacokinetics. It was essential that our quality policies were carefully designed and implemented to achieve our desired QA objectives [6]: to assure accuracy and validity of experimental data and reported results without disrupting the research process. Corresponding author: Dunn, M.K. (
[email protected]). Keywords: data audit; discovery research; quality; integrity; traceability; electronic laboratory notebook.
552
Paramount to the success of our approach was avoiding arduous bureaucracy or needless rules, the risk being that rules potentially create barriers to breakthrough innovations and stifle creativity [7]. We first needed to assess at which research phase quality practices would yield the greatest benefit, while allowing innovation and experimentation to proceed largely unimpeded. We determined that data in study reports used for patents, investigator brochures, and strategic decision-making held the highest impact; the quality of data therein would be most critical and our quality program should initially focus on the Lead Optimization phase (Figure 1). We began with a goal to eliminate preventable errors, oversights, or misinterpretations in data analysis and final reporting, developing a simple plan of auditing data to be compiled into study reports. Although limited to the later research stages where quality measures might be more tolerated, we hoped the process could trickle down to earlier phases as scientists became more accustomed to quality practices. As our subsequent research program progressed through Lead Optimization, we set out to perform internal data audits on key studies. Critical questions about this new process quickly arose and needed to be answered at the highest level: Where exactly was the data? Who should compile it, and in what format? Could QA be achieved without a complete revolution? Assembling the quality infrastructure Electronic laboratory notebook Challenges in data gathering turned our strategy towards improved record keeping, facilitated by introduction of an electronic laboratory notebook (ELN) to aid data capture and traceability. The ELN not only provided a secure environment for electronic data entry, but also ensured version control and electronic signing. We now had a platform for data storage and real-time monitoring of who, what, when, and how experiments were being documented. Importantly, data retrieval and review occur in a searchable, nimble electronic environment, exceedingly more efficient than paper notebooks or electronic files floating around the IT infrastructure. Launching the ELN was not a cure-all for data handling issues. There was some initial trepidation and a learning curve associated with the software: researchers were outside of their paper notebook comfort zone. To mitigate these fears, we developed a training module and ELN policy. Despite best efforts to prepare in advance, handson experience revealed unforeseen issues requiring vendor consultation, follow-up training, and a healthy dose of
Forum: Science & Society
Trends in Biotechnology October 2013, Vol. 31, No. 10
Innovaon
New ideas
Quality
Feasibility
Lead opmizaon
Clinical candidate
Support for development
Data Uses: Disease mechanism
Target idenficaon
Patents
Publicaons
Strategic decision-making Due diligence
Invesgator brochures TRENDS in Biotechnology
Figure 1. Balancing scientific innovation and data quality in unregulated research. The phases of the drug discovery pipeline are shown with examples of potential uses of research data. Although innovation (blue) and quality (green) are necessary in all stages of discovery research, there is a balance between when they are critical. Innovation is more crucial to success in the early phases, whereas quality and quality assurance (QA) become more important as programs approach decision-making stages. Indicated in red is the initiation point for the QA mechanisms described herein. The early phases of the pipeline remain unfettered to promote scientific creativity; data quality, integrity, and traceability are assured in the later phases.
patience. Over time, feedback from scientists and QA personnel led to modification of the ELN, the associated policy, and training. Flexibility and willingness to adapt have been recurring themes in successfully implementing our QA practices. Research–QA collaboration Positioning the quality processes as a partnership rather than an external imposition is vital [8]. We view QA as an opportunity for scientific discussion rather than judgment: our auditors are scientists with discovery research experience whose mission is to complement the research team – assuming third party objectivity in the data review, then offering solutions or suggestions as collaborative scientists. Although the QA process is not optional, research and QA are aligned towards the common goal of releasing only validated, verified data. We also engage a core group of researchers as ELN Power Users – expert ELN users who serve as a resource within their own departments and are the first line of feedback as beta testers for ELN software revisions. This two-way communication promotes joint ownership of the process. Importantly, senior management has allocated resources to the QA effort and mandates QA review before data release: no patent can be filed, no meeting abstract/publication submitted, etc., until the associated data has been reviewed. Management’s commitment to quality practices ensures diligence in this area at all levels. Internal data audit: assessing and monitoring quality Consistent with some Good Practice (GxP) principles, but divergent from others, we developed our own quality audit process purpose-built for the research environment, representing a simplified version of that proposed by Volsen and
Masson [8]. We defined audit standards which we refer to as the ‘Four Pillars of Data Quality’ (Box 1): (i) traceability, (ii) accuracy, (iii) clarity/completeness, and (iv) timeliness; all documents submitted for audit are examined for compliance with these basic concepts. This crucial aspect of our quality practices has been refined to add maximum value with minimal disturbance to research activities. Again, the audit process is a joint effort between research and QA. Although the overall scope is programwide, individual audits are performed for each key study. The study’s lead scientist is responsible for initial compilation and review of data and supporting documents – activities routinely performed in the course of final reporting, so the extra burden is minimal. The auditor then conducts an objective verification of traceability, accuracy, clarity/completeness, and timeliness of all raw data, analyses, and documentation, including comparison of reported Box 1. Four Pillars of Data Quality (i) Traceability: complete description of methods and materials used, and raw data collected are available and appropriately archived such that experiments are fully reconstructable and traceably follow the stated protocol or associated STM. (ii) Accuracy: experimental methods and data analysis are consistent with sound scientific principles; appropriate controls, data quality assessment, and statistical analyses are performed; data transcription and calculations are correct; reported data correspond to the collected data; from a third party perspective, hypotheses and conclusions generated are consistent. (iii) Clarity/completeness: documentation of procedures, data analysis, and any supporting material are comprehensive; experimental rationale, study design, methods and results are clear and consistent. (iv) Timeliness: records are created contemporaneously with research activities. 553
Forum: Science & Society methods against collected data. The audit identifies editorial mistakes such as inconsistencies or errors, but the auditor also provides suggestions related to scientific content and clarity. The auditor’s ‘arm’s length’ feedback can help prevent potential misinterpretations, which is critical because the intended audience for the reported data is often outside the organization. The auditor’s findings and recommendations are provided in an audit report. The scientific lead works in collaboration with the research team to prioritize and address relevant audit findings. There is open discussion between the research team and auditor to clarify findings and establish an action plan. Throughout the process, the scientists remain accountable for the research; the auditor’s role is to provide independent feedback to be incorporated at the research team’s discretion. Generally, an acceptable solution to address audit findings is obvious and easily implemented, but senior management is occasionally consulted to adjudicate complex issues. When the data are finalized, the audit is closed, an electronic archive of all study documents is created, and comprehensive written study reports are drafted. In addition to study reporting, internal auditing is also utilized for standardized experimental protocols; materials, methods, data analysis procedures, and data acceptability criteria for routinely performed assays are detailed in a standard test method (STM). In our model, STMs are established upon completion of assay development before the production of ‘reportable’ data. The STM, validation data, and a representative ELN entry are audited before the STM is finalized. In this way, quality is assessed early, before Lead Optimization data are generated, preventing possible systemic errors before they occur. Continuous improvement and impact Cooperation in the QA policy process continues by soliciting researcher feedback of what is practical, what
554
Trends in Biotechnology October 2013, Vol. 31, No. 10
is working, and what is not. Despite initial challenges, the QA process has collaboratively evolved to better meet the needs of all stakeholders. Surveys have confirmed a high level of satisfaction from both scientific and QA perspectives. Internal data audit is now voluntarily sought out by researchers, and QA is generally viewed as an integral part of research rather than an external interrogation. Ultimately, data integrity is a fundamental principle of scientific research. All our researchers receive timely training related to ELN, internal data audit, and STMs, and the expectations are clear. We have seen measurable quality improvement, as judged by the number and severity of audit findings. Emphasis on training, recognizing and rewarding compliance have promoted quality awareness and produced a long-term positive impact on our institute. Hopefully the research appropriateness and collaborative nature of the QA processes described can be appreciated as effective mechanisms for measuring, improving, and maintaining research data quality. References 1 Begley, C.G. and Ellis, L.M. (2012) Drug development: raise standards for preclinical cancer research. Nature 483, 531–533 2 Prinz, F. et al. (2011) Believe it or not: how much can we rely on published data on potential drug targets? Nat. Rev. Drug Discov. 10, 712–713 3 Pollack, A. (2009) Biotech company fires chief and others over handling of data. The New York Times 28 September 4 Shamoo, A.E. (1991) Policies and quality assurance in the pharmaceutical industry. Account. Res. 1, 273–284 5 Volsen, S.G. et al. (2004) Quality: an old solution to new discovery dilemmas? Drug Discov. Today 9, 903–905 6 International Organization for Standardization (2005) ISO 9000:2005 Quality Management Systems. International Organization for Standardization 7 Bennani, Y.L. (2011) Drug discovery in the next decade: innovation needed ASAP. Drug Discov. Today 16, 779–792 8 Volsen, S.G. and Masson, M.M. (2009) A novel audit model for assessing quality in non-regulated research. Qual. Assur. J. 12, 57–63 0167-7799/$ – see front matter ß 2013 Elsevier Ltd. All rights reserved. http://dx.doi.org/10.1016/j.tibtech.2013.06.007 Trends in Biotechnology, October 2013, Vol. 31, No. 10