ORIGINAL SCIENTIFIC ARTICLES
Improved Surgeon Performance in Clinical Trials: An Analysis of Quality Assurance Audits from the American College of Surgeons Oncology Group Y Nancy You, MD, MHSc, Lisa Jacobs, MD, FACS, Elizabeth D Martinez, BS, Susan C Budinger, BS, E Jean Wittlief, MA, Shelley K Myles, BScRN, David M Ota, MD, FACS The American College of Surgeons Oncology Group (ACOSOG) represents an organized effort by surgeons to participate in clinical trials research. To assess the quality of trial conduct by surgeons on a national level and the feasibility of improvement through education, this study examined the findings of the Quality Assurance Audit Program of the ACOSOG over time. STUDY DESIGN: Outcomes of 249 routine audits conducted from 2001 to 2004 were reviewed for major and minor deficiencies and overall performance (acceptable versus unacceptable) in compliance with regulatory requirements (REG) and patient case review (PCR). RESULTS: From 2001 to 2004, active trials have increased. Major deficiencies in REG fell from 31% to 20% for IRB documentation (p ⫽ 0.002) and from 31% to 9% for informed consent (p ⬍ 0.001). The major deficiency rates in PCR decreased from 21% to 6% (patient consent), 16% to 7% (eligibility), 13% to 7% (treatment), 34% to 6% (outcomes), 6% to 1% (toxicity), and 16% to 3% (data). During 2001 to 2004, the overall acceptable performance rates were 82%, 72%, 84%, and 92%, respectively, in REG (p ⫽ 0.093), and significantly improved in PCR (47%, 55%, 77%, 94%, respectively; p ⬍ 0.001). No difference was detected in acceptable rates between academic versus community sites, for either REG (86% versus 76%, respectively; odds ratio: 1.91; 95% CI: 0.87 to 4.19) or PCR (63% versus 68%, respectively; odds ratio: 0.81; 95% CI: 0.42 to 1.53). CONCLUSIONS: Despite initial deficiencies, surgical trials are now conducted with high standards nationwide. In response to educational programs, surgeon performance in clinical trials has measurably improved. Quality assurance audits have served both surveillance and educational roles. (J Am Coll Surg 2006;203:269–276. © 2006 by the American College of Surgeons) BACKGROUND:
tional Cancer Institute (NCI)-sponsored national oncology cooperative groups, the ACOSOG was established in 1996 with the mission to evaluate surgical management of malignant solid tumors in a multidisciplinary setting.1 Before participation in ACOSOG, many surgeons had been unfamiliar with the theory and practice of clinical trials, because fewer than 20% of randomized trials in the past had compared surgical therapy as a treatment variable,2 and traditional surgical training emphasized basic science research as opposed to clinical trials research.3,4 Given the limited experience in surgical trials, it remained unclear whether conducting clinical trials with high standards is feasible for surgeons on a national level. Prospective clinical trials provide the highest level of evidence and hold the most power to influence clinical practice.5 In surgical oncology, results of clinical trials
The American College of Surgeons Oncology Group (ACOSOG) represents a nationwide effort by general and specialty surgeons to collectively participate in clinical trials research, from both academic medical centers and community practices.1 As one of the newest NaCompeting Interests Declared: None. Supported by the American College of Surgeons Oncology Group Grant U10CA76001 from the National Cancer Institute. Dr You was supported by the Ruth L Kirschstein National Research Service Awards Training Grant T32CA101695 from the National Institutes of Health. Received March 7, 2006; Revised May 17, 2006; Accepted May 23, 2006. From the American College of Surgeons Oncology Group (ACOSOG), Durham, NC (You, Jacobs, Martinez, Budinger, Wittlief, Myles, Ota); the Departments of Surgery, Mayo Clinic, Rochester, MN (You); Johns Hopkins University, Baltimore, MD (Jacobs); and Duke University Medical Center, Durham, NC (Ota). Correspondence address: David M Ota, MD, The American College of Surgeons Oncology Group, 2400 Pratt St, Terrace Level Room 0311, Durham, NC 27705.
© 2006 by the American College of Surgeons Published by Elsevier Inc.
269
ISSN 1072-7515/06/$32.00 doi:10.1016/j.jamcollsurg.2006.05.298
270
You et al
Improved Surgeon Performance in Clinical Trials
Abbreviations and Acronyms
ACOSOG ⫽ American College of Surgeons Oncology Group CTMB ⫽ Clinical Trials Monitoring Branch NCI ⫽ National Cancer Institute OR ⫽ odds ratio PCR ⫽ patient case review PHARM ⫽ pharmacy operations REG ⫽ research regulations
have defined and revolutionized the optimal treatment strategies for many patients afflicted with malignancies. But for clinical trials to have such impact, the validity of the reported data and the integrity of the trial conduct must be assured.5-7 A quality assurance audit is a formal process by which the accuracy of the submitted trial data and the compliance of the trial investigator to the protocol are independently verified against the site medical records and regulatory documents.8 The critical importance of quality assurance has been well recognized by the NCI, the world’s largest sponsor of antineoplastic therapies. Since the early 1980s, the Clinical Trials Monitoring Branch (CTMB) of the NCI has instituted formal policies for monitoring all clinical trials conducted by NCI-sponsored cooperative groups.9,10 But the outcomes and impact of such quality assurance audit programs have not been widely documented.10-14 To assess the quality of conduct in surgical oncology trials nationwide and to benchmark progress over time, the experience of the ACOSOG quality assurance audit program was reviewed. We aimed to determine whether trial performance in regulatory requirements and patient case review have improved over time, in response to quality assurance audits and other educational programs. METHODS The ACOSOG quality assurance audit program
The ACOSOG audit team conducts all audit activities of the ACOSOG and is led by one of the two dedicated staff auditors. One of the 15 surgical investigators on the ACOSOG Audit Committee, who volunteers time and services without additional compensation, may serve as an additional team member. Members of the CTMB or other federal agencies may attend any audit to monitor the work of ACOSOG auditors. Any investigator from the ACOSOG membership at large may sit in on an audit to learn the auditing process.15
J Am Coll Surg
Every institution with an ACOSOG protocol open and approved by its Institutional Review Board (IRB) is eligible to be audited. The key components of an ACOSOG on-site audit include the following: 1. All open protocols are assessed for compliance with research regulations (REG); specific subcomponents monitored include IRB documentation (all protocol-related regulatory activities are reviewed and approved by the institutional IRB) and informed consent documentation (all components required by the NCI9 are present and approved by the institutional IRB). 2. All open protocols involving investigational agents are reviewed for drug accountability in pharmacy operations (PHARM). 3. All protocols with active patient enrollment are evaluated for protocol compliance through patient case review (PCR); six subcomponents of PCR specifically monitored include: informed consent (the process of informed consent was executed correctly), patient eligibility (subjects fulfill all protocol-specified eligibility criteria), treatment interventions (subjects have received protocol-specified interventions), disease outcomes and tumor response (reported outcomes were accurate and verified), toxicity (adverse events were faithfully reported without delay), and data quality (accurate, complete, and timely data submission). 4. An exit interview is conducted with the investigators to discuss audit findings and to plan for improvements.15
Routine on-site audits occur at least once every 3 years. Ten percent of all patients enrolled in a particular protocol since the previous audit and at least one additional unannounced patient case are audited.15 The audit team reviews and verifies submitted data against source documents, including original IRB documents, informed consents, NCI Drug Accountability Record Forms, medical records (clinical notes, diagnostic test reports, laboratory data, procedural reports, and so on), research records signed and dated by the study personnel, and subject diaries or calendars. Deviations found are rated according to NCI guidelines9,15 as either a “major deficiency” (defined as a variance from protocol-specific procedures that makes the resulting data questionable) or a “minor (or lesser) deficiency” (defined as one that is judged not to have a significant impact on the outcomes or interpretation of the study). Additionally, based on predefined criteria from the NCI,9,15,16 each audit component (REG, PHARM, or PCR) is given an overall grade of “acceptable,” “acceptable needing followup,” or “unacceptable.” For grading of “acceptable needing followup” or “unacceptable,” institu-
Vol. 203, No. 3, September 2006
You et al
Improved Surgeon Performance in Clinical Trials
271
Table 1. Overall Activity of the American College of Surgeons Oncology Group (ACOSOG) Quality Assurance Audit Program, 2001 to 2004 Year
Routine audits, n
Protocols audited,* n
Distinct protocols audited, n
Patients audited, n
Distinct patients audited,† n
2001 2002 2003 2004 Total
66 58 45 80 249
153 127 105 311 696
7 7 14 21 —
689 321 252 343 1,605
657 302 239 332 1,530
*Protocols audited refers to the number of distinct protocols open per site multiplied by the total number of sites audited during the calendar year. † Number of distinct patients may differ from the number of patients audited because of concurrent enrollment in pairs of ACOSOG companion trials.
tions must submit a written response with a corrective plan. A final audit report is filed with the CTMB Audit Information System database. Data source and definitions
The first patient enrollment to an active ACOSOG trial occurred in late 1998. Between 2001 and 2004, 249 routine on-site audits were conducted. Among 217 enrolling sites, 125 (58%) were categorized as “academic or teaching affiliated,” 78 (36%) as “community,” and 14 (6%) as “other” (including Veterans Affairs medical centers, army institutions, and so on), as identified by the ACOSOG membership database.1 Data from “other” sites were too scarce to allow for meaningful comparisons and were excluded from subsequent analyses. Outcomes of the REG and PCR components were extracted from the CTMB Audit Information System database. During 2001 to 2004, only 3 ACOSOG protocols involved the use of investigational agents and received an audit of the PHARM component. Because results of the PHARM audits were uniformly excellent, they were not analyzed further. For the REG and the PCR components, the following outcomes are reviewed: the frequency and nature of the deficiencies (both major and minor) in each year from 2001 to 2004; the overall performance grading each year, as dichotomized to acceptable (including both “acceptable” and “acceptable needing followup”) versus unacceptable; and performance grading of “academic” (including both academic and teaching-affiliated) versus “community” enrolling sites. Statistical analysis
To detect trends over time, major and minor deficiencies and overall performance grading (acceptable versus unacceptable) were compared across the four calendar years using the Cochran-Armitage trend test, with a two-sided
statistical significance level of p ⬍ 0.05. Comparison of grading between academic versus community sites was conducted using chi-square tests with conditional maximal likelihood estimates; statistically significant odds ratios (OR) are associated with a 95% confidence interval (CI) exclusive of 1.0. All analyses were performed using the SAS Enterprise Guide software (version 3.0; SAS Institute). RESULTS From 2001 to 2004, the activities of the ACOSOG audit program have been robust in terms of both the number of protocols audited and the number of patient cases reviewed (Table 1). As the ACOSOG clinical trial portfolio matured, more active clinical trials were audited over time. Table 2. Deficiencies (Major and Minor) Consistently Identified in Regulatory Compliance During All Years of the Study, 2001 to 2004 IRB documentation Lack of documentation of full IRB approval of a protocol amendment that affects more than minimal risk (major) Reapproval delayed ⱖ30 days and ⬍1 year (major) Reapproval delayed ⬍30 days (minor) Informed consent documentation Lacking required full descriptions of risks and discomforts (major) Lacking required full description of involved research: purposes, duration of participation, description of procedures, identification of experimental procedures (major) Lacking required full disclosure of alternative procedures or treatments (major) Lacking required full description of the extent of confidentiality of records (major) Lacking the statement that new findings that may relate to subject’s willingness to continue will be provided to subject (minor)
272
You et al
Improved Surgeon Performance in Clinical Trials
Figure 1. Overall performance in regulatory compliance over time.
J Am Coll Surg
Figure 2. Overall performance in patient case review over time (*p ⬍ 0.001).
Regulatory compliance
Deficiencies consistently found in all 4 years of study are summarized in Table 2, and denoted as “major” or “minor.” Those in IRB documentation mainly occurred during protocol amendments and renewals; those in informed consent documentation represented failures to include elements of an adequate consent document as defined by the NCI.9 From 2001 to 2004, the number of both major and minor deficiencies in IRB documentation decreased significantly (p ⫽ 0.002 for major and p ⫽ 0.004 for minor; Table 3), although it should be noted that the major deficiency rate remained considerable in 2004. Similarly, remarkable declines were observed in the numbers of both major and minor deficiencies in informed consent documentation over time (p ⬍ 0.001 for major and p ⬍ 0.001 for minor; Table 3). Among routine audits of REG compliance, the overall performance was acceptable in 82% (2001), 72% (2002), 84% (2003), and 92% (2004), but this trend did not reach statistical significance over time (p ⫽ 0.093; Fig. 1). Nonetheless, the proportion of “acceptable” audits (excluding “acceptable needing followup”) had significantly in-
creased: 3%, 28%, 27%, 48%, from 2001 to 2004, respectively (p ⬍ 0.001; Fig. 1). Over the 4 years, the overall performance was acceptable in 86% of the audits conducted at academic institutions and 76% of those conducted at community sites (OR: 1.91; 95% CI: 0.9 to 3.9). Patient case review
Deficiencies consistently found in each of the 6 subcomponents of PCR from 2001 to 2004 are summarized in Table 4. In all subcomponents, the proportion of patient cases with either major or minor deficiencies tended to decrease from 2001 to 2004 (Table 5). The decrease in major deficiencies over time was significant in informed consent (p ⬍ 0.001), patient eligibility (p ⬍ 0.001), disease outcomes/treatment response (p ⬍ 0.001), toxicity (p ⬍ 0.001), and data quality (p ⬍ 0.001). For minor deficiencies, time trends were significant in informed consent (p ⬍ 0.001), treatment interventions (p ⫽ 0.004), toxicity (p ⬍ 0.001), and data quality (p ⬍ 0.001). In 2001, the most frequent deficiencies in each of the
Table 3. Frequencies of Major and Minor Deficiencies in Regulatory Compliance
n
%
Protocols audited 2002 2003 (n ⴝ 127) (n ⴝ 105) n % n %
48 13
31 9
43 12
34 9
32 10
30 10
62 8
20 3
48 50
31 33
20 26
16 21
12 9
11 6
28 9
9 3
2001 (n ⴝ 153) Subcomponent
IRB documentation Major deficiency Minor deficiency Informed consent documentation Major deficiency Minor deficiency
2004 (n ⴝ 311) n
%
Vol. 203, No. 3, September 2006
Table 4. Deficiencies (Major and Minor) Consistently Identified in Patient Case Review During All Years of Study, 2001 to 2004 Informed consent Consent form missing (major) Consent form not signed and dated by the patient (major) Consent form does not contain all required signatures (major) Consent form signed after patient started on treatment (major) Consent form does not include updates or information required by IRB (major) Patient eligibility Patient did not meet all eligibility criteria as specified by the protocol (major) Documentation missing and unable to confirm eligibility (major) Treatment interventions Treatment doses incorrectly administered, calculated, or documented (major) Incorrect agent/treatment used (major) Additional treatment used that is not permitted by protocol (major) Disease outcomes/tumor response Tumor measurement/evaluation of status or disease not performed according to protocol (major) Claimed response (partial response, complete response, and so on) cannot be verified (major) Toxicity Recurrent over- or underreporting of toxicities (major) Data quality Errors in submitted data: ⬎50% error rate* (major) Delinquent data submission: ⬎6-month delay* (major) *As defined by the American College of Surgeons Oncology Group audit team.
6 subcomponents were: informed consent without all required signatures (11.5%); inability to confirm eligibility because of missing document (10.2%); incorrectly administered or documented treatments (7.4%); inability to verify claimed tumor response (20.1%); failure to report toxicities (1.8%); and errors in submitted data (48.6%). The incidences of these deficiencies dramatically declined over time such that the corresponding incidences in 2004 were: 0.9% (informed consent); 3.2% (eligibility); 4% (treatment); 1.1% (tumor response); 0% (toxicity); and 5% (data). Overall performance in PCR was acceptable in 47%, 55%, 77%, and 94% of the patient cases audited each year from 2001 and 2004, respectively, demonstrating a significant improvement over time (p ⬍ 0.001; Fig. 2). When PCR outcomes were compared between academic versus community sites, the proportion of accept-
You et al
Improved Surgeon Performance in Clinical Trials
273
able audits did not significantly differ: 63% versus 68%, respectively, (OR: 0.8; 95% CI: 0.42 to 1.53). DISCUSSION Since the inception of ACOSOG, surgeons and their medical colleagues have enrolled more than 11,000 patients in clinical trials. This is the first study to examine the quality of the clinical trials research conducted by surgeons on a national level. We examined the outcomes of the quality assurance audit program at the ACOSOG, and found that 1) despite substantial regulatory and patient case deviations during the initial years, deficiency rates declined and acceptable grading increased measurably over time in both REG and PCR; and 2) these improvements occurred to a similar degree in both the academic and community settings. Our findings suggest surgeons now conduct clinical trial research with high standards, and that quality assurance audit programs have played critical roles in both surveillance and education. It remained unproven whether the practice and methodologies of clinical trials can be learned and executed in all practice settings nationwide. The data from the ACOSOG audit program provide the most suitable evidence to investigate this question for several reasons. First, because clinical trials were novel to many surgical investigators during the formative years of ACOSOG, initial audit outcomes provide an informative assessment of deficiencies at baseline. Second, because any practicing surgeon, whether at academic/ teaching institutions or in community/group practices, can enroll patients to ACOSOG trials, the audit outcomes reflect trial performances nationwide. Third, with the maturation of ACOSOG, surgical investigators have become knowledgeable in the methodologies of clinical trials, and audit outcomes measure the impact of these educational efforts. Educating surgeons about clinical trials has occurred through three main venues: 1. Hands-on learning: through active participation in ACOSOG trials, surgeon investigators have learned the practicalities of clinical trials research first-hand. 2. Formal didactic programs: during 6 ACOSOG semiannual meetings (2000 to 2002), the components of an onsite audit, assessments, and gradings were presented in 2 formal Audit Preparation Seminars and other training seminars by ACOSOG staff auditors and surgeon members of the Audit Committee. During the American College
274
You et al
Improved Surgeon Performance in Clinical Trials
J Am Coll Surg
Table 5. Frequencies of Major and Minor Deficiencies in Patient Case Review 2001 Subcomponent
Informed consent Major deficiency Minor deficiency Patient eligibility Major deficiency Minor deficiency Treatment interventions Major deficiency Minor deficiency Disease outcomes/tumor response Major deficiency Minor deficiency Toxicity Major deficiency Minor deficiency Data quality Major deficiency Minor deficiency
n
689 142 73 689 109 15 660 85 45 657 225 20 655 39 14 664 106 366
%
Patient cases audited for each component, n (%) 2002 2003 n % n %
21 11 16 2 13 7 34 3 6 2 16 55
320 63 80 316 41 32 280 80 19 278 89 26 278 18 73 299 61 73
20 25 13 10 29 7 32 9 6 24 20 24
252 41 18 252 30 4 210 56 10 211 37 15 210 5 10 219 33 52
16 7 12 2 27 5 18 7 2 5 5 24
2004 n
343 19 4 343 25 6 276 20 8 275 16 5 275 2 9 278 9 22
%
6 1 7 2 7 3 6 2 1 3 3 8
Total numbers of patient cases audited may differ in different subcomponents because of missing data or a specific subcomponent not audited for a particular protocol.
of Surgeons Clinical Congress in 2002, a 2-day postgraduate course entitled “The American College of Surgeons Oncology Group Clinical Trials” took place. A session dedicated to the ACOSOG audit program not only delineated the objectives of a quality-assurance audit, but also reviewed common deficiencies identified during the early experience of the group. Additionally, strategies for building a clinical team, developing protocols, organizing a tissue bank, and increasing accrual were discussed by surgeon investigators. 3. On-site audits: during these face-to-face sessions, auditors review and reinforce clinical trial research requirements. They bring to the site the best research practices they have gathered from other trial sites across the nation and help develop site-specific strategies to improve audit performances. This combination of individualized tutorial and site-tailored recommendations for future practice has constituted a highly effective educational tool, although the relative efficacies of these three educational venues have not been comparatively investigated.
Our finding that trial performance substantially improved over time, despite initially high rates of deficiencies, is consistent with reports from other national cooperative groups. A summary report of the 17 then-active cooperative groups revealed an initial deficiency rate of 45% in REG in 1982, which fell to 6% in 1984.10 The Cancer and Leukemia Group B (CALGB) reported that
deficiency rates in PCR decreased from 28% to 13.3% and 49.6% to 28.2% for main and affiliated institutions, respectively, over a 10-year period (1982 to 1992).13 And in melanoma trials of the European Organization for Research and Treatment of Cancer (EORTC), the frequency of protocol violations decreased from 28% to 11% from 1988 to 1992.16 Both these initial deficiency rates and the improvement over time are consistent with those observed in our study, suggesting that surgical trials can be conducted with performance standards comparable to those in other cooperative group trials. Despite the high rate of overall acceptable performance in REG, major deficiencies in IRB documentation remained in as many as 20% of the protocols audited in 2004. The most consistent deficiencies did not occur with the initial approval of the trial protocols but with protocol amendments or subsequent renewals. So, investigators must remain vigilant in their regulatory compliance throughout the entire lifetime of a trial protocol. Obtaining proper IRB documentation is not a one-time task performed during trial initiation but a continuing process. The care of patients enrolled in clinical trials differs from routine clinical care. Tasks unique to trial patients
Vol. 203, No. 3, September 2006
include confirmation of subject eligibility, adherence to trial protocol, unbiased treatment assignment, and accurate and timely data reporting. These items are specifically audited during PCR. The dramatic improvements in PCR strongly suggest that good clinical trials research can be learned and executed. It is noteworthy that the most frequent deficiencies in subcomponents of PCR identified in 2001 all related to inadequate documentation. The dramatic decline in their incidences in 2004 can best be attributed to the audit program itself. In preparing for and responding to audits, investigators and research staff must review the standards of documentation and ensure meticulous and honest recordkeeping; auditors may also recommend sound documentation methods encountered during their audits of other investigators. The performances of community sites participating in ACOSOG trials were found to be as acceptable as those of academic institutions in both REG and PCR. Similar improvements in PCR for community institutions have been reported by other investigators: ineligible cases declined from 16% to 8%, treatment deficiencies from 62% to 19%, and unverifiable tumor response assessments from 17% to 10%.10,17 Although the categorization of sites into “academic” versus “community” reflects the unique membership structure of ACOSOG, we could not further characterize these sites in terms of experience level with clinical research, organization of the research infrastructure, and availability of financial and personnel resources. Findings of this study highlight an association between educational programs and improvements in trial performance over time. But the retrospective design of this study precludes a definitive attribution of causation. Additionally, this study could not fully capture all contributors to the observed improvement in trial performance. One key determinant may be earlier experience with clinical trials research, for both the surgeon investigator and the protocol office. First-hand investigator experience with clinical trials research is a prerequisite for fully appreciating the importance of a wellfunctioning protocol office. Such experience is also critical for discerning and recruiting support staff with appropriate qualifications and skills. Because surgeon investigators are ultimately held responsible for the conduct of the clinical trials, they must carefully balance delegation of duties with close supervision. The team-
You et al
Improved Surgeon Performance in Clinical Trials
275
work nature of clinical trials research cannot be overemphasized. Another key contributor to audit outcomes may be the volume of the clinical trials practice. A detailed analysis of the complex volume-outcomes relationship in clinical trials research was beyond the scope of this study. Although sites or investigators enrolling more patients may be speculated to perform better, caution must be raised against overwhelming the support structure and losing the ability to closely monitor research activities. In addition, accruing a high number of patients to few trials may influence quality differently than accruing few patients to a large number of trials. Last, although the criteria for deficiencies and ratings have remained constant and the performance of the ACOSOG audit team continues to meet CTMB standards, auditor experience and subjectivity may contribute to observed differences over time, although greater ability to detect deficiencies may be expected with increasing experience. Clinical trials provide the highest level of evidence for the optimal practice of surgical oncology. Although clinical trial research was novel to many surgeons initially, dramatic improvement in trial conduct has occurred with time. Through practical experiences, didactic courses, and quality assurance audits, surgical trialists of ACOSOG have learned and now conduct clinical trials with high standards in both academic and community settings. Quality assurance audit programs have served both surveillance and educational roles, and their findings may warrant as much attention as trial outcomes. Author Contributions
Study conception and design: You, Jacobs, Ota Acquisition of data: Martinez, Budinger, Wittlief, Myles Analysis and interpretation of data: You, Jacobs, Ota Drafting of manuscript: You, Ota Critical revision: You, Jacobs, Ota, Martinez, Budinger, Wittlief, Myles REFERENCES 1. The American College of Surgeons Oncology Group. Available at: http://www.acosog.org. Accessed January 4, 2006. 2. McCulloch P, Taylor I, Sasako M, et al. Randomised trials in surgery: problems and possible solutions. BMJ 2002;324:1448– 1451. 3. Posther KE, Wells SA Jr. The future of surgical research: the role of the American College of Surgeons Oncology Group. Eur J Surg Oncol 2005;31:695–701.
276
You et al
Improved Surgeon Performance in Clinical Trials
4. You YN, Hunt KK, Posner MC, et al. Operative trials: the opportunity beckons–An update on the American College of Surgeons Oncology Group. Surgery 2006;139:455–459. 5. Knatterud GL, Rockhold FW, George SL, et al. Guidelines for quality assurance in multicenter trials: a position paper. Control Clin Trials 1998;19:477–493. 6. Califf RM, Karnash SL, Woodlief LH. Developing systems for cost-effective auditing of clinical trials. Control Clin Trials 1997;18:651–660. 7. Morse MA, Califf RM, Sugarman J. Monitoring and ensuring safety during clinical research. JAMA 2001;285:1201–1205. 8. You YN, Jacobs L, Martinez E, Ota DM. The audit process and how to ensure a successful audit. In: Leong S, ed. Cancer clinical trials made user-friendly with proactive strategies. New York: Springer; in press 2005. 9. Guidelines of the Clinical Trials Evaluation Program, National Cancer Institute: Monitoring of clinical trials. Available at: http://ctep.cancer.gov/monitoring. Accessed January 4, 2006. 10. Mauer JK, Hoth DF, Macfarlane DK, et al. Site visit monitoring program of the clinical cooperative groups: results of the first 3 years. Cancer Treat Rep 1985;69:1177–1187. 11. Christian MC, McCabe MS, Korn EL, et al. The National Cancer Institute audit of the National Surgical Adjuvant Breast and
12.
13.
14.
15.
16.
17.
J Am Coll Surg
Bowel Project Protocol B-06. N Engl J Med 1995; 333:1469–1474. Sunderland M, Kuebler S, Weiss RB. Compliance with protocol: quality assurance (QA) data from the Southwestern Oncology Group (SWOG). Proc Am Soc Clin Oncol 1990;9:60. Weiss RB, Vogelzang NJ, Peterson BA, et al. A successful system of scientific data audits for clinical trials. A report from the Cancer and Leukemia Group B. JAMA 1993;270:459–464. Bourez RL, Rutgers EJ. The European Organization for Research and Treatment of Cancer (EORTC) Breast Cancer Group: quality control of surgical trials. Surg Oncol Clin N Am 2001;10:807–819, ix. The American College of Surgeons Oncology Group. The audit manual; 2004. Available at: http://www.acosog.org. Accessed January 4, 2006. Therasse P, Eggermont AM. Research and quality control in surgical oncology. Surg Oncol Clin North Am 2001;10:763– 772, viii. Hjorth M, Holmberg E, Rodjer S, et al. Patient accrual and quality of participation in a multicentre study on myeloma: a comparison between major and minor participating centres. Br J Haematol 1995;91:109–115.
ANNOUNCEMENT All manuscripts are submitted on the electronic system ONLY. SEE GUIDELINES ON www.journalacs.org