Communicating Critical Test Results
Failure to Recognize and Act on Abnormal Test Results: Peter Cram, M.D. M.B.A. Gary E. Rosenthal, M.D. Robert Ohsfeldt, Ph.D. Robert B. Wallace, M.D., M.S. Janet Schlechte, M.D. Gordon D. Schiff, M.D.
The Case of Screening Bone Densitometry ore than 10 years ago Leape et al. published an article titled Preventing Medical Injury, which ushered in the modern medical errors movement by claiming that a major challenge of improving quality was to learn from those responsible for medical errors rather than to punish them.1 One major contribution of this article was the creation of a classification system for medical errors that forced providers and managers to view medical errors in a more structured manner. Several years later, this classification system served as the framework for the Institute of Medicine publication To Err Is Human.2 Although Leape et al. identified 14 separate categories of medical error (for example, delay in diagnosis, failure to act on test results, equipment failure), much of the published medical literature evaluating such errors has focused on two specific categories: surgical/technical errors and errors related to drug administration.3,4 There has been relatively little exploration of errors resulting from the failure to recognize and act on (or follow up on) abnormal test results in the medical literature. Data from the risk management literature suggest that failure to follow up on abnormal test results is a common problem and a frequent cause of medical malpractice lawsuits.5,6 A survey conducted by Boohaker et al., which asked providers about their individual systems for following up on test results, also suggests that the problem is significant.7 Among other findings, 25% of respondents reported having no reliable method for
M
90
February 2005
Article-at-a-Glance Background: Failure to follow up on abnormal test results is common. A model was developed to capture the reasons why providers did not take action on abnormal test results. Methods: A systematic review of the medical literature was conducted to identify why providers did not follow up on test results. The reasons were then synthesized to develop an operational model. The model was tested by reviewing electronic medical records of consecutive patients diagnosed with osteoporosis through a dual-energy x-ray absorptiometry (DXA) scan to determine whether: (1) the scan results had been reviewed; (2) therapy was recommended; (3) the scan results were not reviewed and why this occurred. Results: Of the 48 newly diagnosed osteoporosis patients, 16 did not receive a recommendation to begin treatment. There was no evidence that the scan results were reviewed in 11 of the 16 cases (23% of all abnormal scans); the scan results of an additional 5 patients were reviewed but no treatment was recommended. Discussion and Conclusions: A clinically significant percentage of DXA scan results went unrecognized. As a long-term solution, direct patient notification could theoretically reduce the burden on providers, activate and empower patients, and create a back-up system for ensuring that patients are notified of their test results.
Volume 31 Number 2
Copyright 2005 Joint Commission on Accreditation of Healthcare Organizations
making sure that they received the results of all tests they had ordered, and 36% of providers reported that they did not always notify patients of abnormal test results. Studies by Pinckney et al. and Schiff et al. have used administrative data to demonstrate that existing systems result in substantial failure rates in managing test results. These studies showed that patients with abnormal mammograms, elevated thyroid stimulating hormone levels, and abnormal serum potassium levels failed to receive the appropriate follow-up, which leads to potentially life-threatening outcomes.8–10 However, because these studies relied on the use of databases for evaluating provider follow-up of the abnormal test results, the investigators’ ability to determine the reasons for lack of follow-up was limited. None of these studies were specifically designed to examine why patients with abnormal test results failed to receive appropriate follow-up care; understanding why appropriate actions are not taken is critical to designing effective interventions to improve care. Moreover, no formal operational model exists in the medical literature to guide clinicians and managers in assessing the causes and magnitude of this problem. Therefore, two of the authors [P.C., G.E.R.] performed a structured review of the medical literature in an effort to identify all research studies that described reasons why providers do not follow up on abnormal test results. Next, we used the individual reasons identified in the literature review to create an operational model for evaluating the actions taken on account of abnormal test results. We then conducted a retrospective cohort study of consecutive patients diagnosed with osteoporosis on dual-energy x-ray absorptiometry (DXA) scanning to apply the model in an effort to do the following: ■ Understand the frequency with which appropriate action was not taken after osteoporosis was diagnosed ■ Understand why action was not taken ■ Generate insights about potential strategies for ensuring that abnormal tests receive appropriate follow-up
Methods Model Development We (all study authors) explicitly defined the underlying problem as the failure to take an appropriate action following an abnormal test for an individual patient. Next,
February 2005
we set out to identify all original research studies that identified the frequency with which and reasons why providers did not follow up on abnormal test results. In May 2004, with the assistance of an experienced reference librarian, we performed an initial MEDLINE search but quickly determined that neither single Medical Subject Heading (MeSH) nor any combination of headings readily identified all relevant studies. We used our combined clinical and research experience to identify 10 common screening tests (for example, Pap smears, fecal occult blood testing, mammography) that required prompt provider follow-up. We then performed a comprehensive MEDLINE search to identify all articles involving these tests, and we combined the results of this search under the MeSH heading medical errors. In addition, all studies containing the keywords loss to follow-up were reviewed. The literature search was supplemented by reviewing bibliographies of relevant review articles and our personal files. This strategy resulted in the identification of 23 original publications and 2 abstracts addressing the frequency with which and/or reasons why providers did not follow up on test results. Six of the most relevant articles are summarized in Table 1 (page 92).
Study Synthesis Two study authors [P.C., G.E.R.] independently abstracted each article to identify specific causes cited for why the abnormal test results did not result in an action being taken. Duplicate and related causes were combined, resulting in a list that included 11 unique causes. These causes were then grouped into common domains, leading to the identification of the following four distinct first-order explanations for why a provider did not take action on an abnormal test result: ■ Test result is not correctly communicated to the responsible provider (Figure 1, page 93) ■ Result is communicated but never received/reviewed by the provider ■ Result is reviewed but no action is recommended ■ Action is recommended but not carried out Each of these first-order causes was then linked back to the underlying second-order causes identified in the literature review to produce an operational model that could be linked to common steps in the delivery process,
Volume 31 Number 2
Copyright 2005 Joint Commission on Accreditation of Healthcare Organizations
91
Table 1. Key Studies Identifying Reasons for Failed Follow-up of Abnormal Test Results
Schiff, et al.9
Serum potas- Retrospective sium cohort
Patient % of Nonpopulation follow-up Outpatients taking 2% oral potassium
Baig, et al.
Fecal occult blood test
Survey
Adults screened for 48% colon cancer
Klos, et al.30
Fecal occult blood test
Retrospective cohort
Institutionalized elderly
50%
Leape, et al.31
Various
Retrospective cohort
Hospitalized patients
<1%
Marcus, et al.32 Pap smear
Randomized trial
Women attending urban hospital
30%
Poon, et al.33
Prospective cohort
Women with abnormal mammograms
36%
Source
29
Test Studied Study Design
Mammogram
thus facilitating practical solutions that could be developed by individual health care systems. The operational model was designed to be generic—applicable to a broad array of test-disease combinations as well as a wide variety of practice settings with different types of medical records (for example, electronic versus paper) and different types of relationships between the ordering practitioner and the testing facility.
Causes of Failed Follow-up ■
Results not reviewed/recognized
■
■
Provider decision Results not received Patient preference Patient preference Results not reviewed/recognized Inadequate communication Failure to act on results Patient not notified
■
Patient financial constraints
■
Inadequate communication
■ ■ ■ ■ ■ ■
Osteoporosis affects approximately 16 million Americans, resulting in medical costs of $15 to $20 billion per year, but it can be readily identified through screening DXA scanning.11–14 For adults diagnosed with osteoporosis on screening DXA scan, pharmacologic therapy is recommended as part of a comprehensive strategy aimed at reducing fracture risk.15–17
Study Site Model Application After developing the model, we sought to identify a disease-test pair to test the model and to examine whether it was comprehensive, practical, and potentially generalizable to the analysis of many different test results. We identified the following factors that would characterize a “good” candidate disease and test for study: ■ Commonness of the disease ■ Availability of the test and frequency ordered ■ Clear definition of the presence or absence of disease appears on the test ■ Accepted course of action should be taken for individuals diagnosed with disease according to the screening test On the basis of these criteria and our professional interests, we identified DXA scanning for osteoporosis screening as a good candidate for testing the model.
92
February 2005
The University of Iowa Hospitals and Clinics (UIHC) is an integrated health care delivery system consisting of a 700-bed tertiary care hospital, more than 30 specialty and subspecialty clinics, as well as numerous primary care clinics. The UIHC bone density center performs more than 3,000 DXA scans per year. Approximately 90% of patients receiving scans are referred from within the health care system with the remainder being referred from outside providers. The system has a well-developed electronic medical record (EMR) that has undergone progressive refinement since 1997. As of 2001, when the patient encounters reviewed for this project occurred, more than 95% of clinic notes, all laboratory results, and many imaging study results were recorded in the EMR. DXA scan results, however, were not automatically posted to the EMR. Rather, a paper report containing the scan result (but not including specific treatment
Volume 31 Number 2
Copyright 2005 Joint Commission on Accreditation of Healthcare Organizations
Operational Model for Evaluating Follow-up of Abnormal Test Results
Figure 1. Proceeding from left-to-right through the first-order explanations in the model demonstrates failure of the result management system progressively later in the result reporting process.
recommendations) was generated by the bone density center and sent to the referring provider. The bone density center did not routinely provide scan results directly to the patients. Thus, all treatment decisions were made at the individual provider’s discretion, presumably after review of the DXA reports. This study protocol was reviewed and approved by the Institutional Review Board.
Subjects All patients who underwent DXA scanning between January and May 2001 at UIHC were studied. The inclusion criterion was age older than 18 years at the time of the index scan. Patients with (1) a prior DXA scan, (2) index DXA scan limited to the forearm, (3) referral for the scan from outside of UIHC, (4) terminal illness, and (5) ongoing pharmacologic treatment for reduced bone density at the time of the index scan (other than with calcium, Vitamin D, or hormone replacement therapy) were excluded from this analysis.
Record Review Process We reviewed medical records maintained in the DXA scanning center for each patient to assess inclusion/
February 2005
exclusion criteria—and reviewed the DXA scan results for all patients meeting these criteria. Next, using the operational model as a guide, we reviewed the EMR and paper chart (when indicated) of patients with newly diagnosed osteoporosis on their DXA scan (T score ⱕ 2.5) to determine whether the following occurred: ■ The DXA scan result was reviewed by any provider ■ Any therapy (that is, pharmaceutical or nutritional supplement) was prescribed or recommended to the patient in response to the new diagnosis of osteoporosis ■ The scan results were not reviewed and/or therapy was not recommended and why this occurred Any acknowledgement of the scan results in the medical record by any provider or any recommendation to begin osteoporosis therapy was considered evidence that the result of the scan had been reviewed. For patients with osteoporosis whose DXA scan results were explicitly acknowledged in the medical record, the chart was reviewed to determine what drug therapy, if any, the provider recommended during the subsequent six months. All recommendations of pharmaceutical or nutritional supplements expressly for the purpose of treating osteoporosis were recorded. Finally, for patients
Volume 31 Number 2
Copyright 2005 Joint Commission on Accreditation of Healthcare Organizations
93
Results of Applying the Operational Model to Patients with Osteoporosis on DXA Scan
cases, we contacted the responsible provider to notify him or her of the potential oversight—approximately 24 months after the scan had been performed. In no case did information received from the provider indicate that the medical record review findings were incorrect (that is, that the DXA scan had been reviewed). An additional five patients had their DXA scan results reviewed by a provider but received no treatment recommendation; no justification for these decisions was identified during the medical record review. In none of Figure 2. In 11 of the 16 cases where no treatment was recommended, the medthe 16 cases did the medical record ical record review failed to demonstrate any evidence that the scan result was review provide sufficient informaever reviewed by any provider. DXA, dual energy x-ray absorptiometry. tion to allow for determination of the specific second-order causes of failure to follow up on the whose medical record did not provide any evidence that abnormal test results. the DXA scan results were either reviewed or therapy recommended, the provider who ordered the test was contacted, notified of this abnormal result and potential Discussion oversight, and additional information was requested. This project has provided several useful insights into Specifically, we inquired whether the provider may have problems in the test result reporting process. First, using reviewed the DXA scan showing osteoporosis, but failed a systematic review of the medical literature, we have to document this in the medical record. developed a comprehensive, practical, and potentially generalizable model for analyzing the processes of care Results that result from ordering a test. Second, this study adds Review of the DXA scan results for 428 patients resulted to the evidence that failure to recognize and act on in the identification of 48 patients (11%) with newly diagabnormal test results is an important and not infrequent nosed osteoporosis who met all inclusion criteria medical error with a number of important ramifications. (Figure 2, above); 32 (67%) of these 48 patients received Third, application of the model highlights the strengths a recommendation to initiate therapy during the six and limitations of applying the model in combination months following their DXA scans, while 16 (33%) with medical record review to assess quality of care. received no such recommendation. The specific treatment recommendations made to the 32 patients who Model for Analyzing the Process of Care received them are shown in Table 2 (page 95). This model provides researchers, clinicians, and manIn 11 of the 16 cases where no treatment was recomagers with a tool for evaluating whether abnormal test mended, the medical record review failed to demonstrate results trigger a recommended therapeutic action and any evidence that the scan result was ever reviewed by any classifying where in this process that failures occur. The provider (Figure 2). We were unable to ascertain whether operational model highlights the fact that a relatively the failure occurred because the result was not correctly finite number of generic causes are likely to underlie failcommunicated to the provider (Figure 1, Box 1, on page ures in the result reporting system across a diverse spec93) or because the result had been communicated but not trum of tests and diseases. In some cases, a solution may reviewed (Figure 1, Box 2, on page 93). In each of these 11 be as “easy” as making sure that a test result (for example,
94
February 2005
Volume 31 Number 2
Copyright 2005 Joint Commission on Accreditation of Healthcare Organizations
Table 2. Treatment Recommendations for Patients Diagnosed with Osteoporosis Medication Bisphosphonate Calcitonin Hormone replacement Raloxifene Calcium alone
Number of Patients (%) 20 (62) 1 (3) 1 (3) 2 (6) 8 (25)
blood pressure measured by a nursing assistant) is recognized by the provider, but in other cases, it may be as complicated as educating providers about the importance of treatment or designing complex interfaces between electronic information systems from disparate vendors.
Frequent Failure to Act on Abnormal Test Results This project adds to the growing evidence that failure to recognize and act on abnormal test results is a common medical error with potentially serious consequences for patients, providers, and the greater health care system. When a patient’s abnormal test results are not reviewed, the patient loses opportunity to initiate therapy. In the case of osteoporosis, this translates into the patient losing the opportunity to begin drug therapy, optimize calcium intake, and make lifestyle modifications (for example, stop smoking, increase exercise) to reduce risk of subsequent fractures. In the case of other screening tests, this may translate into delays in diagnosing cancer (for example, mammography, Pap smears), missed opportunities for preventing heart disease (for example, cholesterol screening), or failure to recognize complications of pharmacotherapy (elevated creatinine phospokinase with statin therapy or liver enzymes with tuberculosis drugs). The most obvious consequence for providers and health systems is from the legal liability that results from the failure to follow up on abnormal test results. Such errors are also likely to erode patient trust and satisfaction in the health care system. Moreover, if a test is performed but not reviewed, it can be considered an unnecessary expenditure. For example, in the case of DXA scanning, Medicare payment is approximately $100.18 Extrapolating our findings to the 950,000 DXA scans performed on Medicare beneficiaries in 2002
February 2005
would suggest that 24,500 patients with osteoporosis did not have their scans reviewed, resulting in $2,450,000 in excess Medicare payments. Even using conservative estimates, expanding such analyses across the total number of tests ordered in the United States each year has major financial implications.
Model Combined with Medical Record Review The current project also highlights both the advantages and limitations of using medical record review for evaluating the follow-up of abnormal test results. The model pivots on searching for an expected action (for example, initiation of osteoporosis medications) that would result from recognition of a particular test result (for example, osteoporosis on DXA scan), permitting a simple two-stage screen be performed, often electronically: first, identify patients with abnormal results and second, link the patients with a medication or test-orders database. Patients with no evidence of appropriate action can be identified and targeted for further scrutiny, including chart review. However, such a screening tool has inherent limitations. For example, such a screen would identify many patients who underwent DXA scans but did not fill a prescription, refused pharmacotherapy, filled a prescription without using their prescription drug benefit, or had a medical contraindication to therapy for further review. Although medical record review is superior to a reliance on administrative data alone and can provide an additional degree of understanding, it is limited by the quality of documentation available,19,20 as well as the resources required for manual review. Our medical record review provided significant insight into why patients with osteoporosis identified on DXA scan do not receive treatment but was unable to differentiate between two of the first-order causes as well as the more detailed second-order causes.
Limitations Even though the model has provided useful insights into the problems related to test result reporting systems, several limitations are important to note. First, although the model was carefully derived from the medical literature, it has yet to be rigorously validated. Second, it is important to recognize that even though failures in the test result reporting system appear to be
Volume 31 Number 2
Copyright 2005 Joint Commission on Accreditation of Healthcare Organizations
95
common in the current study, the frequency and causes of these failures are likely to differ both by test and by health system. This reinforces the importance of explicitly reviewing the performance of results reporting systems at the local level to ensure that institutions and office practices develop (and monitor) effective mechanisms to ensure that test results are communicated to providers and patients. Third, a significant percentage of DXA scan results were never recognized as abnormal, but the scan results were not reported on the EMR at the time of this study: this has recently been added. However, given that many practices lack any type of EMR, our results should be widely generalizable.21,22 It will be important to determine whether electronic posting results in a lower failure rate and to design electronic systems to make this a fail-safe process. Fourth, we did not contact providers who had reviewed a DXA scan and recognized that the scan showed osteoporosis but did not recommend therapy, thus reducing our ability to understand why such a decision was made. Finally, the model and current study do not directly address the fact that providers are obliged to notify patients of both abnormal and normal results.23
Conclusions The lack of a single simple solution to the problem of provider failure to recognize and act on abnormal test results underscores the complexity of the problem.24 EMRs supplemented by decision support hold great promise, but currently such systems are not widely available to a majority of practices.25 A less comprehensive but potentially effective solution involves directly reporting test results to patients (either by mail or
96
February 2005
phone). A variety of commercial “mail-box” type solutions are being marketed that purportedly can perform this function.26 For example, timely written notification is already mandated by the U.S. Federal Drug Administration for mammography results.27 Direct patient notification could theoretically reduce the burden on providers, activate and empower patients, and create a back-up system for ensuring that patients are notified of their test results.28 In the meantime, clinicians must become more vigilant in ensuring that they have systems in place (and understand their potential weak spots) to receive, act, acknowledge, track, and perform proper patient notification for all laboratory and radiology tests they order. J This study was funded in part by a new investigator award to Dr. Cram from the University of Iowa College of Medicine and an unrestricted grant from Proctor & Gamble, Inc.
Peter Cram M.D., M.B.A., is Assistant Professor, and Gary E. Rosenthal, M.D., is Professor, Division of General Internal Medicine, Department of Internal Medicine, University of Iowa College of Medicine, Iowa City, Iowa. Robert Ohsfeldt, Ph.D., is Professor, Department of Health Management and Policy, University of Iowa College of Public Health. Robert B. Wallace, M.D., M.S., is Professor, Department of Epidemiology, University of Iowa College of Public Health. Janet Schlechte, M.D., is Professor, Division of Endocrinology, Department of Internal Medicine, University of Iowa College of Medicine. Gordon D. Schiff, M.D., is Senior Attending Physician, Department of Medicine, Cook County (Stroger) Hospital, Chicago. Please address correspondence to Peter Cram, M.D., M.B.A,
[email protected].
Volume 31 Number 2
Copyright 2005 Joint Commission on Accreditation of Healthcare Organizations
References 1. Leape L.L., et al.: Preventing medical injury. QRB Qual Rev Bull 19:144–149, May 1993. 2. Kohn L.T., et al.: To Err Is Human. Washington DC: National Academy Press, 2000. 3. Gawande A.A., et al.: Risk factors for retained instruments and sponges after surgery. N Engl J Med 348:229–235, Jan. 16, 2003. 4. Gandhi T.K., et al.: Adverse drug events in ambulatory care. N Engl J Med 348:1556–1564, Apr.17, 2003. 5. Greenwald L.: Medical error in the physician office: An insurer’s perspective. Med Health R I 10:312–315, Oct. 2000. 6. Lawrence J.: Do you always make sure patients get test results? Manag Care 5:37–41, Dec. 1996. 7. Boohaker E.A., et al.: Patient notification and follow-up of abnormal test results: A physician survey. Arch Intern Med 156:327–331, Feb. 12, 1996. 8. Pinckney R.G., et al.: Delay to biopsy after a positive mammogram. J Gen Intern Med 16:213, Apr. 2001. 9. Schiff G.D., et al.: Prescribing potassium despite hyperkalemia: medication errors uncovered by linking laboratory and pharmacy information systems. Am J Med 109:494–497, Oct. 15, 2000. 10. Schiff G.D., et al.: Every system is perfectly designed to...missed diagnosis of hypothyroidism uncovered by linking lab and pharmacy data. J Gen Intern Med 18:295, Apr. 2003. 11. Nelson H.D., et al.: Screening for postmenopausal osteoporosis: A review of the evidence for the U.S. Preventive Services Task Force. Ann Intern Med 137:529–541, Sep. 17, 2002. 12. U.S. Preventive Services Task Force: Screening for osteoporosis in postmenopausal women: Recommendations and rationale. Ann Intern Med 137:526–528, Sep. 17, 2002. 13. Ray N.F., et al.: Medical expenditures for the treatment of osteoporotic fractures in the United States in 1995: Report from the National Osteoporosis Foundation. J Bone Miner Res 12:24–35, Jan. 1997. 14. Max W., et al.: The burden of osteoporosis in California, 1998. Osteoporos Int 13:493–500, Jun. 2002. 15. Cranney A., et al.: Meta-analyses of therapies for postmenopausal osteoporosis. IX: Summary of meta-analyses of therapies for postmenopausal osteoporosis. Endocr Rev 23:570–578, Aug. 2002. 16. Shea B., et al.: Meta-analyses of therapies for postmenopausal osteoporosis. VII: Meta-analysis of calcium supplementation for the prevention of postmenopausal osteoporosis. Endocr Rev 23:552–559, Aug. 2002. 17. Cranney A., et al.: Meta-analyses of therapies for postmenopausal osteoporosis. II: Meta-analysis of alendronate for the treatment of postmenopausal women. Endocr Rev 23:508–516, Aug. 2002.
February 2005
18. Medicare: 2003 Medicare Fee Schedule. Centers for Medicare and Medicaid Services http://www.noridianmedicare.com/provider/ feeschedule (last accessed Nov. 8, 2004). 19. Kerr E.A., et al.: Avoiding pitfalls in chronic disease quality measurement: A case for the next generation of technical quality measures. Am J Manag Care 7:1033–1043, Nov. 2001. 20. Kerr E.A., et al.: Comparing clinical automated, medical record, and hybrid data sources for diabetes quality measures. Jt Comm J Qual Improv 28:555–565, Oct. 2002. 21. Jonietz E.: Paperless medicine. Technol Rev 106:59–63, Apr. 2003. 22. Casalino L., et al.: External incentives, information technology, and organized processes to improve health care quality for patients with chronic diseases. JAMA 289:434–441, Jan. 22, 2003. 23. Meza J.P., Webster D.S.: Patient preferences for laboratory test results notification. Am J Manag Care 6:1297–1300, Dec. 2000. 24. Mold J.W., Cacy D.S., Dalbir D.K.: Management of laboratory test results in family practice: An OKPRN study. J Fam Pract 49:709–715, Aug. 2000. 25. Poon E.G., et al.: Design and implementation of a comprehensive outpatient results manager. J Biomed Inform 36:80–91, Feb.–Apr. 2003. 26. Ridgeway N.A., et al.: An efficient technique for communicating reports of laboratory and radiographic studies to patients in a primary care practice. Am J Med 108:575–577, May 2000. 27. Priyanath A., et al.: Patient satisfaction with the communication of mammographic results before and after the mammography quality standards reauthorization act of 1998. AJR Am J Roentgenol 178:451–456, Feb. 2002. 28. Cram P., et al.: Patient preference for being informed of their DXA scan results. J Clin Densitom 7:275–280, Fall 2004. 29. Baig N., et al.: Physician-reported reasons for limited follow-up of patients with a positive fecal occult blood test screening result. Am J Gastroenterol 98:2078–2081, Sep. 2003. 30. Klos S.E., et al.: The utilization of fecal occult blood testing in the institutionalized elderly. J Am Geriatr Soc 39:1169–1173, Dec. 1991. 31. Leape L.L., et al.: The nature of adverse events in hospitalized patients: Results of the Harvard Medical Practice Study II. N Engl J Med 324:377–384, Feb. 7, 1991. 32. Marcus A.C., et al.; Reducing loss-to-follow-up among women with abnormal pap smears: Results from a randomized trial testing an intensive follow-up protocol and economic incentives. Med Care 36:397–410, Mar. 1998. 33. Poon E.G., et al.: Communication factors in the follow-up of abnormal mammograms. J Gen Intern Med 19:316–323, Apr. 2004.
Volume 31 Number 2
Copyright 2005 Joint Commission on Accreditation of Healthcare Organizations
97