Using the ACR Accreditation Process As a Quality Improvement Tool

Using the ACR Accreditation Process As a Quality Improvement Tool

CASE STUDY IN TRAINING AND EDUCATION Using the ACR Accreditation Process As a Quality Improvement Tool Kamila Nowak, MD, Voichita Bar-Ad, MD, Linda F...

91KB Sizes 2 Downloads 121 Views

CASE STUDY IN TRAINING AND EDUCATION

Using the ACR Accreditation Process As a Quality Improvement Tool Kamila Nowak, MD, Voichita Bar-Ad, MD, Linda Ferguson, CRNP, John McAna, PhD, MA, Amy S. Harrison, MS, Yan Yu, PhD, MBA, Laura Doyle, MS DESCRIPTION OF THE PROBLEM The ACR accreditation process is a voluntary review that allows for an impartial peer evaluation of radiation oncology departments. Accreditation is a mechanism for departments to demonstrate to patients, providers, and payers adherence to recognized standards of care and quality performance. Among the goals of the process listed by the ACR are recognizing quality radiation oncology practices, as well as recommending potential practice improvements based on nationally recognized standards. As part of the accreditation process, an onsite survey of departmental facilities, staff, and procedures is carried out by board-certified radiation oncologists and medical physicists. During the onsite survey, the external reviewers interview key personnel, assess documentation of departmental policies and procedures, audit the department’s existing quality assurance and improvement program, and review patient records to assess adherence to nationally recognized clinical standards. Although quality of care is a challenging metric to assess in radiation oncology, accreditation programs focus on components of a department’s structure, process, and outcomes. Most quantifiable outcomes fall into the structure and process categories of quality assessment. Our department sought to quantify the quality improvement

projects implemented throughout the process of applying for ACR accreditation. Our study evaluates the influence of voluntary participation in the accreditation process at a medium-size academic facility. In the months preceding the ACR accreditation process that took place at our institution, a multidisciplinary team with physicians and internal representatives from administration, billing, dosimetry, nursing, physics, and radiation therapy was assembled to analyze departmental clinical practices and policies and compare them to current practice ACR and American Society for Therapeutic Radiation and Oncology (ASTRO) guidelines. In addition to a review of procedures, 40 charts were randomly selected for the purpose of conducting a self-audit. Parameters to include in the self-audit were selected from ACR and ASTRO practice guidelines [1-3]. During the internal audits, several documentation parameters were found to be less than 100% compliant, and a quality improvement plan was implemented to improve documentation of these noncompliant criteria before the onsite chart review that was conducted as part of the ACR accreditation application. Noncompliant documentation fell into several categories. These included the following: data related to clinical and technical documentation; inefficiencies of quality review processes; inconsistencies in peer-review

ª 2016 American College of Radiology 1546-1440/16/$36.00 n http://dx.doi.org/10.1016/j.jacr.2015.12.026

procedures; and inconsistent physician presence at procedures such as stereotactic radiosurgery (SRS).

WHAT WE DID In the pre-accreditation self-audit, the departmental multidisciplinary team identified several areas for improvement focused on clinical and physics documentation, delays in processes such as physician review, and approval of in vivo dosimetry, physics end-oftreatment chart reviews, inconsistent physician presence at SRS cases, and lack of consistency for the peer-review process in the main campus and affiliate campuses. After this initial self-audit, a multistep improvement process was implemented to address the identified practice deviations from recommended guidelines. This process involved several changes, including institution of a “nofly” policy for physician presence at SRS cases, and adaptations to emergency medical record templates to force completion of certain ACR-ASTRO guideline documentation items. Specifically, forcing functions were added to the History and Physical Template to ensure proper documentation of parameters, including stage, pathology, pretreatment Karnofsky Performance Status (KPS) and prior radiation treatment, and all information that directly influences treatment decisions. Primary care physician and referring doctor information

1

Table 1. Changes instituted in documentation to increase compliance Patient Documentation On treatment notes

End-of-treatment notes

Follow-up notes Radiation prescriptions

Changes Instituted Include physical exam Radiation dose tracking Treatment changes tracking Document area treated Document radiation dose, fractionation, and energy Document treatment dates Document on treatment toxicities Record additional patient providers’ contact information to allow for ongoing communication Add prescription templates with standardized energy, fractionation, etc

was changed to a required data entry item, to ensure appropriate communication with the patient’s other physicians. The informed consent templates were modified to document discussion of risks and benefits of treatment, potential complications, treatment alternatives, and whether patient questions were

answered. These parameters were all part of the clinician-patient discussion, but they were inadequately captured in the patient chart to be able to indicate patient understanding of treatment options. Additional documentation template modifications were completed to increase compliance with parameters

Table 2. Documentation parameters analyzed before and after the ACR accreditation process History and Physical Documentation Stage Pathology report History of present illness Past medical history Review of systems Family history Social history Smoking/risk factors Alcohol use Pre-treatment KPS Physical exam Prior radiation treatment Communicate information with PCP Communicate information with referring physicians Informed consent documentation Risks Complications Benefits Treatment alternatives Questions answered

On Treatment Evaluation Documentation Weekly exam Progress/tolerance Accumulated dose Treatment plan change Treatment breaks Other clinical issues End-of-treatment documentation End-of-treatment present Area treated Dose Energy Treatment dates Number of fractionations and treatment dates Final treatment status Treatment tolerance Tumor response Follow-up plan Communicate with PCP Follow-up documentation First follow-up Ongoing follow-ups Ongoing correspondence with PCP regarding patient status

KPS ¼ Karnofsky Performance Status; PCP ¼ primary care physician.

2

before, during, and after treatment (Table 1). Existing tools with the departmental emergency medical record were utilized to adapt workflows to encourage timely reviews of quality records. Peer review processes were redesigned to document the weekly quality assurance conference, which reviews all patients starting treatment and details of their plan, as well as creation of a weekly treatment chart review to assess compliance with timely documentation. Separate from the process of accreditation, the departmental quality improvement committee had an interest in exploring the extended retention value of the improvement projects implemented for the department chart review before and after accreditation. Approximately 12 months after receiving ACR accreditation status, 53 patient charts were selected with the same subset of ACR requirements that were previously analyzed, and each chart was reviewed for compliance in a selfaudit process. Subsequently, the charts audited during the pre-accreditation internal review were compared with the charts analyzed 12 months after the ACR accreditation process. A total of 39 different documentation parameters, based on the ACR-ASTRO practice parameters, were analyzed in the comparison (Table 2). Statistical analysis software (SAS Institute, Inc, Cary, North Carolina) was used to analyze the collected data. A two-sided Fisher exact test was used to compare the two data groups, before and after accreditation. A P value of <.05 was considered statistically significant.

OUTCOMES The documentation parameters previously discussed were subsequently compared, both before and 12 months after ACR accreditation, to examine compliance with accepted practice guidelines. The comparison revealed a significant improvement in compliance

Journal of the American College of Radiology Volume - n Number - n - 2016

Table 3. Documentation parameter improvement after QI implementation process and ACR accreditation Documentation Parameter Consent Risks Complications Benefits Alternatives Questions answered H&P Disease stage KPS Prior RT Patient examination during treatment End of treatment Energy used Treatment tolerance Follow-up plan Follow-up First Ongoing

Compliance (%) PreeQI PosteQI Intervention Intervention

P value

80 82.5 77.5 77 67.5

98 98 98 100 96.2

.005 .019 .002 <.001 <.001

82.5 87.5 62.5 27.5

100 100 100 100

.004 .012 <.001 <.001

85 90 77.5

100 100 100

.005 .031 <.001

79.5 56.4

94.3 98

.049 <.001

H & P ¼ History & physical; KPS ¼ Karnofsky Performance Status; QI ¼ quality improvement; RT ¼ radiation treatment.

in 14 of the documentation parameters (Table 3), across all medical record documentation areas, including history and physical parameters, informed consent, on treatment, end of treatment, and follow-up documentation. Several of the other parameters that were examined showed a strong trend for improvement after the accreditation process. Quality improvement initiatives can focus on structure and/or process measures. Accreditation programs ensure that institutions are achieving accepted standards of care by reviewing practice processes and documentation. The goal of achieving accreditation

focused our department’s quality improvement initiatives primarily on process improvement and documentation. Therefore, all our data collection and analyses reflect process measures or documentation compliance. Documentation compliance was dramatically improved and maintained one year after our departmental accreditation application; however, determining whether compliance with these criteria will be maintained long term is difficult. Additionally, assessing whether increased compliance actually corresponds with improved clinical outcomes is extremely difficult, owing to lack of clinical endpoints.

This type of research is challenging as a single performance improvement project may include multiple facets of change, which may be occurring at various times throughout a clinical department. We chose to combine all documentation, process, and policy efforts into one quality improvement project focused on complying with ACR accreditation requirements and used the ACR onsite review as the time point for the intervention to determine pre- and post-accreditation compliance. We feel that this type of extended time and repeated measures evaluation is useful in radiation oncology, which is a technologically evolving field, with procedural, technical, and clinical processes changing rapidly and frequently. Our single-institution study has demonstrated that participating in a voluntary accreditation program had a positive impact on a number of items related to radiation oncology documentation. Further study is needed to assess the impact of accreditation long term and the need for frequency of recurrent external reviews.

REFERENCES 1. Demanes DJ, Franklin G, Gilbert R, et al. ACR-ASTRO practice parameter for communication: radiation oncology. Available at: www. acr.org/w/media/ACR/Documents/PGTS/ guidelines/Comm_Radiation_Oncology.pdf. Accessed September 8, 2015. 2. Ellerbroek, Hartford A, Rosenthal S, et al. ACR practice parameter on informed consent—radiation oncology. Available at: www. acr.org/w/media/ACR/Documents/PGTS/ guidelines/Informed_Consent_Rad_Onc.pdf. Accessed September 8, 2015. 3. Radiation oncology practice accreditation FAQ. Available at: www.acr.org/QualitySafety/Accreditation/RO/FAQ#1. Accessed September 8, 2015.

Kamila Nowak, MD, Voichita Bar-Ad, MD, Linda Ferguson, CRNP, Amy S. Harrison, MS, Yan Yu, PhD, MBA, and Laura Doyle, MS, are from the Department of Radiation Oncology, Sidney Kimmel Medical College at Thomas Jefferson University, Philadelphia, Pennsylvania. John McAna, PhD, MA, is from Jefferson College of Population Health, Thomas Jefferson University, Philadelphia, Pennsylvania. The authors have no conflicts of interest related to the material discussed in this article. Laura Doyle, MS: Department of Radiation Oncology, Sidney Kimmel Medical College at Thomas Jefferson University, 111 S 11th St, Philadelphia, PA 19107; e-mail: [email protected]. Journal of the American College of Radiology Nowak et al n Case Study in Training and Education

3