Implementation of a pilot surveillance program for smaller acute care hospitals

Implementation of a pilot surveillance program for smaller acute care hospitals

Implementation of a pilot surveillance program for smaller acute care hospitals Noleen J. Bennett, RN, MPH,a Ann L. Bull, BSc(hons), MApEpid, PhD,a Da...

95KB Sizes 0 Downloads 29 Views

Implementation of a pilot surveillance program for smaller acute care hospitals Noleen J. Bennett, RN, MPH,a Ann L. Bull, BSc(hons), MApEpid, PhD,a David R. Dunt, PhD, MB, BS, FFPH,b Denis W. Spelman, MB, BS, FRACP, FRCPA, MPH,c Philip L. Russo, RN, MCE,a and Michael J. Richards, MB, BS, FRACP, MDa Melbourne, Australia

Background: An infection control (IC) surveillance program for smaller (,100 acute beds) hospitals was piloted for 18 weeks in 14 hospitals. The aim of the pilot stage was to test a theoretical program in the context in which it was to be implemented. Method: An evaluation framework was developed, outlining the program’s intended activities for data collection, management, analysis, reporting, and use. This framework was used as a reference to interview each of the 12 IC nurses participating in the pilot stage. Results: The preferred case finding methodologies were not uniformly applied. Management, analysis, and reporting of data were delayed because of infrequent and irregular IC hours and laboratory reporting. Reports were not always distributed to key persons. Specific action was only taken in response to the process (and not outcome) module reports. Conclusion: Discrepancies between the theoretical and actual implementation of a surveillance program for smaller hospitals were highlighted. The program will need to be revised before it is rolled out to all 89 eligible hospitals across Victoria. (Am J Infect Control 2007;35:196-9.)

In late 2003, a novel infection control (IC) surveillance program for smaller (,100 acute beds) hospitals was piloted in the state of Victoria, Australia. Fourteen hospitals participated over 18 weeks. The pilot stage was considered important because guidelines outlining simple yet effective IC programs specifically for smaller hospitals had not been widely published.1 Recommendations for IC programs had mostly been based on studies undertaken in larger ($100 acute beds) hospitals.2 The specific aim of the pilot stage was to highlight any discrepancies between intended and actual activities in regard to the collection, management, analysis, reporting, and use of the program’s data. The information obtained is to be used to revise the program before From the Victorian Nosocomial Infection Surveillance System (VICNISS) Coordinating Centrea; the School of Population Health, The University of Melbourneb; and the Microbiology and Infectious Diseases Unit, Alfred Hospital,c Melbourne, Australia. Address correspondence to Noleen Bennett, RN, MPH, Senior Infection Control Consultant, Victorian Hospital Acquired Surveillance System Coordinating Centre, 10 Wreckyn St, North Melbourne 3061, VIC Australia. E-mail: [email protected]. 0196-6553/$32.00 Copyright ª 2007 by the Association for Professionals in Infection Control and Epidemiology, Inc. doi:10.1016/j.ajic.2006.04.209

196

it is ‘‘rolled out’’ to all 89 smaller hospitals across Victoria.

METHODS A theoretic evaluation framework (Table 1) was developed after consultation with the programs key stakeholders and an analysis of the relevant literature.3 For each pilot hospital, this framework was used as a reference to collect information about the program’s implementation. Each of the 12 IC nurses who were primarily responsible for the program’s implementation was interviewed at least once by the same Victorian Hospital Acquired Infection Surveillance System (VICNISS) Coordinating Centre (CC) IC nurse. Table 2 outlines the surveillance modules included in the pilot program.4-9 Multiple educational strategies were developed to assist the IC nurses in collecting data for these modules. This included a manual that outlined the standardized definitions, data collection forms, and reporting instructions for each data field to be used. The advantages of prospectively collecting surveillance data3,8 were highlighted.

RESULTS Data collection Fifty percent of the surveillance plans were submitted by the due date. One hospital had planned to

Bennett et al

April 2007

197

Table 1. Evaluation framework Objective

Activities

1. To collect accurately the data

Surveillance plans outlining modules to be undertaken are completed by the pilot IC nurses. Standard data collection forms are used by the pilot IC nurses. Prospective case finding methodologies are consistently and uniformly applied by the pilot IC nurses. Completed data collection forms are forwarded (before the due date) by the pilot IC nurses to the VICNISS Coordinating Centre. Data are checked and entered onto an aggregate database at the VICNISS Coordinating Centre. ‘‘User friendly’’ reports are generated by the VICNISS Coordinating Centre employees. Surveillance reports are distributed back to the pilot IC nurses within 1 month. Reports are distributed by the pilot IC nurses to identified key persons. Data are used by hospitals to guide the planning, implementation, and evaluation of policies/programs to prevent and control hospital-acquired infections.

2. To manage and analyze data

3. To report data (in a timely manner)

4. To use data

Table 2. Pilot surveillance modules No. of Participating Hospitals

Type of indicator

Surveillance Module

Ref. used

Process

Surgical antibiotic prophylaxis

4,5

Process

Health care workers and measles vaccination

6

As above

13

Outcome

Multiresistant organism infections

7

14

Outcome

Bloodstream infections

8

Required except for hospitals with 50-99 acute beds Required

Outcome

Outpatient hemodialysis event

9

Optional

4

Outcome

Surgical site infections

8

Optional

No hospitals

Requirement At least 1 process indicator surveillance module was required

3

14

Measurement Numerator

Denominator

Reporting time frame

1. Patients who received 1. All patients who As soon as prophylactic antibiotics underwent a procedure possible after data consistent with current in 1 of the 8 listed completion for 25 recommendations VICNISS surgical consecutive cases procedure groups* 2. Patients who received 2. All patients from prophylactic antibiotics denominator group within 2 hours before 1 who were given a prophylactic antibiotic surgical incision 3. Patients who received 3. All patients from prophylactic antibiotics denominator group that were discontinued 1 who were given a within 24 hours prophylactic antibiotic postsurgery All permanently employed All permanently employed As soon as possible health care workers health care workers after data born after 1970 who completion were susceptible to measles All patients with new Acute occupied bed days For each month, MRSA and VRE up to 2 weeks into infections the next month All patients with new Acute occupied bed days primary laboratory confirmed bloodstream infections Patient months (number All chronic hemodialysis of chronic outpatient outpatients who develhemodialysis patients oped a positive blood for each month) culture or who were commenced on IV Vancomycin. All surgical inpatients who All patients who developed a superficial, underwent a procedure deep or organ space in the chosen VICNISS infection surgical procedure groupy

For each month, up to 2 weeks into the next month For each month, up to 2 weeks into the next month

For each month, up to 6 weeks after data completion

Ref, reference; MRSA, methicillin-resistant Staphylococcus aureus; VRE, vancomycin-resistant Enterococcus. *For the Surgical Antibiotic Prophylaxis module, surgical procedure groups included appendicectomy, cholecystectomy, colon surgery, caesarean section, gastric surgery, hip prosthesis, abdominal hysterectomy, and knee prosthesis. y To be eligible, a hospital had to perform at least 70 procedures within 1 of the 20 listed surgical procedure groups.

198

Bennett et al

Vol. 35 No. 3

participate in the ‘‘Surgical Antibiotic Prophylaxis’’ module, but surgical procedure numbers were unexpectedly low. No hospitals were eligible to participate in the ‘‘Surgical Site Infection’’ module. All 12 IC nurses agreed that the standard paper data collection forms were ‘‘simple to use.’’ The program’s manual was used at least once by 11 of the IC nurses to check reporting instructions. The use of information technology resources to support data collection was not considered necessary. For each surveillance module, there was variation either among which occupational group(s) collected the data, data sources used, and/or how the data were collected. For example, for the ‘‘Multi-resistant Organism’’ module, IC nurses at 4 hospitals checked monthly laboratory reports. For those patients with positive cultures, their medical records were reviewed. (This was sometimes problematic because of the incompleteness of documentation in medical records). At the other 10 hospitals, data were prospectively collected. The IC nurse at 5 of these hospitals was able to monitor for infections while employed on the wards. At 9 hospitals, it was estimated that it took an IC nurse up to 0.5 hours per week to collect the data. At the other 5 hospitals, 1 to 2.5 hours per week was required to collect the data. For the 12 hospitals that had designated IC hours, 4.2% to 50% (mean, 15.2%) of these hours was required by the IC nurse to collect data.

Data management and analysis In total, 192 data collection forms were faxed by IC nurses to the VICNISS CC. Of the 70 monthly data summary collection forms, only 26 (37%) were faxed by the specified deadline. The most common reasons for late submission were infrequent and irregular IC hours and delayed laboratory reporting. The time taken to enter data onto the VICNISS CC aggregate database was not measured; however, the same IC nurse responsible for this task ‘‘did not consider it onerous.’’ It was ‘‘sometimes necessary’’ for the VICNISS CC IC nurse to contact a pilot IC nurse to clarify 1 or more data fields.

Data reporting In June 2004, 4 weeks later than expected, the pilot surveillance reports were posted on the password secure VICNISS Web site. The main reason for the delay was the late submission of data from the hospitals. Eleven of the IC nurses found the reports (presented as Tables) to be ‘‘user friendly.’’ In 12 hospitals, IC nurses identified up to 6 key persons to whom the VICNISS reports should be directly forwarded. In 4 hospitals, these key persons included clinicians in addition to executive managers. Medical

staff and pharmacists were not represented at any IC-related meetings for 4 and 6 hospitals, respectively.

Data usage Three IC nurses across 5 hospitals reported that specific action had been taken in response to 1 of the process surveillance module reports. Two reasons given by IC nurses for not using the outcome-based data were that,‘‘It was not statistically meaningful’’ and ‘‘No major issues were detected.’’ Alternative options suggested for ensuring that the outcome data reports were useful included ‘‘pooling’’ of data from geographically and functionally similar hospitals, longer reporting periods, and reporting of sentinel events.

DISCUSSION Working closely together during the pilot stage, the VICNISS CC and pilot hospital IC nurses were able to obtain useful information about the implementation of a surveillance program in smaller hospitals. Discrepancies between intended and actual activities in regard to the collection, management, analysis, reporting, and use of the program’s data were highlighted. This included, most notably, prospectively case-finding methodology not being uniformly applied; management, analysis, and reporting of data being delayed; and reports not always being distributed to key persons. Specific action was only taken in response to the process (not outcome) module reports. These discrepancies in particular are to be revised before the pilot program is rolled out to all eligible hospitals across Victoria. Data validation is an essential activity of any group that aggregates data from multiple collectors.8 As part of the pilot stage, an extensive study was undertaken to estimate the accuracy of reported multiresistant organism and bloodstream infections by calculating the sensitivity, predictive value positive, and specificity. The results of this study and an assessment of the supporting educational strategies implemented are to be submitted later for publication. References 1. Scheckler WE. Hospital epidemiology and infection control in small hospitals. In: Mayhall CG, editor. Hospital epidemiology and infection control. Philadelphia: Lippencott Williams and Wilkins; 2004. p. 1849-53. 2. Haley RW, Culver DH, White JW. The efficacy of infection surveillance and control programs in preventing nosocomial infections in US hospitals. Am J Epidemiol 1985;212:182-205. 3. Lee TB, Baker OG, Lee JT, Scheckler WE, Steele L, Laxton CE. Recommended practices for surveillance. Am J Infect Control 1996;26:277-88. 4. Centers for Medicare and Medicaid Services and the Centers for Disease Control and Prevention. National Surgical Infection Prevention Medicare Quality Improvement Project. Available at: www. surgicalinfectionprevention.org. Accessed November 9, 2005. 5. Therapeutic guidelines antibiotic version 11. Victoria: Therapeutic Guidelines Limited; 2000. p. 119-27.

Bennett et al 6. National Health Medical Research Council. The Australian immunisation handbook. Canberra: Australian Government Department of Health and Aging; 2003. p. 182-92. 7. Colignon P, Looke D, Ferguson J, McLaws ML, Olsen D. Surveillance definitions for multi resistant organisms (MROs). Aust Infect Control J 2002;7:i-iv.

April 2007

199

8. Gaynes RP, Horan TC. Surveillance of nosocomial infections. In: Mayhall CG, editor. Hospital epidemiology and infection control. Philadelphia: Lippencott Williams and Wilkins; 1999. p. 1285-317. 9. Tokars JI, Miller ER, Stein G. New national surveillance system for hemodialysis-associated infections: initial results. Am J Infect Control 2002;30:288-95.