Surgical Hospital Audit of Record Keeping (SHARK)—A New Audit Tool for the Improvement in Surgical Record Keeping

Surgical Hospital Audit of Record Keeping (SHARK)—A New Audit Tool for the Improvement in Surgical Record Keeping

ORIGINAL REPORTS Surgical Hospital Audit of Record Keeping (SHARK)—A New Audit Tool for the Improvement in Surgical Record Keeping Perbinder Grewal, ...

NAN Sizes 0 Downloads 94 Views

ORIGINAL REPORTS

Surgical Hospital Audit of Record Keeping (SHARK)—A New Audit Tool for the Improvement in Surgical Record Keeping Perbinder Grewal, MBBS Queen Alexandra Hospital, Portsmouth, United Kingdom INTRODUCTION: Accurate and legible record keeping is a crucial part of good medical practice. Surgical Hospital Audit of Record Keeping (SHARK) is a new audit and teaching tool for junior doctors. The author has designed the tool, based on the Royal College of Surgeons guidelines, to anonymously score the different surgical teams’ medical records within a hospital. It takes into account regular record keeping during ward rounds, together with the operation note and admission clerking. METHODS: The SHARK audit tool assesses 45 individual areas within surgical records. Fifteen points are apportioned for an initial surgical clerking, 13 for a subsequent record entry, and 17 for the operation note to give an overall score out of 45. It was implemented at 2 hospitals and used to educate medical students. RESULTS: The results were poor and improved with education at both sites. There was 80% total agreement with a k coefficient for interobserver reliability of 0.6. CONCLUSION: This study shows that the SHARK tool is

simple to use, repeatable, and reliable in improving record C 2013 Association of keeping. ( J Surg 70:373-376. J Program Directors in Surgery. Published by Elsevier Inc. All rights reserved.) KEY WORDS: Record keeping COMPETENCIES: Patient Care, Practice-Based Learning

and Improvement, Systems-Based Practice

INTRODUCTION Accurate and legible record keeping is a crucial part of good medical practice. With the advent of the new European Working Time Directive, precise record keeping and hand over has become even more important for patient safety. Correspondence: Inquiries to Perbinder Grewal, Queen Alexandra Hospital, Portsmouth PO6 3LY, UK; e-mail: [email protected]

Good surgical practice states that documentation must be legible, easily identifiable (patient and clinician), dated, and timed and must contain a clear plan.1 Surgical Hospital Audit of Record Keeping (SHARK) is a new audit and teaching tool for junior doctors. The author has designed the tool, based on the Royal College of Surgeons guidelines, to anonymously score the different surgical teams’ medical records within a hospital. It takes into account regular record keeping during ward rounds, together with the operation note and admission clerking. By means of a numerical score, the tool can be used to compare records between different surgical subspecialties or firms, house officers at different stages of their foundation year 1, and ultimately between hospitals. This study assessed the use of the tool in a clinical setting and as a teaching tool for medical students.

METHODS The SHARK audit tool assesses 45 individual areas within surgical records. Fifteen points are apportioned for the initial surgical clerking, 13 for a subsequent record entry, and 17 for the operation note to give an overall score out of 45. For individual scoring points, refer the SHARK pro forma (Table 1). The tool was implemented in 2 hospitals (University College London Hospital [UCLH] and Broomfield Hospital), where a junior doctor scored 5 sets of notes from each consultant surgical firm, looking at the current admission only. Average scores were generated per surgical firm, and the juniors were subsequently educated on the importance of record keeping and the necessary inclusions. Two further analyses were then carried out on a monthly basis with repeated education. Statistical analysis was performed using SPSS via a paired t test. Interobserver reliability was measured by concordance between reviewers. A sample of 10 medical records was audited by 2 reviewers on 2 separate occasions. A K

Journal of Surgical Education  & 2013 Association of Program Directors in Surgery. Published by 1931-7204/$30.00 Elsevier Inc. All rights reserved. http://dx.doi.org/10.1016/j.jsurg.2012.12.003

373

coefficient of greater than 0.6 indicates interobserver reliability.11 SHARK was also used as a teaching tool within a lecture for final-year medical students. The students were asked to take medical notes for a fake ward round, and their records were marked using the SHARK tool. Immediately after the marks were distributed, there was a short presentation on the appropriate information that should be entered in medical records on a ward round. Thereafter, the medical students were asked to repeat the task.

FIGURE 2. SHARK scores at Broomfield hospital in 3 cycles of audit: cycle 1, preteaching and cycles 2 and 3, postteaching.

RESULTS Five records were assessed from 5 firms in each hospital in 3 rounds of audit. This made a total of 150 medical records. Cycle 1 of the audit was performed before education of medical record keeping (UCLH 1 and Broomfield 1). Cycles 2 and 3 were performed after education of the junior doctors writing in the medical notes. The data have been tabulated. Tables 2-4 show the results recorded for rounds 1 to 3 at UCLH, Tables 5-7 for rounds 1 to 3 at Broomfield, and Tables 8 and 9 for the medical student teaching. The results have been split into each of the scoring sections: initial entry, subsequent entry, and operation note, together with the overall score. The mean scores from the 2 hospitals are shown in Figures 1 and 2. There was a statistically significant (p ¼ 0.03) improvement in all the firms over the 3 cycles at UCLH. Figure 2 demonstrates the overall scores from Broomfield, which improved posteducation (p ¼ 0.01). Commonly missed entries in the medical record were patient number, date and time, consultant, name and post of doctor, diagnosis, and results. Many operation notes did not specify the diagnosis and the tissue samples sent. The scores from the medical student teaching session are shown in Figure 3, which were statistically significant (p o 0.05).

FIGURE 1. SHARK scores at UCLH in 3 cycles of audit: cycle 1, preteaching and cycles 2 and 3, postteaching. 374

Interobserver Reliability Of the 10 medical records examined by the 2 doctors, there was a concordance in 8 records. This provides 80% total agreement with a k coefficient for interobserver reliability of 0.6.

DISCUSSION Medical record keeping is a fundamental part of efficient patient care. Medical records are kept not only for clinical purposes but also for reporting individual departmental activity, monitoring overall hospital performance, and conducting research. Good and accurate records are also essential in responding to and defending against complaints and claims.12 Note-keeping tools, such as CRABEL and ANKLe, have been designed previously. CRABEL was designed in 2001 in response to the Good Surgical Practice in 1989. CRABEL was proposed as a quick, simple, and reliable method for auditing the quality of medical records.2 This assessed the initial clerking, 5 subsequent entries, and discharge letter to give an overall total. They demonstrated

FIGURE 3. SHARK scores during a lecture to medical students preteaching and postteaching.

Journal of Surgical Education  Volume 70/Number 3  May/June 2013

an improvement in the quality and accuracy of surgical record keeping with their tool, which was replicated by a separate independent study into maxillofacial surgery records.3 Their tool has also been used to compare the quality of elective and emergency vascular surgical admissions.4 It is a simple tool that can be easily adapted for the audit process. However, further studies into the CRABEL scoring system showed that an initial improvement was not maintained in subsequent audit cycles.5 In addition, the CRABEL score gives no weight to the operation note, which is crucial in the postoperative period. CRABEL also looked at 5 subsequent note entries, which penalizes the note keeper on numerous occasions for 1 omission. CRABEL was also criticized for being too heavily weighted toward subsequent entries. In addition, it included marking of the consent form and discharge summary. These latter 2 have not been included in the SHARK scoring system as an National Health Service consent form has been taken up nationally, and discharge summaries are not a good measure of day-to-day record keeping. CRABEL did not include assessment of the operative note, and we felt that this was a major component of surgical records. The ANKLe score was devised in 2008 as a means to auditing otolaryngology emergency clinic record keeping.6 It showed an improvement in quality with the use of a pro forma. It was designed with focused elements of otolaryngology and did not assess an operative record, as it was specific to a clinic setting. It is therefore of minimal value in the wider surgical arena for auditing purposes. Crawford et al. compared the quality of integrated care pathways with traditional record keeping in orthopedic surgery over a 3-month period.7 They found medical notation to be of higher quality in traditional record keeping than in the integrated care pathways. However, they did find that with both methods, the frequency of omissions was high. Omissions included date and time of entries, the appropriate entry of investigation results, and the clear identification of individual clinicians’ entries. Conversely, more recently in 2009, surgeon education and the use of a formal checklist have shown to produce better operation notes for total hip replacements, in line with recommendations from the British Orthopaedic Association.8 With the advent of the European Working Time Directive, clinical records and operative notes are becoming more important for the on-call surgeon. Clinical records are frequently cited as being a major weakness for the defense during medicolegal investigations. Rogers et al. audited their operation notes with the aim of drawing up pro forma to aid surgeons in their record keeping.9 They audited 100 consecutive operation notes using the Royal College of Surgeons guidelines and compared trainee operation notes to consultants. The vast majority of notes

had no diagram to demonstrate the surgical findings. Specialist surgeons were more likely to describe the actions accurately, but less likely to describe wound closure methods or dressings used. They were also less likely to complete adequate postoperative orders. Lefter et al. analyzed the quality of operation notes with the aid of a medicolegal lawyer in conjunction with a medical expert. They found that over 50% of the notes were incomplete for various reasons and over 15% contained no postoperative instructions.10 The author developed an audit tool to support the implementation of good surgical record keeping and have piloted it in 2 hospitals. The results have shown a poor performance of junior doctors at the start of auditing. Once the results have been presented and appropriate teaching provided, repeating the analysis has shown an improvement in record keeping. This has been repeated at both hospital sites. Interobserver agreement was 80%. Using the tool for educating the medical students showed a large improvement in the scores. These scores were still much lower than that of the foundation doctors, representing poor education of medical students in the appropriate information to be recorded in medical records and lack of experience on ward rounds.

FURTHER RESEARCH A criticism of this tool is that junior doctors may improve their medical record keeping with normal on-the-job education and experience. Thus, SHARK would be used to compare the use of education with the tool against experience as a junior doctor.

CONCLUSION This study shows that the SHARK tool is simple to use, repeatable, and reliable in improving record keeping. The tool can be used in educating medical students and junior doctors.

ACKNOWLEDGMENT Mr James Neffendorf Mr Geoffrey Roberts

REFERENCES 1. The Royal College of Surgeons of England. Good

Surgical Practice. London: The Royal College of Surgeons of England; 2008. 2. Crawford JR, Beresford TP, Lafferty KL. The CRA-

BEL score—a method for auditing medical records. Ann R Coll Surg Engl. 2001;83(1):65-68.

Journal of Surgical Education  Volume 70/Number 3  May/June 2013

375

3. Dhariwal

DK, Gibbons AJ. The CRABEL score—setting standards in maxillofacial medical note-keeping. Br J Oral Max Surg. 2004;42(3): 200-202.

4. Suh J, Roake JA, Lewis DR. Quality of clinical notes

for vascular surgery admissions: a CRABEL score review. ANZ J Surg. 2009;79(7-8):539-543. 5. Ho MY, Anderson AR, Nijjar A, et al. Use of the

CRABEL Score for improving surgical case-note quality. Ann R Coll Surg Engl. 2005;87(6):454-457. 6. Dexter SC, Hayashi D, Tysome JR. The ANKLe

score: an audit of otolaryngology emergency clinic record keeping. Ann R Coll Surg Engl. 2008;90(3): 231-234.

9. Rogers A, Bunting M, Atherstone A. The quality of

operative notes at a general surgery unit. S Afr Med J. 2008;98(9):726-728. 10. Lefter LP, Walker SR, Dewhurst F, Turner RW. An

audit of operative notes: facts and ways to improve. ANZ J Surg. 2008;78(9):800-802. 11. Streiner D, Norman G. Health Measurement Scales a

Practical Guide to Their Development and Use, 2. Oxford University Press; 1995. 12. MDDUS (The Medical and Dental Defence Union

of Scotland). Available at: /http://www.mddus. com/media/291125/essential%20guide%20to%20 medical%20and%20dental%20records12-09.pdfS; 2012 Accessed November 2012.

7. Crawford JR, Shanahan M. Documentation in ortho-

paedic surgery: do integrated care pathways work? Ann R Coll Surg Engl. 2003;85(3):197-199. 8. Morgan D, Fisher N, Ahmad A, Alam F. Improving

operation notes to meet British Orthopaedic Association guidelines. Ann R Coll Surg Engl. 2009;91(3):217-219.

376

SUPPLEMENTARY DATA Supplementary data associated with this article can be found in the online version at doi:10.1016/j.jsurg.2012. 12.003.]

Journal of Surgical Education  Volume 70/Number 3  May/June 2013