Structured assessment of microsurgery skills in the clinical setting

Structured assessment of microsurgery skills in the clinical setting

Journal of Plastic, Reconstructive & Aesthetic Surgery (2010) 63, 1329e1334 Structured assessment of microsurgery skills in the clinical setting* Woa...

657KB Sizes 0 Downloads 37 Views

Journal of Plastic, Reconstructive & Aesthetic Surgery (2010) 63, 1329e1334

Structured assessment of microsurgery skills in the clinical setting* WoanYi Chan*, Niri Niranjan, Venkat Ramakrishnan St Andrew’s Centre for Plastic Surgery & Burns, Broomfield Hospital, Court Road, Chelmsford, Essex CM1 7ET, UK Received 19 April 2009; accepted 13 June 2009

KEYWORDS Microsurgery; Assessment; Skills; Training; Microsurgery training

Summary Microsurgery is an essential component in plastic surgery training. Competence has become an important issue in current surgical practice and training. The complexity of microsurgery requires detailed assessment and feedback on skills components. This article proposes a method of Structured Assessment of Microsurgery Skills (SAMS) in a clinical setting. Three types of assessment (i.e., modified Global Rating Score, errors list and summative rating) were incorporated to develop the SAMS method. Clinical anastomoses were recorded on videos using a digital microscope system and were rated by three consultants independently and in a blinded fashion. Fifteen clinical cases of microvascular anastomoses performed by trainees and a consultant microsurgeon were assessed using SAMS. The consultant had consistently the highest scores. Construct validity was also demonstrated by improvement of SAMS scores of microsurgery trainees. The overall inter-rater reliability was strong (a Z 0.78). The SAMS method provides both formative and summative assessment of microsurgery skills. It is demonstrated to be a valid, reliable and feasible assessment tool of operating room performance to provide systematic and comprehensive feedback as part of the learning cycle. ª 2009 British Association of Plastic, Reconstructive and Aesthetic Surgeons. Published by Elsevier Ltd. All rights reserved.

* Douglas Murray Prize 2007 at the West Midlands National Plastic & Burns Surgery Meeting, October 2007, Birmingham, UK and Kilner Prize 2007 at BAPRAS Winter Scientific Meeting, December 2007, London, UK. Also presented at Euromicro, 9th Congress of the European Federation of Microsurgical Societies, June 2008, Turku, Finland and 12th International Perforator flap Course, September 2008, Coimbatore, India; and in part at Conjoint Annual Scientific Congress RACS/CSHK, May 2008, Hong Kong and International Symposium on Plastic Surgery ‘A Moment of Reflection’, June 2008, Bologna, Italy; and 5th Congress of the World Society of Reconstructive Microsurgery, June 2009, Okinawa, Japan. * Corresponding author. E-mail address: [email protected] (WoanYi Chan).

Microsurgery is an indispensable technique for complex reconstructions today. As such, it is an essential component in the training of a plastic and reconstructive surgeon. The medical profession in this era is increasingly under pressure to assure healthcare quality. Specialist training must reduce or prevent extreme variation in surgical outcomes. Assessment is generally acknowledged to be fundamental to the education and training process. A plethora of approaches, instruments and developments in assessment have been described in medical education literature, each with its own merits and disadvantages.

1748-6815/$ - see front matter ª 2009 British Association of Plastic, Reconstructive and Aesthetic Surgeons. Published by Elsevier Ltd. All rights reserved. doi:10.1016/j.bjps.2009.06.024

1330 An ideal assessment method of surgical skills is based on objective structured criteria, is cost-effective and acceptable to all stakeholders involved and has an educational impact.1,2 It has been well recognised that a major determinant of success or failure in free tissue transfer reconstructions is operator related. Retrospective reviews of large series from many centres have shown a learning curve of surgeons prior to achieving a high success rate.3 In addition to meticulous dissection of the vascular pedicle and preservation of the vascular network within the flap, the success of a microsurgical free tissue transfer is absolutely dependent on maintaining arterial inflow and venous outflow through the patent microsurgical anastomosis. Technical performance of the microsurgical anastomosis is therefore of paramount importance, and microsurgery training should include competency-based assessment of technical skills in clinical practice. We present a method of Structured Assessment of Microsurgery Skills (SAMS) in the clinical setting.

Materials and methods Development of the SAMS method A review of assessment of surgical skills was performed, which included the methods described in medical education literature, PubMed-indexed articles as well as online resources on surgical training. Objectivity and a structured approach are considered key features in the assessment of technical competence. The Objective Structured Assessment of Technical Skills (OSATS), a multi-station performance-based assessment of technical skills developed by the Research in Education group in Toronto, is a validated methodology used successfully in bench models examination of technical competence in various surgical specialties. Independent observers mark the performance of seven items of surgical skills using a five-point global rating scale (GRS) with descriptive parameters and a taskspecific checklist.4,5 The principles of these methods have also been adopted by the Intercollegiate Surgical Curriculum Project in current Specialist Training programmes.6 Modifications of the GRS and task-specific checklists have been applied in laboratory-based microsurgery skills studies,7,8 but have neither taken into account the clinical variations and difficulties nor been validated for the clinical setting. Our SAMS method is a comprehensive approach in the assessment and feedback of microsurgery skills to define competence and advance the learning curve in microsurgery in the clinical setting. The development of this method involved a process of complete deconstruction and analysis of the skills and tasks involved in performing a clinical microvascular anastomosis. Observations of clinical performances by expert microsurgeons and trainees were done to define the essential items for a structured assessment and feedback method. The developed method was subsequently reviewed by two microsurgeons to confirm coherence and comprehension of the described clinical microsurgery skills items and unambiguous interpretation of this assessment methodology by non-educationalists.

WoanYi Chan et al.

The SAMS methodology The SAMS methodology contains three assessments (i.e., GRS, errors list and summative rating) and a comments box for free commentary (Figure 1). The modified GRS consists of 12 items, grouped into four main areas of microsurgery skills: dexterity, visuo-spatial ability, operative flow and judgement. Each principal area of microsurgery skills is further subdivided into three technical components: - Dexterity (steadiness, instrument handling, tissue handling) - Visuo-spatial ability (dissection, suture placement, knot technique) - Operative flow (steps, motion, speed) - Judgement (irrigation, patency test, bleeding control) Dexterity is a very basic prerequisite to be able to start a microsurgical procedure. Steadiness, instrument handling and tissue handling relate to dexterity. Steadiness, that is, the control of tremor, is a prerequisite to handle microinstruments comfortably. The use of micro-instruments may be awkward if inappropriately held. Dextrous handling of tissue is important to minimise tissue damage at all times to reduce the risk of thrombosis. Visuo-spatial ability is a further step in familiarisation of performing surgery under the microscope. Visuo-spatial ability is required in vessel-wall dissection, suture placement and knot tightening. Well-prepared vessel ends improve visualisation of the vessel walls and ensure accurate suture placement. The placement and spacing of sutures require visuo-spatial awareness to avoid catching the back-wall and suture entanglement. Knot technique and tightening under the microscope is also done under vision rather than by feel. Operative flow relates to the whole procedure of completing an anastomosis efficiently. Many factors determine the operative flow. Good dexterity and visuo-spatial ability can facilitate the steps in performing a microvascular anastomosis. Knowledge and control of the steps is important to progress with the procedure. The control of each movement, that is, motion, contributes to an efficient operative flow of the procedure. An experienced surgeon is efficient in each movement and so operates faster. Judgement is a very important skill and demonstrates the ability to recognise, prevent and manage complications. Judgement is required for irrigation, patency test and bleeding control. Irrigation can help distending the lumen and prevent tissue desiccation, but overuse can increase surface tension causing suture adherence. A patency test needs to be done delicately to avoid injury to the intima. Adequately placed sutures through judgement can avoid anastomosis leak and so control bleeding. A task-specific checklist is a very useful method to assess novices in microsurgery in a controlled course-based environment. However, in clinical practice, various anastomosis techniques are employed and would require different checklists. A checklist may also restrict the performer in flexibility as pre-set marks dictate pass or fail. In a clinical situation, it is also difficult to weigh the different tasks. A step

Structured assessment of microsurgery skills in the clinical setting

Figure 1

1331

Form for a structured assessment of microsurgery skills.

not done well but compensated by other scores may be unacceptable for the clinical situation. For example, tearing the vessel wall through awkward tissue and instrument handling may not be clearly highlighted in a task-specific list.

For constructive feedback, an errors list would be more useful in clinical practice to highlight errors in skills, such as inappropriate magnification, excessive amount of sutures and vessel dessication. It is important that errors are not

1332

WoanYi Chan et al. Anastomoses on chicken thigh vessels performed by three novices in microsurgery were also recorded to identify the range of level of skills in the assessment. Clinical anastomosis on a patient would be unacceptable for ethical reasons and patient safety. A special workshop for novices was organised for this and the participants were taught to perform a microvascular anastomosis through videodemonstration and instruction by experienced seniors.

being perpetuated, in particular in an era of impending reduced training time. The errors list in the SAMS method is non-exhaustive and is to facilitate the assessor to identify the mistakes and errors through prompting. The comments box provides additional errors to be annotated or to comment on particular difficulties and special circumstances of the procedure. A summative conclusion using a five-point rating scale is also used to give summative feedback on the overall performance and to provide an indication of the level of skills attained. The scale for indicative skill is derived from the Dreyfus model of skills acquisition, which identifies at each stage what capacities the performer has acquired and which capacity of higher order is then to be attained. The model has been adapted to describe five levels of expertise to the clinical environment.9

Results All recorded clinical anastomoses were successful in patency and no complications occurred postoperatively. The consultant had consistently the highest scores in all the technical components of the modified GRS, in the summative scores and no errors as rated by all three assessors (Figure 2). The Consultant had a mean total GRS score of 54.5  3.2 and trainees 37.6  4.7 (p Z 0.048). Trainees had lower scores, but reached borderline competent and competent (rating scale two or three) levels in most parameters. Intra-individual improvement could also be demonstrated after six months fellowship at the microsurgery firm. In particular, a shift to higher GRS score was noticed in the parameters of visuo-spatial ability, operative flow and judgement. Inappropriate magnification, wrong grasp of tissue, dessicated or flooded surgical field and inappropriate patency test were common errors noticed in the trainees. Summative rating on overall performance and indicative skill corresponded to similar overall GRS scale levels. The overall inter-rater reliability index was strong (0.78), although the reliability index for each technical component of the GRS varied. As demonstrated in the

Validation study To validate this assessment methodology, 15 clinical cases of microvascular anastomoses performed by trainees and a consultant microsurgeon were recorded using a digital microscope system. Hence, the views of these recordings were exactly the same as those viewed by the operator. Out of 15 cases, 10 were performed by five microsurgery trainees and five cases were performed by a consultant microsurgeon. The anastomosis technique was thus nearly identical. All videos were edited to retain only venous anastomoses for blinded assessment by three consultants. Veins are considered more difficult to perform for their fine vessel walls and are therefore chosen to standardise the assessment as much as possible. SPSS version 14.0 (SPSS Inc., Chicago, IL, USA) was used to perform statistical analysis. Inter-rater reliability was determined using Cronbach’s a coefficient.

Assessment scores 5 Consultant Trainee A Trainee B

mean rating score

4

3

2

g in

bl ee d

Trainee A = halfway of microsurgery fellowship Trainee B = beginning of microsurgery fellowship

Figure 2

SAMS scores of three different surgeons.

ll

ic in d

ov er a

co nt

te y nc

ro l

st

n io at ig

irr

pa te

n

ee d

io ot

sp

s m

ep st

g di ss su ec tu tio re n pl ac em kn en ot t te ch ni qu e

nd lin ha

nd lin su e

ts en

ru m st in

tis

st

ea

ha

di ne s

s

g

1

Structured assessment of microsurgery skills in the clinical setting

Figure 3

1333

Inter-rater variability of the parameters assessed.

graph, the ratings for the parameters such as steadiness, dissection, irrigation use and bleeding control had a moderate strong correlation (0.4e0.6) (Figure 3). The videos of the novices, who performed anastomoses on chicken thigh vessels, demonstrated significant difference in all parameters, with scores at the lower end of the scale. Reliability was not tested as the difference of chicken tissues clearly hinted that the performances were done by inexperienced persons.

Discussion Competence can be defined as the quality or extent of having the necessary ability or knowledge to do something successfully.10 Although surgical competence is multimodal, proficiency in technical skills to perform an operative procedure is fundamental to successful outcome. Structured assessment of technical skills and systematic feedback is considered very important in current practice of surgical training. The UK Postgraduate Medical Education and Training Board defines assessment as the ‘process of measuring an individual’s progress and accomplishments against defined standards and criteria, which often includes an attempt at measurement. The purpose of assessment in an educational context is to make a judgement about mastery of skills or knowledge; to measure improvement over time; to arrive at some definitions of strengths and weaknesses; to rank people for selection or exclusion, or perhaps to motivate them.’11 Objective assessments are becoming increasingly important in surgical training programmes. Retrospective evaluation of training often leads to recall bias. Direct observation by experts alone is an unreliable method of assessment. The shift to competency-based training requires robust assessment systems. The GRS tool is a valid

and reliable assessment tool of operating room performance.12,13 Deconstructing all the skills involved in a microvascular anastomosis provided a detailed and structured method for assessment of microsurgery skills. Using video-recordings, nowadays easily done with digital systems, objective assessments can be done by the trainer at a convenient time and the performances reviewed by the trainee with the constructive feedback given. This method of analysing videos and assessing trainees’ competence takes about 1 h of a consultant’s time for each assessment. This would form part of the formal workplace-based assessments, which are mandatory today. Although labour-intensive, video-analysis can provide a qualitative method of assessment and performance can be quantified through definitions in a scale with descriptive anchor points. Structured feedback based on objective assessment can be valuable to improve operating room performance.14 The devised microsurgery assessment methodology showed good reliability and construct validity. It is interesting to note that some microsurgery skills parameters showed a lower reliability index. A degree of subjectivity can be expected in any type of assessment, and this could be related to the varying expectations and microsurgery experience in the assessors’ group. Steadiness in terms of tremor may be difficult to assess on videos and could be more accurately measured through computerised systems. For example, the Imperial College Surgical Assessment Device employs a hand-motion analysis technology and has been validated to measure the number of hand movements, hand-travel distance and direction and acceleration changes objectively in microsurgical performance.7 However, it was unavailable for this study. Furthermore, positioning of the electromagnetic trackers on the dorsal hands may interfere with sterility and the surgeon’s fine movements and therefore are not suitable in an actual clinical setting of microsurgery. Moreover, significant cost

1334 (approximately $5000 per unit) is involved.15 Dissection of the vessels, irrigation use and bleeding control can be related to personal style in techniques and produce less reliability in isolated assessment. The overall inter-rater reliability of the modified GRS was however strong. Assessments are important for skills evaluation and supervision of the quality of the training process of surgeons. The SAMS methodology combines both formative and summative assessments, which are considered powerful modes of training,1,16 and can be used in the workplace-based environment as well as laboratory-based microsurgery courses. The modified GRS can also, with minor modifications in the anchored descriptors, be applied to the assessment of microvascular anastomoses by venous mechanical couplers. Similarly, a further modification with deletion of some items, neural anastomoses can also be assessed. Virtual-reality models may provide a new standard of training and assessment of microsurgery skills in future, but further experiments and validation, in particular translation to the real clinical setting, are still awaited.17 Limited training time and litigation awareness are rendering a long learning curve in microsurgical skills unethical and unfeasible. The SAMS methodology provides a tool for the issues in the current era to enhance the training experience and validate technical competence in microsurgery. However, a microvascular anastomosis is only part of a free tissue transfer operation and the success of such operations is multi-factorial.

Conflict of interest None.

Funding None.

Acknowledgement We would like to thank Mr Jonathan Britto, Consultant Plastic Surgeon, for his help in the assessments of the video-recordings.We also acknowledge Dr Fernando Bello, Senior Lecturer Surgical Graphics & Computing, Imperial College London, for supervising the first author to conduct this study as part of her research for a MEd in Surgical Education.

WoanYi Chan et al.

References 1. Schuwirth LWT, van der Vleuten CPM. How to design a useful test: the principles of assessment. Guide 5, ASME. Edinburgh: Association for the Study of Medical Education; 2007. 2. Wong JA, Matsumoto ED. Primer: cognitive motor learning for teaching surgical skill e how are surgical skills taught and assessed? Nat Clin Pract Urol 2008;5:47e54. 3. Khouri R. Avoiding free flap failure. Clin Plast Surg 1992;19:773e81. 4. Reznick R, Regehr G, MacRae H, et al. Testing technical skill via an innovative bench station examination. Am J Surg 1996;172:226e30. 5. Faulkner H, Regehr G, Martin J, et al. Validation of an objective structured assessment of technical skill for surgical residents. Acad Med 1996;71:1363e5. 6. Intercollegiate Surgical Curriculum Project. Available from: http://www.iscp.ac.uk/Assessment/WBA/PBA.aspx [accessibility verified April 18, 2009]. 7. Grober ED, Hamstra ST, Wanzel KR, et al. Validation of novel and objective measures of microsurgical skill: hand-motion analysis and stereoscopic visual acuity. Microsurgery 2003;23: 317e22. 8. Atkins JL, Kalu PU, Lannon DA, et al. Training in microsurgical skill: does course-based learning deliver? Microsurgery 2005; 25:481e5. 9. Dreyfus SE, Dreyfus HL. A five-stage model of the mental activities involved in directed skill acquisition. United States Air Force Office of Scientific Research (F49620-79-C-0063), University of California at Berkeley; 1980. 10. Soanes C, Stevenson A. Concise Oxford English dictionary. 11th ed. Oxford University Press; 2004. 11. Postgraduate Medical Education and Training Board. Available from: http://www.pmetb.org.uk [accessibility verified April 18, 2009]. 12. Reznick RK. Teaching and testing technical skills. Am J Surg 1993;165:358e61. 13. Doyle JD, Webber EM, Sidhu RS. A universal global rating scale for the evaluation of technical skills in the operating room. Am J Surg 2007;193:551e5. 14. Grantcharov TP, Schulze S, Kristiansen VB. The impact of objective assessment and constructive feedback on improvement of laparoscopic performance in the operating room. Surg Endosc 2007;21:2240e3. 15. Datta V, Mackay S, Mandalia M, et al. The use of electromagnetic motion tracking analysis to objectively measure open surgical skill in the laboratory-based model. J Am Coll Surg 2001;193:479e85. 16. Aggarwal R, Grantcharov TP, Darzi A. Framework for systematic training and assessment of technical skills. J Am Coll Surg 2007;204:697e705. 17. Kalu PU, Atkins J, Baker D, et al. How do we assess microsurgical skill? Microsurgery 2005;25:25e9.