In-training gastrointestinal endoscopy competency assessment tools: Types of tools, validation and impact

In-training gastrointestinal endoscopy competency assessment tools: Types of tools, validation and impact

Accepted Manuscript In-training gastrointestinal endoscopy competency assessment tools: types of tools, validation and impact Dr. Catharine M. Walsh, ...

481KB Sizes 0 Downloads 78 Views

Accepted Manuscript In-training gastrointestinal endoscopy competency assessment tools: types of tools, validation and impact Dr. Catharine M. Walsh, MD, MEd, PhD, FAAP, FRCPC PII:

S1521-6918(16)30011-7

DOI:

10.1016/j.bpg.2016.04.001

Reference:

YBEGA 1422

To appear in:

Best Practice & Research Clinical Gastroenterology

Received Date: 16 March 2016 Revised Date:

24 March 2016

Accepted Date: 7 April 2016

Please cite this article as: Walsh CM, In-training gastrointestinal endoscopy competency assessment tools: types of tools, validation and impact, Best Practice & Research Clinical Gastroenterology (2016), doi: 10.1016/j.bpg.2016.04.001. This is a PDF file of an unedited manuscript that has been accepted for publication. As a service to our customers we are providing this early version of the manuscript. The manuscript will undergo copyediting, typesetting, and review of the resulting proof before it is published in its final form. Please note that during the production process errors may be discovered which could affect the content, and all legal disclaimers that apply to the journal pertain.

ACCEPTED MANUSCRIPT

In-training gastrointestinal endoscopy competency assessment tools: types of tools, validation and impact Catharine M Walsh1

Word count: 3942 words (6538 with references) No. of Figures: 3 No. of Tables: 2

RI PT

Division of Gastroenterology, Hepatology and Nutrition and the Learning Institute, Hospital for Sick Children, the Department of Paediatrics, and the Wilson Centre, University of Toronto, Toronto, Canada1

AC C

EP

TE D

M AN U

SC

Corresponding Author: Dr. Catharine M. Walsh Highest Academic Degree(s): MD, MEd, PhD, FAAP, FRCPC Affiliations: Division of Gastroenterology, Hepatology and Nutrition and the Learning Institute, Hospital for Sick Children, the Department of Paediatrics, and the Wilson Centre, University of Toronto, Toronto, Canada Address: Hospital for Sick Children Division of Gastroenterology, Hepatology and Nutrition 555 University Ave, Room 8409, Black Wing Toronto, ON Canada M5G 1X8 Phone: 416.813.7654 x309432 Fax: 416.813.6531 Email: [email protected]

1

ACCEPTED MANUSCRIPT

Abstract: The ability to perform endoscopy procedures safely, effectively and efficiently is a core element of gastroenterology practice. Training programs strive to ensure learners demonstrate sufficient

RI PT

competence to deliver high quality endoscopic care independently at completion of training. Intraining assessments are an essential component of gastrointestinal endoscopy education,

required to support training and optimize learner’s capabilities. There are several approaches to

SC

in-training endoscopy assessment from direct observation of procedural skills to monitoring of surrogate measures of endoscopy skills such as procedural volume and quality metrics. This

M AN U

review outlines the current state of evidence as it pertains to in-training assessment of competency in performing gastrointestinal endoscopy as part of an overall endoscopy quality and skills training program.

TE D

Key words:

Endoscopy, Gastrointestinal/education



Endoscopy, Gastrointestinal/standards



Assessment



Clinical Competence



Educational Measurement



AC C



EP



Education, Medical, Graduate/standards

Patient simulation

2

ACCEPTED MANUSCRIPT

In-training gastrointestinal endoscopy competency assessment tools: types of tools, validation and impact Gastrointestinal endoscopy training largely occurs during formalized gastroenterology training

RI PT

programs of at least 2 years duration. An increasing focus on quality, patient safety, and social accountability has resulted in a paradigm shift across postgraduate medical education from a time- and process-based system that specifies the amount of time required to “learn” specified

SC

content to a competency-based system that defines outcomes of training [1]. Competency-based education implies a training process that results in documented achievement of the requisite

M AN U

knowledge, skills and attitudes for competent independent medical practice [2]. Gastroenterology training programs are obliged to ensure trainees are competent to perform endoscopic procedures safely and effectively, without prescribed oversight, at completion of training. Assessment is required to support this goal. Assessment acts to optimize learner’s

TE D

capabilities through the provision of motivation and direction for future learning, it permits documentation of competence prior to entering unsupervised practice (i.e., certification) and helps protect society from substandard care [3]. This review examines how endoscopic

EP

competence is conceptualized, outlines the importance of integrating assessment throughout the endoscopy learning cycle, and discusses the validity of currently available in-training assessment

AC C

methods and measures for gastrointestinal endoscopy.

Defining endoscopic competence: What skills should be assessed? A key goal of gastrointestinal endoscopy training programs, professional organizations and accreditation bodies is to develop competent professionals capable of providing high-quality patient care. In relation to the skill of gastrointestinal endoscopy, competence has been defined

3

ACCEPTED MANUSCRIPT

as the minimum level of skill, knowledge, and/or expertise, derived through training and experience, required to safely and proficiently perform a task or procedure [4]. The requisite skills to perform endoscopic procedures have traditionally been categorized into 2 core skill

RI PT

domains: technical and cognitive. Examples of technical or psychomotor skills related to endoscopy include scope handling and strategies for scope advancement, loop reduction,

withdrawal and mucosal inspection [5,6]. Cognitive competencies are reflective of knowledge

SC

and the application of endoscopically derived information to clinical practice. Examples of cognitive skills include selection of the most appropriate endoscopic test to assess and/or treat

M AN U

the clinical problem at hand, lesion recognition and sedation management [5,6].

Acquisition of technical and cognitive skills is fundamental to providing high-quality patient care; however, there are additional non-technical skills that are required to perform endoscopic

TE D

procedures safely and proficiently. The need to address these competencies is explicitly outlined within general competency-based frameworks from accreditation bodies such as the Accreditation Council of Graduate Medical Education’s (ACGME) Core Competencies in the

EP

United States [7] and the Royal College of Physicians and Surgeons of Canada’s (RCPSC) CanMEDS framework [8]. Additionally, the importance of assessing non-technical components

AC C

of endoscopic competence is recognized by gastroenterology-focused organizations such as the American Society of Gastrointestinal Endoscopy [9], the Canadian Association of Gastroenterology [10] and the North American Society for Pediatric Gastroenterology, Hepatology and Nutrition [6]. The importance of non-technical competencies has also been emphasized by the recognition that procedural-related adverse events are more likely to originate from behavioural failures, such as a communication failure, rather than a lack of technical

4

ACCEPTED MANUSCRIPT

expertise[11]. Furthermore, literature has shown that failures in non-technical skills, such as teamwork and situational awareness, are associated with decreased technical performance [12]. With regard to endoscopy, there is literature to suggest that non-technical skills play a pivotal

RI PT

role in high-quality endoscopic practice. Twenty of 21 recommendations stemming from the 2004 National Confidential Enquiry into Perioperative Death[13], that investigated deaths

occurring within 30 days of therapeutic gastrointestinal endoscopy procedures in the United

M AN U

making and teamwork, as opposed to technical skills.

SC

Kingdom, highlighted deficiencies in non-technical skills such as patient assessment, decision

A clear understanding of the competencies required to perform high-quality endoscopic procedures is fundamental to the development of a framework for assessment of endoscopic competence. The extant literature highlights that technical and cognitive skills are necessary but

TE D

not sufficient to ensure development and maintenance of competence in gastrointestinal endoscopy. Non-technical skills are an integral facet of competent endoscopic practice and an important contributor to patient safety and clinical outcomes. Endoscopic competence should,

EP

therefore, be conceptualized as encompassing 3 core competency domains: technical, cognitive and integrative competencies (see Figure 1) [14]. Integrative competencies are higher-level

AC C

competencies required to perform an endoscopic procedure, that complement an individual’s technical skills and clinical knowledge to facilitate effective delivery of high-quality endoscopic care in varied contexts [15]. The term “integrative” reflects the complex and interdependent relationships between non-technical skills, knowledge and technical performance. Integrative competencies include core skills such as communication and clinical judgement that allow individuals to integrate their knowledge and technical expertise to function effectively within a

5

ACCEPTED MANUSCRIPT

healthcare team, adapt to varied contexts, tolerate uncertainty, and ultimately provide safe and effective patient care. Reflective of this framework of endoscopic competence, assessment methods and measures should ideally reflect the full scope of technical, cognitive and integrative

RI PT

competencies required for performance of high-quality endoscopic procedures.

SC

[Insert Figure 1]

Assessment Goals during Training

M AN U

Assessment is an integral component of gastrointestinal endoscopy education that drives both teaching and learning. While assessment can serve many purposes, from an educational perspective assessment is generally subdivided into three categories: diagnostic, formative and summative. Diagnostic assessment is used for planning purposes. It helps trainers identify

TE D

leaners’ baseline knowledge, skills and misconceptions prior to beginning a learning activity. Formative assessment serves a developmental purpose and is process focused. It is typically embedded within the instructional process and acts to provides trainees with informative, timely

EP

feedback and benchmarks to enable them to reflect on their performance and modify their thinking and behaviour to improve learning [3,16]. Additionally, formative feedback acts to

AC C

reinforce trainees’ intrinsic motivation to learn, promotes self-reflection, helps students identify learning gaps, clarifies desired outcomes and encourages a dialogue about learning. The feedback provided by formative assessment can also be used by endoscopy training programs to identify curricular deficiencies and by endoscopy trainers to help guide improvements in ongoing teaching to facilitate learning. Summative assessment, alternatively, is outcome focused with the goal of producing an overall judgment to determine competence, readiness for independent

6

ACCEPTED MANUSCRIPT

practice or qualification for advancement [3]. It is used to indicate the extent of a learner’s success in meeting an intended outcome. Summative assessments must have sufficient psychometric rigor as they are employed to establish competence and, as a by-product, to

RI PT

promote patient safety. While summative assessment affords professional self-regulation and accountability, it may not provide sufficient feedback to direct learning [3,17].

SC

Assessment is an ongoing process that needs to be thoughtfully integrated throughout the

endoscopy learning cycle from training to accreditation to independent clinical practice (see

M AN U

Figure 2). At the start of training, diagnostic assessment can be used to determine trainees’ baseline skill level to facilitate planning. During training (or re-training) formative assessment should be used to provide trainees with feedback on which to build their knowledge and skills, thus facilitating skill acquisition and optimizing learning [18]. Summative focused assessments

TE D

are required at completion of training to enable board certification and/or medical licensure decisions to be made about whether an endoscopist is competent to practice independently. During subsequent independent practice, formative feedback can be used to promote quality

EP

improvement in patient care. Additionally, summative assessments are required to ensure

AC C

ongoing maintenance of competence and provision of high-quality endoscopic services.

[Insert Figure 2]

Selection of In-training Assessment Methods The Miller pyramid provides a framework that educators can use to help guide selection of assessment methods to target different facets of clinical competence including “knows,” “knows

7

ACCEPTED MANUSCRIPT

how,” “shows how,” and “does” [19]. The framework centers on learner’s cognition at the lower end and moves towards a focus on learner’s behaviours, thus, emphasizing the importance of assessments conducted within the authentic clinical environment as a means of assessing clinical

assessment methods of relevance to gastrointestinal endoscopy.

SC

[Insert Figure 3]

RI PT

competence. Figure 3 outlines Miller pyramid [19] with each of the 4 levels matched to

M AN U

Current State of In-training Gastrointestinal Endoscopy Competency Assessment To support the provision of high quality endoscopic care, in-training gastrointestinal endoscopy competency assessment measures are required by program training directors to monitor trainees’ progress, provide feedback for improvement, enhance learning, identify trainees who require

TE D

more focused training, and ultimately to determine when a trainee has demonstrated sufficient competence to enter practice without direct supervision. High quality assessment is reliant on the existence of tools and measures that are both reliable and valid. Reliability refers to the

EP

consistency or reproducibility of assessment outcomes over time or occasions [20]. Whereas, validity reflects the degree to which an assessment measures what it is intended to measure (i.e.,

AC C

the outcome of interest) [21]. The following section outlines assessment methods that are commonly used in-training to assess competence in performing gastrointestinal endoscopic procedures, including procedural volumes, simulation-based assessments, quality metrics and direct observational assessment tools.

Procedural Volumes

8

ACCEPTED MANUSCRIPT

Traditionally, the number of endoscopic procedures completed under supervision sufficed as a surrogate for competent performance [22–24]. Although adequate volume is necessary to achieve competence, performance of a pre-determined number of procedures does not ensure

RI PT

competence. Research has shown that there is wide variation in skill among endoscopists with similar levels of experience [25,26]. Additionally, the rate at which trainees learn is influenced by a host of factors, including training intensity [25], the presence of breaks during training [27],

SC

use of training aids (e.g., magnetic endoscopic imagers [28]), quality of instruction received, and a trainees’ innate ability. Furthermore, the accuracy of log books used to record procedural

M AN U

numbers has been questioned and these records do not provide learners and educators with specific information about the nature of learning achieved [29].

Reflective of these concerns, current gastrointestinal endoscopy credentialing guidelines specify

TE D

“competence thresholds,” as opposed to absolute procedural number requirements that guarantee competence. A “competence threshold” is a recommended minimum number of supervised procedures that a trainee is required to perform before competence can be reliably assessed.

EP

There is great variability with regard to the competence thresholds outlined in current credentialing guidelines for adult and pediatric upper endoscopy and colonoscopy [6,10,30–38].

AC C

Additionally, recent studies examining the validity evidence of adult procedural volume recommendations suggest that the published minimum required numbers may significantly underestimate the amount of training required to achieve competence [25,39–43]. Procedural volume should, therefore, only be used as a “competence threshold.” Performance of a predefined number of procedures should not be the sole criteria for competence. The question

9

ACCEPTED MANUSCRIPT

outstanding remains: what is the best way to assess learning to gauge progress and determine when trainees are competent for independent practice?

RI PT

Simulation-based Assessments

While there is substantive evidence that virtual reality endoscopy simulation-based training can be used to speed up the early learning curve and reduce patient burden [44,45], the validity

SC

evidence for simulation-based assessment of gastrointestinal endoscopic skills remains limited. Assessment utilizing simulation technology is appealing to educators as it offers a proxy for

M AN U

clinical encounters and it enables reproducible and standardized assessments at the “does” level of Miller’s pyramid [19]. Additionally, simulation permits assessment of trainees as they perform tasks independently in a risk-free environment, thus eliminating concerns for patient safety. Furthermore, simulation facilitates assessment of integrative competencies such as

TE D

communication and teamwork through endoscopy-based Integrated Procedural Performance Instrument [46] format assessment scenarios. These are hybrid simulations where a learner is assessed performing a simulated procedure in a naturalistic setting, while interacting with team

EP

members (e.g., endoscopic assistant, anesthesiologist) and an actor portraying the patient.

AC C

There are a number of compelling reasons to implement endoscopic simulation-based assessments; however, prior to widespread adoption further research is required to ensure these assessments can reliably distinguish between endoscopists with a range of endoscopic experience and are predictive of actual clinical performance [47]. Tools commonly used to assess simulated performance include performance metrics, motion analysis and/or direct observational assessment tools. Virtual reality endoscopy simulators typically generate performance metrics

10

ACCEPTED MANUSCRIPT

such as withdrawal time and patient discomfort [48]. Research assessing the validity evidence of simulator-derived metrics has yet to demonstrate that they are capable of meaningfully discriminating between endoscopists across skill levels [49–61] and two studies of moderate

RI PT

quality revealed these metrics do not correlate with performance scores assigned by blinded experts [62,63]. Performance metrics derived from tasks performed on low-fidelity part-task endoscopy simulators (e.g. speed, precision) are also being studied as a means to assess technical

SC

skills [64]; however, further validity evidence is required before they are adopted broadly.

Assessments based on motion analysis quantify performance objectively using information

M AN U

generated by motion tracking hardware and/or software that are derived from movements of the endoscopist and/or procedural instrument(s) (e.g. number of movements, hand trajectory) [65]. While a potentially promising means of objectively assessing endoscopic technical skills within both the simulated and clinical setting, research to date has been limited [66–70] and further

TE D

validity evidence of the technology and metrics is required. Direct observational assessment tools, which are reliant on an external rater to observe and assess learners, use pre-defined criteria that are built around an assessment framework (see section “In-training direct

EP

observation assessment tools”, below. These tools are advantageous as they facilitate feedback provision and potentially enable one to measure transfer of skills between the simulated and

AC C

clinical environment. To date, however, there is limited data examining reliability and validity evidence of a direct observational tool for simulated endoscopy [62,63,71,72]. Of note, virtual reality endoscopic simulation has recently been integrated into the board certification process for General Surgery in the United States through the Fundamentals of Endoscopic Surgery (FES) Program. The FES performance-based manual skills assessment consists of 5 simulation-based tasks intended to assess fundamental technical skills related to endoscopy [73]. While the

11

ACCEPTED MANUSCRIPT

assessment has good test-retest reliability (ICC = 0.85), scores correlated only modestly with performance of colonoscopy in the clinical setting, and assessors were not blinded to endoscopists’ skill level [74]. While this is a promising first step in the application of

RI PT

endoscopic simulation-based assessment, further research is required to determine whether

passing scores are a reliable and valid marker of competence in performing clinical endoscopic

SC

procedures.

Quality Metrics as an Assessment Tool

M AN U

Reflective of the healthcare system’s increasing focus on delivery of safe, effective, equitable, and high-quality care, current endoscopy credentialing guidelines highlight the importance of using evidence-based endoscopy quality and safety metrics to help determine when a trainee has demonstrated sufficient competence for unsupervised practice [10,30]. Endoscopy training

TE D

programs are increasingly requiring learners to track quality metrics, such as independent cecal intubation rate, bowel preparation quality, and patient comfort, so they can be integrated into the assessment process. Although quality metrics may reflect performance at the “does” level of

EP

Miller’s pyramid [19], their utility during training is limited as they do not provide learners and program directors with informative feedback to help pinpoint deficiencies. Additionally, while

AC C

there is evidence to show that quality and safety indicators can be used to authenticate provision of safe, high quality endoscopic care in adult practice [75,76], additional studies are required to provide validity evidence for their use as a surrogate measure of endoscopic skills during training. Additionally, with regard to pediatric endoscopy, quality and safety indicators derived from adult practice may not apply directly to the specific needs of pediatric patients and their families [77,78].

12

ACCEPTED MANUSCRIPT

In-training Direct Observational Assessment Tools In line with a competency-based educational model, accreditation bodies, such as the ACGME

RI PT

and RCPSC, and endoscopy training and credentialing guidelines have emphasized the need for continuous assessment during patient-based training. This allows training programs to monitor the learning curves of trainees as they progress towards competence. Direct observation of

SC

procedural skills is the preferred method to support ongoing skills assessment. In comparison with other performance metrics such as procedural volume and quality indicators, structured

M AN U

direct observational assessment tools are advantageous as they provide a framework for teaching, help trainers pinpoint specific deficiencies and facilitate the provision of detailed feedback to enhance performance. Despite the recognition of the importance of direct observation of procedural skills, the United Kingdom’s Joint Advisory Group on gastrointestinal endoscopy is

TE D

the only organization to date that has formally incorporated a direct observational assessment tool into their credentialing guidelines [36,37].

EP

Endoscopy has been identified as a core competency for both adult and pediatric gastroenterology training. In the United States Entrustable Professional Activities (EPAs) have

AC C

recently been developed to outline the core activities of the gastroenterology profession that should be assessed. EPAs are a core unit of professional work that can be entrusted to a learner to perform independently once sufficient competence has been achieved [79]. Endoscopy is the principal activity of two of 13 EPAs developed for adult gastroenterology training: (1) perform upper and lower endoscopic evaluation of the luminal GI tract for screening, diagnosis, and intervention; and (2) perform endoscopic procedures for the evaluation and management [80].

13

ACCEPTED MANUSCRIPT

With regard to pediatric gastroenterology, endoscopy is the principal activity of one of 5 discipline specific EPAs: Perform quality upper and lower endoscopic evaluation of the luminal gastrointestinal tract for screening, diagnosis, and intervention [81]. Workplace-based

RI PT

observation and assessment at the “does” level of the Miller pyramid [19] is required to assess performance of EPAs within the real clinical setting. To support this goal, direct observational

SC

assessment tools with strong evidence of reliability and validity are required.

Characteristics of published direct observational assessment tools for colonoscopy and upper

M AN U

endoscopy are outlined in Table 1 including, the tool development strategy, target population (adult and/or pediatric endoscopists), primary assessment purpose, format, and competency domain(s) (technical, cognitive and/or integrative) assessed. Evidence of reliability and validity of each tool is outlined in Table 2. Five sources of validity evidence are provided [82], using

TE D

previously defined operational definitions, which include (1) content (processes taken to ensure that items represent the intended assessment construct); (2) response process (relationship between the construct and the thought processes of the raters), (3) internal structure (reliability

EP

and factor analysis); (4) relations to other variables (association with scores from another instrument or feature that has an expected relationship (e.g. training level)); and (5)

AC C

consequences (impact of the assessment on participants and programmes) [21].

[Insert Table 1 and Table 2] To date, there is no published direct observational assessment tool for upper endoscopy that has strong evidence of reliability and validity. With regard to adult colonoscopy, there are 4 direct observational assessment tools have been developed and validated in a more systematic manner

14

ACCEPTED MANUSCRIPT

as compared with other published tools: The Gastrointestinal Endoscopy Assessment Tool (GiECAT) [83], The Mayo Colonoscopy Skills Assessment Tool (MCSAT) [84], the Assessment of Competency in Endoscopy (ACE) Colonoscopy Skills Assessment Tool and the Joint

RI PT

Advisory Committee on GI Endoscopy’s Direct Observation of Procedure (JAG-DOPS)

Assessment Tool [85]. As an assessment measure the GiECAT has a number of strengths. Use of Delphi consensus methodology allowed for development of a tool that is reflective of practice

SC

across institutions. The GiECAT was specifically designed to assess all domains of competence (cognitive, integrative and technical) related to colonoscopy in an integrated manner; a factor

M AN U

that is known to facilitate learning [86]. Additionally, it addresses performance of all components of a colonoscopy procedure, including pre-, intra-, and post-procedural aspects of care [83]. Furthermore, there is strong reliability and validity evidence of the GiECAT for use as a formative assessment tool in the clinical setting for both gastroenterological and surgical

TE D

trainees [87]. To date, however, there is no data assessing minimal acceptable criteria for competency based on GiECAT scores. The MCSAT has been used widely at the Mayo Clinic in Rochester since 2007 and minimal acceptable criteria for competency have been established

EP

based on longitudinal analysis of data from that institution [42]. However, the MCSAT is limited in that it was developed using local expertise and it centers predominantly on the intra-

AC C

procedural aspects of colonoscopy. Furthermore, the reliability of the tool has not been systematically assessed. The ACE Colonoscopy Skills Assessment Tool was developed by the American Society for Gastrointestinal Endoscopy Training Committee based on the format of the MCSAT. While minimal acceptable criteria for competency have been established, based on longitudinal analysis of multi-centre data, evidence regarding reliability of the tool remains lacking. The JAG-DOPS tool has been formally integrated into training and credentialing

15

ACCEPTED MANUSCRIPT

guidelines in the United Kingdom [36,37]; however, there is no published data outlining its psychometric properties within the adult or pediatric training context. The only validity evidence available examines its use within the context of summative evaluations of practicing adult

RI PT

endoscopists [88]. With regard to pediatric colonoscopy, key differences in adult and pediatric endoscopic practice highlight the need for a pediatric-specific assessment measure. The Gastrointestinal Endoscopy Competency Assessment Tool for pediatric colonoscopy

SC

(GiECATKIDS) is the only currently available assessment tool with strong evidence of reliability

M AN U

and validity that has been developed within the pediatric context [15,89].

Summary

Endoscopy training programs aim to ensure trainees are competent to perform safe and highquality endoscopic procedures without direct supervision. To support this goal rigorously

TE D

developed assessment tools with strong evidence of reliability and validity are required for continuous assessment throughout training, and ultimately, to verify a trainee has demonstrated sufficient competence for independent practice. While great strides have been made in recent

EP

years with regard to the development of in-training gastrointestinal endoscopy competency assessment tools, looking to the future, additional research is required to compare the most

AC C

promising direct observational tools to determine which is most suitable for widespread implementation for upper endoscopy and colonoscopy. Subsequently, development of a national or international database would facilitate the development of average learning curves of assessment scores for upper endoscopy and colonoscopy based on aggregate data. This would enable determination of specific milestones for endoscopists at different levels of training and facilitate comparison of trainees across programs to support competency-based training. The

16

ACCEPTED MANUSCRIPT

psychometric properties of in-training assessment measures developed to date have largely been evaluated within the context of formative assessment. Further studies are necessary to determine whether an acceptably high reliability (i.e., > 0.90) that is required for high stakes summative

RI PT

assessments can be achieved for an endoscopic assessment tool [20]. With regard to formative assessment, the optimal frequency for use of an in-training endoscopic competency assessment instrument remains unknown. Finally, more work is needed to determine how best to integrate

SC

formative and summative assessments into training to optimize the learning function of

assessment, as it is well known that trainees tend to focus on skills on which they expect to be

M AN U

tested [90]. Ultimately, meaningful competency assessment metrics should be inextricably woven within a core endoscopy curriculum to ensure optimal integration of teaching, learning,

AC C

EP

TE D

feedback and assessment.

17

ACCEPTED MANUSCRIPT

Acknowledgements Catharine M. Walsh is supported by a Canadian Child Health Clinician Scientist Program Career

RI PT

Development Award. The funder had no role in the design of this manuscript, analysis, and interpretation of the data; preparation, review, or approval of the manuscript; and decision to

SC

submit the manuscript for publication.

Conflict of Interest statement

Practice Points: •

Assessment is an essential component of gastrointestinal endoscopy education that drives

TE D

both teaching and learning. •

M AN U

The author reports no conflicts of interest and has nothing to declare.

Structured direct observational assessment tools provide a framework for teaching, facilitate provision of detailed and specific feedback, aid in the identification of skill

EP

deficits, and can be used to generate aggregate assessment data across training programs

AC C

to help gauge trainees’ progress toward specific competency-based milestones.

Research Agenda: •

Research is necessary to compare direct observational tools for gastrointestinal

endoscopy that have been developed to date to determine which is most suitable for widespread implementation

18

ACCEPTED MANUSCRIPT



Studies utilizing large scale aggregate data are necessary to enable determination of specific milestones for endoscopists at different levels of training and facilitate comparison of trainees across programs to support competency-based training.

RI PT

The optimal manner by which to integrate formative and summative assessments into

EP

TE D

M AN U

SC

training needs to be determined

AC C



19

ACCEPTED MANUSCRIPT

References 1.

Leung W. Competency based medical training: review. BMJ Br Med J. 2002;325(7366):693–6. Long DM. Competency-based residency training: the next advance in graduate medical

RI PT

2.

education. Acad Med. 2000;75(12):1178–83.

Epstein RM. Assessment in medical education. N Engl J Med. 2007;356(4):387–96.

4.

Eisen GM, Baron TH, Dominitz JA, Faigel DO, Goldstein JL, Johanson JF, et al. Methods of

SC

3.

granting hospital privileges to perform gastrointestinal endoscopy. Gastrointest Endosc.

5.

M AN U

2002 Jun;55(7):780–3.

Sedlack RE. Colonoscopy. In Cohen J (ed). Successful Training in Gastrointestinal Endoscopy. 1st edn, pp 42-72. Oxford: Wiley-Blackwell; 2011.

6.

Leichtner AM, Gillis LA, Gupta S, Heubi J, Kay M, Narkewicz MR, et al. NASPGHAN

Suppl 1:S1–8. 7.

TE D

guidelines for training in pediatric gastroenterology. J Pediatr Gastroenterol Nutr. 2013;56

Swing SR. The ACGME outcome project: retrospective and prospective. Med

8.

EP

Teach.2007;29(7):648–54.

Frank JR, Snell L, Sherbino J, editors. The Draft CanMEDS 2015 Physician Competency

AC C

Framework – Series IV. Ottawa: The Royal College of Physicians and Surgeons of Canada; 2015 March. 9.

Faigel DO, Baron TH, Lewis B, Petersen B, Petrini J. Ensuring Competence in Endoscopy. American Society for Gastrointestinal Endoscopy Website. 2006. Available from: http://www.asge.org/uploadedFiles/Publications_and_Products/Practice_Guidelines/compete nce.pdf. Accessed March 1, 2016.

20

ACCEPTED MANUSCRIPT

10. Romagnuolo J, Enns R, Ponich T, Springer J, Armstrong D, Barkun AN. Canadian credentialing guidelines for colonoscopy. Can J Gastroenterol. 2008;22(1):17–22. 11. Yule S, Flin R, Paterson-Brown S, Maran N. Non-technical skills for surgeons in the

RI PT

operating room: a review of the literature. Surgery. 2006;139(2):140–9.

12. Hull L, Arora S, Aggarwal R, Darzi A, Vincent C, Sevdalis N. The impact of nontechnical skills on technical performance in surgery: a systematic review. J Am Coll Surg.

SC

2012;214(2):214–30.

13. Cullinane M, Gray A, Hargraves C, Lucas S, Schubert M, Sherry K, et al. Scoping our

M AN U

practice. The 2004 report of the National Confidential Enquiry into Patient Outcome and Death. London; 2005. Available from: http://www.ncepod.org.uk/2004report/. Accessed March 1, 2016.

14. Walsh CM. Development and Validation of the Gastrointestinal Endoscopy Competency

2014. 221 p.

TE D

Assessment Tools for Adult and Pediatric Colonoscopy [dissertation]: University of Toronto;

15. *Walsh CM, Ling SC, Walters TD, Mamula P, Lightdale JR, Carnahan H. Development of

EP

the Gastrointestinal Endoscopy Competency Assessment Tool for Pediatric Colonoscopy (GiECATKIDS). J Pediatr Gastroenterol Nutr. 2014;59(4):480–6.

AC C

16. Shute VJ. Focus on Formative Feedback. Rev Educ Res. 200878(1):153–89. 17. Govaerts MJB, van der Vleuten CPM, Schuwirth LWT, Muijtjens AMM. Broadening perspectives on clinical performance assessment: rethinking the nature of in-training assessment. Adv Health Sci Educ Theory Pract. 2007;12(2):239–60. 18. Wass V, Vleuten C Van Der, Shatzer J, Jones R. Assessment of clinical competence. Lancet. 2001;357(9260):945–9.

21

ACCEPTED MANUSCRIPT

19. Miller GE. The assessment of clinical skills/competence/performance. Acad Med. 1990;65(9 Suppl):S63–7. 20. Downing SM. Reliability: on the reproducibility of assessment data. Med Educ.

RI PT

2004;38(9):1006–12.

21. Cook DA, Beckman TJ. Current concepts in validity and reliability for psychometric instruments: theory and application. Am J Med. 2006;119(2):166.e7–16.

SC

22. Cass OW, Freeman ML, Peine CJ, Zera RT, Onstad GR. Objective evaluation of endoscopy skills during training. Ann Intern Med. 1993;118(1):40–4.

M AN U

23. Chak A, Cooper GS, Blades EW, Canto M, Sivak M V. Prospective assessment of colonoscopic intubation skills in trainees. Gastrointest Endosc. 1996;44(1):54–7. 24. Parry BR, Williams SM. Competency and the colonoscopist: a learning curve. Aust N Z J Surg. 1991;61(6):419–22.

TE D

25. Ward ST, Mohammed MA, Walt R, Valori R, Ismail T, Dunckley P. An analysis of the learning curve to achieve competency at colonoscopy using the JETS database. Gut. 2014; 63(11):1746-54.

EP

26. Dafnis G, Granath F, Påhlman L, Hannuksela H, Ekbom A, Blomqvist P. The impact of endoscopists’ experience and learning curves and interendoscopist variation on colonoscopy

AC C

completion rates. Endoscopy. 2001;33(6):511–7. 27. Jorgensen JE, Elta GH, Stalburg CM, Kolars JC, Stout JM, Korsnes SJ, et al. Do breaks in gastroenterology fellow endoscopy training result in a decrement in competency in colonoscopy? Gastrointest Endosc. 2013;78(3):503–9.

22

ACCEPTED MANUSCRIPT

28. Shah SG, Brooker JC, Williams CB, Thapar C, Saunders BP. Effect of magnetic endoscope imaging on colonoscopy performance: a randomised controlled trial. Lancet. 2000;356(9243):1718–22.

RI PT

29. Klasko SK, Cummings R V., Glazerman LR. Education Resident data collection : Do the numbers add up ? Am J Obstet Gynecol. 1995;172(4 Pt 1):1312–6.

30. Adler DG, Bakis G, Coyle WJ, DeGregorio B, Dua KS, Lee LS, et al. Principles of training

SC

in GI endoscopy. Gastrointest Endosc. 2012;75(2):231–5.

31. Accreditation Council for Graduate Medical Education. ACGME Program Requirements for

M AN U

Graduate Medical Education in Colon and Rectal Surgery. 2011. ACGME Website. Available from:

http://www.acgme.org/acgmeweb/Portals/0/PFAssets/ProgramRequirements/060_colon_rect al_surgery_07012014_TCC.pdf. Accessed March 1, 2016.

TE D

32. Accreditation Council for Graduate MedicalEducation. ACGME program requirements for graduate medical education in general surgery. 2012. ACGME Website. Available from: http://www.acgme.org/acgmeweb/Portals/0/PFAssets/ProgramRequirements/440_general_su

EP

rgery_07012014.pdf. Accessed March 1, 2016. 33. Hori Y. Granting of privilege for gastrointestinal endoscopy. Surg Endosc.

AC C

2008;22(5):1349–52.

34. Conjoint Committee for the Recognition of Training in Gastrointestinal Endoscopy. Requirements for CCRTGE Recognition. 2014. Available from: http://www.conjoint.org.au/information.html#requirements. Accessed March 1, 2016. 35. The European Section and Board of Gastroenterology and Hepatology. The Blue Book. 2012. Available from: http://www.easl.eu/_newsroom/latest-news/the-blue-book-describes-

23

ACCEPTED MANUSCRIPT

the-european-gastroenterology-and-hepatology-specialist-competences. Accessed March 1, 2016. 36. Joint Advisory Group on GI Endoscopy (UK). JAG Trainee Certification Process -

RI PT

Colonoscopy. 2011. Available from: http://www.thejag.org.uk/Portals/0/file/Training and Certification in Endoscopy - Guidance for Colonoscopy 07_02_11.pdf. Accessed March 1, 2016.

SC

37. BSPGHAN Endoscopy Working Group. JAG Paediatric Endoscopy Certification, version 2.1. 2014. Available at:

M AN U

http://www.thejag.org.uk/downloads%5CJAG%20certification%20for%20paediatric%20trai nees%5CJAG%20Paediatric%20Certification%202.1%20300513.pdf. Accessed March 1, 2016.

38. Worthington D, American Academy of Family Physicians. Colonoscopy: Procedural Skills

TE D

(Position Paper). Am Fam Physician. 2000;62(5):1177–82.

39. Shahidi N, Ou G, Telford J, Enns R. Establishing the learning curve for achieving competency in performing colonoscopy: a systematic review. Gastrointest Endosc. 2014;

EP

40. Cass OW. Training to competence in gastrointestinal endoscopy: a plea for continuous measuring of objective end points. Endoscopy. 1999;31(9):751–4.

AC C

41. Spier BJ, Benson M, Pfau PR, Nelligan G, Lucey MR, Gaumnitz EA. Colonoscopy training in gastroenterology fellowships: determining competence. Gastrointest Endosc. 2010;71(2):319–24.

42. *Sedlack RE. Training to competency in colonoscopy: assessing and defining competency standards. Gastrointest Endosc. 2011;74(2):355–66.

24

ACCEPTED MANUSCRIPT

43. Koch AD, Haringsma J, Schoon EJ, de Man RA, Kuipers EJ. Competence measurement during colonoscopy training: the use of self-assessment of performance measures. Am J Gastroenterol. 2012;107(7):971–5.

RI PT

44. Walsh C, Sherlock M, Ling S, Carnahan H. Virtual reality simulation training for health professions trainees in gastrointestinal endoscopy. Cochrane database Syst Rev. 2012;(6):CD008237.

SC

45. Singh S, Sedlack RE, Cook DA. Effects of Simulation-Based Training in Gastrointestinal Endoscopy: A Systematic Review and Meta-analysis. Clin Gastroenterol Hepatol.

M AN U

2014;12(10):1611–23.

46. Kneebone RL, Nestel D, Moorthy K, Taylor P, Bann S, Munz Y, et al. Learning the skills of flexible sigmoidoscopy - the wider perspective. Med Educ. 2003;37(Suppl) 1:50–8. 47. Cohen J, Thompson CC. The next generation of endoscopic simulation. Am J Gastroenterol.

TE D

2013;108(7):1036–9.

48. Sedlack RE. Competency assessment: it’s time to expect more from our simulator. Dig Liver Dis. 2012;44(7):537–8.

EP

49. Elvevi A, Cantù P, Maconi G, Conte D, Penagini R. Evaluation of hands-on training in colonoscopy: is a computer-based simulator useful? Dig Liver Dis. 2012;44(7):580–4.

AC C

50. McConnell RA, Kim S, Ahmad NA, Falk GW, Forde KA, Ginsberg GG, et al. Poor discriminatory function for endoscopic skills on a computer-based simulator. Gastrointest Endosc. 2012;76(5):993–1002. 51. Plooy AM, Hill A, Horswill MS, Cresp ASG, Watson MO, Ooi S-Y, et al. Construct validation of a physical model colonoscopy simulator. Gastrointest Endosc.;76(1):144–50.

25

ACCEPTED MANUSCRIPT

52. Verdaasdonk EGG, Stassen LPS, Schijven MP, Dankelman J. Construct validity and assessment of the learning curve for the SIMENDO endoscopic simulator. Surg Endosc. 2007;21(8):1406–12.

RI PT

53. Felsher JJ, Olesevich M, Farres H, Rosen M, Fanning A, Dunkin BJ, et al. Validation of a flexible endoscopy simulator. Am J Surg. 2005;189(4):497–500.

54. Fayez R, Feldman LS, Kaneva P, Fried GM. Testing the construct validity of the Simbionix

SC

GI Mentor II virtual reality colonoscopy simulator metrics: module matters. Surg Endosc. 2010;24(5):1060–5.

M AN U

55. Koch AD, Buzink SN, Heemskerk J, Botden SMBI, Veenendaal R, Jakimowicz JJ, et al. Expert and construct validity of the Simbionix GI Mentor II endoscopy simulator for colonoscopy. Surg Endosc. 2008;22(1):158–62.

56. Sedlack RE, Kolars JC. Validation of a computer-based colonoscopy simulator. Gastrointest

TE D

Endosc. 2003;57(2):214–8.

57. Mahmood T, Darzi a. A study to validate the colonoscopy simulator. Surg Endosc. 2003;17(10):1583–9.

EP

58. Grantcharov TP, Carstensen L, Schulze S. Objective assessment of gastrointestinal endoscopy skills using a virtual reality simulator. JSLS. 2005;9(2):130–3.

AC C

59. Sedlack RE, Coyle WJ, Obstein KL, Al-Haddad M a, Bakis G, Christie J a, et al. ASGE’s assessment of competency in endoscopy evaluation tools for colonoscopy and EGD. Gastrointest Endosc. 2014;79(1):1–7. 60. Sedlack RE, Baron TH, Downing SM, Schwartz AJ. Validation of a colonoscopy simulation model for skills assessment. Am J Gastroenterol. 2007;102(1):64–74.

26

ACCEPTED MANUSCRIPT

61. Haycock A V, Bassett P, Bladen J, Thomas-Gibson S. Validation of the second-generation Olympus colonoscopy simulator for skills assessment. Endoscopy. 2009;41(11):952–8. 62. Moorthy K, Munz Y, Orchard TR, Gould S, Rockall T, Darzi A. An innovative method for

RI PT

the assessment of skills in lower gastrointestinal endoscopy. Surg Endosc. 2004;18(11):1613–9.

63. Sarker SK, Albrani T, Zaman A, Kumar I. Procedural performance in gastrointestinal

SC

endoscopy: live and simulated. World J Surg. 2010;34(8):1764–70.

64. Thompson CC, Jirapinyo P, Kumar N, Ou A, Camacho A, Lengyel B, et al. Validation of an

M AN U

endoscopic part-task training box as a skill assessment tool. Gastrointest Endosc. 2015;81(4):967–73.

65. Mason JD, Ansell J, Warren N, Torkington J. Is motion analysis a valid tool for assessing laparoscopic skill? Surg Endosc. 2013;27(5):1468–77.

TE D

66. Mohankumar D, Garner H, Ruff K, Ramirez FC, Fleischer D, Wu Q, et al. Characterization of right wrist posture during simulated colonoscopy: an application of kinematic analysis to the study of endoscopic maneuvers. Gastrointest Endosc. 2014;79(3):480–9.

EP

67. Appleyard MN, Mosse CA, Mills TN, Bell GD, Castillo FD, Swain CP. The measurement of forces exerted during colonoscopy. Gastrointest Endosc. 2000;52(2):237–40.

AC C

68. Shergill AK, Asundi KR, Barr A, Shah JN, Ryan JC, McQuaid KR, et al. Pinch force and forearm-muscle load during routine colonoscopy: a pilot study. Gastrointest Endosc. 2009;69(1):142–6.

69. Obstein KL, Patil VD, Jayender J, San José Estépar R, Spofford IS, Lengyel BI, et al. Evaluation of colonoscopy technical skill levels by use of an objective kinematic-based system. Gastrointest Endosc. 2011;73(2):315–21.

27

ACCEPTED MANUSCRIPT

70. Ende AR, Shah PM, Chandrasekhara V, Egorov V, Pasechnik A, Korman LY, et al. Quantitative Force Application During Simulated Colonoscopy Is Significantly Different Between Novice and Expert Endoscopists. Gastrointest Endosc. 2014;79(5S):AB217

RI PT

[abstract Su1568].

71. Haycock A, Koch AD, Familiari P, van Delft F, Dekker E, Petruzziello L, et al. Training and transfer of colonoscopy skills: a multinational, randomized, blinded, controlled trial of

SC

simulator versus bedside training. Gastrointest Endosc. 2010;71(2):298–307.

72. Park J, MacRae H, Musselman LJ, Rossos P, Hamstra SJ, Wolman S, et al. Randomized

M AN U

controlled trial of virtual reality simulator training: transfer to live patients. Am J Surg. 2007;194(2):205–11.

73. Vassiliou MC, Dunkin BJ, Fried GM, Mellinger JD, Trus T, Kaneva P, et al. Fundamentals of endoscopic surgery: creation and validation of the hands-on test. Surg Endosc.

TE D

2014;28(3):704–11.

74. Mueller CL, Kaneva P, Fried GM, Feldman LS, Vassiliou MC. Colonoscopy performance correlates with scores on the FESTM manual skills test. Surg Endosc. 2014;28(11):3081–5.

EP

75. Tinmouth J, Kennedy E, Baron D, Burke M, Feinberg S, Gould M, et al. Guideline for Colonoscopy Quality Assurance in Ontario. Toronto (ON): Cancer Care Ontario; 2013 Sept

AC C

9. Program in Evidence-based Care Evidence-based Series No.: 15-5 Version 2. 76. *Armstrong D, Barkun A, Bridges R, Carter R, de Gara C, Dube C, et al. Canadian Association of Gastroenterology consensus guidelines on safety and quality indicators in endoscopy. Can J Gastroenterol. 2012;26(1):17–31.

28

ACCEPTED MANUSCRIPT

77. Lightdale JR, Acosta R, Shergill AK, Chandrasekhara V, Chathadi K, Early D, et al. Modifications in endoscopic practice for pediatric patients. Gastrointest Endosc. 2014;79(5):699–710.

quality and safety. Can J Gastroenterol. 2012;26(10):735.

RI PT

78. Forget S, Walsh CM. Pediatric endoscopy: need for a tailored approach to guidelines on

79. Ten Cate O. Entrustability of professional activities and competency-based training. Med

SC

Educ. 2005;39(12):1176–7.

80. Rose S, Fix OK, Shah BJ, Jones TN, Szyjkowski RD, Bosworth BP, et al. Entrustable

2014;80(1):16–27.

M AN U

professional activities for gastroenterology fellowship training. Gastrointest Endosc.

81. NASPGHAN EPA Taskforce. Entrustable Professional Activities. 2016. NASPGHAN Website. Available from:

TE D

http://www.naspghan.org/content/132/en/training/opportunities/Entrustable-ProfessionalActivities. Accessed March 1, 2016.

82. Messick S. Validity. In Linn RL (ed) Educational Measurement. 3rd edn, pp 13–104. New

EP

York: American Council on Education and Macmillan; 1989. 83. *Walsh CM, Ling SC, Khanna N, Cooper MA, Grover SC, May G, et al. Gastrointestinal

AC C

Endoscopy Competency Assessment Tool: development of a procedure-specific assessment tool for colonoscopy. Gastrointest Endosc. 2014;79(5):798–807. 84. *Sedlack RE. The Mayo Colonoscopy Skills Assessment Tool: validation of a unique instrument to assess colonoscopy skills in trainees. Gastrointest Endosc. 2010;72(6):1125– 33.

29

ACCEPTED MANUSCRIPT

85. Joint Advisory Group on GI Endoscopy. Formative DOPS assessment form - colonoscopy and flexible sigmoidoscopy. The Joint Advisory Group on GI Endoscopy Website. 2010. Available from: http://www.thejag.org.uk/downloads/DOPS Forms For International and

RI PT

reference use only/Formative DOPS Assessment Form - Colonoscopy and FS.pdf. Accessed March 1, 2016.

86. Beard JD, Marriott J, Purdie H, Crossley J. Assessing the surgical skills of trainees in the

SC

operating theatre: a prospective observational study of the methodology. Health Technol Assess. 2011;15(1): 1–162.

M AN U

87. *Walsh CM, Ling SC, Khanna N, Grover SC, Yu JJ, Cooper MA, et al. Gastrointestinal Endoscopy Competency Assessment Tool: reliability and validity evidence. Gastrointest Endosc. 2015;81(6):1417–24.

88. *Barton JR, Corbett S, van der Vleuten CP. The validity and reliability of a Direct

TE D

Observation of Procedural Skills assessment tool: assessing colonoscopic skills of senior endoscopists. Gastrointest Endosc. 2012;75(3):591–7. 89. *Walsh CM, Ling SC, Mamula P, Lightdale JR, Walters TD, Yu JJ, et al. The

EP

gastrointestinal endoscopy competency assessment tool for pediatric colonoscopy. J Pediatr Gastroenterol Nutr. 2015;60(4):474-80.

AC C

90. Kromann CB, Jensen ML, Ringsted C. The effect of testing on skills learning. Med Educ. 2009;43(1):21–7.

91. *Sedlack RE, Coyle WJ. Assessment of competency in endoscopy: establishing and validating generalizable competency benchmarks for colonoscopy. Gastrointest Endosc. 2016;83(8):516–23.

30

ACCEPTED MANUSCRIPT

92. Boyle E, Al-Akash M, Patchett S, Traynor O, McNamara D. Towards continuous improvement of endoscopy standards: validation of a colonoscopy assessment form. Colorectal Dis. 2012;14(9):1126–31.

RI PT

93. American Association for the Study of Liver Diseases, American College of

Gastroenterology, Aerican Gastroenterological Association, American Society for

Gastroenterology. 2007;132(5):2012–8.

SC

Gastrointestinal Endoscopy. The Gastroenterology Core Curriculum, Third Edition.

94. Sarker SK, Albrani T, Zaman A, Patel B. Procedural performance in gastrointestinal

M AN U

endoscopy: an assessment and self-appraisal tool. Am J Surg. 2008;196(3):450–5. 95. Vassiliou MC, Kaneva P a, Poulose BK, Dunkin BJ, Marks JM, Sadik R, et al. Global Assessment of Gastrointestinal Endoscopic Skills (GAGES): a valid measurement tool for technical skills in flexible endoscopy. Surg Endosc. 2010;24(8):1834–41.

TE D

96. Vassiliou MC, Kaneva PA, Poulose BK, Dunkin BJ, Marks JM, Sadik R, et al. How should we establish the clinical case numbers required to achieve proficiency in flexible endoscopy? Am J Surg. 2010;199(1):121–5.

EP

97. Shah SG, Thomas-Gibson S, Brooker JC, Suzuki N, Williams CB, Thapar C, et al. Use of video and magnetic endoscope imaging for rating competence at colonoscopy: validation of

AC C

a measurement tool. Gastrointest Endosc. 2002;56(4):568–73. 98. Sullivan ME, Ortega A, Wasserberg N, Kaufman H, Nyquist J, Clark R. Assessing the teaching of procedural skills: can cognitive task analysis add to our traditional teaching methods? Am J Surg. 2008;195(1):20–3.

31

ACCEPTED MANUSCRIPT

99. Martin JA, Regehr G, Reznick R, MacRae H, Murnaghan J, Hutchison C, et al. Objective structured assessment of technical skill (OSATS) for surgical residents. Br J Surg. 1997;84(2):273–8.

RI PT

100. Vassiliou MC, Feldman LS, Andrew CG, Bergman S, Leffondré K, Stanbridge D, et al. A global assessment tool for evaluation of intraoperative laparoscopic skills. Am J Surg.

AC C

EP

TE D

M AN U

SC

2005;190(1):107–13.

32

ACCEPTED MANUSCRIPT

FIGURE CAPTIONS Figure 1: Conceptual framework of endoscopic competence and examples of corresponding competencies within each domain

RI PT

Figure 2: Framework for the integration of assessment throughout the endoscopy learning cycle Figure 3: The learning assessment pyramid outlining methods of assessment relevant to

SC

gastrointestinal endoscopy skills

TABLES

M AN U

Table 1: Characteristics of published endoscopy direct observational assessment tools Competency domains (technical,

Endoscopist Procedure (s)

ASGE’s Assessment of Competency in Endoscopy (ACE) [59,91]

and/or integrative)

Adult

Formative

Procedure-specific GRS

T, C, I

Colonoscopy

Adult

Formative

• Procedure-specific GRS

T, C, I

Colonoscopy

AC C

Diagnostic Colonoscopy and Upper

cognitive

Colonoscopy

EP

Competency-based Colonoscopy Assessment Form [92]

Scale(s)

population

TE D

Assessment tool

Primary purpose

• Checklist Adult

Formative

Procedure-specific GRS

T, C, I

Adult and

Formative

• Procedure-specific GRS

T, C, I

Endscopy Procedural Competency Forms [93]

Gastrointestinal Endoscopy

Colonoscopy

Competency Assessment Tool

• Checklist

pediatric

(GiECAT and GiECATkids) [15,83,87,89] Generic and Specific Endoscopic Technical Skills [94]

Colonoscopy

Adult

Not clearly stated

• Generic GRS

T, C, I

• Procedure-specific GRS

33

ACCEPTED MANUSCRIPT

EGD and Colonoscopy

Global Assessment of Gastrointestinal Endoscopy Skills

Adult and

Research outcome

pediatric

measure

Adult

Research outcome

Procedure-specific GRS

T

Generic GRS

T

(GAGES) [95,96] Global Rating Scale [72]

Colonoscopy

Colonoscopy

Joint Advisory Committee on GI

Adult and

RI PT

measure Summative

Procedure-specific GRS

pediatric

Endoscopy’s Direct Observation of Procedure

Mayo Colonoscopy Skills

Formative

Colonoscopy

Adult

EGD and Colonoscopy

Pediatric

Formative

Procedure-specific GRS

T, C, I

Colonoscopy

Adult

Not clearly stated

Procedure-specific GRS

T

Colonoscopy

Adult

Not clearly stated

• Procedural checklist

T, C, I

and EGD Training Score Sheets [6] Objective Structured Video Assessment Score [97] Procedural Checklist and Cognitive

TE D

Colonoscopy

Adult

checklist Formative

AC C

Scale for Measuring Technical Skill

in Performance of Colonoscopy [23]

Colonoscopy

• Objective measures

T, I

(distance, time) • Procedure-specific

EP

Colonoscopy (RAF-c) [43]

T, C, I

• Cognitive demands

Decision Points – Colonoscopy [98]

Rotterdam Assessment Form for

Procedure-specific GRS

M AN U

Assessment Tool (MCSAT) [42,84] NASPGHAN Pediatric Colonoscopy

SC

(JAG-DOPS) Assessment Tool [88]

T, C, I

visual analog scales • Improvement plan Adult

Research outcome

Procedure-specific GRS

T

measure

ASGE = American Society of Gastrointestinal Endoscopy; C = cognitive; EGD = Esophagogastroduodenoscopy; I = integrative; GRS = global rating scale; OSATS = Objective, structured Assessment of Technical Skills; NASPGHAN = North American Society for Pediatric Gastroenterology, Hepatology and Nutrition; T = technical

34

ACCEPTED MANUSCRIPT

Table 2: Evidence of reliability and validity published endoscopy direct observational assessment tools Validity Evidence Assessment tool (citation)

Response

Internal structure

process

(reliability)

Relationship with Consequences

Colonoscopy Assessment Tools • Refinement of previously

Endoscopy (ACE)

validated

Colonoscopy Skills

instrument

Assessment Tool

MCSAT [84]

• Discriminative: - Significant

SC

of Competency in

• Expert review

Competency-based Colonoscopy

• Hierarchal task analysis

Contrasting groups

with experience (p <

method used to

0.001)

establish minimal

existing instruments:

and advanced for mean

OSATS [99] and the

CL, GRS and overall

Generic and Specific

score (p < 0.001)

Endoscopic

- 2/11 CL items

Technical Skills

discriminated

assessment tool [94]

significantly between 3

AC C

EP

[92]

• Expert panel discussions

competency criteria of 3.5 that was achieved after 255 colonoscopies

- Significant difference

Modification of

revision

established:

improvement in scores

Assessment Form

• Pilot testing and

• Rigorous cut-point

• Discriminative:

TE D

[59,91]

other variables

M AN U

ASGE’s Assessment

RI PT

Content

between, intermediate

groups - 6/8 GRS items discriminated significantly between 3 groups

Diagnostic

• Expert review

35

ACCEPTED MANUSCRIPT

Colonoscopy Procedural Competency Form [93] • Discriminative:

• Areas under

rubric

- Total score: ICC1,1 = 0.85

- Scores different

receiver-operating

• Delphi consensus

based on

(95% CI, 0.73-0.92)

significantly between

curve comparing

Assessment Tool

methodology -

literature

- GRS score: ICC1,1 = 0.85

novice, intermediate and

endoscopists with

(GiECAT) [83,87]

international panel

review

(95% CI, 0.73-0.92)

advanced endoscopists

PGA scores

of experts

and

- CL score: ICC1,1 = 0.81

for total, GRS and CL

reflecting

Delphi

(95% CI, 0.67-0.90)

scores (p < 0.001)

competences versus

panel

Competency

• Test-retest reliability:

EP AC C

• Concurrent

non-competent for GiECAT total, GRS

- Total score: ICC2,1 = 0.91

(Spearman’s ρ):

and CLs scores of

(95% CI, 0.85-0.95)

- Significant

0.98 (95% CI: 0.95–

- GRS score: ICC2,1 = 0.93

correlation (p < 0.001)

1.00), 0.98 (95% CI,

(95% CI, 0.88-0.96)

of GiECAT total, GRS

0.95– 1.00), and 0.91

- CL score: ICC2,1 = 0.80

and CL scores with (1)

(95% CI, 0.83–0.98)

(95% CI, 0.68-0.88)

Number of lifetime

TE D

feedback

SC

literature review

• Inter-rater reliability:

M AN U

Endoscopy

• Scoring

RI PT

• Systematic

Gastrointestinal

colonoscopies (total:

• Internal consistency:

0.78, GRS: 0.80, CL:

- GRS: Cronbach’s α = 0.98

0.71); (2) Cecal

- CL: Cronbach’s α = 0.91

intubation rate (total: 0.81, GRS: 0.82, CL:

• Item-total correlations: - Ranged from 0.83 – 0.95

0.75); (3) TI intubation rate (total: 0.82, GRS: 0.82, CL: 0.77); and

• Inter-item correlations: - Ranged from 0.78 – 0.95

(4) physician global assessment of skills (total: 0.90, GRS: 0.94,

• Item analysis (Pearson’s

CL: 0.77)

36

ACCEPTED MANUSCRIPT

r): - Total technical item score and technical PGA score: r

- Total cognitive item score and cognitive PGA score: r = 0.82 (p < 0.001) - Total integrative item

SC

score and integrative PGA

RI PT

= 0.85 (p < 0.001)

score: r = 0.82 (p < 0.001)

M AN U

- GRS and CL scores: r = 0.82 (p < 0.001) • Systematic

Gastrointestinal Endoscopy

• Scoring

• Discriminative:

• Areas under

rubric

- Total score: ICC1,1 = 0.88

- Scores different

receiver-operating

• Delphi consensus

based on

(95% CI, 0.74-0.95)

significantly between

curve comparing

Assessment Tool for

methodology -

literature

- GRS score: ICC1,1 = 0.79

novice, intermediate and

endoscopists with

pediatric

North American

review

(95% CI, 0.56-0.91)

advanced endoscopists

PGA scores

colonoscopy

panel of experts

Competency

(GiECATkids) [15,89]

TE D

literature review

• Inter-rater reliability:

and

- CL score: ICC1,1 = 0.89

for GiECATkids total,

reflecting

Delphi

(95% CI, 0.75-0.95)

GRS and CL scores (p <

competences versus

0.001)

non-competent for

panel

AC C

EP

feedback

• Test-retest reliability: - Total score: ICC2,1 = 0.94

GiECATkids total, • Concurrent

GRS and CLs scores

(95% CI, 0.90-0.97)

(Spearman’s ρ):

of 0.99 (95% CI:

- GRS score: ICC2,1 = 0.94

- Significant

0.96–1.00), 0.98

(95% CI, 0.90-0.97)

correlation (p < 0.001)

(95% CI, 0.95–

- CL score: ICC2,1 = 0.84

of GiECATkids total,

1.00), and 0.99 (95%

(95% CI, 0.74-0.91)

GRS and CL scores

CI, 0.97–1.00)

with (1) Number of • Internal consistency:

lifetime colonoscopies

- GRS: Cronbach’s α = 0.98

(total: 0.91, GRS: 0.92,

- CL: Cronbach’s α = 0.87

CL: 0.84); (2) Cecal

37

ACCEPTED MANUSCRIPT

intubation rate (total: • Item-total correlations: - Ranged from 0.87 – 0.95

0.82, GRS: 0.85, CL: 0.77); (3) TI intubation rate (total: 0.82, GRS:

- Ranged from 0.77 – 0.92

0.82, CL: 0.80); and

RI PT

• Inter-item correlations:

(4) physician global assessment of skills

• Item analysis (Pearson’s

CL: 0.89)

SC

r):

(total: 0.95, GRS: 0.94,

- Total technical item score

M AN U

and technical PGA score: r = 0.94 (p < 0.001)

- Total cognitive item score and cognitive PGA score: r = 0.85 (p < 0.001)

- Total integrative item

TE D

score and integrative PGA score: r = 0.91 (p < 0.001) - GRS and CL scores: r =

0.88 (p < 0.001)

Endoscopic

• Expert panel

EP

Generic and Specific

discussions

• Hierarchical task

AC C

Technical Skills [94]

analyses

• Inter-rater reliability:

• Discriminative:

- Generic scale: Cronbach’s α - Significant difference = 0.85

between novice and

- Specific scale: Cronbach’s

experienced endoscopists

α = 0.80

for mean total generic scale (p = 0.003) and specific scale (p = 0.004) scores

Global Assessment of • Expert review Gastrointestinal Endoscopy Skills

• Based on existing instruments:

• Inter-rater reliability:

• Discriminative:

- Attending versus observer:

- Significant difference

ICC = 0.97 (95% CI, 0.92-

between novice and

38

ACCEPTED MANUSCRIPT

(GAGES) [95,96]

OSATS [99] and

0.99)

experienced endoscopists

GOALS [100]

- Attending versus

for total score (p < 0.001)

endoscopist (self-rating): ICC = 0.89 (95% CI, 0.810.93)

• Concurrent:

RI PT

- Pearson’s correlation

between GAGES upper • Internal consistency:

endoscopy and

- Cronbach’s α = 0.95

colonoscopy scores =

Global Rating Scale [72]

SC

0.75 (p < 0.001) • Refinement of

• Discriminative: - Significant difference

M AN U

existing instrument:

between simulation

OSATS [99]

trained and untrained novices for mean total score ((t(22) = 1.84, p < .04)

• Inter-rater reliability

TE D

• Expert review

Joint Advisory Committee on GI

(multi-center,

Endoscopy’s Direct

multi-disciplinary) • Pilot testing and

Observation of Procedure

• Survey

(JAG-DOPS)

AC C

Assessment Tool [88]

EP

revision

(senior endoscopists only):

• Concurrent (senior endoscopists only)

• Cut-point (senior endoscopists only):

- Reliability achieved using 2

- Grades ‘mirrored’

96% agreement

cases and 2 assessors: G =

global evaluation in 97%

across the crucial

0.81

of assessments (measure

pass/fail divide

- Reliability achieved for 1

of agreement not

(levels 4 and 3

case and 1 assessor: G =

provided)

versus 2 and 1)

0.65

- Pearson’s correlation

- Reliability achieved for 3

with (1) MCQ test =

cases and 4 assessors: G =

0.276 (p = 0.001); (2)

0.90

Polyp-detection rate = 0.119 (p > 0.05); (3) Cecal intubation rate: 0.122 (p > 0.05); (4) Number procedures in

39

ACCEPTED MANUSCRIPT

preceding year: -0.164 (p > 0.05) and (5) Number of lifetime procedures: 0.039 (p > 0.05) • Test

Skills Assessment

developed based

Tool (MCSAT)

on review of

Electronic

[42,84]

professional

correlation):

• Discriminative:

1 SD below their

- Mean cognitive item score

in mean cognitive score

peers provided

database

and overall cognitive score: r

between novice,

additional practice

society

(password

= 0.79 (p < 0.01)

intermediate and

for remediation of

recommendations,

protected)

- Mean motor item score and

advanced endoscopists

specific skills until

published reviews

overall motor score: r = 0.88

(p < 0.0001)

their scores

and expert opinion

(p < 0.01)

- Significant difference

improved to within

in mean motor score

1SD of their peers

between novice, intermediate and

established:

advanced endoscopists

Contrasting groups

(p < 0.0001)

method used to

TE D Pediatric Colonoscopy

• Rigorous cut-point

establish minimal competency criteria of 3.5 that was

EP

achieved after 275 colonoscopies

AC C

NASPGHAN

• Trainees performing

- Significant difference

SC

security:

• Item analysis (Pearson’s

RI PT

• Test blueprint

M AN U

Mayo Colonoscopy

Training Score Sheet [6]

Objective Structured Video Assessment Score [97]

• Inter-rater reliability: - К = 0.63 (p < 0.001)

• Discriminative: - Significant difference in mean total score between endoscopists

40

ACCEPTED MANUSCRIPT

rated as incompetent, reasonably competent and fully competent as per the global

Procedural Checklist and Cognitive

RI PT

assessment (p < 0.0001) • Cognitive task analysis

Decision Points –

SC

Colonoscopy [98]

Assessment Form for

instruments (JAG-

Colonoscopy (RAF-c)

DOPS (Barton et

[43]

al. 2012), OSATS (Reznick et al. 1997) and Park et. al.’s Global Rating

TE D

Scale (2007))

M AN U

• Based on existing

Rotterdam

Scale for Measuring Technical Skill in Performance of

EP

Colonoscopy [23]

Esophagogastroduodenoscopy Assessment Tools • Expert review

AC C

ASGE’s Assessment of Competency in

Endoscopy (ACE) EGD Skills

Assessment Tool [59] Diagnostic Upper Endoscopy Procedural

41

ACCEPTED MANUSCRIPT

Competency Form [93] Global Assessment of • Expert review • Based on existing

Gastrointestinal

• Inter-rater reliability

• Discriminative:

- Attending versus observer:

- Significant difference

instruments:

ICC = 0.96 (95% CI, 0.90-

between novice and

(GAGES) [95,96]

OSATS [99] and

0.99))

experienced endoscopists

GOALS [100]

- Attending versus

for total score (p < 0.001)

endoscopist (self-rating):

0.85)

• Concurrent:

SC

ICC = 0.89 (95% CI, 0.67-

RI PT

Endoscopy Skills

- Pearson’s correlation

M AN U

between GAGES upper

• Internal consistency

- Cronbach’s α = 0.89

endoscopy and colonoscopy scores = 0.75 (p < 0.001)

NASPGHAN Pediatric EGD

TE D

Training Score Sheets [6]

ASGE = American Society for Gastrointestinal Endoscopy; CI = confidence interval; CL = checklist, G = generalizability coefficient; GOALS = Global Operative Assessment of Laparoscopic Skills; GRS = global rating

EP

scale; MCQ = multiple choice test; NASPGHAN = North American Society for Pediatric Gastroenterology, Hepatology and Nutrition; OSATS = Objective, structured Assessment of Technical Skills; PGA = physician global

AC C

assessment; TI = terminal ileum

42

AC C

EP

TE D

M AN U

SC

RI PT

ACCEPTED MANUSCRIPT

AC C

EP

TE D

M AN U

SC

RI PT

ACCEPTED MANUSCRIPT

AC C

EP

TE D

M AN U

SC

RI PT

ACCEPTED MANUSCRIPT