MEDICI-196; No. of Pages 9 medicina xxx (2017) xxx–xxx
Available online at www.sciencedirect.com
ScienceDirect journal homepage: http://www.elsevier.com/locate/medici
Original Research Article
Evaluation of clinical teaching quality in competency-based residency training in Lithuania Eglė Vaižgėlienė a,*, Žilvinas Padaiga a, Daiva Rastenytė b, Algimantas Tamelis c, Kęstutis Petrikonis b, Cornelia Fluit d a
Department of Preventive Medicine, Faculty of Public Health, Medical Academy, Lithuanian University of Health Sciences, Kaunas, Lithuania Department of Neurology, Medical Academy, Lithuanian University of Health Sciences, Kaunas, Lithuania c Department of Surgery, Medical Academy, Lithuanian University of Health Sciences, Kaunas, Lithuania d Radboudumc Health Academy, Nijmegen, The Netherlands b
article info
abstract
Article history:
Background and aim: In 2013, all residency programs at the Lithuanian University of Health
Received 31 May 2017
Sciences were renewed into the competency-based medical education curriculum (CBME). In
Received in revised form
2015, we implemented the validated EFFECT questionnaire together with the EFFECT-
17 August 2017
System for quality assessment of clinical teaching in residency training.
Accepted 28 August 2017
The aim of this study was to investigate the influence of characteristics of the resident
Available online xxx
(year of training) and clinical teacher (gender, age, and type of academic position) on
Keywords:
Materials and methods: Residents from 7 different residency study programs filled out 333
teaching quality, as well as to assess areas for teaching quality improvement. FFFECT
EFFECT questionnaires evaluating 146 clinical teachers. We received 143 self-evaluations of
Clinical teaching
clinical teachers using the same questionnaire. Items were scored on a 6-point Likert scale.
Quality
Main outcome measures were residents' mean overall (MOS), mean subdomain (MSS) and
Residents
clinical teachers' self-evaluation scores. The overall comparisons of MOS and MSS across
Competency-based medical
study groups and subgroups were done using Student's t test and ANOVA for trend. The
education
intraclass correlation coefficient (ICC) was calculated in order to see how residents' evaluations match with self-evaluations for every particular teacher. To indicate areas for quality improvement items were analyzed subtracting their mean score from the respective (sub) domain score. Results: MOS for domains of ‘‘role modeling’’, ‘‘task allocation’’, ‘‘feedback’’, ‘‘teaching methodology’’ and ‘‘assessment’’ valued by residents were significantly higher than those valued by teachers (P < 0.01). Teachers who filled out self-evaluation questionnaires were rated significantly higher by residents in role modeling subdomains (P < 0.05). Male teachers in (sub)domains ‘‘role modeling: CanMEDS roles and reflection’’, ‘‘task allocation’’, ‘‘planning’’ and ‘‘personal support’’ were rated significantly higher than the female teachers (P < 0.05). Teachers aged 40 years or younger were rated higher (P < 0.01). Residents ratings by type of teachers' academic position almost in all (sub)domains differed significantly
* Corresponding author at: Department of Preventive Medicine, Faculty of Public Health, Medical Academy, Lithuanian University of Health Sciences, Tilžės 18, 47181 Kaunas, Lithuania. E-mail address:
[email protected] (E. Vaižgėlienė). http://dx.doi.org/10.1016/j.medici.2017.08.002 1010-660X/© 2017 The Lithuanian University of Health Sciences. Production and hosting by Elsevier Sp. z o.o. This is an open access article under the CC BY-NC-ND license (http://creativecommons.org/licenses/by-nc-nd/4.0/). Please cite this article in press as: Vaižgėlienė E, et al. Evaluation of clinical teaching quality in competency-based residency training in Lithuania. Medicina (2017), http://dx.doi.org/10.1016/j.medici.2017.08.002
MEDICI-196; No. of Pages 9
2
medicina xxx (2017) xxx–xxx
(P < 0.05). No correlation observed between MOS of a particular teacher and MOS as rated by residents (ICC = 0.055, P = 0.399). The main areas for improvement were ‘‘feedback’’ and ‘‘assessment’’. Conclusions: Resident evaluations of clinical teachers are influenced by teachers' age, gender, year of residency training, type of teachers' academic position and whether or not a clinical teacher performed self-evaluation. Development of CBME should be focused on the continuous evaluation of quality, clinical teachers educational support and the implementation of e-portfolio. © 2017 The Lithuanian University of Health Sciences. Production and hosting by Elsevier Sp. z o.o. This is an open access article under the CC BY-NC-ND license (http://creativecommons.org/licenses/by-nc-nd/4.0/).
1.
Introduction
High quality and safe patient care can only be assured if physicians receive high-quality clinical teaching during their residency training [1–4]. Residency training aims to educate physicians through acquiring necessary clinical skills, knowledge, and competencies and thus being able to provide evidence-based health care services [5–7]. Therefore, medical schools are increasingly focused on improving clinical teaching implementing competency-based learning, with subsequent assessment of the quality in residency training according to accreditation and quality assurance standards [8– 10]. Ultimately, societies need to know how, where and by whom physicians were trained [4]. Following the standards of medical training in Europe, the Lithuanian University of Health Sciences (LSMU) renewed its residency programs into competency-based medical education (CBME) curricula. These changes set new demands for residents' teachers and requirements for the assessment and assurance of quality. To guide the realization of CBME in practice, in 2015 we implemented the validated EFFECT questionnaire (Evaluation and Feedback For effective Clinical Teaching) together with the EFFECT-System (EFFECT-S) for quality assessment of clinical teaching in residency training in LSMU [11–13]. To enable such assessments to be properly interpreted and used for the improvement of clinical teaching quality, it is essential to understand what might influence the outcomes of residents' evaluations and clinical teachers' selfevaluations [14]. Previous studies showed that resident evaluations of clinical teachers with the EFFECT questionnaire are influenced by teachers' gender, year of residency training and type of hospital. Taking into consideration the fact that the findings of the original study may not generalize beyond the Dutch healthcare and training system [15], and following the characteristics of the residency studies organization model in Lithuania, we aimed to investigate the influence of characteristics of resident (year of training) and clinical teacher (gender, age, and type of academic position) on teaching quality, as well as the relation between clinical teachers' self-scores and residents' scores and to assess areas for teaching quality improvement.
2.
Materials and methods
2.1.
Medical education in Lithuania
After a six-year undergraduate medical education program, graduates apply for residency study programs in one of 57 medical specialties (3–6 years). Two universities have a right to conduct residency study programs in Lithuania, i.e. Lithuanian University of Health Sciences and Vilnius University. During residency training, residents complete a theoretical course that is delivered by university lecturers and conduct clinical practice, being supervised by a clinical teacher in the University Hospital or in the residency bases, which are accredited by the university for the certain cycles of respective residency program. Residents are supervised by clinical teachers who can either have an academic position (professor, associate professor, lecturer or assistant) or non-academic position (i.e., non-academic teacher, who is a hospital employee having no employment contract with the University). CBME was implemented in 2013.
2.2.
EFFECT evaluation system
The System for Evaluation and Feedback For Effective Clinical Teaching (EFFECT-S) contains 5 steps: (1) creating commitment in the department (2) filling out questionnaires (3) producing a feedback report (4) a dialog between a supervisor and two residents to discuss data, guided by a facilitator, and (5) a group discussion within the department about the overall teaching quality. The validated questionnaire EFFECT is based on theories of workplace learning and clinical teaching and incorporates CanMEDS (Canadian Medical Education directives for Specialists) competences [11]. The EFFECT contains 58 items grouped to 7 domains of clinical teaching: (1) role modeling, (2) task allocation, (3) planning, (4) feedback, (5) teaching methodology, (6) personal support, and (7) assessment [16]. The rolemodeling domain contains 4 subdomains such as clinical skills, CanMEDS competencies, academic research, and reflection, and the feedback domain contains 2 subdomains such as process and content. The items were scored on a six-point Likert scale (1, very poor; 2, poor; 3, intermediate; 4, satisfactory; 5, good; 6, excellent; and not (yet) able to evaluate). The option ‘‘not (yet) able to evaluate’’ was chosen
Please cite this article in press as: Vaižgėlienė E, et al. Evaluation of clinical teaching quality in competency-based residency training in Lithuania. Medicina (2017), http://dx.doi.org/10.1016/j.medici.2017.08.002
MEDICI-196; No. of Pages 9 medicina xxx (2017) xxx–xxx
if a specific item did not (yet) occur during clinical teaching. Prior to analysis of study results, the Lithuanian version of the EFFECT questionnaire was validated [13].
2.3.
Participants
This study was carried out in 7 residency programs, which are conducted at LSMU Hospital Kauno Klinikos. Residents (n = 182) and their clinical teachers (n = 284) were invited by e-mail to fill out the online EFFECT questionnaire. Residents were asked to evaluate those supervisors, whom they have actually been working with during their residency studies. The residents could decide how many teachers they wanted to evaluate not necessarily filling out the questionnaire for every teacher they worked with. Data collection took place during 2015–2016 from the residency study programs of anesthesiology reanimathology, dermatovenerology, emergency medicine, cardiology, neurology, physical medicine and rehabilitation, obstetrics and gynecology.
2.4.
Data analysis
Statistical analysis was conducted using ‘‘IBM SPSS 20.0’’ software. In descriptive statistics, means, standard deviations, and percentages were calculated. For the EFFECT scale and its domains, the mean overall score (MOS) and mean domains scores (MSS) were calculated, averaging scores of all items with responses except for ‘‘not yet able to evaluate’’. In order to see how residents' evaluations match with self-evaluations for every particular teacher, the intraclass correlation coefficient (ICC) was calculated. The overall comparisons of MOS and MSS across study groups (residents and teachers) and subgroups (by age, gender, year of training, academic position) were conducted using the Student t test and ANOVA for trend. The statistical significance was set at P < 0.05. In addition, the items were analyzed subtracting their mean score from the score of the respective (sub)domains. Negative values can be considered as indicating a potential for quality improvement, whereas positive values can be considered as a strong point. We considered a deviation of ≥0.20 as relevant. The study was approved by the Bioethics Centre of LSMU. The heads in charge of each residency program decided to take part in the study on a voluntary basis.
3.
Results
A total of 333 questionnaires were filled out in order to assess 146 clinical teachers. As the evaluation was anonymous, we could not calculate the number of questionnaires filled out by each resident. We received 143 self-evaluations (42 males, 101 female). Of the 333 questionnaires, 36.9% were filled out by first-year; 24.3%, by second-year; 25.2%, by third-year; and 13.5%, by fourth-year residents. Table 1 shows residency programs, its duration, numbers of questionnaires filled out by residents evaluating their clinical teachers, number of teachers in the program and number of self-evaluations of clinical teachers. The results of MOS and MSS across study groups (residents and teachers) and subgroups (by age, gender, year of training,
3
academic position, yes or no self-evaluation) on the EFFECT questionnaire are provided in Table 2. The MOS and MSS for the different covariates were generally high. The mean residents' scores for all domains except for ‘‘planning’’ and ‘‘personal support’’ were significantly higher than the mean supervisors' scores of the similar self-evaluated domains (P < 0.01). Clinical teachers ‘‘up to 40 years’’ in all domains except for ‘‘feedback content’’ were rated higher than teachers aged 41 year or older (P < 0.01). No significant differences were found between similar age groups in teachers' self-evaluations. Male teachers in the ‘‘role modeling CanMEDS roles’’, ‘‘role modeling reflection’’, ‘‘task allocation’’, ‘‘planning’’ and ‘‘personal support’’ (sub)domains were rated significantly higher than the female teachers (P < 0.05). MOS and MSS of clinical teachers did not differ by gender in self-evaluation. Teachers who filled out the self-evaluation questionnaire were rated significantly higher by residents in the ‘‘role modeling clinical skills’’, ‘‘role modeling CanMEDS’’ and ‘‘role modeling academic research’’ subdomains than those who did not fill out self-evaluation (5.36 and 5.01, P < 0.05; 5.36 and 4.89, P < 0.05; 5.33 and 5.00, P < 0.05, respectively). Except for ‘‘feedback content’’, residents' ratings by type of teachers' academic position differed significantly (P < 0.05, ANOVA for trend). MOS of residents' ratings were highest for associate professors and non-academic teachers (5.30 and 5.22), while lowest – for lecturers and assistants (4.83 and 4.84). Assistants and lecturers in all (sub)domains except for ‘‘personal support’’ and ‘‘assessment’’ were rated lower than teachers with other academic position. Non-academic teachers in all domains were rated generally high: from 5.10 in ‘‘feedback process’’ to 5.53 in ‘‘assessment’’. In the ‘‘role modeling academic research’’ subdomain, the highest scores were given to professors, and the lowest, to assistants (5.63 and 4.94, respectively). Professors were rated lowest on the ‘‘personal support’’ and ‘‘assessment’’ domains (4.67 and 4.48, respectively). We observed statistically significant differences in MSS of all (sub)domains except ‘‘clinical skills’’ by years of residents training (P < 0.05, ANOVA for trend). Generally, MSS values were high in the first year of training, decreased in the second year, especially in ‘‘feedback process’’, ‘‘feedback content’’, ‘‘teaching methodology’’ and ‘‘assessment’’. MSS increased again in the third year in the same (sub)domains of ‘‘feedback process’’, ‘‘feedback content’’ and ‘‘assessment’’, also in ‘‘role modeling academic research’’. A strong MSS decrease (by more than 0.5 point) was observed in the fourth year for all (sub) domains. Clinical teachers together scored themselves significantly lower on most (sub)domains than residents. However, the individual MOS based on an individual clinical teacher's selfevaluation did not correlate with the resident's MOS of that individual teacher (ICC = 0.055, P = 0.399). Clinical teachers who received lower rates from resident(s) (MOS up to 4.3) tended to rate themselves higher, whereas teachers who received the highest scores (MOS from 5.5) rated themselves lower than residents. In Figure, the comparisons of the MOS between 83 individual clinical teachers' self-evaluations and the MOS of the residents' (one or several) evaluations of that particular teacher are presented.
Please cite this article in press as: Vaižgėlienė E, et al. Evaluation of clinical teaching quality in competency-based residency training in Lithuania. Medicina (2017), http://dx.doi.org/10.1016/j.medici.2017.08.002
MEDICI-196; No. of Pages 9
4
medicina xxx (2017) xxx–xxx
Table 1 – Number of questionnaires filled out by residents, number of teachers evaluated in the program and their selfevaluations per residency program. Residency program (duration in years)
No. of questionnaires filled out by residents
No. of teachers in the program
No. of teachers evaluated in the program
Anesthesiology reanimathology (4) Cardiology (4) Dermatovenerology (4) Emergency medicine (5)a Neurology (4) Obstetrics and gynecology (4) Physical medicine and rehabilitation (3)
57 45 37 115 20 30 29
70 53 10 74 22 37 18
21 22 8 56 12 16 11
30 22 8 24 14 34 11
Total
333
284
146
143
a
New residency program which started in 2013.
In Table 3 the differences between the scores of (sub) domain items and the respective (sub)domain average score are presented: negative deviation of their scores indicating potential for quality improvement. Negative resident item scores ≥0.20 compared to the mean score for the respective domain score (MSS) were seen in all domains except for the planning domain. According to the teachers' self-evaluation, the planning could be improved on all aspects, especially ‘‘reserves time to supervise/counsel me’’ (item 22) requires more attention. The role modeling competence ‘‘cooperating with colleagues’’ should be improved (item 6). Based on the selfevaluation scores of the teachers, a potential for improvement would be role modeling ‘‘organizing his or her own work adequately’’ and ‘‘handle complains and incidents’’ (item 8 and 11). Teachers rated themselves relatively low for being a leading example of how a resident wants to perform as a specialist (item 15). Negative values for the task allocation skills ‘‘teaches me how organize and plan my work’’ (item 20) and ‘‘prevents me from having to perform too many tasks irrelevant to my learning’’ (item 21) were observed by residents. Teachers also rated themselves low on teaching how to organize and plan work (item 20). Both residents (item 29) and especially teachers (items 28, 29) indicated that the quality of the provided feedback could be improved. We found highly negative values by both residents and teachers for the teaching skills ‘‘reviews the learning objectives’’ (item 37). Teachers rated ‘‘reviewing reports’’ (item 40) lower, thus indicating the area of possible improvement as well. Teachers and residents strongly agreed on teachers' improvement in personal support skills of ‘‘helps and advises me on how to maintain a good work-home balance’’ (item 50). Finally, within the assessment domain, a big proportion of residents' and teachers indicated that they were not able to rate these items (more than 70% and 60%, respectively).
4.
No. of self-evaluations by teachers
Discussion
Our results show that residents' evaluation of their clinical teachers were influenced by teachers' age, gender type of academic position, year of residency training and whether
teachers filled out self-evaluation or not. Clinical teachers in all sub(domains) tended to evaluate themselves lower than residents did, whereas individual teachers, who received lower rates from residents tended to rate themselves higher. We indicated potential areas for teaching quality improvement. In our study, younger clinical teachers received higher scores for their clinical teaching performance than the older teachers. Other studies also reported similar results [17]. In the clinical teaching setting, this could mean that younger teachers are more positive toward teaching residents compared to their older colleagues. Being recently residents themselves, younger teachers might be more compassionate toward younger colleagues assigning more time for teaching and being less hierarchical than older teachers. It was indicated by other authors that the chance of being rated as a better teacher increases with more time allocated on daily clinical teaching [17]. It needs to be explored why younger clinical teachers are rated higher by residents and what implications this could have for quality improvement of residency studies. Male clinical teachers were evaluated higher than female teachers in all domains. The literature on gender-related teaching performance also indicates that male teachers are more likely to receive better scores for their teaching quality, whereas female teachers could be expected to do better in communicative areas [17,18]. These results are different from the original studies in the Netherlands – female teachers received significantly higher scores than male ones almost in all domains [14]. In our study, non-academic teachers were highly valued by residents, and in some EFFECT (sub)domains received the highest scores. Non-academic teachers are hospital employees having no work contract with the University, and often they do not identify themselves being formal clinical teachers. This situation was illustrated in their written comments: ‘‘I cannot identify myself as a clinical teacher as no such official position exists, and only resident supervisors are present. I consider that while listening to me, gathering history, examining patients, etc. those residents, who are willing to learn, gain sufficient information from me. I cannot allocate more time to teaching as there is no such an official position.’’
Please cite this article in press as: Vaižgėlienė E, et al. Evaluation of clinical teaching quality in competency-based residency training in Lithuania. Medicina (2017), http://dx.doi.org/10.1016/j.medici.2017.08.002
MEDICI-196; No. of Pages 9
Mean overall score
Role modeling
Task allocation
Planning
Feedback
Personal support
Assessment
Clinical skills
CanMEDS roles
Academic research
Reflection
5.04 4.74 <0.001
5.27 4.77 <0.001
5.22 4.94 <0.001
5.28 4.81 <0.001
5.01 4.77 0.003
5.04 4.75 0.001
5.03 4.88 0.088
4.84 4.51 <0.001
5.00 4.78 0.007
5.04 4.78 <0.001
5.03 4.90 0.106
5.10 4.42 <0.001
5.32 4.90 <0.001
5.46 5.18 0.009
5.48 5.10 <0.001
5.50 5.18 0.004
5.31 4.88 0.001
5.30 4.91 0.001
5.37 4.87 <0.001
5.11 4.71 0.002
5.16 4.93 0.080
5.30 4.92 0.001
5.41 4.86 <0.001
5.49 4.91 0.002
Self-evaluation by age 19–40 years 41 years and more P*
4.76 4.73 0.775
4.77 4.77 0.998
4.88 4.98 0.327
4.75 4.86 0.485
4.71 4.81 0.307
4.68 4.80 0.333
4.92 4.86 0.592
4.48 4.53 0.711
4.87 4.73 0.245
4.65 4.62 0.791
4.99 4.84 0.150
4.65 4.35 0.214
Residents ratings by teacher gender Female Male P*
4.95 5.19 0.025
5.21 5.38 0.109
5.14 5.36 0.042
5.20 5.43 0.057
4.89 5.24 0.006
4.92 5.24 0.006
4.91 5.24 0.012
4.76 4.98 0.098
4.95 5.08 0.327
4.97 5.17 0.074
4.91 5.25 0.009
5.02 5.21 0.366
Self-evaluation by gender Female Male P*
4.71 4.80 0.370
4.73 4.86 0.334
4.91 5.00 0.408
4.78 4.90 0.477
4.76 4.81 0.620
4.71 4.86 0.214
4.84 4.98 0.282
4.47 4.62 0.262
4.76 4.83 0.540
4.63 4.65 0.901
4.88 4.94 0.601
4.38 4.50 0.569
Self-evaluation Filled out Not filled out P*
5.13 4.88 0.126
5.01 5.36 0.035
5.38 4.89 0.044
5.38 4.89 0.008
5.11 4.90 0.263
5.07 4.97 0.574
5.12 4.93 0.359
4.91 4.63 0.144
5.08 4.90 0.340
5.08 4.89 0.288
5.19 4.99 0.298
5.24 5.17 0.748
Residents ratings by type of clinical teacher academic position None 5.22 5.35 Assistant 4.84 5.12 Lecturer 4.83 5.03 Assoc. Prof. 5.30 5.55 Professor 4.96 5.41 0.013 0.019 P**
5.29 5.02 5.04 5.50 5.34 0.035
5.11 4.94 5.33 5.56 5.63 0.002
5.26 4.85 4.65 5.31 5.01 0.004
5.29 4.85 4.83 5.24 4.85 0.008
5.33 4.93 4.68 5.30 4.86 0.003
5.10 4.52 4.69 5.02 4.77 0.021
5.14 4.78 4.83 5.34 5.00 0.058
5.23 4.85 4.85 5.24 4.98 0.039
5.36 4.94 4.72 5.40 4.67 0.000
5.53 5.15 5.11 5.40 4.48 0.009
Year of residency First Second Third Fourth P**
5.34 5.08 5.39 4.83 0.003
5.27 5.10 5.58 5.09 0.018
5.15 4.99 5.13 4.46 0.007
5.26 4.89 5.14 4.51 <0.001
5.22 4.94 5.11 4.52 0.009
5.10 4.54 5.01 4.38 <0.001
5.24 4.57 5.27 4.66 <0.001
5.27 4.82 5.16 4.58 <0.001
5.24 5.02 5.15 4.30 <0.001
5.20 4.87 5.39 4.45 0.034
Residents ratings (333) Teachers self-evaluation (143) P* Residents ratings by teachers age 19–40 years 41+ years P*
*
5.21 4.86 5.20 4.57 <0.001
5.30 5.20 5.46 4.99 0.050
Process
Teaching methodology
Content
medicina xxx (2017) xxx–xxx
Please cite this article in press as: Vaižgėlienė E, et al. Evaluation of clinical teaching quality in competency-based residency training in Lithuania. Medicina (2017), http://dx.doi.org/10.1016/j.medici.2017.08.002
Table 2 – Mean overall scores and mean subscale scores by teachers' age group, gender, academic position, presence or absence of self-evaluation and year of residency training.
P, independent t test.
5
MEDICI-196; No. of Pages 9
6
medicina xxx (2017) xxx–xxx
Figure – Comparison of the MOS between individual clinical teachers' self-evaluation (light triangles) and the MOS of the resident(s) evaluations of that particular clinical teacher (dark dots). The X-axis represents sequence of clinical teachers who filled out self-evaluation (n = 83), starting with the one who received the lowest MOS from residents. The Y-axis represents MOS which was calculated using scores of questionnaires that were filled out for that particular teacher.
These results demonstrate that they are strongly valued by residents for their role in residency teaching. Therefore, it is very important to emphasize the significant contribution of non-academic teachers to clinical teaching. Also, their contribution to residency training needs to be discussed at an organizational level. As we expected professors were excellent role models in academic research, while assistants received the lowest scores. In contrast, professors were rated lowest on ‘‘personal support’’ and ‘‘assessment’’ domains, although these are their specific tasks. It is difficult to speculate about the reasons of these findings; however, one of them might be the professors having too little time for these roles. Higher hierarchical gap between residents and professors should not be excluded as a possible reason as well. Assistants and lecturers were rated lower on more than a half of the EFFECT subscales. It might indicate that these teachers need to improve their teaching skills during additional training courses. First- and third-year residents were more positive about their teachers than residents in the second and fourth year of training. In Lithuania, basically resident's status from junior resident to senior one changes in the third year, as an exception in 3-year duration residency programs status changes in the second year. Consequently, the degree of responsibility being assigned and trust level by their teacher increases a lot. This might result in residents' more satisfaction with ‘‘real life’’ situation finally starting to practice their skills and competencies. Residents in their fourth year of training were most critical about their teachers. One could speculate that residents in their last year feel more on the same level as their teachers, but they are still in a dependent hierarchical relation. Future research is needed to explain these differences. Teachers who filled out self-evaluation questionnaires were rated significantly higher by residents in role modeling than those who did not fill out self-evaluation. One explanation could be that teachers who are willing to critically
evaluate their own clinical teaching skills and for this purpose fill out their self-evaluation form are those teachers who are more aware of what residents learn from their behavior or role modeling and are more open for feedback. Subsequently, this can have a positive effect on the quality of their teaching [19]. At the other side of the scale, there are teachers who are not willing to reflect critically on their teaching behavior and their own role modeling behavior. In this light, it would be interesting to have further research on teachers' attitudes toward teaching, their openness to critical reflection and their feedback seeking or avoiding behavior, in relation to their teaching qualities. The current clinical teaching in 7 residency programs is evaluated generally high. First, teachers need to improve their role modeling competences, mainly in the areas of collaboration, health advocacy and leadership and specific items of planning, teaching and personal support domains. Secondly, it is of high request to improve teachers' feedback skills in CBME, as the effective feedback promotes learning at all levels. Clinical teachers' skills in providing effective feedback can be improved through the implementation of faculty development programs and e-portfolio. Finally, university should focus on the assessment as it is the weakest part of current clinical teaching and needs to be improved urgently. It should be taken into consideration that assessment in CBME is a complex, frequent, and continuous process that requires formative and summative approaches [20–22]. The mastering of EPAs and milestones could aid at the development of CBME curricula [23]. The strengths of our study include its multispecialty sample and sufficient number of evaluations to perform analysis. Additionally, we used a valid questionnaire that is strongly linked to the theory of workplace learning. As participation in the survey was voluntary, we did not aim at a representative sample for each residency program. Therefore, we could not compare MOS and MSS among programs. This can be considered as a limitation of our study.
Please cite this article in press as: Vaižgėlienė E, et al. Evaluation of clinical teaching quality in competency-based residency training in Lithuania. Medicina (2017), http://dx.doi.org/10.1016/j.medici.2017.08.002
MEDICI-196; No. of Pages 9
7
medicina xxx (2017) xxx–xxx
Table 3 – Difference from the (sub)domain average, and proportions of residents and teachers not yet being able to evaluate specific item (NAE, %). EFFECT (sub)domains and items
Value NAE (%) Value NAE (%) Resident
Role modeling Role modeling clinical skills 1. Ask for a patient history 2. Examine a patient 3. Perform clinical actions Role modeling general CanMEDS roles 4. Cooperate with other health professionals while providing care to patients and relatives 5. Communicate with patients 6. Cooperate with colleagues 7. Organize his or hers own work adequate 8. Apply guidelines and protocols 9. Treat patients respectfully 10. Handle complaints and incidents 11. Have a bad news conversation Role modeling scholarship 11. Apply academic research results Role modeling professionalism 12. Indicates when he/she himself/herself does not know something 13. Reflects on his/her own actions 14. Is a leading example of how I want to perform as a specialist
Clinical teacher
0.06 0.05 0.07
9.01 9.01 6.01
0.11 0.07 0.16
4.20 4.90 1.40
0.03 0.02 0.20 0.02 0.18 0.09 0.02 0.04
2.4 1.20 0.00 1.20 1.50 1.20 18.62 22.82
0.05 0.11 0.07 0.21 0.05 0.30 0.21 0.12
2.8 2.10 0.00 0.70 0.00 2.10 21.68 11.89
0.00
3.30
0.00
5.59
0.09 0.02 0.12
5.11 1.50 0.90
0.24 0.04 0.31
1.40 0.00 6.29
Task allocation 15. Gives me enough freedom to perform tasks on my own that suit my current knowledge/skills 16. Gives me tasks that suit my current level of training 17. Stimulates me to take responsibility 18. Gives me the opportunity to discuss mistakes and incidents 19. Teaches me how to organize and plan my work 20. Prevents me from having to perform too many tasks irrelevant to my learning
0.21 0.16 0.18 0.05 0.22 0.31
0.90 0.60 0.80 5.11 7.51 8.11
0.05 0.09 0.15 0.07 0.29 0.06
0.00 0.70 0.00 0.70 3.50 9.79
Planning 21. Reserves time to supervise/counsel me 22. Is available when I need him/her during my shift 23. Sets aside time when I need him/her
0.09 0.05 0.08
1.20 17.42 3.60
0.49 0.23 0.30
1.40 11.19 0.70
0.07 0.08 0.08 0.12 0.26 0.11
9.61 3.90 6.31 10.21 19.22 8.11
0.10 0.30 0.05 0.32 0.51 0.30
2.10 0.70 2.10 4.20 13.29 5.59
0.06 0.03 0.02 0.01 0.02 0.00
12.31 15.62 16.52 13.51 16.22 15.92
0.02 0.11 0.04 0.06 0.10 0.11
0.70 6.29 5.59 4.20 3.50 3.50
Teaching abilities (methodology) 36. Reviews the learning objectives 37. Asks me to explain my choice for a particular approach (status, resign form, etc.). 38. Discusses the possible clinical courses and/or complications 39. Reviews my reports 40. Stimulates me to find out things for myself 41. Stimulates me to ask questions 42. Stimulates me to actively participate in discussions 43. Explains complex medical issues clearly
0.35 0.05 0.02 0.11 0.05 0.20 0.02 0.06
14.41 12.61 7.51 22.22 4.80 0.60 6.31 4.80
0.48 0.08 0.03 0.32 0.04 0.58 0.27 0.02
6.29 3.50 2.80 23.08 2.80 0.00 2.80 3.50
Personal support 44. Treats me respectfully 45. Is an enthusiastic instructor/supervisor 46. Let me know I can count on him/her 47. Supports me in difficult situations (e.g. during morning report)
0.17 0.13 0.02 0.00
0.00 0.00 0.30 6.91
0.23 0.07 0.21 0.14
0.00 1.40 0.70 4.90
Giving feedback Process (quality) of the feedback 24. Bases feedback on concrete observations of my actions 25. Indicates what I am doing correctly 26. Discusses what I can improve 27. Let me think about strengths and weaknesses 28. Reminds me of previously given feedback 29. Formulates feedback in a way that is not condescending or insulting Content of the feedback 30. My clinical and technical skills 31. How I monitor the boundaries of my clinical work 32. How I collaborate with my colleagues in patient care 33. How I apply evidence-based medicine to my daily work 34. How I make ethical considerations explicit 35. How I communicate with patients
Please cite this article in press as: Vaižgėlienė E, et al. Evaluation of clinical teaching quality in competency-based residency training in Lithuania. Medicina (2017), http://dx.doi.org/10.1016/j.medici.2017.08.002
MEDICI-196; No. of Pages 9
8
medicina xxx (2017) xxx–xxx
Table 3 (Continued ) EFFECT (sub)domains and items
Value NAE (%) Value NAE (%) Resident
48. Is open to personal questions/problems 49. Helps and advises me on how to maintain a good work–home balance Assessment 50. Prepares progress reviews 51. Makes a clear link with previously set learning objectives during these reviews 52. Gives me the opportunity to raise issues of my own 53. Formulates next-term learning objectives during these reviews with me 54. Explains how staff was involved in the assessment 55. Reviews my portfolio during the assessment 56. Pays attention to my self-reflection 57. Gives a clear and exhaustive assessment
Clinical teacher
0.08 0.45
13.81 28.53
0.06 0.59
3.50 9.79
0.21 0.04 0.25 0.03 0.21 0.03 0.13 0.06
75.08 73.57 71.77 72.67 75.68 73.57 72.97 72.07
0.25 0.18 0.43 0.02 0.30 0.23 0.03 0.20
62.24 62.24 62.24 62.24 65.03 64.34 62.24 61.54
NAE, not (yet) able to evaluate. Bold negative values (≥0.20) are considered as indicators of areas for potential improvement, whereas positive values (≥0.20) – areas of high quality clinical teaching.
Future research has to focus on explaining and reducing the variation in teaching quality between different residency programs. Also, research should be conducted to examine changes of the quality of competency-based residency studies in Lithuania.
4.1.
Implications for practice
For successful development of CBME programs in practice, it is essential that clinical supervisors have insight in their strengths and weaknesses of the teaching qualities that are essential for competency-based learning and teaching. The strategy of further effective CBME development at LSMU should be concentrated in the following directions: - Regularly evaluation of quality of residency training, and formulating strong points and points for improvement concerning the teaching quality; - Training of clinical supervisors based on the assessed areas for improvement during quality evaluation within specific residency program; - Provision of adequate educational support for individual teachers, who might face problems in their new educational roles during implementation of CBME in practice; - Provision of organizational support for departments to implement competency-based education successfully; - Implementation and integration of e-portfolio into the residents' competence assessment system.
5.
Conclusions
Resident evaluations of clinical teachers are influenced by teachers' age, gender, year of residency training, type of teachers' academic position and whether or not clinical teacher performed self-evaluation. The current clinical teaching was evaluated generally high; however, we indicated potential areas for teaching quality improvement. Furthermore, it was indicated, that residents and teachers' perception in most of strong and weak points toward good clinical
teaching, differs. Seeking to make agreements to realize further changes, both residents and clinical teachers need to share their perception of clinical teaching quality in order to understand the meaning of differences in evaluations.
Conflict of interest The authors state no conflict of interest.
references
[1] Leach DC. Changing education to improve patient care. Qual Saf Health Care 2001;10(Suppl. 2):ii54–8. [2] Engbers R, de Caluwé LIA, Stuyt PMJ, Fluit CRMG, Bolhuis S. Towards organizational development for sustainable highquality medical teaching. Perspect Med Educ 2013;2 (February (1)):28–40. [3] Farnan JM, Petty LA, Georgitis E, Martin S, Chiu E, Prochaska M, et al. A systematic review: the effect of clinical supervision on patient and residency education outcomes. Acad Med 2012;87(April (4)):428–42. [4] Leeuw RM, Lobarts KMJMH, Arach OA, Heineman MJ. A systematic review of the effects of residency training on patient outcomes. BMC Med 2012;10:65. [5] Fluit CR, Bolhuis S, Grol R, Laan R, Wensing M. Assessing the quality of clinical teachers: a systematic review of content and quality of questionnaires for assessing clinical teachers. J Gen Intern Med 2010;25(December (12)):1337–45. [6] Sutkin G, Wagner E, Harris I, Schiffer R. What makes a good clinical teacher in medicine? A review of literature. Acad Med 2008;83(May (5)):452–66. [7] Beckman TJ, Cook DA, Mandrekar JN. What is the validity evidence for assessments of clinical teaching? J Gen Intern Med 2005;20(October (12)):1159–64. [8] Leung W. Competency based medical training: review. Br Med J 2002;32(September (7366)):693–6. [9] ECAMSQ_presentation.pdf. Available from: https://www. uems.eu/__data/assets/pdf_file/0009/1206/ ECAMSQ_presentation.pdf [cited 17.04.17]. [10] esg_2015.pdf. Available from: http://www.eua.be/Libraries/ quality-assurance/esg_2015.pdf?sfvrsn=0 [cited 01.02.17].
Please cite this article in press as: Vaižgėlienė E, et al. Evaluation of clinical teaching quality in competency-based residency training in Lithuania. Medicina (2017), http://dx.doi.org/10.1016/j.medici.2017.08.002
MEDICI-196; No. of Pages 9 medicina xxx (2017) xxx–xxx
[11] Fluit C, Bolhuis S, Grol R, Ham M, Feskens R, Laan R, et al. Evaluation and feedback for effective clinical teaching in postgraduate medical education: validation of an assessment instrument incorporating the CanMEDS roles. Med Teach 2012;34(11):893–901. [12] Fluit L. Evaluation and feedback for effective clinical teaching. UB Nijmegen [Host]; 2013. [13] Vaižgėlienė E, Padaiga Ž, Rastenytė D, Tamelis A, Petrikonis K, Kregždytė R, et al. Validation of the EFFECT questionnaire for competence-based clinical teaching in residency training in Lithuania. Medicina (Kaunas) 2017. pii: S1010-660X(17)30025-3. [14] Fluit CRMG, Feskens R, Bolhuis S, Grol R, Wensing M, Laan R. Understanding resident ratings of teaching in the workplace: a multi-centre study. Adv Health Sci Educ 2014;20(October (3)):691–707. [15] Scheepers RA, Lombarts KM, van Aken MA, Heineman MJ, Arah OA. Personality traits affect teaching performance of attending physicians: results of a multi-center observational. PLoS ONE 2014;9(May (5)):e98107. [16] Fluit CRMG, Feskens R, Bolhuis S, Grol R, Wensing M, Laan R. Repeated evaluations of the quality of clinical teaching by residents. Perspect Med Educ 2013;2(April (2)):87–94. [17] Arah OA, Heineman MJ, Lombarts KMJMH. Factors influencing residents' evaluations of clinical faculty
[18]
[19]
[20]
[21]
[22]
[23]
member teaching qualities and role model status. Med Educ 2012;46(April (4)):381–9. Morgan HK, Purkiss JA, Porter AC, Lypson ML, Santen SA, Christner JG, et al. Student evaluation of faculty physicians: gender differences in teaching evaluations. J Womens Health 2016;25(March (5)):453–6. Passi V, Johnson S, Peile E, Wright S, Hafferty F, Johnson N. Doctor role modelling in medical education: BEME Guide No. 27. Med Teach 2013;35(September (9)):e1422–36. Hawkins RE, Welcher CM, Holmboe ES, Kirk LM, Norcini JJ, Simons KB, et al. Implementation of competency-based medical education: are we addressing the concerns and challenges? Med Educ 2015;49(November (11)):1086–102. Holmboe ES, Sherbino J, Long DM, Swing SR, Frank JR. Collaborators for the IC. The role of assessment in competency-based medical education. Med Teach 2010;32 (August (8)):676–82. Boet S, Pigford A.-A.E., Naik VN. Program director and resident perspectives of a competency-based medical education anesthesia residency program in Canada: a needs assessment. Korean J Med Educ 2016;28(June (2)): 157–68. Touchie C, ten Cate O. The promise, perils, problems and progress of competency-based medical education. Med Educ 2016;50(January (1)):93–100.
Please cite this article in press as: Vaižgėlienė E, et al. Evaluation of clinical teaching quality in competency-based residency training in Lithuania. Medicina (2017), http://dx.doi.org/10.1016/j.medici.2017.08.002
9