Journal Pre-proof Acceptability, utility, and undergraduate nursing satisfaction with a video assessment of clinical skills
student
Peter Lewis, Leanne Hunt, Lucie M. Ramjan, Miranda Daly, Rebecca O'Reilly, Yenna Salamonson PII:
S0260-6917(19)30483-6
DOI:
https://doi.org/10.1016/j.nedt.2019.104244
Reference:
YNEDT 104244
To appear in:
Nurse Education Today
Received date:
27 March 2019
Revised date:
6 September 2019
Accepted date:
10 October 2019
Please cite this article as: P. Lewis, L. Hunt, L.M. Ramjan, et al., Acceptability, utility, and undergraduate nursing student satisfaction with a video assessment of clinical skills, Nurse Education Today(2018), https://doi.org/10.1016/j.nedt.2019.104244
This is a PDF file of an article that has undergone enhancements after acceptance, such as the addition of a cover page and metadata, and formatting for readability, but it is not yet the definitive version of record. This version will undergo additional copyediting, typesetting and review before it is published in its final form, but we are providing this version to give early visibility of the article. Please note that, during the production process, errors may be discovered which could affect the content, and all legal disclaimers that apply to the journal pertain.
© 2018 Published by Elsevier.
Journal Pre-proof
Acceptability, utility, and undergraduate nursing student satisfaction with a Video Assessment of Clinical Skills Peter Lewis RN, Dip App Sc (Nursing), BA, PhD Senior Lecturer Western Sydney University, School of Nursing and Midwifery | Locked Bag 1797, Penrith NSW 2751 E:
[email protected]
ro
of
Leanne Hunt RN, MHM, PhD Senior Lecturer Western Sydney University, School of Nursing and Midwifery | Locked Bag 1797, Penrith NSW 2751 E:
[email protected]
re
-p
Lucie M. Ramjan RN, BN (Hons), PhD Associate Professor Western Sydney University, School of Nursing and Midwifery | Locked Bag 1797, Penrith NSW 2751 E:
[email protected]
na
lP
Miranda Daly RN, BN (Hons), GradCertHSM, GradDipAdvClinNurs, MN, PhD Candidate Lecturer Western Sydney University, School of Nursing and Midwifery | Locked Bag 1797, Penrith NSW 2751 E:
[email protected]
Jo ur
Rebecca O’Reilly RN, RM, PhD Senior Lecturer Western Sydney University, School of Nursing and Midwifery | Locked Bag 1797, Penrith NSW 2751 E:
[email protected]
Yenna Salamonson RN, BSc, MA(Ed&Wk), PhD Professor Western Sydney University, School of Nursing and Midwifery | Locked Bag 1797, Penrith NSW 2751 E:
[email protected]
Journal Pre-proof
Abstract Background: Clinical skill assessment via Objective Structured Clinical Assessment (OSCA) has many challenges for undergraduate nursing students. These include high levels of anxiety that can compromise performance during the assessment, inconsistency with assessor reliability and is inconsistent with clinical skills performance in the real world. The implementation of a Video Assessment of Clinical Skills (VACS) that integrates formative feedback may be a way to address the challenges posed by OSCA assessment. Objectives: The aim of this study was to examine the acceptability, utility, and nursing student satisfaction with a formative feedback strategy – the Video Assessment of a Clinical Skill (VACS).
of
Design: A cross sectional survey.
ro
Settings: Undergraduate Bachelor of Nursing degree students from a large Australian University.
re
-p
Participants: Third year undergraduate nursing students (final year) enrolled in a Bachelor of Nursing Program.
lP
Methods: Participants were recruited via purposive sampling. A pre-survey (prior to VACs assessment) and post-survey (after VACS assessment) were completed. This paper reports on the open-ended responses in the post-survey that explored students’ insights and perceptions into formative feedback and its impact on their learning for the VACS assessment.
Jo ur
na
Results: A total of 732 open-ended responses were analysed with findings being organised into 3 major themes; (i) Flexibility and reflexivity, (ii) Editing and repeated attempts, and (iii) Working together. Conclusions: Video Assessment of a Clinical Skill has demonstrated good utility, acceptability, and satisfaction among undergraduate nursing students.
Journal Pre-proof
Factors contributing to undergraduate nursing students’ satisfaction with a Video Assessment of Clinical Skills Introduction Undergraduate nursing students are required to demonstrate competencies in clinical skills performance during their undergraduate degree as an indicator of their capacity to deliver care safely in a clinical setting (Watson et al., 2002), and as early as the 1980s, the Objective
of
Structured Clinical Examination (OSCE) was developed to assess medical students’ clinical competencies (Harden, 1988). In the 1990s, Objective Structured Clinical Assessment
ro
(OSCA) was developed to assess undergraduate nursing students’ competence to perform
-p
clinical skills (Bujack et al., 1991). The implementation of these clinical assessments has
re
varied over time. For example, these clinical assessments could include assessment of
lP
physical examination skills, diagnostic skills, communication skills, or any combination of these (Michels et al., 2012). They could also be a summative assessment of student
na
performance to evaluate student learning against a benchmark or a set of criteria. Should the final grade be satisfactory, the student has been judged to be competent, and is able to
Brown, 2009).
Jo ur
progress to the next stage of the course (ACT Government Education, 2016; Duers and
Background/ Literature Despite the widespread implementation of this mode of clinical assessment, challenges have been reported by individual students, as immediate feedback has not always been provided for students to improve their clinical performance (Cazzell and Rodriguez, 2011). Furthermore, OSCE or OSCA can be anxiety-provoking, to the extent that some students who might be competent clinically, might be assessed as incompetent on the basis of a single exam (Bouchoucha et al., 2013; East et al., 2014).
Journal Pre-proof Inconsistencies in the reliability of this mode of clinical assessment have also been reported (Daly et al., 2017), thus calling into question the trustworthiness of this assessment mode, of a single assessor with variability in qualifications, experience and a lack of standardisation in training (Turner and Dankoski, 2008). OSCE and OSCA have also been criticised as unrealistic or as being inconsistent with the performance of a clinical skill in the real world (East et al., 2014).
of
Feedback provided to students with the intention of improving performance is a crucial
ro
component of learning and ongoing professional development. It can be used to both affirm adequate performance or to correct errors in performance (Cazzell and Rodriguez, 2011).
-p
However, formative feedback that assists students identify their strengths and weaknesses,
re
and highlights areas of practice in which students might benefit from further instruction is not
lP
always provided in the OSCE or OSCA (Cazzell and Rodriguez, 2011; Duers and Brown,
na
2009).
An emerging method of assessment involves the use of digital video recordings for assessing
Jo ur
clinical skills (Strand et al., 2013). This method of instruction has been evaluated as satisfactory for students who value its flexibility and opportunity for repetition (Barratt, 2010; Kelly et al., 2009), and help lower students’ anxiety levels (Cardoso et al., 2012). In this mode of clinical assessment, students are producers of the video and their active engagement extends to skill performance and the use of technology for video production (Purpora and Prion, 2018; Strand et al., 2013). While video-based clinical assessment has distinct advantages and disadvantages compared to the OSCA or OSCE, less is known about the acceptability and student satisfaction with video assessment of clinical skills. Despite the increasing use of student-produced video for clinical assessment in nursing education (Purpora and Prion, 2018; Strand et al., 2013; Yoo et
Journal Pre-proof al., 2009), studies have usually involved small student cohorts who were located at a single campus site. Thus, the aim of this study was to examine the factors contributing to nursing student satisfaction with a novel, formative feedback strategy – the Video Assessment of a Clinical Skill (VACS) - in a multi-campus university involving a large number of nursing students who were completing the VACS within the same time period. This study aims to contribute an evaluation of a novel method of assessing clinical skills in undergraduate nurses. The research question of this study was: ‘What were the contributing and limiting
ro
of
factors that impacted on students’ satisfaction with the VACS?’
-p
Methods
re
Design and setting
lP
This cross-sectional study was undertaken with third (final) year undergraduate nursing students enrolled in a Bachelor of Nursing program at a large multi-campus Australian
na
university in Autumn 2017. In this paper, we report on the open-ended responses from the
Jo ur
post-survey and complied with the COREQ checklist for qualitative research.
Study Intervention
The VACS was introduced to address student feedback about the difficulties in completing an OSCA, to decrease the stress students experienced whilst undertaking competency assessments, and to create a mechanism to provide the students with ongoing formative feedback about their performance of a prescribed set of clinical skills. Although the VACS was ultimately used as a summative assessment in that students who failed to complete the task satisfactorily risked failing the unit, it did provide students with opportunities to receive formative feedback on their performance of the task during semester.
Journal Pre-proof Third year undergraduate nursing students enrolled in the Bachelor of Nursing program at this university were required to complete the VACS as part of their course assessment. All students had completed at least two semesters of instruction in professional clinical practice prior to enrolling in this third-year unit. All previous professional clinical practice had been partially assessed using an OSCA. All students had therefore completed between two and four OSCAs prior to undertaking the VACS. The VACS required students to create a video of themselves performing a prescribed clinical task – hanging a bag of intravenous fluids – and
of
to upload the video to YouTube for access by assessors. Secure access to the videos was
ro
provided by use of a password issued only to the assessors. Assessors then provided
-p
formative feedback to students on performance of the task. Feedback was designed to assist
re
students to improve their performance of the task in a second attempt. This cycle of video production, assessment, and revision was completed three times. After the third attempt,
lP
students’ performance of the clinical skill was judged by assessors to be either satisfactory or
na
unsatisfactory. An unsatisfactory grade in the assessment meant that the student had to re-
Jo ur
enrol in the unit the following year.
Recruitment and sampling
Following ethical approval from the University Human Research Ethics Committee, all students who were enrolled in the third-year clinical practice unit were invited to participate in this study (n=1,277). A purposive sampling technique was used where students were informed and invited to participate during class in Week 1 of this unit of study.
Data collection Surveys were the most appropriate method to gather data for this project as they ensured consistency in purposefully constructed questions that were used to measure study variables and the use of surveys also allow for large volume distribution within a limited time frame
Journal Pre-proof (Da Costa & Schneider 2015). Students who consented to participate in the study completed a pre-survey (Week 1) and post-survey (Week 12). Items that appeared in the survey tool were generated by members of the research team based on a review of the literature to ensure that the inclusion of each item was based upon research evidence (Bouchoucha et.al. 2013). This was the first occasion upon which the survey had been used. The pre-survey collected demographic and administrative data on students as well as their
of
reflections on previous OSCA experiences. The pre-survey asked students to rate their
ro
responses to 17 statements about their experience of the OSCA on a scale from 1-7 where 1 equated to strongly disagree and 7 equated to strongly agree. The post survey asked students
-p
to rate the usefulness of the feedback that they received on their VACS on a scale from 0
re
(Not useful at all) to 10 (Extremely useful). The post-survey collected open-ended responses
lP
that explored students’ insights and perceptions into formative feedback and its impact on their learning. Three questions were used to elicit open ended responses: What were the
na
positive aspects of undertaking this assessment? What were the negative aspects of
Jo ur
undertaking this assessment? How will the feedback you received assist you with your clinical skill assessment?
Data Analysis
The thematic analysis procedure followed the steps described by Braun and Clarke (2006). According to Braun and Clarke (2006: 84), a semantic approach to thematic analysis focuses on the explicit, surface meanings of the data and produces a description of what participants have revealed enabling the researcher to theorise about the meanings and implications of what has been revealed. A semantic approach to coding in this study was adopted because of the brief and direct written responses provided by participants to the open-ended questions. The open-ended responses from students in the post-survey were manually entered into an
Journal Pre-proof Excel spreadsheet. The comments were then read and re-read by the first two authors independently using a deductive, “top-down” approach to focus on the particular topics of interest in this study (Braun and Clarke 2006: 84) – the positive and negative experiences of the VACS provided by participants. Common concepts and categories were combined to generate themes and sub-themes, with confirmation of the credibility of interpretation provided by the remaining authors.
of
Ethical considerations
ro
To ensure ‘arms-length’ recruitment process, none of the class tutors who invited students to participate were an investigator in this study. In addition to verbal briefing about the study,
-p
all students were provided with a Participant Information Sheet and Consent Form. The
lP
to withdraw from the study at any time.
re
voluntary nature of participation was highlighted during the invitation and students were free
na
Although the Principal Investigator on this study was the unit coordinator, all surveys were distributed and collected by a research assistant and not by the coordinator or any member of
Jo ur
the unit teaching team. Survey responses were de-identified by the RA prior to viewing by the research team and this process was explained to students in the Participant Information Sheet. Students were also advised that participation was voluntary and that their choice whether or not to participate had no bearing on the result that they could achieve in the unit, their relationship with the teaching or research team, or their relationship with the University. Approval to conduct this study was obtained from the University’s Human Research Ethics Committee (Approval No. Removed for blind peer review).
Rigour We have followed the framework described by Kitto and colleagues (Kitto et al., 2008) to assess the rigour of this study. The framework poses specific questions in relation to the
Journal Pre-proof following criteria: clarification, justification, procedural rigour, representativeness, interpretation, reflexivity and evaluative rigour and transferability. Our study met the criteria of clarification and justification, with the statement of a clear aim and research question and justification for the qualitative approach adopted. Procedural rigour has been met with clear documentation of data collection procedures and transparency in the thematic analysis process. While there is broad representation of students in this study,
of
conceptual generalisability is limited by the sample being one cohort, from one university.
ro
Interpretative rigour was maintained through the use of two researchers in the coding and
-p
discussions around theme development as well as inclusion of deviant cases. Reflexivity and
re
evaluative rigour have been accounted for with declaration of any influential relationships between the researchers, topic and participants and clear articulation of ethical approval.
lP
Finally, a critical evaluation of the application of findings to knowledge and practice is
Jo ur
Findings
na
discussed with appropriate recognition of the limitations inherent within this study design.
Of 1,277 students approached, 731 students (57.32% of those eligible) provided a VACS satisfaction rating and open-ended responses to all three questions. The characteristics of participants who provided these open-ended responses are summarised in Table 1. Participants in this study provided comments on a range of factors associated with the conduct of the VACS assessment. Many of the factors were multifaceted. As a result of the analysis, findings have been organised into three major themes that captured these opportunities and challenges identified by students; (i) Flexibility and reflexivity, (ii) Editing and repeated attempts, and (iii) Working together.
Journal Pre-proof Satisfaction Rating Overall, students rated the feedback that they received on their VACS as being more useful than not (Table 1), with a median rating of 6 (Range: 0 to 10).
Flexibility and reflexivity Students sometimes referred to the timing of the task as being relaxed. They expressed appreciation of the time that they were given to practice and of the extended period of time
of
provided for the completion of the assessment task. Students were generally satisfied with the
ro
amount of time available for the completion of the assessment.
-p
More chance to practise and record the skill before submit it in my own time.
re
I have more time to prepare or I can do it in my own time suitable for me.
lP
There is a flexibility & time to fix the issues.
na
For some, completing the assessment in their own time meant delaying completion of the task
practice.
Jo ur
until they had taken advantage of the maximum amount of time available for preparation and
The issue of time and pressure in completing the assessment was reduced. Having enough time to prepare was practising several times before the actual [assessment], breaking the assessment into small parts. Flexibility also meant that students who preferred to complete the assessment task sooner rather than later could do so. Ability to complete the assessment asap rather than later. The flexible timing diminished some of the adverse effects of a compulsive, singular deadline such as that associated with the OSCA.
Journal Pre-proof This assessment gave me more chance to do the skill and I am not as nervous as the [OSCA] exam. The best part was that we had the opportunity to do it again and pick the best one. Less nervous & pressure. The video provided students with the opportunity to see what they had done and to reflect on the quality of their technique. Reflection was linked to the idea of time in that the recording
of
and viewing of the video provided a natural buffer between the performance of the skill and
ro
the student’s access to an “objective” view of their performance.
-p
Also watching yourself back makes it easier to know where you have to improve as
re
you see it for yourself.
lP
Gave me the chance to look at myself in a practical environment therefore giving me
na
the opportunity to criticize myself.
Importantly, viewing the video provided students with a form of feedback to which they were
Jo ur
unaccustomed but which had the benefit of enabling them to self-correct in some cases when they would not otherwise have had that opportunity. Positive aspect of the assessment is that we can realise where we made mistake and reflect on that mistake in order to correct it. I was able to watch myself and identify missed problems/clinical issues, room for improvement & skills …… Reflection was sometimes expressed in terms of goal orientation or achievement. The goal was to correct mistakes. Occasionally, students also reflected on what they had done well. Watching myself do the skill is the best way to identify what I'm doing well or not.
Journal Pre-proof Editing and repeated attempts The technology that facilitated the VACS had two advantages: students said that repeated attempts at completing the skill gave them the opportunity to perfect the skill itself, and they also spoke of the opportunity to edit their video with the aim of perfecting their representation of the skill. You can do the assessment multiple times until you're happy with the result.
of
…we did the skill numerous times, so we can perfect it.
-p
feel the first one you did is not good.
ro
The anxiety is less because you can take the video many times or you can retake if you
re
Chance to re-do it if we failed the first time and chance to edit the video to
lP
perfection/satisfaction.
na
Many students spoke positively of the opportunity to master the learning of the clinical skill
Jo ur
being examined by the VACS.
Ability to perfect the skills.
Students are able to master the skill. It helped me grow my experience & understanding on PCA (Patient-controlled analgesia) and the correct administration & work before giving medication.
In addition, students spoke positively of the opportunity provided by the VACS to learn new ‘technical’ skills of video recording and editing. I was able to know how to upload on YouTube.
Journal Pre-proof Making the video myself & editing is needed knowledge of computer skills. Therefore I was able to learn more about IT and search about it. Enhanced my self confidence. Some spoke of the value of the VACS assessment in their preparation for clinical practice. It will help me to do the same skills in my future nursing career. The positive aspect is that I practice the skills properly and get me ready when I
of
already work in the nursing field.
ro
Technical Challenges
The challenge of the VACS mentioned most frequently by participants was that of
-p
technology. Completion of the VACS required students to use technological devices and
re
platforms that were unfamiliar to them. This resulted in a large number of comments and
lP
complaints about the challenges that they experienced in completing the assessment that arose as a result of the technology that was available for use. One complaint was that the
na
technology itself was limited. Students commented that when they used an iPad to record
Jo ur
their video, the available memory was insufficient to complete the video task. The iPads we are supplied with do not have enough storage to do the VACs without deleting everything on it.
I prefer doing an actual OSCA instead of VACS because I had lots of problems downloading the video and my memory on iPad kept on running out. This led to frustration with what students described as the time consuming task of deleting files from the iPad in order to make space for the video. A second challenge was that many students described having difficulty uploading their video to YouTube, an internet based video and image sharing platform. Some students identified
Journal Pre-proof that uploading their videos was time consuming and others described their frustration when the platform failed to successfully upload their videos. The video uploading in YouTube account was all confusing. We had trouble uploading the video on YouTube and our link used to open on some computers and would not open on other computer.
of
Two major complaints made by students about the video recording and uploading aspect of the assessment were that the task was difficult because of the level of technical expertise
ro
required to complete it and it was an unreasonable expectation for undergraduate nursing
-p
students.
re
You almost need an IT degree to upload a video.
lP
We are not doing an IT degree. I had technical difficulties but my video itself was
na
adequate. I think it is unfair that my ignorance in YouTube has affected my mark. Students implied that a lack of training in IT or videography put them at a disadvantage in
Jo ur
terms of completing the assessment.
Students also complained that their videos had to be uploaded to what they described as a publicly available website. This implied that they believed that whatever they posted to YouTube would be accessible to a worldwide audience, and they spoke about their discomfort at this breach of privacy. my personal details are now on a public platform which I would not normally do and am not comfy with. I feel that it was breech of my privacy to upload my details in social media. Should not be uploaded to YouTube for the world to see - breach of privacy.
Journal Pre-proof In fact, only academic staff with password access to the video submissions could view the videos uploaded to YouTube. All videos had password-protected and restricted access.
Working together
Working with other students For some students, the opportunity to work with friends to complete the VACS was identified as supportive of their learning.
of
I enjoyed being able to film with friends and not feeling the pressure of making
ro
mistake.
-p
However, for others, working with fellow students was challenging. For the activity to be
re
completed effectively and efficiently, three students had to be present at one time. Some
lP
students described having difficulty negotiating with others to be available and present for the assessment task. Busy schedules were identified as being a barrier to teamwork and this was
na
sometimes said to be compounded by the administrative difficulties of booking time and
Jo ur
space in the clinical practice laboratories. It’s hard to work around my colleagues schedule & personal commitments. Hard to organise with same students who do not turn up on your VAC day yet you are given one chance to do it in the Sim [simulation] room. Other students complained about the technical competence of their peers to complete the video capture in a satisfactory way. In other words, some students complained that their grade suffered because of their peers’ inability to use the video recording system effectively. I had to find someone to record that knew what you needed. I asked them to take specific shots & they didn’t so I was told to redo.
Journal Pre-proof Student might not shoot video properly and we need to get unsatisfactory on that.
Working with academics: Preparation and feedback The comprehensive feedback that staff provided attracted positive comments from some students. Participants identified that the VACS can be cross-marked (i.e. have another assessor grade performance) in a way that the OSCA cannot and that markers’ comments led
of
to improved performance of the skill.
ro
We receive critique on our first for attempt and can use the comments to improve for
-p
final VACS.
re
Gaining valuable feedback from assessors.
lP
[Assessment] can be cross-marked.
na
However, the more common feedback from students referred to inconsistency between assessors. Some participants indicated that the marking criteria were applied inconsistently by
Jo ur
staff from one student to another. Some objected that the marking criteria were applied too subjectively. This could be expressed in terms that the marker was too harsh or too picky. The marking was not uniform. Some students got a 1 even when they did not mention baseline observations in their clips and others failed even when they reflected on the issues & corrected them. Unfair & inconsistent marking from a range of markers that didn’t know the right marking guidelines. People failed whilst some others passed on the same thing - marking was inaccurate.
Journal Pre-proof One other complaint was that markers focused on elements of the task that did not pertain to the performance of the skill and which were outside the students’ control, for example, some students commented that markers fed back that they should use new rather than recycled equipment without apparently considering that recycled equipment was all that was available to a large numbers of students.
Discussion
of
This paper reports the findings from an analysis of open-ended responses to survey questions
ro
in a study of contributing factors to third year undergraduate nursing students’ satisfaction
-p
with a Video Assessment of Clinical Skills in a multi-campus Australian university. Findings
re
suggested that students valued the advantages as well as experienced challenges while undertaking the VACS in the areas of flexibility and reflexivity, video editing and repeated
lP
attempts at the clinical skill, and working together with other students and academic staff. As
na
all participants had previously experienced the OSCA, it was inevitable that comparisons between the VACS and OSCA were made. Indeed, the administration of the VACS was
Jo ur
deliberately conceptualised as an alternative method of assessment to the OSCA and student satisfaction with the VACS was unlikely to be undertaken without reference to the OSCA. Although the VACS was assessed and graded by academic staff, participants also identified the value of the inclusion of self-assessment in the VACS, which was previously unavailable in the OSCA. Self-assessment provided the opportunity for students to observe and reflect on their own performance (Chae and Ha, 2016). The recording and uploading device for the VACS was chosen arbitrarily, however many used iPads issued to them when they commenced their course. Although the iPad was the most easily accessible and user friendly device, the memory capacity was small, which created additional challenges for students.
Journal Pre-proof As participants were familiar with the OSCA process, the additional requirement to learn and use unfamiliar technology was an added complexity that detracted from their satisfaction with the VACS, and challenged students’ satisfaction. This finding is contrary to the findings of Pereira et al. (2014), which indicate that nursing students embraced the opportunities to develop skills in the use of computer technology. Perhaps the high-stake nature of the VACS nullified any enjoyment students might have had in this potential learning opportunity. Opportunities have previously been identified for nursing students to develop skills in the use
ro
of
of computer technology (Forbes et al., 2016).
While some participants commented that they had received valuable, timely, and actionable
-p
feedback from their markers, others challenged the credibility of the marking of the VACS
re
when they perceived that they were not being held to an objective standard of assessment.
lP
These findings resonate with previous research about the unacceptability of marking by global impression rather than strict criteria applied to OSCA assessment (Yeates et al., 2013).
na
On the basis of the findings of this study, two recommendations can be made. First,
Jo ur
administrators of practical assessment should create opportunities to enhance markers’ understanding of the application of the assessment criteria in some circumstances. Second, academic staff can assist the students to develop realistic expectations about how the assessment task is implemented and what sorts of feedback to expect from markers.
Strengths and limitations The response rate for this survey was high which promotes confidence in the findings. The total number of responses provided perspectives from a broad cross section of the student cohort who participated in the VACS. The design of this study, however, provided limited opportunity for an in-depth exploration of students’ experiences and the reasons for their satisfaction or dissatisfaction with the VACS. While the format of the survey instrument
Journal Pre-proof enabled a broad sample of students to identify and briefly describe factors that contributed to and factors that detracted from their satisfaction with the VACS, the format provided only limited opportunity to explore and develop a rich and detailed understanding of these factors. A more sophisticated qualitative research design that provides for a more elaborate discussion of students’ experiences will be required in order to explore the contributing factors to their level of satisfaction with the VACS in future. Another limitation of the study is that it was conducted with one cohort of students at one
of
institution. This limited opportunity to compare satisfaction with the VACS method of
ro
assessment between cohorts of students and across institutions. For example, it may be that
-p
the VACS is more acceptable to third year nursing students who have had more time to
re
develop skills in self-directed learning than it is for first year nursing students who are novice students and might respond more favourably to the contained format of the OSCA. Further
lP
research of a longitudinal nature and across institutions will be required in order to produce a
Conclusion
Jo ur
cohorts of nursing students.
na
nuanced understanding of the acceptability, utility, and satisfaction of the VACS for different
The video assessment of clinical skills has demonstrated good utility, acceptability and satisfaction among the nursing students who participated in this study as an assessment strategy and in recognising strengths and weaknesses in the performance of a clinical skill. Participants in this study also valued the formative feedback that they received from assessors which contributed to their satisfactory completion of the assessment task.
Journal Pre-proof
References ACT Government Education, 2016. Teacher's guide to assessment. Retrieved 24 June, 2019, from https://www.education.act.gov.au/__data/assets/pdf_file/0011/297182/TeachersGuide-To-Assessment.pdf Barratt, J., 2010. A focus group study of the use of video-recorded simulated objective structured clinical examinations in nurse practitioner education. Nurse Educ. Pract. 10 (3), 170-175 https://doi.org/10.1016/j.nepr.2009.06.004. Bouchoucha, S., Wikander, L., Wilkin, C., 2013. Nurse academics perceptions of the efficacy
ro
https://doi.org/10.1016/j.colegn.2012.03.008.
of
of the OSCA tool. Collegian 20 (2), 95-100
Braun., Clarke, V., 2006. Using thematic analysis in psychology. Qualittative Research in
-p
Psychology 3 (2), 77-101.
re
Bujack, L., McMillan, M., Dwyer, J., Coordinator, C., Hazeton, M., 1991. Assessing comprehensive nursing performance: The objective structural clinical assessment
lP
(OSCA) Part 1 — Development of the assessment strategy. Nurse Educ. Today 11 (3), 179-184 https://doi.org/10.1016/0260-6917(91)90057-H.
na
Cardoso, A.F., Moreli, L., Braga, F.T.M., Vasques, C.I., Santos, C.B., Carvalho, E.C., 2012. Effect of a video on developing skills in undergraduate nursing students for the
Jo ur
management of totally implantable central venous access ports. Nurse Education Today 32 (6), 709-713.
Cazzell, M., Rodriguez, A., 2011. Qualitative analysis of student beliefs and attitudes after an Objective Structured Clinical Evaluation: Implications for affective domain learning in undergraduate nursing education. J. Nurs. Educ. 50 (12), 711-714 https://doi.org/10.3928/01484834-20111017-04. Chae, Y.J., Ha, Y.M., 2016. Effectiveness of education program for core fundamental nursing skills using recording video with smartphone and formative feedback. Journal of Digital Convergence 14 (6), 285-294. Da Costa, C., Schneider, Z., 2015. Quantitative data collection and study validity. In Z. Schneider, D. Whitehead, G. Lobiondo-Wood, J. Haber (Eds) Nursing and Midwifery Research: Methods and appraisal for evidence based practice Chatswood, NSW: Elsevier.
Journal Pre-proof Daly, M., Salamonson, Y., Glew, P. J., Everett, B., 2017. Hawks and doves: The influence of nurse assessor stringency and leniency on pass grades in clinical skills assessments. Collegian 24 (5), 449-454 https://doi.org/10.1016/j.colegn.2016.09.009. Duers, L. E., Brown, N., 2009. An exploration of student nurses’ experiences of formative assessment. Nurse Educ. Today 29 (6), 654-659 https://doi.org/10.1016/j.nedt.2009.02.007. East, L., Peters, K., Halcomb, E., Raymond, D., Salamonson, Y., 2014. Evaluating objective structured clinical assessment (OSCA) in undergraduate nursing. Nurse Educ. Pract.
of
14 (5), 461-467 https://doi.org/10.1016/j.nepr.2014.03.005. Forbes, H., Oprescu, F.I., Downer, T., Phillips, N.M., McTier, L., Lord, B., et al., 2016. Use
-p
review. Nurse Education Today 42 53-56.
ro
of videos to support teaching and learning of clinical skills in nursing education. A
re
Harden, R., 1988. What is an OSCE? Med. Teach. 10 (1), 19-22 Kelly, M., Lyng, C., McGrath, M., Cannon, G., 2009. A multi-method study to determine the
lP
effectiveness of, and student attitudes to, online instructional videos for teaching clinical nursing skills. Nurse Educ. Today 29 (3), 292-300
na
https://doi.org/10.1016/j.nedt.2008.09.004. Kitto, S. C., Chesters, J., Grbich, C., 2008. Quality in qualitative research: Criteria for authors
Jo ur
and assessors in the submission and assessment of qualitative research articles for the Medical Journal of Australia. Medical Journal of Australia 188, 243-246 Michels, M. E. J., Evans, D. E., Blok, G. A., 2012. What is a clinical skill? Searching for order in chaos through a modified Delphi process. Med. Teach. 34 (8), e573-e581 https://doi.org/10.3109/0142159X.2012.669218. Purpora, C., Prion, S., 2018. Using student-produced video to validate head-to-toe assessment performance. J. Nurs. Educ. 57 (3), 154-158 https://doi.org/10.3928/0148483420180221-05. Strand, H., Fox-Young, S., Long, P., Bogossian, F., 2013. A pilot project in distance education: Nurse practitioner students' experience of personal video capture technology as an assessment method of clinical skills. Nurse Educ. Today 33 (3), 253257 10.1016/j.nedt.2011.11.014.
Journal Pre-proof Turner, J. L., Dankoski, M., 2008. Objective structured clinical exams: A critical review. Fam. Med. 40 (8), 574-578 Watson, R., Stimpson, A., Topping, A., Porock, D., 2002. Clinical competence assessment in nursing: a systematic review of the literature. J. Adv. Nurs. 39 (5), 421-431 https://doi.org/10.1046/j.1365-2648.2002.02307.x. Yeates, P., O’Neill, P., Mann, K., Eva, K., 2013. ‘You’re certainly relatively competent’: Assessor bias due to recent experiences. Medical Education 47 (9), 910-922 Yoo, M. S., Son, Y. J., Kim, Y. S., Park, J. H., 2009. Video-based self-assessment:
of
Implementation and evaluation in an undergraduate nursing course. Nurse Educ.
Jo ur
na
lP
re
-p
ro
Today 29 (6), 585-589 https://doi.org/10.1016/j.nedt.2008.12.008.
Journal Pre-proof Table 1
Characteristics of open-ended question participants (n = 731)
Variable Age, mean [median] (SD) years (Range: 19 to 59 years)
28.9 [27.0] (8.3)
Sex, n (%) Male
131 (18)
Female
600 (82)
Country of birth, n (%) 256 (35)
of
Australia Born outside Australia
ro
Language spoken at home, n (%)
-p
English only Other than English
re
Enrolment category, n (%)
International
lP
Domestic
Jo ur
na
Student rating of VACS, mean [median] (SD) (Range: 0 to 10)
475 (65)
226 (31) 505 (69)
519 (71) 212 (29) 5.7 [6] (2.8)