Nurse Education in Practice (2004) 4, 250–257
Nurse Education in Practice www.elsevierhealth.com/journals/nepr
Using portfolios in the assessment of learning and competence: the impact of four models Ruth Endacotta,*, Morag A Grayb, Melanie A Jasperc, Mirjam McMulland, Carolyn Millere, Julie Scholese, Christine Webbf a
La Trobe University, P.O. Box 199, Bendigo, Vic. 3552, Australia Napier University, 74 Canaan Lane, Edinburgh EH9 2TB, UK c Institute of Medicine, Health and Social Care, University of Portsmouth, St George’s Building, 141 High Street, Portsmouth PO1 2HY, UK d Institute of Health Studies, University of Plymouth, Plymouth PL4 8AA, UK e Centre for Nursing and Midwifery Research, University of Brighton, Westlain House, Falmer, Brighton BN1 9PT, UK f Institute of Health Studies, University of Plymouth, Exeter EX2 6AS, UK b
Accepted 22 January 2004
KEYWORDS
Summary This paper discusses the diversity of portfolio use highlighted in a study funded by the English National Board for Nursing, Midwifery and Health Visiting exploring the effectiveness of portfolios in assessing learning and competence (Endacott et al., 2002). Data collection was undertaken in two stages: through a national telephone survey of Higher Education Institutions (HEIs) delivering nursing programmes (stage 1); and through four in-depth case studies of portfolios use (stage 2). Data collection for stage two was undertaken through field work in four HEIs purporting to use portfolios as an assessment strategy, and their associated clinical placement settings. Four approaches to the structure and use of portfolios were evident from the stage 2 case study data; these were characterised as: the shopping trolley; toast rack; spinal column and cake mix. The case study data also highlighted the evolutionary nature of portfolio development and a range of additional factors influencing the effectiveness of their use, including language of assessment, degree of guidance and expectations of clinical and academic staff. c 2004 Elsevier Ltd. All rights reserved.
Portfolio; Assessment; Learning; Competence
Introduction The use of portfolios for the purpose of assessment and personal development has seen huge growth *
Corresponding author. Tel.: +61-3-5444-7814; fax: +61-35444-7977. E-mail address:
[email protected].
across professional groups (Wilkinson et al., 2002; McMullan et al., 2003). Authors generally agreed (Glen and Hight, 1992; Cayne, 1995; Jasper, 1995) that the theoretical basis underpinning the use of portfolios was the androgogical approach espoused by Knowles (1975). Knowles (1990) and Cayne (1995) further proposed that, whilst not everyone will have these ‘adult learning’ tendencies,
1471-5953/$ - see front matter c 2004 Elsevier Ltd. All rights reserved. doi:10.1016/j.nepr.2004.01.003
Using portfolios in the assessment of learning and competence portfolio preparation can help to nurture and develop them, given a facilitative climate. There was also general agreement in the literature that principles of student-active, experiential learning were central to the portfolio approach (Kolb, 1984; Hull and Redfern, 1996; Redfern, 1998; Quinn, 1998). Despite this level of consensus regarding theoretical approaches to portfolio use, the literature revealed considerable variation in terms of portfolio structure. This variation was such that it was not possible to identify a typology of portfolio use from written accounts alone. Fieldwork undertaken in Higher Education Institutions and clinical settings during Stage 2 of the study described in this paper did, however, enable identification of different approaches to the use and development of portfolios. These were conceptualised into four models (Webb et al., 2002): referred to as the shopping trolley; toast rack; spinal column and cake it mix. This paper discusses these models in the context of the theoretical principles described above and makes recommendations for improving the effectiveness of portfolio development and use.
251
(Stake, 1975) through interviews. Observation in the HEIs and practice settings provided the research team with contextual dimensions for the interview data. The theoretical basis for data analysis was grounded theory, underpinned by theoretical sampling and constant comparative method. Case study data were analysed first by listing the topic areas raised by interviewees about portfolios. Comparisons were made between views held by different people in each sample for their similarities and differences. From the topic areas which emerged, the researchers ‘progressively focused’ (Parlett and Hamilton, 1972) on areas which illuminated the key features of portfolios. Discussion of the data by the whole research team brought out themes from the cases and raised further questions to pursue and check from the observations and interviews. Ethics approval was gained from an NHS MultiCentre Research Ethics Committee and relevant Local Research Ethics Committees. Informed consent was gained from all participants (nurses). Verbal assent was gained from patients and/or relatives who were present when observation was undertaken in practice settings.
Research design Results The ENB, in commissioning this study, identified four research aims: 1. To assess the extent to which the use of portfolios promotes the integration of theory and practice. 2. To analyse the validity and credibility of the assessment of multiple sources of evidence within portfolios. 3. To explore approaches to the inclusion of portfolios within quality assurance systems. 4. To evaluate the relationship between portfolios and the judgement of clinical competence. A telephone survey was undertaken for stage 1 of the study, in order to identify the breadth and diversity of portfolio use. A case study design (Simons, 1980; Merriam, 1988) was used for stage 2, with the ‘case’ defined as the Higher Education Institution (HEI) and its associated practice settings. The views and experiences of individuals (academics, students and clinicians) involved in designing, implementing and using portfolios form an essential part of the data. The case study approach allowed the researchers to seek out the perspectives of these different stakeholders
Analysis of the data revealed four approaches to the structure and use of portfolios: these were characterised as: the shopping trolley; toast rack; spinal column and cake mix. These provide a useful framework for analysing the effectiveness of portfolios in assessing learning and competence.
The shopping trolley Similar to the suitcase or ‘portmanteau’ described by Wilkinson et al. (2002), this type of portfolio acted as a repository for artefacts collected during the course (for example, articles used for assignments or relevant to particular lectures; copies of practice assessment reports; assignment feedback). There was little cohesion evident in the portfolio, and little attempt to link the evidence to learning outcomes or competencies. Where reflective accounts were included, they were likely to stand alone rather than reflecting on the evidence/artefacts provided. The type of artefacts collected were chosen by the student and not necessarily viewed by a mentor or lecturer. The
252
R. Endacott et al.
structure used was likely to be a large ring binder that held artefacts in different sections. The portfolio itself was not assessed, and might not even be acknowledged as a portfolio by the students (see Fig. 1): ‘If you went up to one of our current pre-registration students and asked to see their portfolio they may not know what you mean. If they did, it would probably be a random collection of articles, information related to a subject, it may include assessed work, but the portfolio itself is not assessed nor does it have any structure so the student might not identify the portfolio as a specific thing’ (B/Head of Pre-Registration Programmes).
Figure 2
The toast rack.
Few examples of this model were found across the case study sites, illustrating the developing nature of the use of portfolios over the past decade. At all sites, evidence was found of ‘shopping trolleys’ in previous curricula, which had evolved at least into ‘toast racks’ at the time of the fieldwork. It appeared that as lecturers became more familiar with the use of portfolios through experience, they began to understand their potential in demonstrating learning and competence, as opposed to requiring simply a collection of evidence.
If reflective components were included, they tended to stand alone as artefacts, such as critical incident analysis or journal entries, rather than being integrated with other elements. Similarly, these components did not include reflection on the selection or evaluation of the portfolio material, as advocated by Wisker (1996) and Crandall (1998). Examples of ‘toast rack’ portfolios were found at each case study site. However, many were evolving further to incorporate elements demonstrating the development of reflective practice, thus moving towards the ‘spinal column’ model.
The toast rack
The spinal column
The portfolio was made up of discrete elements (the toast) that assessed different aspects of practice and or theory, for example, skills log or reflective account. They remained separate when collected into a binder, with the binder (the rack) simply acting as a convenient device for keeping the elements in one place, although this was likely to be structured through required sections identified by the HEI. There was no overarching narrative to connect the various sections, different people might participate in the assessment of the various sections, and some sections might not be assessed at all. The portfolio itself may or may not be assessed or reviewed (see Fig. 2).
The portfolio was structured around practice competencies or learning outcomes (the ‘vertebrae’ making up the central column), and evidence was slotted in, to demonstrate how each competence had been met. Within this model there may be reflective accounts that consider more than one competency, and overarching competencies that require multiple pieces of evidence as proof of
Figure 1
The shopping trolley model.
Figure 3
The spinal column.
Using portfolios in the assessment of learning and competence achievement. The emphasis was on the original work of the student, whilst the evidence was used to support or illustrate the case being made. This was more sophisticated than the toast rack in that assessors needed to see explicit evidence of learning and competence through the student’s writing. Hence the focus for assessment was each competency statement, and the evidence that counted as verification for the claims being made (see Fig. 3).
The cake mix Evidence from theory and practice was integrated into the portfolio and the whole (‘cake’) was assessed. There was an overarching narrative which combined elements and the narrative was assessed rather than the discrete components. A reflective commentary of some sort, whether directed by lecturers, or at the discretion of the student, aimed to demonstrate the student’s critical and analytical skills by asking them to consider how they achieved what they have, how the evidence supported this, and what they had learnt. Hence, the cake was more than a sum of its individual parts, and it was the whole (or the cake) that was assessed rather than the ingredients. Reflectivity, practice and professional development were likely to be features of this model (see Fig. 4). The cake mix approach was chosen at one case study site because it provided the ability to assess the student across the range of activity (A/A&E Course leader). This was considered key by the course leader, who saw the significance of using a portfolio approach as combining evidence of ongoing progress and development as well as providing the end point of assessment. She emphasised the integrative function of the portfolio in enabling the students to explore the complexity of A&E situations.
253
Discussion Evolution of portfolio use Regardless of the model of portfolio used, a key feature of the diversity of portfolio use emerged from the data: the evolutionary nature of portfolio development. Related to this, perhaps inevitably, was the tendency for institutions to use more than one model. In one case study site all four models were evident in different courses across pre and post registration provision. A key way in which portfolios have evolved is the requirement for students to interact and reflect upon what the evidence demonstrates and how this matches competencies, particularly in the preregistration partnership curriculum (UKCC, 1999). This was particularly evident in the spinal column and cake mix models. This underpins the tendency for the portfolio to not be regarded as an endpoint for assessment in the same way that an essay might be, rather as a dynamic document, used both formatively and summatively. Whilst reservations about this approach were voiced by some academics in this study, most considered that, whilst the approach might pose challenges, it was better than previous models in: 1. Facilitating the application of theory to practice. 2. Enabling students to develop skills for reflective practice. 3. Providing evidence of competence. This reflects Price’s (1994) contention that the portfolio has a dual role in both product (outcomes assessment) and process (personal and professional growth). The degree of structure had also evolved in some sites: Initially, the structure of the portfolio was very loose and student dependent, but this was considered not to be successful as the students were unsure as to how to use them, so more structure was built in, based on learning outcomes and being more prescriptive (A/MW/ fieldnotes).
More specific areas of portfolio use that had evolved included:
Figure 4
The cake mix.
1. Greater guidance about the nature of evidence expected (for example, by identifying essential and supplementary evidence). 2. Using the same evidence for a number of outcomes. 3. Prescribed limits to portfolio size.
254 4. Emphasis on the quality of evidence rather than its quantity. 5. More emphasis on the student’s input, with the use of dialogue pages, rather than focusing purely on assessor commentary.
Factors influencing the effectiveness of portfolio use The research team acknowledges the evolutionary state of portfolio development at the time of data collection. However, it is evident from the data that a number of factors contributed to the effectiveness of using a portfolio for the assessment of learning and/or competence. These factors are discussed below; much of this discussion also relates to wider issues of assessment in professional practice. The optimum structure for the portfolio It is evident from the data that some degree of structure was required for the portfolio to redress the fault line created by the problem of “not knowing what you don’t know”. However, the integration of evidence and of theory and practice was more important than the proforma. The four models that emerged through the case study data collection had varying degrees of structure but it was clear that the shopping trolley and toast rack approaches created extra work for students and assessors with limited benefits in terms of integrating theory and practice; enabling reconstruction (and a greater understanding) of practice; and enabling the student to see their own progress and personal development through the programme. Both the spinal column and cake mix approaches were closer to achieving these goals. A further feature of ‘optimum structure’ (seen in these latter models) was the construction of the portfolio to enable triangulation of evidence. However, this worked best when the goal of triangulation was completeness rather than convergence; this also reflects the need to consider a qualitative, interpretive approach to assessment, using criteria such as dependability and credibility rather than validity and reliability (Webb et al., 2003). For some programmes, reflective accounts – the key element of a successful portfolio (Howarth, 1999) – were restricted by the amount of space allocated in the portfolio. This reflects similar findings from a previous study where the reflective component of a program was seen as crucial but also constrained by the reality of the clinical setting (Scholes and Endacott, 2002; Endacott et al.,
R. Endacott et al. 2003). Some stakeholders made suggestions for improvement in the use of portfolios, focusing on streamlining to reduce the workload and time involved for all parties. A ‘tick box’ or ‘checklist’ format was the only alternative suggested. Some students wanted a pocket-sized clinical skills booklet that could be carried around and be readily available for assessors to sign. This suggestion confirms the impression that students and many of the assessors appeared to view the portfolio more as an assessment than a learning tool. Appropriate guidance Some students were vociferous regarding inconsistency between academic staff in terms of their advice and expectations of the portfolio. Students believed that they had to tailor their portfolio to the requirements of the lecturer who was going to be involved in the assessment. This led to a conflict in students between wanting to maintain their own identity within the portfolio and addressing the requirements set by their lecturer, particularly when different lecturers had different expectations. In some cases, the practice assessor would also seek to place their stamp upon the way in which the portfolio was presented. This also had repercussions regarding the participant ’s ownership of the portfolio and how active a role students took in their own learning, one of the key principles of portfolio use espoused in the literature (Cayne, 1995; Jasper, 1995; Redfern, 1998). From the researchers’ perspective, guidance given to students by academics was at times ‘inappropriate’, for example, in the reflective section of the portfolio, placing emphasis on ‘feelings’ rather than critique, analysis and cognitive aspects. This reflects concerns by other authors that lecturers are not necessarily all good facilitators of self-reflection (Gerrish, 1993; Karlowicz, 2000). Realistic expectations Students could spend many hours preparing for and writing up versions of the reflective accounts until they and their assessors were satisfied with the standard. For these students the tendency was to identify an outcome that they needed to complete in a particular placement and then seek an incident that could be used to fulfill this requirement, rather than having an experience and identifying the need to reflect on it. Most incidents selected were ‘positive’ in that they provided a constructive opportunity to give care and say how students felt about the episode. Rarely was a ‘negative’ incident involving poor care criticised, due to reluctance to criticise nurses working in the placement setting.
Using portfolios in the assessment of learning and competence Workload concerns were reflected in the literature, from both the student completion and lecturer assessment perspectives (Snadden et al., 1996; Harris et al., 2001), with further suggestion that the anxiety this provokes for students will have a negative impact on their motivation (Mitchell, 1994). Assessment language In one site, the portfolio tended to focus heavily on process rather than outcome assessment. This was evident even in the clinical skills component, since students wrote evidence against the statement to show that they had been involved in carrying out the skill at some level (observation, participation, internalisation, identification). Actual skill attainment did not seem to be measured/evaluated. Hence selection of an appropriate assessment framework is central if the portfolio is to be used to assess competence, as well as (or instead of) learning. It was important also that the development of the portfolio reflected the real world of practice, such that the competencies and work expected of students and assessors were realistic and valued. There was a tendency for a range of stakeholders (for example, the regulatory body, university, Department of Health) to impose their requirements on the portfolio content, resulting in competing tensions. There was a strong sense that many of the competency statements advocated at national level were necessarily abstract, to accommodate the different needs of stakeholders. However, students and assessors (and many academics) wanted a level of specificity to enable practice to be assessed in a meaningful manner. To make sense of assessment tools, students deconstructed the learning outcomes, sometimes translating these into a format or language that reflected their practice and then reconstructed their evidence to meet university requirements for the portfolio. It appeared that some students developed personal theories of nursing, midwifery or health visiting practice through this process, depending on the tool in use and the students’ academic and professional maturity. This process, and the criteria used to assess portfolios, are addressed in detail elsewhere (Scholes et al., in press; Webb et al., 2003). At one case study site, the guidelines for portfolio use distinguished assessment ‘in’ practice from assessment ‘of’ practice. The former definition referred to a portfolio that included practice assessment tools and was completed in practice and signed off by practice assessors. The latter definition conceived the whole portfolio as an assessment of practice, with written evidence to support the the-
255
oretical outcomes and reflective accounts. In this instance, the actual portfolio was marked by the lecturer and the student was given a pass/fail mark. It is interesting to note that few examples of cake mix portfolios were found across the sites. Where they were being used, this was for postqualification courses aimed at developing specialist or advanced practitioners at level three or postgraduate level. What this suggests is that the activities required for portfolios of this sort map on to the academic skills found at Honours or Master’s level in terms of demonstrating higher levels of critical analysis and evaluation, together with accurate assessment of a student’s own skills, knowledge and practice acquisition. A key feature of these portfolios was their formative and developmental use as a learning as well as assessment strategy. Hence, the portfolios were considered as dynamic in nature, being used actively in clinical/ practical situations, as well as crossing the divide into academic settings for discussions in seminars and tutorials. They contained a great deal of reflective material, including journals and logs, that required the student to reflect upon their experiences and make sense of them in theoretical terms, for translation back into practice.
Conclusions A recurring theme across the data was the evolutionary nature of portfolio development. Stakeholders frequently made reference to how things used to be or how things were going to change in the future. On one level, this was encouraging, highlighting the acknowledgement that the structures and processes for portfolio use were not yet right. Whilst this was a positive dynamic, it also meant that students and assessors were generally working with a system under development, and were usually very aware of this. On a positive note, this evolutionary phase did increase the opportunity for stakeholders (including students and assessors) to influence future developments. In order to optimise the effectiveness of portfolios, the authors recommend that, first, the portfolio structure should enable a balance between providing sufficient evidence to enable triangulation and a judgement about the student’s competence, whilst not creating an overwhelming or unrealistic workload. Similarly, the danger of streamlining to the extent of producing a ‘sterile’ portfolio should be avoided (Redfern, 1998; Stockhausen, 1996). The limitations of the shopping trolley and toast rack approaches to portfolio
256 structure should be heeded, in order to prevent excessive workload investment for little return. Secondly, the degree of structure used for the portfolio should be appropriate for the academic level and level of experience of the students, both at pre-and post-registration level. A balance also has to be struck between encouraging the student to become selective and autonomous in their construction of evidence (Crandall, 1998) and acknowledging that a self-directed approach will not suit all learning styles (Snadden and Thomas, 1998). Thirdly, there needs to be adequate preparation and support of students, staff and assessors and fourthly, the model of portfolio used should require the student to transform material into evidence, rather than simply bolt on material that has not been synthesised. The amount of input to support staff and students should not be underestimated. This is a continual and demanding process of time and effort but an important investment if it is to be effective. It is essential that the developmental aspects of critical reflection are acknowledged through providing direction and feedback to all parties. Portfolios appear to offer the promise of integrating theory and practice by giving students the responsibility to provide evidence of the outcomes of their learning and its processes, how they have developed personally and professionally, and where further learning is needed. Combined with the holistic emphasis on a variety of assessment methods, a portfolio can incorporate evidence from more quantifiable approaches, such as skills checklists, to overcome the criticism that portfolios mainly assess writing skills. However, this potential will only be realised with considerable investment from all parties.
Acknowledgements The study was funded by the former English National Board for Nursing, Midwifery and Health Visiting, whose responsibility for its oversight then passed to the Department of Health. The views expressed in this paper are those of the authors and do not necessarily reflect the opinions of the funding bodies.
References Cayne, J., 1995. Portfolios: a developmental influence? Journal of Advanced Nursing 21, 395–405. Crandall, S., 1998. Portfolios link education with practice. Radiologic Technology 69, 479–482.
R. Endacott et al. Endacott R., Scholes J., Freeman M., Cooper, S., 2003. The reality of clinical learning in critical care settings: a practitioner:student gap. Journal of Clinical Nursing 12, 778–785. Endacott, R., Jasper, M., McMullan, M., Miller, C., Pankhurst, K., Scholes, J., Webb, C., 2002. Evaluation of the use of portfolios in the assessment of learning and competence in nursing, midwifery and health visiting. Unpublished report to the Department of Health, Leeds. Gerrish, K., 1993. An evaluation of a portfolio as an assessment tool for teaching practice placements. Nurse Education Today 13, 172–179. Glen S., Hight, N.F., 1992. Portfolios: An ‘affective’ assessment strategy? Nurse Education Today 12, 416–423. Harris S., Dolan G., Fairbairn, G., 2001. Reflecting on the use of student portfolios. Nurse Education Today 21, 278–286. Howarth, A., 1999. The portfolio as an assessment tool in midwifery education. British Journal of Midwifery 7, 327–329. Hull C., Redfern, L., 1996. Profiles and Portfolios. Macmillan Press Ltd, Basingstoke. Jasper, M., 1995. The potential of the professional portfolio for nursing. Journal of Clinical Nursing 4, 249–255. Karlowicz, K.A., 2000. The value of student portfolios to evaluate undergraduate nursing programs. Nurse Educator 25, 82–87. Knowles, M., 1975. Self-directed learning: a guide for learners and teachers. Follet, Chicago. Knowles, M., 1990. The Adult Learner: A Neglected Species, fourth ed. Gulf Publishing, Houston. Kolb, D.A., 1984. Experiential learning. Prentice-Hall, London. McMullan M., Endacott R., Gray M., Jasper M., Miller C., Scholes J., Webb, C., 2003. Portfolios and assessment of competence: a review of the literature. Journal of Advanced Nursing 41 (3), 283–294. Merriam, S.B., 1988. Case Study Research in Education: A Qualitative Approach. London, Jossey Bass. Mitchell, M., 1994. The views of students and teachers on the use of portfolios as a learning and assessment tool in midwifery education. Nurse Education Today 14, 38–43. Parlett, M., Hamilton, D., 1972. ‘Evaluation as Illumination: A new approach to the study of innovatory programmes’, Occasional Paper No. 9, Centre for Research in Educational Science, Edinburgh. Price, A., 1994. Midwifery portfolios: making reflective records. Modern Midwife 4, 35–38. Quinn, F.M., 1998. Reflection and reflective practice. In: Quinn, F.M. (Ed.), Continuing Professional Development in Nursing: A Guide for Practitioners and Educators. Stanley Thornes Publishers Ltd, Cheltenham, pp. 121–145. Redfern, L., 1998. The power of the professional profile. In: Quinn, F.M. (Ed.), Continuing professional development in nursing. Stanley Thornes, Cheltenham, pp. 167–181. Scholes, J., Jasper, M., Endacott, R., Gray, M., Miller, C., McMullan, M., Webb, C., in press. Making portfolios work: a process of deconstruction and reconstruction. Journal of Advanced Nursing. Scholes J., Endacott, R., 2002. Evaluation of the effectiveness of educational preparation for critical care nursing. ENB, London. Simons, H., 1980. ‘Case study in the context of educational research and evaluation’. In: Simons, H. (Ed.), Towards a Science of the Singular, Occasional Paper No.10, Centre for Applied Research in Education, University of East Anglia. Snadden D., Thomas M.L., Griffin E.M., Hudson, H., 1996. Portfolio-based learning and general practice vocational training. Medical Education 30, 148–152. Snadden D., Thomas, M.L., 1998. Portfolios learning in general practice vocational training – does it work? Medical Education 32, 401–406.
Using portfolios in the assessment of learning and competence Stake, R., 1975. ‘ Programme Evaluation, Particularly Responsive Evaluation’ Occasional Paper No. 5. Kalamazoo, Western Michigan University Evaluation Centre. Stockhausen, L.J., 1996. The clinical portfolio. Australian Electronic Journal of Nursing Education 2, 1–11. United Kingdom Central Council for Nursing, Midwifery and Health Visiting, 1999. Fitness for practice: the UKCC commission for nursing and midwifery education. UKCC, London. Webb C., Endacott R., Gray M., Jasper M., Miller C., McMullan M., Scholes, J., 2002. Models of portfolios Medical Education 36 (10), 897–898.
257
Webb C., Endacott R., Gray M., Jasper M., McMullan M., Scholes, J., 2003. Evaluating portfolio assessment systems: what are the appropriate criteria. Nurse Education Today 23 (8), 600–609. Wilkinson T.J., Challis M., Hobma S.O., Newble D.I., Parboosingh J.T., Sibbald R.G., Wakeford, R., 2002. The use of portfolios for assessment of the competence and performance of doctors in practice. Medical Education 36 (10), 918–924. Wisker, G., 1996. Assessment for learning: encouraging personal development and critical response on a writing module by student-centred assessment and teaching/learning strategies. IETI 33, 58–65.