The nuts and bolts of evaluating science communication activities

The nuts and bolts of evaluating science communication activities

Seminars in Cell & Developmental Biology 70 (2017) 17–25 Contents lists available at ScienceDirect Seminars in Cell & Developmental Biology journal ...

893KB Sizes 1 Downloads 41 Views

Seminars in Cell & Developmental Biology 70 (2017) 17–25

Contents lists available at ScienceDirect

Seminars in Cell & Developmental Biology journal homepage: www.elsevier.com/locate/semcdb

Review

The nuts and bolts of evaluating science communication activities Suzanne Spicer ∗ Office for Social Responsibility,186 Waterloo Place, Oxford Road, Manchester M13 9PL UK

a r t i c l e

i n f o

Article history: Received 9 June 2017 Received in revised form 1 August 2017 Accepted 8 August 2017 Available online 18 August 2017 Keywords: Science communication Public engagement Evaluation Impact

a b s t r a c t Since 2008 there has been a focus on fostering a culture of public engagement in higher education plus an impact agenda that demands scientists provide evidence of how their work, including their science communication, is making a difference. Good science communication takes a significant amount of time to plan and deliver so how can you improve what you are doing and demonstrate if you are having an impact? The answer is to evaluate. Effective evaluation needs to be planned so this paper takes you step by step through the evaluation process, illustrated using specific examples. Crown Copyright © 2017 Published by Elsevier Ltd. All rights reserved.

Contents 1. 2. 3.

4.

Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17 How to start . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 20 Writing an evaluation plan . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21 3.1. Aims and objectives . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21 3.2. Audience . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21 3.3. Key evaluation questions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 22 3.4. Data collection . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 22 3.5. Analysing your evaluation data . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 23 3.6. Using your findings . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 24 3.7. Sharing your learning . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 24 Final remarks . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 24 References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 24

1. Introduction Evaluation can be perceived as a daunting task, that it is complicated and demanding but when used correctly, it is an effective tool to reflect on and improve your science communication activities, as well as determining the value and worth of evidenced impact. Or as according to the W G Kellogg Foundation [1] − “to prove that a project worked, but also to improve the way it works”. Science communication came out of a drive to improve the public’s understanding of science. The Royal Society described it in 1985 [2] as a need to develop public awareness of the nature of science to

∗ Corresponding author at: The University of Manchester, Office for Social Responsibility, 186 Waterloo Place, Oxford Road, Manchester M13 9PY, UK. E-mail address: [email protected] http://dx.doi.org/10.1016/j.semcdb.2017.08.026 1084-9521/Crown Copyright © 2017 Published by Elsevier Ltd. All rights reserved.

improve debate and decision-making on how science and technology affects modern life. More recently de Bruin, Bruine and Bostrom have highlighted it as having “to improve people’s understanding of the decision-relevant issues, and if needed, promote behaviour change.” [3]. In 2006 the Royal Society [4] surveyed the factors affecting science communication to understand what was encouraging as well as inhibiting scientists from undertaking engagement. This was further developed in a more recent survey conducted by a consortium of UK research funders [5] in 2015. This focus on developing public awareness and the identification of what was needed to support scientists to engage resulted in a fostering of a culture of public engagement in higher education in the UK. It began in 2008 with six Beacon partnerships [6] which were funded by the Higher Education Funding Councils, Research Council UK and the Wellcome Trust to inspire a culture change in how universities engaged

18

Table 1 The stages of evaluation planning illustrated using three actual evaluation examples.

Aims

Wriggling Rangoli

From Supermarket to Sewers

Science Spectacular

• To share knowledge and raise awareness of parasitic infections and the links to global poverty

• To promote healthy eating by explaining how the body digests the food we eat

• To contribute to the Manchester Science Festival’s mission of creating a place for surprising, meaningful science where everyone can join in and be curious • To provide a supported opportunity for researchers to engage families with current science research

• To gain relevant insights into the experiences of an Asian women community group who lived in areas affected by parasitic infections

Objectives

• To explain how the human digestive system works

• To attract over 1500 people to the event

• To create a traditional Rangoli mural during a one-day community festival and on campus at Manchester Museum as part of the Manchester Science Festival

• To develop an understanding of the importance of eating 5 fruit and vegetables a day

• To attract family groups from across Greater Manchester

• To create a fun science show that young people enjoy

• To provide fun, interactive table-top activities that brings science alive and makes it meaningful to young people and their families

• To run 40 shows for school pupils aged 8–14 years old

• To provide researchers with the logistical support and event organisation to enable them to deliver successful activities

Audiences

• University immunology researchers; women and children from an Asian community group in Manchester; worker from community partner organisation

• School children; school teachers; museum presenters delivering the show

• Families from Greater Manchester; University researchers and students; event organisers; science buskers; volunteers; Manchester Science Festival staff/evaluators

Evaluation Questions

• Has a sustainable two-way partnership been developed with a new audience?

• How many school pupils attended the shows?

• How many people attended the event?

• What have the researchers gained from working with a community group for the first time?

• Do young people understand how we digest our food?

• Where have the people come from?

• What have the women gained from working with the researchers?

• Do young people understand the importance of eating a healthy diet?

• What did visitors to the event think about the day?

• Has there been an increase in awareness of the role of the scientist and worm infections and links to global poverty?

• Did the young people enjoy the show?

• How many researchers were involved in the event and from which faculties?

• What was the key highlight for the researchers? Additional questions will be asked by the evaluators for the Manchester Science Festival

S. Spicer / Seminars in Cell & Developmental Biology 70 (2017) 17–25

• To initiate a workshop to inspire the women and their children to share their experiences and design a creative representation of parasitic infections

Data collection techniques

• 3 interactive questions will be built into the show

• Use of head count clickers at each welcome desk

• Use a creative “tapeworm” to record how the women and children felt about their learning at the workshop, and participants at the public events.

• In-depth interviews will be held with a selection of schools

• Record where visitors came from using postcode data

• Capture the numbers and demographics of the public attending the large-scale public art events.

• Staff debriefing meetings at the beginning, middle and end of the scheduled run of shows

• Ask families for their thoughts of the event via a graffiti wall

• Email survey of stand organisers after the event

• Record key anecdotal points from discussions at the workshop and public-art events

• Debrief by the organising team/volunteers 2–3 weeks after the event • Miscellaneous eg. unprompted emails, independent evaluation by researchers

Analysis & interpretation of data

• Collate the baseline sheets and identify the baseline

• Collate and group the responses to the interactive questions

• Group drawings and comments from graffiti wall

• Group and analyse the tapeworm comments

• Analyse interviews held with a selection of schools

• Use postcode district maps to plot where visitors live

• Gather reflections from colleagues and partners

• Group and analyse the staff debriefing meetings

• Group responses from email survey and debrief meeting • Submit final visitor numbers to Manchester Science Festival

• Draw out the key learning points and recommendations for next steps

Reporting

• Case study for funders

• Assess how to improve the show and make changes

S. Spicer / Seminars in Cell & Developmental Biology 70 (2017) 17–25

• Complete the project’s baseline information sheet at the start and again at the end of the project.

• Short written report for internal funders (also used by researchers)

• Feed into a wider report on the secondary learning programme

19

20

S. Spicer / Seminars in Cell & Developmental Biology 70 (2017) 17–25

with the public. Later initiatives included the Research Council UK’S Research Catalyst funding (April 2012- March 2015) and the Catalyst Seed Fund (August 2016) initiatives [7]. But as highlighted by the National Coordinating Centre for Public Engagement (NCCPE), this growth of science had “few safeguards for the delivery of high quality engagement” [8] with evaluation often being neglected. More recently there has been a growing demand from organisations such as funders to evidence how research is making a difference. More funders are also expecting research grants to include aspects of science communication. For example, the UK’s Biotechnology and Biological Sciences Research Council’s impact policy [9] has a “commitment to identifying, understanding, enabling and publicising the impact of bioscience research in the UK and more widely as part of its support to the UK bioscience community”. It sees impact of research as being very broad, and encompassing economic, social and wider impact and is the responsibility of all those involved including researchers in receipt of their grants. Therefore, scientists need to be able to assess and provide evidence of the impact of their work including their science communication activities. But as Jenson [10] points out, measuring complex outcomes and impact is not simple and King et al. [11] identify that proving the causal effect of an activity can be difficult so if you are spending a significant amount of time planning and delivering your science communication activities, you need to undertake planned and focused evaluation to really know if what you are doing is of a high quality or having an impact. In the Research Councils UK’s guide to evaluating public engagement (see Table 2), evaluation is described as a process that takes place before, during and after an activity or programme of activities. It not only allows you to determine if the content and delivery of your science communication has worked, but more importantly, to identify what has not worked and why. Evaluation can provide evidence to demonstrate the value, benefits and impact of your engagement in relation to your aspired objectives, for example an increase in awareness or understanding, or a change in behaviour. It helps you learn from your actions so you can be more successful in achieving your engagement objectives in the future. The purpose of this paper is to demystify how you can evaluate your science communication activities by taking you through the evaluation process and hopefully encouraging you to start building it into your own activities. Three examples based on actual experiences of developing and conducting evaluation have been selected to illustrate the steps in the process (see Table 1). They differ in scale, content and delivery approach to represent the diversity of science communication activities. The first example is ‘Wriggling Rangoli’ from Manchester implemented by Pennock, Cruickshank and Else [12] who aimed to raise awareness of the dangers of parasitic infection and to share knowledge and experiences with those who had lived in affected areas before moving to the UK. Working in partnership with community organisations, they ran a workshop for local Asian women and their children, at which they shared their knowledge of the science of parasitic infections, inviting the women to share their own experiences and knowledge. The workshop also inspired the women and children to design a creative representation of parasitic infections resulting in the creation of a rangoli mural − a colourful Indian design traditionally drawn on the floor near the entrance to a home using rice grains, flour, sand or chalk. In 2010 the women and children created a chalk rangoli at a local one-day community festival (see Fig. 1) and again in the city centre as part of the Manchester Science Festival, England’s largest science festival produced by the Museum of Science and Industry which aims to create a place for “surprising, meaningful science where everyone can join in and be curious” [13]. The second example is called ‘From Supermarket to Sewers’ and is based on a science show originally developed by the Museum of

Fig. 1. Child creating a rangoli mural based on worm infection.

Fig. 2. Interactive science at Science Spectacular.

Science and Industry for school children aged 8–14 years to deepen understanding of how the human digestive system works and encourage healthy eating. Offered as part of the Museum’s school and family programme, the 25-min interactive show used theatrical demonstrations and on-stage experiments to engage young people with the science of the human digestive system and address a key health issue. The final example is ‘Science Spectacular’ [14], a one-day interactive science event for families run as the University of Manchester’s main contribution to the Manchester Science Festival (see Fig. 2). Based on campus in two buildings, the event involves 200 researchers offering 40 different table-top interactive activities covering all areas of science such as discovering how a plane flies, investigating the science behind music and how to help prevent antibiotic resistance. The event is organised by a small event team who provide the exhibition stands, marketing and general support for the researchers who are offering science communication activities. There are about 2000 visitors and the majority are families from the Greater Manchester region. 2. How to start Ideally you should start planning your evaluation at the same time as you are planning your science communication activity,

S. Spicer / Seminars in Cell & Developmental Biology 70 (2017) 17–25

21

Table 2 Useful resources and links. • • • • • • • • • • • • • • • • • • • • • • •

NCCPE: offers advice on evaluation and runs training workshops https://www.publicengagement.ac.uk/plan-it/evaluating-public-engagement The University of Sheffield: an evaluation toolkit with top tips https://www.sheffield.ac.uk/ris/publicengagement/masterclasses-and-resource/toolkits/evaluation UCL: a guide on how to evaluate your public engagement activities https://www.ucl.ac.uk/culture/sites/culture/files/event evaluation 0.pdf Better Evaluation: An international collaborative network that creates and shares information and advice on evaluation http://www.betterevaluation.org/ AHRC: a guide to self-evaluation http://www.ahrc.ac.uk/documents/guides/understanding-your-project-a-guide-to-self-evaluation/ CLAHRC: an evaluation guide developed for clinicians and NHS managers http://clahrc-cp.nihr.ac.uk/wp-content/uploads/2012/07/Evaluation GUIDE.pdf Arts Council’s Inspiring Learning for All: has a useful resource bank http://www.artscouncil.org.uk/resources Wellcome Trust: results of an engagement workshop on evaluating public engagement in the Wellcome Trust’s UK Centres, 2015 https://wellcome.ac.uk/sites/default/files/wtp059889 0.pdf Jen Harvey ed, Evaluation Cook Book, The Institute for Computer Based Learning, Heriot-Watt University, 1998 http://www.icbl.hw.ac.uk/ltdi/cookbook/cookbook.pdf Healthcare Improvement Scotland, Evaluating Participation: a guide and toolkit for health and social care practitioners, Scottish Health Council, September 2013 http://www.scottishhealthcouncil.org/publications/research/evaluation toolkit.aspx#.WWOZHbsrLrc BMJ: has useful resources on qualitative research methods and analysis http://www.bmj.com/ Jensen, E and Charles L, Doing real research: A practical guide to social research, Sage, 2016

event or programme. Do not make the mistake of leaving it too late or you run the risk of collecting the wrong data and not answering the questions you want answers to, or collecting too much data which you do not have the time to analyse or use. Having a thought-out plan helps avoid these issues and keeps your evaluation successful and manageable. Begin by considering why you are undertaking the evaluation. It is important to be clear of your purpose right from the start − is it to learn from and improve your practice, to demonstrate impact and success in achieving what you set out to do, or a mixture of both? This will shape your evaluation and ensure you are asking the right questions to the right people in the most relevant way. It will also ensure you are collecting the correct amount of information and utilising your time effectively. There are two main modalities of evaluation, formative and summative. Formative evaluation is about process and is when you assess if your activity is working − think of it as testing your ideas. Does the activity work? Can it be improved? Is it suitable for the audience you are engaging with? This allows you to, later on, modify what you are doing. Be wary of continually testing an activity and use your formative evaluation results to determine at what point you know it is working and there are no more improvements you can make. Summative evaluation usually happens at the end and it assesses and evidences the impact of your science communication activity, project or programme. It addresses the question as to whether your activity is making a difference. To successfully demonstrate this you need to plan how you will establish a baseline from which you can evidence any change. For example to know if people’s knowledge, attitudes or behaviour have changed you need to know where they are starting from, so you could ask young people to draw what they think they know about your topic or research at the beginning of your session and then ask them to repeat the activity at the end. When planning your evaluation you may want to ask for help and support from those who are more experienced. You could approach colleagues who are already evaluating their own science communication, social scientists or professional support staff within your institution such as public engagement or impact officers. Alternatively, you can find useful guides and resources from other organisations and higher education institutions (see Table 2).

3. Writing an evaluation plan To keep focused, it is a good idea to have a simple evaluation plan, a step-by-step guide which summarises the whole process from the aims and objectives to how the results will be reported. If you are new to evaluation, then keep things simple and realistic so you gradually become more confident and sophisticated in your strategies. A common mistake is to be too ambitious and make your evaluation overly complex. 3.1. Aims and objectives Start by considering your aims and objectives. Your aims are what you want to achieve overall from your science communication activity, for example inspiring young people about science, raising awareness of your research findings, or involving the public in gathering data as part of a citizen science activity. Ideally have no more than two so you keep focused on the most important outcomes you want to achieve. Table 1 shows the aims of the three science communication examples being used to illustrate the different stages in evaluation planning. Your objectives are how you will implement your science communication activity to achieve your aims. Again, be realistic and focus on no more than five objectives. It is also important to keep them SMART [15], so when you have drafted your objectives, ask yourself these questions: Specific: do your objectives state what you will do and with whom? Measurable: can you measure their success? Achievable: do you have enough time and resources to achieve your objectives? Relevant: do they meet your aims? Time-bound: do they include timescales? 3.2. Audience It is important to identify who will be involved in your evaluation and what challenges may arise during the evaluation process. Remember to not only include your public audiences but also yourself and your team, and any partners you are working with. This will inform other areas of your planning such as where, when and what techniques to use when collecting your evaluation data. For exam-

22

S. Spicer / Seminars in Cell & Developmental Biology 70 (2017) 17–25

ple, the researchers on the ‘Wriggling Rangoli’ project realised that language could be a barrier as many participants had English as a second language. Therefore, they identified staff that spoke Hindi and Bengali to act as interpreters and generated visual resources to improve the communication of scientific ideas and to gather evaluation data. 3.3. Key evaluation questions Careful thought has to be invested in the next stage of the process − identifying appropriate and effective key evaluation questions. These are the questions you want to answer and should reflect your aims and objectives, and relate to what you consider success will look like. Ideally have a minimum of two questions and a maximum of six, and ensure they relate to the evidence you can collect. They should not only measure outputs (the results of your activity, event or programme) but also outcomes (a change or benefit such as an increase in awareness, knowledge and understanding; the development of skills or confidence; a change in behaviour or attitude; or a change in practice or procedure). Outputs are often easier to identify and measure, whereas outcomes can be more challenging but can be more insightful and reveal the impact of your activity. For example in ‘From Supermarket to Sewers’ (see Table 1) the first evaluation question relates to how many pupils attended the shows so is an output question. The other evaluation questions ask about understanding and enjoyment which are outcome questions. The AHRC provides a definition of outputs, outcomes and impact in their guide to self-evaluation (see Table 2). 3.4. Data collection To answer your evaluation questions you need to collect data which is likely to comprise of a mixture of quantitative data that is numerical factual answers that can be counted such as visitor numbers, the selection of predetermined answers to questions or web page downloads, and qualitative data which is made up of open responses such as answers to open-ended questions, drawings, videos or observations. You do not necessarily have to evaluate everyone and every activity or event you undertake. To make it more manageable, you can sample. There are various types of sampling. Random sampling is where everyone in your selected audience group has equal access to contributing to your evaluation. Alternatively, you could use systematic sampling, for example by asking every nth person when completing a face-to-face survey. At ‘Science Spectacular’ a random sampling approach is used to gather information on what the public thought of the event, whereas a more focused sampling approach is used to gain feedback from the researchers with only the stand holder lead contacts asked to complete an online survey. Whichever sampling technique you use, ensure you explain what you did when reporting your findings to give context to your results. You can find more information on sampling on the Better Evaluation website (see Tables 1 and 2). There are a variety of methods you can use to collect your evaluation data but it is important to ensure you use appropriate techniques as each one has its own costs and benefits. Think about your public audience and which methods they are more likely to respond to or engage with. For example the researchers working on the ‘Wriggling Rangoli’ project had to factor in that English was not the first language of their community partners so they used visuals such as a giant tapeworm mounted on the wall of the room where the workshop was being held (see Fig. 3). At ‘Science Spectacular’ a graffiti wall allowed young children not able to write to still express their thoughts using drawings. Another consideration is how much time you and your public will have to conduct the evaluation and where it will take place. Will they want to leave

Fig. 3. The evaluation “tapeworm” used at the ‘Wriggling Rangoli’ workshop.

because your event has finished? Are families distracted because it is time to eat lunch? In the example ‘From Supermarket to Sewers’ when the science show was only one part of a programmed visit to the museum, it was decided to include some evaluation in the actual show itself and then prearrange with a sample of schools to remain after the show had finished to answer more in depth questions. Finally, you may choose a technique that appears easy to set up like videoing but afterwards you have hours of video to watch and analyse. Therefore, think through how much time you can commit to analysing the information you have collected. Useful resources with information about different evaluation techniques are provided in Table 2. Having considered who you will be asking to contribute to your evaluation, the environment within which you will be conducting the evaluation and the resources you have available, you can now decide on which data collection technique or techniques you will use. They can vary from the traditional questionnaire and interview to more creative methods such as drawing or graffiti walls. Better Evaluation (see Table 2) gives examples and categorises them into: A Information from individual; for example, in-depth interviews or questionnaires, comment cards, online surveys, drawings, emails, placing stickers on a scale/image, recording experiences on scales radiating outwards like a dart board, using an event passport [16], self-selecting voting by placing a ball in a tube or a coloured token in a jar, using an electronic voting pad or voting app, project diaries or logs, social media Facebook wall. B Information from groups; for example, focus groups, mapping knowledge and understanding to show change after the outreach, World Café [17], graffiti walls, moving physically into spaces to rate experiences such as good, OK or not good. C Observation; for example, observation of an outreach activity by a neutral observer, use of video or photography to record the activity and/or interviews with participants. D Physical measurements; for example, collecting postcode data, web site analytics, head count. E Reviewing existing records and data; for example, planning notes, meeting minutes, personal logs and diaries. You can find useful guides to different types of data collection methods from within teaching and learning as well as from other sectors who engage with the public. For example the Institute for Computer Based Learning at Heriot-Watt University produced a guide aimed at lecturers who are interested in evaluating materials for their effectiveness in achieving specific learning objectives, and the Scotland’s Health Council’s guide and toolkit for health and social care practitioners provides excellent advice on public involvement and participation (see Table 2).

S. Spicer / Seminars in Cell & Developmental Biology 70 (2017) 17–25

23

Fig. 4. An example of a closed question.

When collecting your evaluation data remember to adopt the same ethical approach that you would when conducting your research. Treat all participants with respect and inform them that evaluation is taking place. Always ask permission to record or observe them and, particularly when working with young people, follow your institution’s consent procedures or use the guidelines provided by the UK’s NSPCC [18]. If you ask people for sensitive information such as their ethnic background, religious beliefs or political opinions, you need to be aware of the legal implications of the UK’s Data Protection Act [19] or equivalent in other countries. Be clear why you need personal information and, if it is essential to your evaluation, how you will anonymise the data, who will have access to it, where you will safely store it and how long you will keep it. If you are planning to publish your evaluation results then you need to go through your institution’s ethics board. Whichever technique you decide to use, it is very likely that you will have to ask a question or a series of questions to gather your evaluation data. Firstly, decide if you want to use open or closed questions. Open questions are free and ask for open-ended responses, for example “Overall, what did you think of the activity?” They are more challenging to analyse but give a richer picture and can reveal unanticipated outcomes. Closed questions give a series of options for the participant to select. An example is given in Fig. 4. Closed questions are easier to analyse but do not necessarily give you an in-depth insight into what people think. If you are asking a series of questions such as in a survey, then it is a good idea to use a mixture of closed and open questions. When writing questions, be clear and avoid using language and jargon that could be misunderstood. Keep your language simple and easy to understand. Check that none of your questions are leading or biased in the way that could possibly skew your results. For example the question “Would you agree that the workshop increased your knowledge of biology?” would be better phrased as “Did the workshop increase your knowledge of biology?” Try to avoid asking people to be hypothetical and predict their behaviour too far into the future, for example “Would you visit this festival next year?” It is not realistic for people to know what they will be doing in 12 months’ time. Instead you could ask “How likely are you to visit the festival next year?” and offer the following scaled responses: very likely, likely, may be, not very likely. Be careful you are not ambiguous or too vague, for example you might ask “Where have you come from?” which could get responses such as from the car park, from Birmingham, or from a specific address. And do not ask multiple questions in one, such as “Why did you come to the event today and what did you think of it?” People tend to only answer part of the question or give an answer that might not be true for both elements of the question. Aim to have a balanced number of questions that are positive and negative, as this will help to ensure that participants answer truthfully, without feeling that they have to give the answers that they think you want. Finally, you

might also like to consider having a question that looks for unanticipated positive (and negative) outcomes so you do not miss the results you had not planned for. For example was there anything you were not expecting or is there anything else you would like to add? It is not as easy as you might think to write effective so you should allocate enough time to test them out by simply asking your colleagues, family or friends to ensure that they will provide the information you want. 3.5. Analysing your evaluation data When planning your evaluation, decide how much data you want to collect and how much time and capacity you will have available to analyse it. There is no point in spending energy on collecting data unless you plan to analyse it and learn from the results. Now consider how you will analyse the information you have collected to assess if you have answered your key evaluation questions. How you do this depends on whether the data is quantitative (numbers) or qualitative (words and images). Useful information is given in various resources such as the RCUK’s guide to evaluating public engagement (see Table 2). With quantitative data use a statistical package such as Microsoft Excel, Prism or SPSS. If you have used a questionnaire to collect your data, give each completed form a unique code and put that in the first column of your spreadsheet so you can refer back to original responses if you need to check any information later. Now enter your data so each row in the spreadsheet corresponds to one respondent’s answers. If you have used rating options as answers to your questions then code them, for example, strongly agree = 1, agree = 2, neither = 3, disagree = 4, strongly disagree = 5, not sure = 6. Enter these codes as the responses to each question. This then allows you to use functions within the spreadsheet package such as frequencies and pivot tables, as well as visual tools such as bar charts. You can find an example of a spreadsheet in the RCUK’s guide (see Table 2). There are various tools and methods you can use to analyse qualitative data. The BMJ has an excellent collection of resources on conducting and analysing qualitative research (see Table 2). There is also qualitative software you can use such as NVivo [20]. One approach you can use is category analysis, which involves grouping similar responses into categories that can then be counted. Start by getting an overview of your responses and then look for commonalities as these will form your categories. These can be simple such as negative and positive comments, or more complex depending on your questions and the responses. The aim is not to generalise the findings but to understand the views of the respondents. You can then use highlight pens, post it notes or a sorting technique to identify which response sits under which category. For example, in 2014 at Science Spectacular the graffiti wall responses completed

24

S. Spicer / Seminars in Cell & Developmental Biology 70 (2017) 17–25

by families to the question − “What did you think of today?” − were categorised into: 1. 2. 3. 4. 5.

Visual drawings Inspiring Educational Interactive, fun activities Specifically named activities.

Now count how many responses are in each category. This number can then be reported or expressed as a percentage. You can also illustrate each category with audience quotes but check they are representative and give a balanced perspective. 3.6. Using your findings Once you have analysed your data then you should interpret what you have found. List your key findings both positive and negative and link them to your evaluation questions and critically reflect on what you have learned. If you are conducting formative evaluation, ask yourself − What has worked well? What has not worked so well? What could be improved? What should we do differently next time? Identify recommendations that can be taken forward. If you are evaluating the impact of your science communication activity, assess if you have been successful in achieving your intended outcomes. What evidence has supported your conclusions? What stories are emerging? Are there any unanticipated outcomes that you had not intended? 3.7. Sharing your learning When writing your evaluation plan, the final section is how you will use and share your findings and recommendations. It is important to plan this now to not only maximise the effort you are putting into the evaluation, but also to ensure you achieve your original purpose, that is to improve your practice or evidence the impact of your activities. Firstly, identify if you have a key audience you have to report to such as your funders. For example ‘Wriggling Rangoli’ was funded by a small community engagement grant and the funder asked for a case study to be submitted at the end of the project. Whereas, ‘From Supermarket to Sewers’ and ‘Science Spectacular’ both had to submit evidence for internal reporting of targets and to justify core funding (see Table 1). Also consider who else might be interested in your findings, including people you identified earlier as audiences in your evaluation. A formal report may not be the most suitable format to use so think about your intended recipients and the format that they would find easier to access such as a case study, an infographic or a short video. If you are required to produce a formal report and are not given a template to follow, you should incorporate the context of your evaluation, including: a brief description of your activity; your aims, objectives and key evaluation questions; your evaluation methodology including your sampling and collection techniques, how many people were involved; a summary of your findings and the evidence (you may wish to put the actual data in an appendix) including the use of data visualisation (for example word clouds or pie charts) and respondent quotes; how you approached the analysis; your conclusions and recommendations; and an executive summary which draws together all the key messages and recommendations. If you decide to use a less traditional format there are free software applications such as Canva [21] that allow you to create your own infographic. For example, Policy@Manchester used an infographic to share the findings of a programme of events held over Policy Week 2015 [22]. You could ask the public for three words

that describe their experience of your event and then create a word cloud using a package like Wordle [23] which visually depicts the words used and their frequency. Your findings could be put on the web as a case study, a blog article or a paper published in bioRxiv [24], F1000Research [25] or figshare [26]. You could create postcards and posters. Whatever platform you use, be clear about your key messages and share what you have learnt, both the positive and the negative. 4. Final remarks Evaluation can be seen as yet another task to do in an already demanding workload but I hope that the examples I have provided here demonstrate how it can be an effective tool to reflect on and improve your science communication activities and, with the growing demand for evidencing impact, possibly contribute towards your evidence of research impact. It will create benefit for all the effort and time you have invested in developing and delivering your activities allowing you to not only demonstrate high quality engagement but also proof of outcomes. You should be aiming to integrate evaluation into your science communication work so it becomes an everyday part of developing activities, events or programmes. Evaluation does not have to be daunting and, more than often, the simpler your approach, the more effective your results. Remember that key to successful evaluation is planning, enabling you to be more focused, and so ensuring that you make the best use of your time and effort. Do not be too ambitious and get caught up in the drive for perfection. Be clear about your purpose, prioritise what you want to find out and be realistic. Think creatively about how you will collect your evaluation data. Consider how you might build it into your science communication activities, so that it is easy for you to collect, and your public positively want to engage with it and do not feel it is rushed and bolted on at the end. Remember that if you are engaging with a new audience you are not very familiar with, think about which other sectors work with the group and look for any guides they have produced about relevant types of data collection methods. Also consider consulting colleagues who are expert on evaluation or social scientists who are more familiar with quantitative and qualitative methods of research. Having spent time evaluating your science communication activities, take the time to analyse the data and act on the findings. By following the steps outlined in this article you will be able to assess if you are delivering high quality science communication activities, and if your aims and objectives are being met. Build up your own experience by evaluating your evaluation and learn from your successes and failures. If a plan does not work, reflect on what did not work and why and build those lessons into your next plan. As your confidence grows, effective evaluation will become a natural component of your approach to science communication. References [1] W.K. Kellogg, Foundation Logic Model Development Guide, W K Kellogg Foundation, 1998, 2017, pp. 3. [2] Royal society public understanding of science, in: Report of a Royal Society as Hoc Group Endorsed by the Council of the Royal Society, The Royal Society, 1985, 2017, pp. 5. [3] Bruine W de Bruin, A. Bostrom, Assessing what to address in science communication, Proc. Natl. Acad. Sci. 110 (Suppl. 3) (2013) 14062–14068. [4] Royal society science communication, in: Survey of Factors Affecting Science Communication by Scientists and Engineers, The Royal Society, 2006. [5] Research Councils UK, Beacons for Public Engagement http://www.rcuk.ac.uk/pe/beacons/. [6] Consortium of UK public funders of research, Factors affecting public engagement by researchers, in: Reflection on the Changing Landscape of Public Engagement by Researchers in the UK, Wellcome Trust, 2015, 2017.

S. Spicer / Seminars in Cell & Developmental Biology 70 (2017) 17–25 [7] Research Councils UK, Embedding Public Engagement in Research http://www.rcuk.ac.uk/pe/embedding/. [8] S. Duncan, P. Manners, C. Wilson, Building an Engaged Future for UK Higher Education Summary Report from the Engaged Futures Consultation, NCCPE, 20145. [9] Biotechnology and Biological Sciences Research Council, Policy on Maximising the Impact of Research, BBSRC, 2012, 2017, pp. 1. [10] E. Jensen, The problems with science communication evaluation, J. Sci. Commun. 13 (2014) 1 (C04). [11] H. King, K. Steiner, M.A.R. Hobson, H. Clipson, Highlighting the value of evidence-based evaluation: pushing back on demands for ‘impact’, J. Sci. Commun. 14 (2015) 2 (A02). [12] NCCPE Educating Community Groups About Parasite Infection and its Impact https://www.publicengagement.ac.uk/case-studies/educating-communitygroups-about-parasite-infection-and-its-impact. [13] Museum of Science and Industry, Manchester Science Festival http://www.manchestersciencefestival.com/. [14] University of Manchester, Science Spectacular http://www.engagement.manchester.ac.uk/highlights/manchester science festival/science spectacular/index.html.

25

[15] G.T. Doran, There’s a S.M.A.R.T. Way to Write Management’s Goals and Objectives, Manage. Rev. 70 (1981) 35–36. [16] Prokop A and Allan S, The Brain Box https://mcrbrainbox.wordpress.com/. [17] The World Café http://www.theworldcafe.com/. [18] NSPCC, Guidance for photographing and recording children during events and activities https://www.nspcc.org.uk/preventingabuse/safeguarding/photography-sharing-images-guidance/. [19] UK Government Data Protection Act https://www.gov.uk/data-protection/the-data-protection-act. [20] NVivo http://www.qsrinternational.com/nvivo-learning/getting-started/win11. [21] Canva https://www.canva.com/create/infographics/. [22] University of Manchester, Policy Week, 2015 (Policy@Manchester) http:// documents.manchester.ac.uk/display.aspx?DocID=26809. [23] Wordle http://www.wordle.net/. [24] bioRxiv http://www.biorxiv.org/. [25] F1000Research https://f1000research.com/. [26] figshare https://figshare.com/.