Effective Evaluation of Equine Extension Programs

Effective Evaluation of Equine Extension Programs

Journal of Equine Veterinary Science 32 (2012) 616-619 Journal of Equine Veterinary Science journal homepage: www.j-evs.com Short Communication Eff...

139KB Sizes 0 Downloads 18 Views

Journal of Equine Veterinary Science 32 (2012) 616-619

Journal of Equine Veterinary Science journal homepage: www.j-evs.com

Short Communication

Effective Evaluation of Equine Extension Programs Krishona Lynn Martinson PhD a , Thomas Bartholomay MS a, Kathleen P. Anderson PhD b, Christine Skelly PhD c, Elizabeth Greene PhD d a

Department of Animal Science, University of Minnesota, St. Paul, MN Department of Animal Science, University of Nebraska, Lincoln, NE c Department of Animal Science, Michigan State University, East Lansing, MI d Department of Animal Science, University of Vermont, Burlington, VT b

a r t i c l e i n f o

a b s t r a c t

Article history: Received 16 December 2011 Received in revised form 27 January 2012 Accepted 27 February 2012 Available online 5 April 2012

Evaluation has become a more significant component of planning and delivering extension programs, as federal partners and granting agencies are requesting information on program and integrated grant outcomes, including participant learning gains, behavior change, and program-generated impacts. Effective evaluation of equine extension programs involves a balance between asking enough well-designed questions to obtain desired information and keeping the evaluation tool brief enough to encourage participant completion. For most faculties, the difficulty with evaluation lies in developing appropriate and useful questions. The objective of this article was to share examples of questions successfully used to evaluate six key equine extension program areas: participant demographics, program logistics, participant behavior change, participant knowledge gain, teaching effectiveness, and program impact. Data generated by postprogram evaluations can be a source of statistically sound information that can be shared with administration, stakeholders, and granting agencies. Extension personnel can use evaluation data to improve planning and delivery of extension programs and to demonstrate teaching ability and program impacts. Ó 2012 Elsevier Inc. All rights reserved.

Keywords: Behavior change Demographics Evaluation Knowledge gain Program impact

1. Introduction The importance of extension program evaluation is well documented [1,2]. Increasingly, federal partners and granting agencies, such as the United States Department of Agriculture National Institute of Food and Agriculture, are requesting information on integrated grant and program outcomes, including participant learning gains, behavior change, and program impacts. As university faculties are being asked to demonstrate their effectiveness in generating behavior change and impacts [3], evaluation has become a more significant component of planning and delivering extension programs. Results of program evaluations are used to improve extension programs, assess teaching faculty for promotion and tenure, and report to Corresponding author at: Krishona Lynn Martinson, PhD, Department of Animal Science, University of Minnesota, 1364 Eckles Avenue, St. Paul, MN 55108. E-mail address: [email protected] (K.L. Martinson). 0737-0806/$ - see front matter Ó 2012 Elsevier Inc. All rights reserved. doi:10.1016/j.jevs.2012.02.006

stakeholders and federal partners to secure existing and future funding. The University of Minnesota (UMN) equine extension team used an industry evaluation to develop its adult equine extension program in 2004 [4]. Since then, postprogram evaluations have been used to assess program effectiveness and to aid in program planning. However, the difficulty with evaluation lies in developing appropriate and useful questions to obtain measurable and desired feedback. Our objective is to share examples of questions successfully used to evaluate six key equine extension program areas: participant demographics, program logistics, participant behavior change, participant knowledge gain, teaching effectiveness, and program impact.

2. Materials and Methods In Minnesota, local focus groups are used to develop multiple face-to-face adult equine extension programs

K.L. Martinson et al. / Journal of Equine Veterinary Science 32 (2012) 616-619

617

Table 1 Questions and response options for evaluation of participant demographics Evaluation Question

Response Options

How many horses do you own and/or manage of others? How many acres do you own/rent that are designated for pasture or turnout? How do you describe yourself? What is your age? What is your gender? How many equine extension programs have you attended in the past? How far did you travel to attend this program? Have you ever visited our Website? Do you have access to and use the Internet and/or e-mail? What horse-related topics would you like to learn about at future equine extension programs (circle only five topics)?

None; 1-5; 6-10; or 11 None; 1-5; 6-10; or 11

throughout the state. Programs offer concurrent topics presented as a mixture of hands-on and lecture sessions, ranging in length from half to full day, and attracting 50100 paying participants. A UMN extension evaluation specialist was enlisted to develop postprogram evaluation questions aimed at effectively collecting and documenting participant demographics, knowledge gains, behavior change, program impact, and satisfaction with program logistics. The resulting two-page postprogram evaluation consisted of three sections focusing on (1) presenters and subject matter, (2) program logistics and outcomes, and (3) demographic information of the participants. Paper copies of the postprogram surveys were distributed at the beginning of the program and collected immediately after the program concluded. The evaluation tool was broad enough to use at all UMN equine extension programs, thus allowing for the combination of data across multiple years. Numerous other states and the eXtension HorseQuest Community of Practice have since used this evaluation tool to collect postprogram information [5]. 3. Results and Discussion The evaluation questions listed later in the text have been successfully used in equine extension programs at the UMN since 2007 and at the University of Nebraska, Michigan State University, University of Vermont, and eXtension HorseQuest Community of Practice since 2009 [3,5]. 3.1. Participant Demographics Knowledge of participant demographics is essential to better understanding audience needs so as to develop more effective programming. Demographics also allow for statistical evaluation of changes and impacts within and between subgroups of the participants. Although demographic questions are likely to be more easily developed, strategy must still be used (Table 1). Demographic data can be used in a variety of ways. From 2006 through 2011, postprogram evaluation results indicated that 35% of UMN equine extension program

Horse owner; barn manager; industry professionals; or other Teens; 20s; 30s; 40s; 50s; or 60 Male or female None, this is my first program; one; two; or three or more <25 miles; 26-50 miles; 51-75 miles; 76-100 miles; or >100 miles Yes or no Yes or no Basic care; breeding and foaling; care of elderly horses; dentistry; facilities; fly and pest control; horse health; hoof care and lameness; horse behavior; horse hay; legal issues; manure management; minerals and vitamins; nutrition; pasture management; poisonous plants; prepurchase examinations; rehabilitation therapies; research updates; supplements; tack fitting; trail riding; trailering; vaccinations; and other

participants were in their 50s and 12% were aged 60 years, accounting for almost half of all program participants. In addition, 85% of participants were female, and 96% had access to or used the Internet and e-mail. Based on these demographic data, the UMN equine extension team developed a Facebook fan page with the goal of expanding their programmatic reach through social media. The fastest growing demographic of Facebook users is women aged between 30 and 60 years [6], which closely matches the primary audience of most adult equine extension programs, and more than 1.4 million Facebook users “like” horses [7]. 3.2. Program Logistics Although participant responses about program logistics are not commonly shared with administration and stakeholders, participant input is valuable when planning future extension programs. Also, by regularly monitoring perceived value versus cost of programs, organizers can make decisions pertaining to appropriate fee structures. From 2007 through 2011, 97% of UMN extension horse owner program attendees indicated that programs were very affordable (64%), fairly affordable (16%), or affordable (17%). This validated the program fee structure of $15-$50 because participants believed that the programs were valuable. The following questions, with various response scales, can be asked to evaluate program logistics (assigning each response variable a numerical value, in parentheses, helps to quantify and compare responses): Question 1: To what degree were you satisfied with the materials/handouts from this program? Response options are very dissatisfied (1), dissatisfied (2), satisfied (3), and very satisfied (4). Question 2: To what degree do you value the program and materials provided compared with the cost of the program? Response options are not affordable (1), somewhat affordable (2), affordable (3), fairly affordable (4), and very affordable (5). Question 3: Why did you choose to attend this location? Response options are date, facility, location, topics, and speakers.

618

K.L. Martinson et al. / Journal of Equine Veterinary Science 32 (2012) 616-619

3.3. Participant Behavior Change Documenting behavior change helps identify participant plans to apply new knowledge and skills gained from programs. Documentation of actual, versus planned, implementation of new knowledge or skills can be determined through follow-up surveys sent electronically or via U.S. Postal Service 3-6 months after the program. Behavior change can also be gleaned by asking repeat attendees to report on changes they have made and integrated into their horse operation as a result of previous extension programs. When providing lists of possible changes, topics should reflect previous extension program content. To assess potential and actual participant behavior change, the following questions and response scales can be used: Question 1: Based on this information, I plan to make at least one change in my horse operation. Response options are strongly disagree (1), disagree (2), neutral (3), agree (4), and strongly agree (5). Question 2: If you have attended an equine extension program in the past, please check all changes you have made to your horse farm/operation based on information learned at that program? Response options are (1) buy or grow better horse hay, (2) improved or developed first aid kit, (3) improved hoof care management, (4) have the soil in my pasture tested, (5) improved fly and pest control around horse facility, (6) now rotationally graze pasture, (7) improved vaccination and deworming, (8) improved bandaging skills, (9) made changes to a horse’s(s) nutrition, (10) other, and (11) I have not made any changes to my horse farm or operation based on equine extension programs I have attended. From 2007 through 2010, past UMN equine extension program participants indicated that the top five changes they incorporated into their farm operation were changes to horse nutrition, making or improving a first aid kit, improving vaccination and deworming protocols, incorporating rotational grazing, and buying or growing better horse hay. These data can be shared with administration and stakeholders to demonstrate participant behavior changes directly attributable to extension programming.

level, the curriculum can be designed and technical levels can be set to addresses knowledge gaps. The following two pre-post retrospective questions and response options can be used to help determine participant knowledge gain: Question 1: Before this presentation, how much did you know about this subject? Response options are very little (1), little (2), some (3), much (4), very much (5). Question 2: Now I know ____ about this subject? Response options are very little (1), little (2), some (3), much (4), very much (5). In the fall of 2009, 51 horse owners attended an extension horse owner program focusing on equine metabolic syndrome and genetics. Initial and final knowledge levels were summarized by averaging the weighted participant responses from the evaluation questions. The initial average value was then divided by the final average value, which was then multiplied by 100 to determine the percentage knowledge gain. Participants indicated they had an average knowledge level of 2.1 and 1.8 before the presentation on equine metabolic syndrome and genetics, respectively. Participant knowledge level increased to 3.7 (equine metabolic syndrome) and 3.4 (genetics) after the presentation. This resulted in a 57% and 53% increase in participant knowledge gain for the topics of equine metabolic syndrome and genetics, respectively. 3.5. Teaching Effectiveness To further evaluate teaching ability, questions from the UMN’s Student Rating of Teaching evaluation were included in UMN equine extension program evaluations. The ability of extension faculty to have their teaching ability evaluated on the same scale as teaching faculty is important for promotion and tenure and for continued improvement in teaching. The following question and response scale can be used to evaluate extension teaching: Question 1: The instructor presented the subject matter clearly. Response options are strongly disagree (1), disagree (2), neutral (3), agree (4), and strongly agree (5).

3.4. Participant Knowledge Gain 3.6. Program Impact Documented knowledge gain by participants is a measure often used to assess program success. Pre-post retrospective questions can be particularly useful because participants are able to make more realistic determinations of their previous knowledge after they have heard the presentation, thereby decreasing participant overestimation of beginning knowledge. Participant knowledge gain not only aids in determining program impacts but also tends to align with, and reinforce, participant behavior changes and faculty teaching ability. In theory, the greater the percentage of knowledge gain, the more effective the faculty member was at teaching the topic. Pre-post retrospective questions are also useful for assessing previous knowledge of broader participant groups, as a type of needs assessment. By knowing the clientele’s general knowledge

Program impact can be the most challenging program component to measure. All the questions previously discussed can be used collectively to demonstrate program impact. However, the following questions, with various response scales, can be included on postprogram evaluations to help further identify program impacts: Question 1: I found this information useful in management of my horse operation. Response options are strongly disagree (1), disagree (2), neutral (3), agree (4), and strongly agree (5). Question 2: To what degree were you satisfied with this program? Response options are very dissatisfied (1), dissatisfied (2), satisfied (3), and very satisfied (4).

K.L. Martinson et al. / Journal of Equine Veterinary Science 32 (2012) 616-619

Topic:

Strongly Disagree Disagree

Presenter:

619

Neutral

Agree

Strongly Agree

I found this information useful in management of my horse operation.

1

2

3

4

5

Based on this information, I plan to make at least one change in my horse operation.

1

2

3

4

5

The instructor presented the subject matter clearly.

1

2

3

4

5

Comments: Before this presentation….. How much did you/do you know about this subject?

Now I know…..

Very Little

Little

Some

Much

Very Much

Very Little

Little

Some

Much

Very Much

1

2

3

4

5

1

2

3

4

5

Fig. 1. An example format for evaluating extension program impact and teaching effectiveness.

Question 3: To what degree did this program improve your ability to make more informed decisions regarding your horse farm or operation? Response options are very low (1), low (2), fair (3), high (4), and very high (5). Figure 1 gives an example of how to use evaluation questions to evaluate program impact, including teaching effectiveness. This format can be repeated for each topic and speaker combination at a program. It is also important to ask for and provide space for written comments. 4. Conclusions Effective evaluation of equine extension programs involves a balance between asking enough well-designed questions to obtain desired information and keeping the evaluation tool brief enough to encourage participant completion. Evaluation questions that have been proven to withstand the test of time can be used in many extension programs to gather information on participant demographics, program logistics, participant behavior change, participant knowledge gain, teaching effectiveness, and program impact. All these areas are necessary to effectively evaluate equine extension programs. Data generated by

postprogram evaluations can be a source of statistically sound information that can be shared with administration, stakeholders, and granting agencies. Extension personnel can use evaluation data to improve planning and delivery of extension programs and to demonstrate teaching ability and program impacts. References [1] Meyer M, Foord K. Consumer preferences and perceptions of gardening information. Hort Technol 2008;18:162-7. [2] Higginbotham GE, Kirk JH. Survey results from participants of a short course for dairy herdsman. J Ext [Online] 2006;44(2). Article 2RIB4. Available at: http://www.joe.org/joe/2006april/rb4.php. [3] Martinson K, Bartholomay T. Evaluating equine extension and outreach programs. J Equine Vet Sci 2009;29:454-5. [4] Martinson K, Hathaway M, Wilson J, Gilkerson B, Peterson P, Del Vecchio R. University of Minnesota horse owner survey: building an equine extension program. J Ext [Online] 2006;44(6). Available at: http://www.joe.org/joe/2006december/rb4.shtml. [5] Anderson K, Greene EA, Martinson K. Assessing the impact and usefulness of extension horses. J Equine Vet Sci 2011;31:345-6. [6] Pew Research Center. Global publics embrace social network [Online]. Available at: http://pewglobal.org/files/2010/12/Pew-GlobalAttitudes-Technology-Report-FINAL-December-15-2010.pdf. Accessed December 15, 2011. [7] Facebook page horses, [Online]. Available at: http://www.facebook. com/#!/pages/Horses/111933198826503. Accessed December 15, 2011.