Evaluation of Evaluation '85

Evaluation of Evaluation '85

67 Although diverse methodologies were represented at the conference, a concern exists that the range or scope of the evaluation profession remains s...

157KB Sizes 0 Downloads 76 Views

67

Although diverse methodologies were represented at the conference, a concern exists that the range or scope of the evaluation profession remains somewhat limited. On the positive side, there were some presentations regarding evaluation in automated environments. These presentations seem very relevant, considering the increased use of computers in all types of organizations. As evaluators, we wil1 continue to face the challenges of carrying out evaluation in these automated environments using these computer tools. Presentations in this area provide evidence that evaluation can adapt to change and continue to grow. Overall, we believe the use of evaluation will continue to thrive and prosper in business. However, in a desire to ensure the health of our common organization, we continue to support efforts to encourage greater contribution from other professionals in business and industry. We realize the effort must be put forth not only from the academically oriented organizations, but also from those of us in the private sector. The conference, we believe, has the potential to increase its scope and to meet the needs of a smal1 but growing constituent group. It has to some extent met our needs as a vehicle for sharing the work we do, as a resource for potential employees, and as a place to learn and exchange ideas with others involved in the profession. We hope future conferences will show increased participation from private sector evaluators.

Evaluation of Evaluation '85 James E. McLean

Research and Evaluation Committee University of Alabama

Evaluation '85 was evaluated by asking participants to fill out evaluation feedback forms. A total of 168 of the pre-session participants and 93 of the approximately 861 regular session participants fil1ed out the evaluation surveys. The pre-session survey forms were distributed and collected at the end of each session. Those returned represent over

68

90070 of those who attended. The regular session evaluation forms were distributed as part of the information packet participants received upon on-site registration. They were asked to return the completed forms to a box at thc registration table or mail it to an address on the form. Of the 105 completed forms, 23 were returned by mail.

PRESESSION WORKSHOP EVALUATION Of the presession participants returning survey forms, 39070 indicated they had previous evaluation experience as a manager, 51u/o had experience as a manager of program evaluation, and 86% had experience as a program evaluator, Satisfaction with the various aspects of the prcsessions was quite high. In all, 830/0 were either "satisfied" or "very satisfied" with thc lectures, 70(% with the case studies, 75070 with tlie discussion, and 77% with the use of examples. The program leaders also were rated very well. Of the respondents questioned, 93% rated the leaders' content expertise to either "good" or "excellent," 85070 rated the leaders' leadership "good" or "excellent," 77070 rated their group skills "good" or "excellent," and 89% rated the program leaders' ability to communicate "good" or "excellent." Further, 70% of the participants indicated that the amount of time allocated was "about right." Using a 4-point graphic scale (with 4 high) to rate the degree to which the workshops' goals were met, the mean was 3.1 with a standard deviation of .9. Over 78% of the participants were positive in their response to this question.

EVALUAnON OF REGULAR MEETING Of the 105 respondents to the evaluation survey, 31% were members of thc Canadian Evaluation Society, 53% were members of ENet, and 44% were in the Evaluation Research Society; 28% belonged to more than one of the organizations. The primary occupational setting for 89(1/0 of the respondents was universities or colleges, followed closely by government, which included 37% of thc respondents. Consulting was the next largest category with 10% of the respondents. No other

69 TABLE I

Descriptive Evaluation of Evaluation '85 Percentage Choice Distribution" Feature Awards luncheon Exhibits Invited speakers Job bank Panels/Symposia Paper presen ta lions Presessions Round tables Software information exchange Scheduled social hours Teaching of evalua tion rna terials exchange Workshops format

N**

Ineffective 1 2

4

5

Mean

SD

22 26 24 39 24 33 18 21 25

35 9 45 18 51 41 33 40 33

38 3 30 7 21 17 37 30 8

4.1 2.3 4.0 2.9 3.9 3.6 3.9 3.9 3.0

.90 1.01 .77 .98 .82 .95 1.15 1.04 1.28

60 78 89 28 95 98 57 53 12

0 23 0 4 1 3 5 17

32 3 6 7 6 17

84 11

2 0

5 0

18 0

32 36

43 63

4.1 4.6

1.01 .51

44

7

7

23

41

23

3.7

1.12

4

5 40

Effective

3

I

"Rounded to nearest percentage. ""Excludes both omissions and "not abte to judge" responses.

occupational setting had more than 5% of the respondents. The primary job responsibility of 58% of the respondents was evaluation with much smaller percentages reporting administration, research, and teaching as their primary responsibilities. Participants were asked to rate each feature of Evaluation '85 on a 5-point scale, where I = "very ineffective" and 5 = "very effective." Table I presents a summary of these data. The data in Table I may be interpreted in a number of ways. For example, the column labeled "N" (number of responses) in the table can be viewed as a relative index of the number of participants who took advantage of each feature. The most popular features were paper presentations, panels! symposia, and invited speakers. The "percentage choice distribution" and mean can be viewed as an index of the perceived effectiveness of each feature. The teaching of evaluation materials exchange was viewed as most effective (mean = 4.6) whereas the exhibits were viewed as the least effective feature (mean = 2.3). It is interesting to note that the least attended feature (teaching of evaluation materials exchange with N = II) was also viewed as the most effective feature.

70

The last column in the table, SD, can be viewed as an index of the agreement about the effectiveness of a feature. The most agreement was found for the teaching of evaluation materials exchange (SD = .5) whereas the least agreement was found for software information exchange. It is interesting to note that these two features represent the least well attended features by the respondents.

SUMMARY

Overall, Evaluation '85 was viewed by the participants in a very positive manner. A high percentage of pre-session participants were satisfied or very satisfied with the quality of their sessions. The most popular features of Evaluation '85 (paper presentation, panels/symposia, and invited speakers) were rated as effective. The exhibits at Evaluation '85 were less favorably viewed by the participants.

AEA Membership and Recruitment Committee Jana Kay Smith

Marc Braverman

Membership and recruitment will play an important role in AEA activities this year. Our 1986 goals include the following: • Conduct a mail campaign that will reach 20,000 individuals with invitations to join AEA and attend the annual conference; • Recruit student members by distributing posters and brochures to training institutions; and • Develop a membership directory that will list professional affiliations, to be updated at regular intervals.

One strategy that has often proved highly successful in the past is growth through personal contact. Association members have shared information on their organization with their colleagues, and have pro-