Evaluation: Appraising the system

Evaluation: Appraising the system

Studies in EducationalEvaluation. Vol. 9, pp. 319-325, 1983 0191-491X/83 $0.00 + .50 Copyright ~ 1984 Pergamon Press Ltd. Printed in Great Britain. ...

340KB Sizes 4 Downloads 88 Views

Studies in EducationalEvaluation. Vol. 9, pp. 319-325, 1983

0191-491X/83 $0.00 + .50 Copyright ~ 1984 Pergamon Press Ltd.

Printed in Great Britain. All rights reserved.

EVALUATION:

APPRAISING

THE

SYSTEM*

Donald E. Elson Virginia Polytechnic Institute and State University College o f Education, Blacksburg, Virginia 24061-3299

INTRODUCTION Government-funded, large-scale educational programs carry strict requirements for the performance of annual evaluations. Within this yearly cycle, the question of the utility of the evaluation findings becomes increasingly important. This paper examines the efforts made within one such government-funded program to increase the relevance and the usefulness of a mandated annual evaluation report. The context of the case study to be examined is a statewide vocational education program operating within the state of Virginia. Under United States law (Federal law P.L. 94-482), all such vocational programs must undergo mandatory evaluation once every five years. The intent of Congress was to use the evaluations for management and improvement of programs. These evaluations can be designed with an emphasis on program improvement, compliance, or a combination of both. Hendrickson (1981) found that while evaluation activities varied in scope and depth, compliance seemed to be the only outcome in many states. The current trend within the U.S. of reduced funding for educational programs from all levels of government does not automatically reduce the need for evaluation data. Limited budgets make it imperative that a concerted effort be made to provide the vital information needed by state and local planners in the most efficient and expeditious manner. Norton and McCaslin (1976, p. 329) indicated: "The evaluation effort should be viewed as a process which seeks program improvement and progressive change rather than program condemnation." They described improvement stating: (...)Improvement implies change. There is some risk involved, of course, with any change because only positive change is useful and desirable. To insure that changes made are positive ones which will lead to increased program effectiveness and/or efficiency, * Article adapted from a paper of the same title presented at Evaluation Baltimore, Maryland, October 28-30, 1982.

319

'82,

320

D. ~ Elson

carefully designed and conducted evaluations are essential as a source of reliable information. Given factual information about a program, the decision maker can identify alternative actions and choose the action or combination of actions most likely to result in improved practice (p. 238). Improved practice, through the use of factual program information by school divisions, is the major goal of the Vocationel Education Evaluation in Virginia (VEEVA) system, which has been developed, refined by field test, and implemented in Virginia (Elson & Frary, 1982a).

VEEVA:

THE SYSTEM

The Divisions of Vocational and Adult Education, Virginia Department of Education, contracted with Virginia Polytechnic Institute and State University to develop and implement strategies for evaluating vocational education programs across the state. The Supervisor for State Planning and Evaluation is responsible for all program improvement evaluations. While the activities of VEEVA provide data for meeting the Federal mandate, the major goal of the project is to improve vocational education. Program evaluation for the purpose of improving vocational education in Virginia has gone through an evolutionary process. Beginning in 1973, the evaluation was conducted as a local self-evaluation. On-site team and extensive data collection procedures were developed and field tested as more emphasis was placed on the importance of evaluation and the need to verify the self-evaluation. Use of a representative sample of school divisions each year was initiated in the 1978-79 school year. For the purposes of this study, the term "school division" includes jointly administered vocational centers. The school divisions were divided into five representative groups to facilitate the process of evaluation. Grouping of the school divisions was based on geographic location, division size, nature of the locality, size of the division's total vocational offerings, and joint service to localities by one facility. The evolutionary process continued as attempts were made to provide the most efficient evaluation system possible. With reduced funding and increased evaluation activities by other state agencies, on-site team evaluations were discontinued after the 1980-81 school year. The evaluation instruments were finalized prior to the 1980-81 year. A division report is prepared for each school division to present the results of the analysis in tabular form. The report contains the results from the Program Evaluation Form for the division, the Teacher Questionnaire by school, and the Student Questionnaire by school and by program within the school. Also included in the report are program enrollment data provided by the Virginia Vocational Research Coordinating Unit. For economic reasons and to protect the privacy of the data, distribution of these reports is restricted by the VEEVA staff to the primary stakeholders. These include the school division administrators (superintendents, evaluation coordinator and principals), the Supervisor for State Planning and Evaluation, and the state vocational program area supervisors. Any additional distribution of the report to other stakeholders is the prerogative of the school division administrators. The administrators are encouraged to share the report with their vocational teachers, students, parents, and advisory councils. A state report, containing a summary of the data and

Appra/s/ng ~e $ys~m conclusions,

321

is prepared for more general distribution (Elson & Frary, 1982b).

Upon receipt of a school division report, it is the responsibility of each state vocational program area supervisory staff to work with their particular vocational instructors in the school division and the local director of vocational education to develop recommendations for improving the vocational program. The development of the recommendations may be accomplished in one visit, but it may take a series of visits throughout the year to bring about the needed improvements.

PURPOSE OF THE STUDY Those being evaluated seldom have an opportunity to evaluate the evaluation system. The purposes of this study are to appraise the usefulness of the VEEVA evaluation procedures and reports to the school division and to solicit recommendations from the evaluation coordinators for improving the evaluation system. More specifically, the study seeks to answer the following research questions: i. 2. 3. 4.

Who How How How

has access to the division report? useful is the division report? helpful are the visits by the state supervisory staff? can the VEEVA system be improved?

To answer these questions data were collected from 49 evaluation coordinators through a two-page questionnaire. The questionnaire included checklist items, rating scale items and open ended questions soliciting suggestions for improving the procedures.

RESULTS The results have been organized and presented in accordance with the above research questions. The information presented in this section is based on the usable responses from 48 evaluation coordinators. i.

ACCESS TO THE REPORT In an attempt to determine access to the VEEVA division report, the evaluation coordinators were asked who had seen the report. As shown in Table i, the report was shared with 88% of the division superintendents and 85% of the principals. A study of Table i shows that six coordinators did not share the division report with the teachers. Involving teachers in the VEEVA regional meetings was suggested as a way of increasing their access and involvement. 14 Coordinators did not share the report with the advisory councils. Six coordinators shared the report with the advisory council chairperson only, while seven coordinators shared the report only with the advisory council members. Both the chairperson and members had access to the report in 21 school divisions. 2.

HOW USEFUL WAS THE REPORT? Many times evaluations are conducted, but the results are never used overtly. However, in this study 44, or 92% of the evaluation coordinators used the evaluation data to plan for improved individual programs (Table 2). The data were used in 38 school divisions to help individual teachers. It should be noted that three evaluation coordinators said the report was "stuck on the shelf." A more formal implementation phase was suggested as a way of improving the use of the evaluation results.

322

D, E. ~ s o n

TABLE I:

a,

b. C.

d. e.

f. g. h. i.

j.

Access to VEEVA Division Report

Items

N*

%

Superintendent Principals Local supervisors Department chairpersons All teachers Representative sample of teachers Advisory council chairperson Advisory council members Parents Other (schoolboards, director of vocational education, director of guidance, state supervisor, administrative review vocational representative.

42 41 25 31 34

88 85 52 65 71 17 56 58 6 13

*N=48 - Respondents were asked to check all applicable

TABLE 2:

a. b. ~. d. e. f. ~.

items.

Use of Division Report

Items

N*

%

Planning for improving individual programs Helping individual teachers Selecting Advisory Council projects As documentation for needed funding As good publicity for vocational education It has just been stuck on the shelf Other: "Improved communication at all levels. Big improvement over previous system'.

44 38 8 18 16 3 1

92 79 17 38 33 6 2

*N=48 - Respondents were asked to check all applicable

3.

8

27 28 3 6

items.

HOW HELPFUL WERE THE RESULTS? VEEVA provides the data to the decision-makers for their use in improving vocational programs. A major step in the decision-making process is the formulation of recommendations based on the VEEVA data and other data which may be available. The evaluation coordinators were asked to rate the helpfulness of the state supervisors' visits to their school divisions. The visits were rated as moderately to extremely helpful by 87% of the coordinators. The coordinators suggested that more follow-up be provided by the supervisors and that they work more directly with the teachers.

Appra~g me System

323

A

series of one day workshops was conducted across the state to provide local vocational administrators with basic information on the utilization of evaluation data as part of another state funded project. It was considered important to determine if the evaluation coordinators needed additional assistance in understanding and using the evaluation results beyond that provided by the report and by the supervisory visits. Approximately 80% of the coordinators indicated some need for such assistance. 4.

IMPROVING THE VEEVA SYSTEM Additional suggestions given by evaluation coordinators for improving VEEVA procedures included the use of fewer forms, separate versions of the Program Evaluation Form to make it more specific to vocational programming and to improve the quality of the forms. It was also suggested that the importance and accuracy of the data should be stressed to teachers at their summer conferences. When asked for suggestions for improving the report, the coordinators wrote that the report should be kept short and simple, and the recommendations should be tailored to each division. The exact meaning of the last suggestion is not clear since no specific recommendations are included in the report.

RECOMMENDATIONS AND IMPLICATIONS These recommendations and implications, while based on experiences with the system used within the state of Virginia, nonetheless provide insight into the improvement of evaluation systems and the use of evaluation results generally. i.

INCREASE ACCESS TO THE EVALUATION REPORT BY STAKEHOLDERS AT THE LOCAL LEVEL More emphasis needs to be placed on providing stakeholders access to the report. Any improvements in the delivery or content of a program ultimately devolve on the teacher. Advisory councils are organized "to advise local education officials on current job needs and the relevance of courses being offered by their educational agency and to assist in the development of the annual vocational plan" (Ross, 1981. p. 1,2). Access to the evaluation data will help advisory councils fulfill their mission. Only limited results can be expected if the teachers, advisory councils, and other stakeholders are not involved in the total evaluation and planning cycle. 2.

EXPAND THE USE OF THE EVALUATION REPORT A high percentage of the evaluation coordinators indicated that the report was used to improve individual programs, while a somewhat lower percentage indicated that the report was used to help teachers. These are two important uses of the report; however, many other uses are possible. Lee (1979) suggested that evaluation data are used in determining policy, establishing goals and objectives, developing plans and planning details, taking administrative action, allocating and re-allocating funds, obtaining additional funds, adding or dropping courses and programs, changing curricula, changing enrollment and completion requirements, and public relations. Some of these uses may be subsumed under the two major uses noted by the coordinators. However, many of these and other possible uses should be explored and put into operation by evaluators. 3.

PLACE GREATER EMPHASIS ON THE ROLE OF THE STATE SUPERVISORY STAFF IN EVALUATION AND PROGRAM IMPROVEMENT The role of the state supervisors in program improvement should be carefully defined. This role should include working directly with the teachers. Franchek (1981, p. i0) reported that "Recent studies identify the need for more clearly defined procedures...to promote utilization. It would appear that the evaluation

324

D. ~ Elson

requirements specified in [Federal Law] P.L. 94-482 call for resources and expertise which many state staffs do not have." 87% of the evaluation coordinators rated supervisory visits as moderately or extremely helpful. It would be worthwhile to investigate the reasons for the lower ratings by the remainder of the coordinators. It may be found that certain staff development activities need to be created to assist the state supervisory staff in formulating recommendations for improving programs based on a variety of available information, including the evaluation results. 4.

DEVELOP IN-SERVICE ACTIVITIES TO ASSIST THE EVALUATION COORDINATORS AND/OR LOCAL VOCATIONAL DIRECTORS IN USING EVALUATION DATA TO PLAN AND IMPLEMENT PROGRAM IMPROVEMENT ACTIVITIES Joyce and Showers (1982) indicated that the development of a skill does not ensure transfer of that skill into practice. In-service programs need to be expanded to not only develop evaluation and planning skills of the coordinators or directors, but to provide follow-up actitivies to assist them as they put these skills to use in planning their local programs. 5.

INVESTIGATE AND INNOVATIVE WAYS OF IMPROVING THE EVALUATION SYSTEM The VEEVA evaluation system received positive ratings and some helpful suggestions from the evaluation coordinators. As new and better ways of data collection and presentation of the results are developed, they should be incorporated into the system. Evaluators should continually explore ways of improving the evaluation system, and implement those that are most efficient and effective in improving educational programs. Stevenson concluded that: If evaluation does not result in changes in programs which benefit students, the intent has been thwarted and a lot of resources wasted. All of those involved in the evaluation effort -administrators, advisory committee members, evaluators, teachers, students, team members, and all others--can legitimately ask, "What is done differently as a result of evaluation and how will program output be improved as a result of these changes?" (1979, p. 61).

REFERENCES ELSON, D. E. & FRARY, J. Vocational Education Evaluation in Virginia, 1982-83. Blacksburg: Virginia Polytechnic Institute and State University, 1982(a). ELSON, D. E. & FRARY, J. VEEVA State Report, 1981-82. Blacksburg: Virginia Polytechnic Institute and State University, 1982 (b). FRANCHEK, S. J. Using Evaluation Results. RD #212. Columbus, OH: National Center for Research in Vocational Education, 1981. HENDRICKSON, G. Evaluating Vocational Education: The Federal Stimulus. Washington, D.C.: The National Institute of Education, 1981. JOYCE, B., & SHOWERS, B. The coaching of teaching. Educational Leadership, 1982, 40 (I), 4-10. LEE, A. M. Use of Evaluative Data by Vocational Educators. IS #156. Columbus, OH: National Center for Research in Vocational Education, 1979. NORTON, R. E., & McCASLIN, N. Evaluation of special group programs. In J. E. Wall (ed.), Vocational Education for Special Groups. Yearbook. Washington, D.C.: American Vocational Association, 1976.

Sixth

Appraising He Sys~m

325

ROSS, N. A Guide for Local Advisory Councils for Vocational Education. Richmond: Virginia Department of Education, 1981. STEVENSON, W . W . Vocational Education Evaluation: Problems~ Alternatives, Recommendations. RD #182. Columbus, OH: National Center for Research in Vocational Education, 1979.

THE AUTHOR DONALD E. ELSON is Associate professor and general program area leader in the Division of Vocational and Technical Education at Virginia Polytechnic Institute and State University, Blacksburg, Virginia.