0149-7189185 $3.00 + .OO Copyright o 1985 Pergamon Press Ltd
Evaluation and Program Planning, Vol. 8, pp. 161-169,1985 Printed in the USA. All rights reserved.
DOCUMENTATION
IN EVALUATION
RESEARCH:
Managerial and Scientific Requirements E. POLLARD
WILLIAM
Emory University School of Medicine
ALFRED C. COOPER
and DEBORAH H. GRIFFIN
Grady Memorial Hospital
ABSTRACT Documentation in evaluation research consists of written material, in human- or machinereadable form, pertaining to the plans, activities, and results of the project. It is argued here that good documentation is essentialfor effective management of evaluations, and for responsible reporting of the research procedures and findings. Documentation relating to electronic data processing activity is especially important. The purpose of this paper is to stimulate consideration and discussion of documentation, and to emphasize its importance in evaluation research. The role of documentation in the planning and controlfunctions of project management is reviewed, and the importance of documentation in the assessment of research quality with respect to objectivity, validity, and replicability is discussed. Reasons for poor documentation are considered. An outline of documentation required in different phases of research projects is provided, and recommendations for improving the quality of documentation are presented.
achieving project objectives and for meeting standards of the scientific community for objectivity, validity, and replicability; a well-planned, comprehensive documentation system is a key factor in meeting these responsibilities. There is a considerable range in the scope of evaluations, and documentation requirements will differ accordingly. A small client satisfaction study carried out by one person over a period of a few days in an outpatient health facility may require only a few pages of documentation. On the other hand, a nationwide longitudinal study of educational development involving the collection of survey and testing data on a very large sample of students could require thousands of pages of documentation. Whatever the features of a particular study may be, evaluators need to give some consideration to what kinds of documents they will need in conducting their research projects. Good doc-
The purpose of this paper is to stimulate consideration and discussion of documentation, and to emphasize its importance in evaluation research. By documentation we mean written material, in human- or machinereadable form, that pertains to the plans, activities, and results of the project. This includes material necessary for internal project operations, as well as deliverables in written form that satisfy external reporting requirements. Documentation thus includes a wide variety of material ranging from project correspondence and minutes of meetings to final reports and descriptions of archived data files. Indeed, documentation constitutes the major tangible product of the evaluation. We argue that good documentation is essential for effective management of evaluation projects and for responsible scientific reporting of research procedures and findings. Investigators in charge of evaluation studies are responsible for efficiently
An earlier version of this article was presented October 28, 1982 at the Evaluation Research Society Annual Meeting, Baltimore, Maryland. Requests for reprints should be sent to William E. Pollard, Department of Psychiatry, Grady Memorial Hospital, 80 Butler Street, SE., Atlanta, GA 30303. 161
162
WILLIAM
E. POLLARD,
ALFRED C. COOPER,
umentation will not just happen-it requires planning. Documentation will accumulate whether it is planned or not; planning enables one to control this to meet project needs. Unfortunately, essential elements of documentation are often handled haphazardly or ignored altogether. Although there is an extensive literature on practical issues in the design of evaluations and the analysis of data, documentation is seldom addressed. Problems with inadequate documentation, especially in the area of computerized data analysis, are increasingly being recognized; however, many of the difficulties encountered in the secondary analysis of evaluations, for example, are seen to relate to the lack of documentation (Hedrick, Boruch, & Ross, 1978; Linsenmeier, Wortman, & Hendricks, 1981). In this paper, we examine the role of documentation in project management and scientific reporting. AlMANAGERIAL Project management involves the judicious application of project resources to achieving project objectives. Managers are responsible for planning how these objectives are to be achieved and for controlling the project according to pIan. Internal documentation is necessary for a number of reasons in carrying out both the planning and control functions. First, the very process of writing down plans and decisions is important because it encourages, if not forces, thorough consideration and articulation. Few of us would deny the value of such written materials, yet as Brooks (1975, p. 111) points out in his discussion of project management, there is often a tendency to jump from general planning meetings and discussions into the tasks at hand. He goes on to argue that writing the decisions down is essential. Oniy when one writes do the gaps appear and the inconsistencies protrude. The act of writing turns out to require hundreds of mini-decisions, and it is the existence of these that distinguishes clear, exact policies from fuzzy ones. Burrill and Ellsworth (1980, p. 50) note that such planning documents have value in preventing the “seat-ofthe-pants” management style. Second, good documentation is necessary for clear communication on what the project objectives are, what decisions were made, what actions were taken, and what was achieved. This is especially important for communication in large projects between different task groups working simultaneously and between groups working at different points in time. For example, interviewing operations may have terminated by the time that the statistical analysts perform their tasks, yet detailed information on interviewing operations and problems encountered by the interviewers in the field can be critical in properly interpreting results.
and DEBORAH
H. GRIFFIN
though documentation in evaluation research has not been given much attention until rather recently, the topic of documentation has been treated extensively in the data processing literature. Much of this literature pertains to software and system development cycles, and may seem somewhat removed from the needs of evaluators. Yet, writers in this area have given considerable thought to the use of documentation in managing complex development projects and producing a usable product, and we draw on this literature, as well as the evaluation and general social science research literature, in our discussion. In the following sections, the need for good documentation is discussed; reasons for poor documentation are considered; the necessary elements in a comprehensive system of documentation are described, along with references to the literature in this area; and recommendations for improving the quality of documentation are discussed. REQUIREMENTS Documentation of the planning process and the decisions that were made is very important. This process may involve a number of meetings over a period of time with a changing cast of characters due to schedule conflicts, presence of consultants, and so on. Unless decisions, and the reasoning behind them, are recorded and made available in the form of minutes and working papers, large amounts of time can be wasted in bringing everyone up to date and in rehashing issues that should have been resolved. Because planning typically involves highly paid consultants and professional staff, such waste of time can be costly. Third, it is essential for management review of project activity and facilitating management control. A number of writers concerned with management of data processing operations argue that a system of documentation should be established in conjunction with specification of project objectives and assignment of responsibility (Enger, 1976a, 1976b; Gaydasch, 1982; Gray & London, 1969; Jancura, 1977; Metzger, 1981). Documentation can be required at specific project control points to assure that desired actions have been taken and proscribed actions avoided. As Gray and London (1969) point out, project control becomes a built-in function of the documentation, thus freeing the manager from a policing role. Fourth, documentation is important in minimizing the disruptive effects of staff turnover. In a review of applied research projects, Weiss (1973, p. 53) was struck by “the tremendous instability of evaluation staffs.” She writes: Part-time directors were the rule rather than the exception. Turnover in evaluation staff at a11levels was phenomenal. It was not uncommon for a 3-year study to have had three or four different directors and three complete turnovers in research associates.
Documentation
Unfortunately, information that isn’t recorded in writing leaves the project when staff members leave. This problem can be especially acute in universitybased projects employing students, because staff attrition is virtually built in by the educational cycle. However, simply having the information written is not the full solution to this problem if the written material is prepared in some idiosyncratic style and/or filed in a disorganized manner; a well-organized documentation system is necessary for continuity of operation. It provides the necessary material for orientation and training of new staff. In some instances, replacement staff may need to be trained; in other instances, different types of personnel may be hired in the various project phases and training will be required. Green (197411977, p. 188) points out that in data processing operations, good documentation is the most practical means for training people in new jobs. Fifth, it is necessary for efficient report writing. If planned in advance and prepared according to a standardized format, interim documents describing completion of various tasks can be assembled with little modification in progress reports and final reports, either in the main body or in appendixes. The report writing effort can thus be spread out over time, helping to minimize last-minute scrambles and overruns. Also, the material is written while the issues and details are fresh in the writer’s mind. This saves time that is often
163
spent in trying to reconstruct what happened in the past and prevents loss of information due to lapse in memory. Sixth, documentation provides an important historical record for planning future projects. A record of what was planned and what actually took place, along with relevant cost and staffing information, can be useful in designing future studies and avoiding problems encountered in the past. Finally, it is necessary for controlling paperwork. This might seem surprising because documentation is often seen as involving “too much paperwork.” Metzger (1981, p. 47) writes: I think one important cause of our so often getting buried under paper is that we don’t take the time to define the documents we want to use on the project. As a result, whenever a project member needs to write something, he dreams up his own format and suddenly there is a new kind of document to file and keep track of. We probably need a little chaos in the world to keep us from growing too dull, but there are many better places to allow for the chaos; let’s keep it out of the documentation system.
By carefully planning the kinds of documents that be needed and how they will be used, the production unnecessary paperwork can be minimized and benefit from the documentation that is produced be maximized.
will of the can
SCIENTIFIC REQUIREMENTS The virtue of scientific inquiry lies in its disciplined, objective approach to collecting and analyzing data, and to making valid inferences regarding the phenomena of interest. Documentation describing the investigation is extremely important because it is the basis for evaluating the scientific adequacy of the research. It is the responsibility of the investigator to provide a clear description and record of the design and its rationale, the data collection process and the data obtained, the processing and analysis of the data, and the findings and logic of the conclusions. Good documentation is basic to quality assessment in all of these areas; however, there are three respects in which good documentation is of special value in evaluation research. First, it is necessary for assessing the objectivity of the investigation. This takes on special importance in evaluation research because evaluation is advocated primarily on the grounds of being a systematic, objective means of obtaining information for debate and decision making regarding public policy. Publicly funded evaluations should be documented in such a way that the procedures and results are auditable and available to interested constituencies. In discussing documentation standards in program evaluation, Robbin (1981, p. 86) writes:
As public policy makers become increasingly dependent on statistical data, it is critical that standards for data quality be established. Such standards must reflect the
principle that statistical data represent objective and verifiable evidence that is unambiguously described so that analysis or evaluation based upon this evidence can be effectively reviewed, criticized, and replicated. Hedrick, Boruch, and Ross (1978, p. 274) make a similar point in their discussion of secondary analysis of evaluations, recommending that “all data, documentation, and logs stemming from federally supported research used in policy, especially program evaluations, must be made available to the community of analysts.” They make this argument on scientific, public interest, and economic grounds, along with the grounds of protecting the public and the scientific community from fraud. The latter point is worth noting, given recent widely discussed cases of scientific fraud (Broad & Wade, 1982). If the production of adequate documentation is required throughout the project, and is reviewed on a routine basis, it becomes considerably more difficult to fabricate or alter data without serious inconsistencies appearing. Second, it is necessary for assessing the validity of the inferences drawn from the data. Making causal in-
164
WILLIAM
E. POLLARD,
ALFRED C. COOPER,
ferences in evaluation studies requires a great deal of care. Evaluations are typically carried out in field settings, where experimental control is considerably less than in laboratory settings, and the quality of the design and the measurement procedures can be degraded by a variety of random and systematic factors. This can limit the kinds of inferences that can be made. Furthermore, many evaluations employ quasi-experimental designs which, depending on contextual factors, may require some strong assumptions to rule out plausible rival explanations of observed effects. Given these circumstances, any interpretation of the data requires fairly detailed information about the nature of the investigation. In the absence of good documentation, validity may be difficult to assess. Third, it is essential for carrying out reanalysis and replication, and interpreting comparisons with the original findings. These activities have a bearing on assessing objectivity and validity, as well as extending or modifying the original findings, and there have been increasing calls for both reanalyses and replications of evaluations. Yet, without adequate documentation regarding how the study was actually implemented and
and DEBORAH H. GRIFFIN
how the analyses were carried out, successful reanalysis and replication can be difficult, if not impossible. These points may seem obvious. Yet, Mosteller, Gilbert, and McPeek (1980) in a review of 152 published clinical trials on cancer, found widespread deficiencies in the published documents regarding important procedural and statistical issues including randomization, statistical method, blindness, power, sample size, survival, and informed consent. Obviously not all details can be included in a journal article; however, attempts to obtain additional detailed information from the original investigators are often unproductive as pointed out by Wolins (1962) and by Boruch, Cordray, and Wortman (1981) and their discussion of problems in secondary analysis. Keep in mind also that the papers considered by Mosteller and his colleagues had been subjected to review and had been published. As Schmandt (1978) notes, much policy research is not subjected to the traditional quality control system of the scientific disciplines; one might therefore expect even more problems with documentation and the assessment of scientific adequacy in this area.
REASONS FOR POOR DOCUMENTATION Unfortunately, as most writers concerned with documentation note, high quality documentation is rare. If good documentation does indeed have the benefits just discussed, why is it so often ignored? Part of the problem is what Sonquist and Dunkelberg (1977, p. 405) describe as the “almost irresistable tendency to get on with the analysis and not to keep written records of what was done and in what sequence.” It is often seen as a chore by technical staff-Weinberg (1971, p. 262) comments on documentation being the “castor oil” of computer programming. The problem is that the benefits are often not realized until some later point in time; in addition, much of the documentation may be for the benefit of persons other than the writer. In their discussion of data archives, Mosteller, Gilbert,
and McPeek (1980, p. 56) describe an all-too-common scenario: By the time a research project is completed, the analysis finished, and publications prepared, not infrequently the investigators’ attention has turned elsewhere, technical assistants are tired of the project, the grant overextended, and preserving the data gets short shrift. The original data, if not thrown out, are likely to be packed helterskelter in a box in the far corner of the storeroom. Even an important, well-funded project is likely to end with the data tapes unmarked or preserved in such a way that it is difficult to tell the original materials from edited versions or those produced for special analyses. Clearly, if documentation is not planned and supported by project management from the beginning, it will not be done.
TYPES OF DOCUMENTATION In this section, an overview is provided of the document types required in different phases of evaluation studies. The purpose of this section is to provide a framework for thinking about and anticipating the kinds of documentation one would use in conducting evaluations. As was mentioned earlier, documentation needs will vary from project to project and there is no single outline of document types that will fit every project. Yet, even though the relative emphasis on, and level of detail of, the different types of documentation may vary across projects, the functions they serve will have to be addressed. Discussion of detailed formats and specifications for the various document
REQUIRED
types is beyond the scope of this paper; however, references to sources discussing those details are provided at the end of this section. In considering this material one should keep in mind the importance of documentation planning. Some decision by project management is required regarding the document types and level of detail necessary for project management and scientific reporting. Most writers in this area emphasize that a documentation plan needs to be developed in the early stages of a project; it is very unlikely that adequate documentation will be developed without planning. One aspect of this planning is to define the types of documents that will be
Documentation necessary. A second aspect is to define the procedures and resources for document production. A third, and very important, aspect is to define a system for documentation management. Many of the writers cited in this discussion recommend designating a documentation manager to handle this. This person would be responsible for monitoring documentation and its completion, filing it for easy retrieval, and controlling revisions and additions. The types of documents to be discussed are shown in Table 1. They are arranged in terms of project phases. Most evaluation projects involve a general planning phase, a design phase, a data collection phase, a data processing and analysis phase, and a summary and interpretation phase. These are in rough chronological order, although there is often overlap. In addition to
TABLE 1 TYPES OF DOCUMENTATION Planning
Documents
Project phase plan Organization plan Documentation and reporting Design Documents Detailed design Working papers
plan
plan
Data Collection Documents Data collection/field administration Data collection reports Editing and coding documents Data collection forms Data entry documents Working papers Data Processing and Analysis Data processing plan Data cleaning and validation Data file documents File index system Data dictionary Processing summaries Program library Printed output Summary tables Working papers Interpretation and Summary Interim and final reports Executive summaries Press releases Project history Archival data file
plan
Documents report
Documents
General Project Documents Accounting records and cost reports Personnel files Contracts and agreements General project correspondence Project proposal Project library Index to project documentation
165
documentation for each of these project phases, some general administrative and reference documentation is required throughout the project; this is also shown in Table 1. Planning Documents These documents provide the reference point for all project activities. In the project phase plan, the objectives and tasks to be accomplished for each phase are spelled out. For example, the tasks in the design phase include defining the variables of interest, selecting measuring instruments, identifying populations to be studied, and so on. The phase plan should include a calendar with timelines for the major tasks within each phase, showing projected completion dates. The organization plan defines staffing, responsibilities, and flow of work. The documentation and reporting plan defines the documents to be produced for internal and external use; who will use them for what purpose; standardized formats; responsibilities for production, review, and maintenance; and the schedule for document production. A filing system and index for all documentation should be outlined as part of this plan. Design Documents The major document here is the detailed design plan. In this document, the relationship of the design to the general study objectives is spelled out. The variables of interest are defined, measuring instruments are selected, specific populations to be studied are identified, the sampling plan with the necessary sample size is specified, and the types of statistical analyses to be performed are described. For some projects, this may be very similar to material included in an initial project proposal; more often, however, the proposal will be less specific than the detailed design. This document provides the basis for pIarming the field administration and data analysis operations. In addition to the detailed design, various working papers including memos, consultants’ recommendations, minutes, and proposals will be generated in this phase. A file of these papers should be maintained because they can be useful in reconstructing decisions when questions arise. Ideally, the rationale for the various aspects of the design should be covered in the detailed design document; however, unforeseen problems may sometimes require rethinking certain issues, and a review of the working papers can be of value. Data Collection Documents Given the data specifications in the detailed design document, the data collection/field administration plan contains a description of how these data will be obtained. Included here are plans for selection and training of data collection staff, contacting respondents or subjects, scheduling and monitoring data collection, and maintaining quality control. The types of
166
WILLIAM E. POLLARD, ALFRED C. COOPER,
data collection reports will vary considerably depending upon the nature of the study and the data requirements- the operations necessary to obtain data from medical records, for example, will be quite different from those necessary to obtain data from surveys involving personal interviews. Included here might be training reports; quality control reports (such as those describing abstractor or interviewer reliability measures); reports on problems encountered in data collection and their resolution; and other documents such as training manuals, data collection assignments and schedules, and supervisor reports on data collection staff. Also included in the data collection documents are those pertaining to editing, and if necessary, coding of the raw data. The editing and coding documents consist of instructions, logs and records of what was done, and summary reports including discussion of data quality along with problems encountered in editing and coding and their solutions. Specific editing and coding actions should be indicated on the original data co~Zect~onforms; these forms should be bound together and filed. Data entry documents include instructions to keypunchers and the coded data sheets. Again, any working papers should be retained along with the data collection documents. Data Proving and Analysis ~~urnen~ The quality of documentation in the area of electronic data processing is critical; the level of detail and complexity is such that inadequate documentation can lead to considerable inefficiency, if not serious errors which are costly and time-consuming to correct. The data processing plan contains a description of the processing steps necessary for data cleaning and validation, file construction, and statistical analysis. The overall organization and sequence of these steps should be spelled out. Input and output specifications and a description of what is to be done should be part of the write-up of each step. Cleaning and validation operations involve checks for illegal codes and for inconsistencies among codes for different variables. A data cleaning and validation report on these operations should include a description of all errors encountered and actions taken, plus summary statistics on data quality. File construction operations may involve the rearrangement and transformation of data and the creation of derived variables. The resulting data files need to be described in data file documents in terms of identifiers, medium, contents, format, creation date and author, and backup files. Documentation for all files should be referenced in a central file indexsystem. A data dictionary should be maintained with an entry for each variable. Each entry should contain a unique variable name, a variable number, aliases used in different computer runs, the source, file locations, legitimate codes, and any descriptive statistics that would aid understanding. In carrying out the statistical analy-
and DEBORAH I-I. GRIFFIN
sis, processing summaries should be prepared following each run or series of related runs. The summaries should include information on input files, program parameters (along with references to the programs used), output files, steps taken to verify correct execution, along with a written description of what was done. The program library should include manuals for canned programs and full descriptions of programs developed by project staff along with the source files. The descriptions of programs developed in-house should include a verbal overview, input and output specifications, parameters to be specified by the user, a flow chart, and a listing with liberal use of comment statements. The printed output should reference the appropriate run summary so that the input files and parameters, such as those for treating missing values, and so on, are clear. The output should be indexed and maintained in a central location, although to facilitate study of the results, duplicates copies of the output may sometimes be necessary. Summary tables should be prepared in a standardized format and should reference the printed output from which the contents were obtained. Any working papers should be maintained along with these documents. Interpretation and Summary Documents The usual documents prepared in this phase include interim and finai reports, executive summaries, and press releases, if any. The project history is for the investigator’s use in planning future projects. It contains a summary of important project events, along with a discussion of problems encountered and their solutions. A comparison between what was planned and what actually occurred should be included. This comparison should involve time, staffing, resources, and budget considerations. If the data are to be fully available for review, an archival data fire must be prepared. The file must be accompanied by all the information necessary for a secondary analyst to use and interpret the data. If the documentation discussed in this section has been adequately prepared, the material needed to accompany the archival data file can be easily assembled. General Project Documents This category includes various administrative and reference documents that are not phase-specific. Included here are accounting records and cost reports, personnel fires, contracts and agreements, generafproJeer correspondence, the project proposal, the project library of books and reprints with an index system, and an index to project documentation. There are a number of publications that the reader may wish to consult for more detailed information on the planning and preparation of documentation. Sonquist and Dunkelberg (1977) provide a comprehensive discussion of the different types of documentation
Documentation necessary in survey research. They give special attention to documentation of data files and data processing, and they provide samples of various document types. Assenzo and Lamborn (1981) review the kinds of documentation needed in clinical trials; and Mosteller, Gilbert, and McPeek (1980) make a number of specific recommendations for improvement in this area. Fiedler (1978) provides an overview of documentation necessary in field research; and Rogson (1975) outlines the types of documentation that will be produced in different phases of large studies. David, Gates, and Miller (1974) discuss their experiences in documenting a large archive of microeconomic data. Lefferts (1983) discusses the reporting program in grants management. Roistacher (1980) provides a detailed style manual for documenting data files, and RECOMMENDATIONS
167
Robbin (1981) draws on this material in providing specific guidelines for documenting data files in evaluation and policy research. Certain books in the data processing literature provide useful guidelines for developing documentation (Enger, 1976a, 1976b; Gaydasch, 1982; Gray & London, 1969; Kindred, 1973; Metzger, 1981; U.S. Department of Commerce, 1976). Most of these references contain discussion of the use of documentation for managerial purposes. Metzger (1981) is especially useful in this respect; there is much discussion of documentation in planning and of the use of documentation to control projects and to assure the quality of the results. The discussion is accompanied by format outlines for a wide variety of document types.
FOR IMPROVING DOCUMENTATION
Given that good documentation is required for managerial and scientific purposes, what could be done to improve the quality of documentation? The first recommendation is a rather straightforward one, directed toward persons in charge of evaluation research projects, and is one that we have touched on already-documentation needs should be carefully considered by project management in the planning phase of the project. This means spelling out what documents will be needed for what purposes, who will produce them, when they will be produced, what format will be appropriate, how they will be maintained, and so on. If it is left for undesignated project staff members to do whenever they happen to feel like it, in whatever format and degree of detail they choose, and to file according to some system known only to them, it will be of limited valued. A slapdash system may be adequate if there is no staff turnover, if no problems arise requiring review of plans and operations, if all staff know exactly what they need to accomplish by what date, if everyone involved has excellent memories and no other tasks to distract them, and so on. However, such a fortuitous combination of circumstances will be rather rare in most evaluation studies. Furthermore, the documentation that is produced will be of limited value for purposes of outside review and reanalysis. Producing documentation does require time and money, and project management will need to decide what documentation will be necessary. The point here is that this should be an informed decision. In planning the project, it is essential that the costs and benefits of documentation be considered in deciding what will be necessary for managerial and scientific purposes. In some instances this may require additional staff, technical assistance, and other resources, especially in technical areas of data base management and data processing, and this needs to be built into the project budget request.
Such planning would be greatly facilitated by the availability of models from which to work. A second recommendation, directed toward specialists in project management and data base management, is that standards and guidelines for documentation be developed and disseminated. It is a waste of time and effort for every project to have to reinvent the wheel in setting up a documentation system. Additional work, along the lines of that cited in the previous section, would be of great value in documentation planning. Furthermore, proposed standards should be tested- reports of experiences in using specific standards would be valuable. Information on what worked well and what did not, on costs, on effort required, and so on would greatly aid subsequent users. Practicing evaluators could contribute significantly in this regard. A third, more general, recommendation is that documentation for data processing be given more attention in the training of evaluators. Many evaluators gain experience with data analysis and data file management during training, and in subsequent professional positions are often involved in directly supervising data processing for research projects. Unfortunately, documentation is seldom given much attention during training. As a result, many evaluators learn to live with inadequate, ad hoc systems. Training in planning and preparing adequate documentation could very easily be combined with training in data analysis and file management. The availability of standards and guidelines, as discussed previously, would be of great value for training purposes. A fourth, and final, recommendation is that more attention should be given to project management in the training of evaluators and in discussions in the evaluation literature. In their treatment of the manaagement of survey research projects, Sonquist and Dunkelberg (1977, p. 460) write, “keeping administrative control of the highly complex project that a survey
168
WILLIAM
E. POLLARD,
ALFRED
C. COOPER,
can be is probably the most underexplained topic (relative to its importance) in survey research methodology texts.” As St. Pierre (1982, 1983) notes, the situation is not much different in the field of evaluation; he writes (1983, p. l), “while the technical science or art of conducting field-based evaluations has advanced steadily over the past decades, the management of pro-
and DEBORAH
H. GRIFFIN
gram evaluation has received little attention.” We have argued that documentation is an essential ingredient in effective project management. Increased attention to research management could lead to better production and use of documentation, and, ultimately, to improved scientific reporting.
REFERENCES ASSENZO, J. R., & LAMBORN, K. R., (1981). Documenting the results of a study. In C. R. Buncher & J. Y. Tsay (Eds.), Statistics in thepharmaceutical industry (pp. 251-299). New York: Marcel Dekker, Inc. BORUCH, R. F., CORDRAY, D. S., & WORTMAN, P. M. (1981). Secondary analysis: Why, how, and when. In R. F. Boruch, D. S. Cordray, Kc P. M. Wortman (Eds.), Reana/yzing program evaluatrons: Policies and practices for secondary analysis of social and educationalprograms (pp. l-20). San Francisco: Jossey-Bass. BROAD, W., & WADE, N. (1982). Betrayers of the truth. New York: Simon and Schuster. BROOKS, F. P., JR. (1975). The mythical man-month: Essays on software engineering. Reading, MA: Addison-Wesley. BURRILL, C. W., & ELLSWORTH, L. W. (1980). Modern Project management.* Foundations for quality and productivity. Tenafly, NJ: Burrill-Ellsworth Associates, Inc. DAVID, M. H., GATES, W. A., &MILLER, R. F. (1974). Linkage and retrieval of microeconomic data: A strategy for data development and use. A report on the Wisconsin assets and income archives. Lexington, MA: Lexington Books, D. C. Heath and Co. ENGER, N. L. (1976a). Documentation standards for computer systems. Fairfax Station, VA: Technology Press. ENGER, N. L. (1976b). Management standards for developrng information systems. New York: AMACOM. FIEDLER, J. (1978). Field research: A manual for logistics and management of scientific studies in natural settings. San Francisco: Jossey-Bass. GAYDASCH, A., JR. (1982). Principles of EDP management. Reston, VA: Reston Pubiishing Co., Inc. GRAY, M., & LONDON, K. (1969). Documentation standards. Princeton, NJ: Brandon/Systems Press.
systems and management: An introduction to systems analysts and desrgn. Englewood Cliffs, NJ: Prentice-Hall. LEFFERTS, R. (1983). The basic handbook of grants management. New York: Basic Books. LINSENMEIER, J. A. W., WORTMAN, P. M., & HENDRICKS, M. (1981). Need for better documentation: Problems in a reanalysis of teacher bias. In R. F. Boruch, P. M. Wortman, & D. S. Cordray (Eds.), Reanalyzing program evaluations: Policies and practrces for secondary analysis of social and educationalprograms (pp. 68-83). San Francisco: Jossey-Bass. METZGER, P. W. (1981). Managing a programming project. Englewood Cliffs, NJ: Prentice-Hall. MOSTELLER, F., GILBERT, J. P., & McPEEK, B. (1980). Reporting standards and research strategies for controlled trials: Agenda for the editor. Controlled Clinical Trials, I, 37-58. ROBBIN, A. (1981). Technical guidelines for preparing and documenting data. In R. F. Boruch, P. M. Wortman, & D. S. Cordray (Eds.), Reanalyzing program evaluations: Policies and practices for secondary analysis of social and educationalprograms (pp. 84-143). San Francisco: Jossey-Bass. ROGSON, M. M. (1975). Documentation in massive social science experiments (Rand Paper Series P-5494). Paper presented at 83rd Annual Convention of the American Psychological Association, Chicago, August 30-September 3, 1975. ROISTACHER, R. C. (1980). A style manualfor machine-readable data files and their documentation. Washington, DC: U. S. Department of Justice. SCHMANDT, J. (1978). Scientific research and policy analysis. Science, 201, 869. SONQUIST, J. A., & DUNKELBERG, W. C. (1977). Survey and opinion research: Procedures for processing and analysis. Englewood Cliffs, NJ: Prentice-Hall, Inc.
GREEN, J. D. (1977). Systems documentation, internal control, and the auditor’s responsibilities. In E. Cl. Jancura (Ed.), Computers: Auditing and control (pp. 186-192). New York: Petrocelli/ Charter. (Reprinted from The CPA Journal, 1974, July, 25-28).
ST. PIERRE, R. G. (1982). Management of federally funded evaluation research: Building evaluation teams. Evaluation Review, 6, 94-l 13.
HEDRICK, T. E., BORUCH, R. F., & ROSS, J. (1978). On ensuring the availability of evaluative data for secondary analysis. Polrcy Sciences, 9, 259-280.
ST. PIERRE, R. G. (1983). Editor’s notes. In R. G. St. Pierre (Ed.), Management and organization of program evaluation (pp. l-3). New Directions for Program Evaluation, No. 18. San Francisco: Jossey-Bass.
JANCURA, E. G. (Ed.). (1917). Computers: Auditing and control. New York: PetrocelliXharter. KINDRED, A. R. (1973). Documentation
and manuals. In Data
U. S. DEPARTMENT OF COMMERCE, NATIONAL BUREAU OF STANDARDS. (1976). Guidelines for documentation of computer programs and automated data systems. (Federal Information
Documentation Processing Standards Publication Government Printing Office.
38) Washington,
DC: U. S.
WEINBERG, G. M. (1971). The ~yc~oiogy of romputer programming. New York: Van Nostrand Reinhold Co.
169
WEISS, C. H. (1973). Between the cup and the lip . . . Evaluation, I, 49-55. WOLINS, L. (1962). Responsibility
chologist, 17, 657458.
for raw data. American Psv-