NORTH- HOIIA,ND
Lessons in Technology Assessment Methodology and Management at OTA F R E D B. W O O D
ABSTRACT The demise of the congressional Office of Technology Assessment (OTA) was precipitated by larger political forces that in the end, despite a close fight, OTA was unable to withstand. In its time, OTA was the institutional leader in the technology assessment (TA) field. OTA defined and refined a widely respected assessment process that produced hundreds of critically acclaimed reports. OTA studies contributed to congressional deliberations and public debate on a wide range of topics. OTA's legacy also includes some important lessons in technology assessment methodology and management, with implications for reinventing technology assessment for legislative bodies such as the U.S. Congress. The lessons learned are likely to be key to the future of a next generation OTA or the equivalent, and to other OTA-like organizations, whether in the public or private sector. Compared to the old OTA, a reinvented TA organization would have a more flexible product line and study process that can more closely match a variety of congressional needs, while at the same time retaining the OTA hallmarks of balance, objectivity, and broad participation. Methodological improvements are needed and can be implemented quickly, drawing on the OTA lessons and other TA activities in thc United States and overseas. The imperative for OTA-like functions continues, given the ever more pervasive role of science and technology in society. As embodied in the Technology Assessment Act, the concept of TA is a noble one. The OTA experience in TA methodology and management should help technology assessors at home and abroad keep the dream alive. © 1997 Elsevier Science Inc.
Introduction Technology assessment (TA) is as much about methodology or process, and how that process is managed, as it is about the assessment results and the substance of T A reports. How a T A is conducted and managed affects the credibility and utility of the results. This was doubly so for the congressional Office of Technology Assessment (OTA), because Congress chartered O T A to provide it with nonpartisan, objective, balanced analyses of technology-related issues. The closure of O T A in fiscal year 1996 ~ prompted considerable soul-searching about the factors that may have played a role, and the methodology and management lessons that can be learned from O T A ' s experiFRED B. WOOD currently serves on the steering committee of the Institute for Technology Assessment (ITA), a not-for-profit private sector organization established to conduct assessment studies for public and private sector clients. Address correspondence to Dr. Fred B. Wood, 2318 N. Trenton Street, Arlington, VA 22207. E-mail:
[email protected] OTA's authorizing statute, P.L. 92-484, is still in effect. OTA operations ceased as a result of congressional appropriations action that "zeroed-out" OTA's budget for fiscal year 1996, with the exception of a modest close-out budget to publish nearly completed reports and shut down the office in an orderly fashion. Technological Forecasting and Social Change 54, 145-162 (1997) © 1997 Elsevier Science Inc. 655 Avenue of the Americas, New York, NY 10010
0040-1625/97/$17.00 PII S0040-1625 (96)000178--3
146
F.B. WOOD
ence. This article reviews the evolution of OTA's assessment methodology and management, summarizes the OTA process in its mature state, and discusses lessons learned. Hopefully, these lessons can help assure that future OTAs, or their equivalent in the public and private sectors, fare better if given the opportunity. Technology assessment methodology as practiced by OTA was constantly evolving and periodically subjected to internal reviews. The Technology Assessment Act of 1972 was devoted primarily to articulating the need for technology assessment, the general purposes to be served by OTA, and the organization, powers, and duties of the OTA director, Technology Assessment Board, and Technology Assessment Advisory Council [1-3]. 2 The statute did not provide or stipulate a methodology per se. It was left to successive OTA directors and staff to develop and refine a methodology and management process, largely through operational experience.
Evolution of OTA's TA Methodology and Management OTA did, in its formative years, have available the results of academic research on TA philosophy and methodology. The technology assessment literature grew rapidly during the 1970s [4-14]. However, during the tenure of OTA's first director, Emilio Q. Daddario (from 1973, when OTA first began operations, to 1977), OTA had not yet reached a critical mass of staff, resources, and experience to establish a consistent methodology---even at a general level. The early methodology reflected some aspects of academic assessment methodology, but with the primary emphasis on practical ways to produce the first round of OTA reports. OTA made heavy use of outside contractors, and actively involved advisory panelists in preparing reports. Toward the end of Daddario's tenure, OTA's program management made a first effort to collect lessons learned-including methodology and management lessons--from the early years of TA experience at OTA [15] 3 and in business and government generally [16]. Dan DeSimone served as acting director of OTA from late 1977 until the appointment of OTA's second director, Russell Peterson, in early 1978. During Peterson's tenure, OTA was preoccupied with political difficulties that crowded out attention to methodological improvements. This period of OTA's history included an extensive effort at outreach in identifying and developing various priority topics for future assessment [17, 18]. While this effort became politicized, due to concerns that OTA was becoming too independent from congressional oversight and needs, many of the topics identified did find their way into OTA's project portfolio in subsequent years. OTA's third director, John H. Gibbons, took office in June 1979, and within several months gave high-level attention to OTA's methodology in part through the establishment of an OTA-wide Task Force on TA Methodology and Management (chaired by the author of this article). 4 The task force included representatives of all OTA programs and operating divisions, commissioned internal and external studies, and issued a comprehensive report in 1980 [19].5 The report reflected a consensus on 2 P.L.92-484, the Technology Assessment Act of 1972. 3 The report [15] includes a useful appendix by Huddle, F. P., of the Congressional Research Service on the definition and delivery of O T A products to the customer. 4 The Task Force was formally established on November 30, 1979. 5 Regarding the O T A assessment process, the report [19] recommended that OTA: (1) prepare a looseleaf T A workbook; (2) institute project close-out reports; (3) develop a staff orientation program; (4) exchange learning on policy analysis; (5) devote additional resources to congressional relations; (6) clarify O T A policy on proposal preparation, report close-out and approval, and project review checkpoints; (7) allocate project follow-up time; (8) revise the "What Is O T A " pamphlet; and (9) prepare a paper (on the task force results) for the O T A annual report.
LESSONS IN T E C H N O L O G Y ASSESSMENT
147
the major components of OTA's basic assessment methodology, for example, the use of advisory panels and workshops, targeted contract studies, extensive intenal and external review, and other methods to help assure the involvement and participation of a wide range of stakeholders and perspectives [20]. This consensus emerged in large part from OTA's early experience during the Daddario and Peterson years. The task force also reached consensus on the need for tighter management of OTA studies, including so-called project review checkpoints that would help assure both timely completion and balanced, high-quality results. The task force did not reach consensus on a deeper level of technology assessment methodology, nor on specific assessment methods or techniques. The prevailing view then, and to a significant extent for the duration of OTA's existence, was that each study was unique when it came to the selection and use of specific methodologies. During Gibbons' 13-year tenure the topic of OTA's technology assessment methodology and management came up repeatedly in a variety of ways at annual senior management retreats, 6 monthly management meetings, and periodic task forces that addressed various aspects of the OTA study process (e.g., writing, publishing, selecting panels, cross-program studies, internal review, policy analysis, report release and dissemination, close-out reports). Over time, many of the task force recommendations were approved and implemented. 7 Gradually, during the 1980s, the basic OTA assessment methodology was refined and strengthened to the point where high consistency, at this general level, was achieved across the organization and for most studies [21, 22]. The management and budgeting of studies was tightened up considerably, compared to the early years of OTA. However, despite OTA's improved track record from the early 1980s to the early 1990s, important methodological and management challenges remained. Most major projects still took 15 to 24 months to complete, and cost, in current dollars, an average of about $500,000 per study, s The full reports were typically lengthy, with many running 200-400 printed pages long. By the late 1980s, various OTA managers and staff understood that the typical OTA report was taking too long to complete to best meet congressional needs, and was too lengthy (regardless of the quality) for many congressional staff (let alone members) to read. Steps were taken to redefine the OTA product line, with greater emphasis on shorter and reader-friendly summary reports and report briefs (the latter typically one to four pages long), the use of congressional staff briefings and testimony to deliver study results, and a range of internal staff workshops and other activities to share learning about assessment methodology--and thus hopefully improve the efficiency and timeliness of the OTA study process. Some questioned the utility of management retreats, but the agendas generally were well developed and covered important topics such as: sharpening OTA's overall strategy; better knowing and meeting client needs: constantly learning how to improve the OTA process; strengthening staff training and development: and staying on top of breaking domestic and international science and technology issues. 7 See, e.g., Memo from OTA Director John H. Gibbons to the Task Force on TA Methodology and Management, Director's Response to Task Force Report and Recommendations, Washington, DC. Dec. 3., 1980, and Memo from OTA Director John H. Gibbons to the Technology Assessment Board, OTA Task Force on TA Methodology and Management: Summary of Activities and Recommendations to Date, Washington, DC, Nov. 13, 1980. s This average cost estimate includes direct staff salaries and benefits, contracting, panel and workshop costs, travel, printing, and other costs directly associated with a specific study, but excludes general administrative and overhead costs.
148
F.B. WOOD
Some of O T A ' s seasoned analysts and managers felt that further improvements in methodology---especially policy analysis m e t h o d o l o g y - - w e r e both possible and desirable, and that such improvements could translate into more useful and timely reports. In response, G i b b o n s established an internal task force on policy analysis that began work in the fall of 1992 and conducted a comprehensive review of many aspects of O T A ' s methodology. The Policy Analysis Task Force operated as a separate O T A study with an internal staff, external advisory committee, some outside contracting, and considerable outreach to congressional staff and the larger academic and research communities. The task force issued its report, after several rounds of review, in May 1993 [23]. 9 The task force identified a range of options that it concluded would improve not only the policy analysis c o m p o n e n t of O T A studies, but the entire assessment process as well. 1° Roger H e r d m a n , O T A ' s fourth director, with the advice of the task force and other O T A staff, began to i m p l e m e n t some of the options. In parallel, anticipating the need for downsizing and budget cuts, H e r d m a n established a Long Range Planning Task Force to study a range of possible alternative organizational structures for O T A . Based in part on the report of this task force [24], he ultimately i m p l e m e n t e d a reorganization that reduced the n u m b e r of O T A divisions from three to two, and the n u m b e r of programs from nine to six. 11In addition, the new divisional managers began to i m p l e m e n t a tighter project m a n a g e m e n t system that clearly identified various project review checkpoints throughout the course of a study, with the intent of holding project directors more accountable for keeping studies on schedule and providing a stronger basis for identifying problems and taking corrective actions earlier. Also, several initiatives were underway to improve the use of electronic mail and the I n t e r n e t in the conduct and dissemination of O T A studies. These changes were still in the early stages of i m p l e m e n t a t i o n in late 1994. O T A had only begun to stabilize after a period of internal change, when powerful external political forces converged to threaten O T A ' s continued existence. ~2 F r o m D e c e m b e r 1994 through the early months of 1995, O T A staff and others made n u m e r o u s proposals 9The report [23] was written in the format of a good OTA report; the major chapters were titled: (1) Findings and Options; (2) About This Study; (3) Responding to the Needs of Congress; (4) A Profile of 18 OTA Reports; (5) Tellinga Good Storyand TellingIt Well;and (6) The Culture of OTA. The report appendices included "gems" of OTA policy analysis. 10The report [23] identified the followingoptions for OTA management action: (1) provide increased editorial assistance to projects to improve reader-friendlinessof reports; (2) clarify OTA's policy regarding "recommendations" and "policy prescriptions" in options; (3) appoint a standing panel of senior staff upon which the director can call when the objectivityof a report is called into question; (4) encourageexperiments with shorter assessments and with policy-relevantinterim products and services; (5) establish a bimonthlyor quarterly "Issues in Policy Analysis" lecture series on specific topics using people from outside OTA; (6) contract for the development of "sourcebooks" with key literature on specific topics in policy design and evaluation;(7) assignstaff from outside the program in which an assessmentis being done to help as "project kibitzers" or "shadow advisorypanels" or in "shirtsleeves policy sessions"; (8) give new project directors or all project directors a few weeks to read reports from other programs; (9) establish a program to provide mentors for new project directors; (10) institute OTA staff-run seminars to facilitate the transfer of policy analysisskillsand knowledgeacross the agency;and (11) reinstitute the OTA Congressionaland PublicAffairs Office's lectures on how Congress works. " AppendixJ to this report [24], by Alic,J., on OTA Organizationand Management,provided suggestions on methodology improvements that were similar to and reinforced the earlier Policy Analysis Task Force report [23]. ~2The first indication of serious threat to OTA came in the form of the early December 1994 Senate Republican Conference legislativereform package that included the eliminationof OTA.
LESSONS IN T E C H N O L O G Y
ASSESSMENT
149
to accelerate implementation of various changes. 13Some staff proposed significant reductions in the length, cost, and time to complete OTA studies. Others suggested a rethinking and redefinition of OTA products and services in ways that would be more timely and responsive to congressional needs. 14Still others emphasized the need to push for further improvements in assessment methodology) 5 After the June 1995 House vote that kept OTA alive, Herdman appointed an internal task force to develop further recommendations for better matching OTA's study process to congressional needs. 16These initiatives were rendered moot by the July 1995 Senate and Conference Committee votes that terminated OTA operational funding for fiscal year 1996. Role of T A Methodology and Management in OTA's Demise Numerous factors contributed to the congressional decision to discontinue funding OTA. Most observers, including this author, believe that political factors were dominant. These include, in no particular order: the changing political philosophies and priorities of Congress and a felt need to cut legislative branch spending and eliminate a congressional agency; the large number of new members of Congress and staff in recent years unfamiliar with the work of OTA; the absence of a politically strong outside constituency for OTA (although OTA was and is held in high regard in the science and technology community, and among academics, researchers, and policy analysts in particular): a limited inside constituency on Capitol Hill, given that OTA served primarily committee chairs; a relatively weak political position on Capitol Hill compared to other congressional agencies; and a perception by some that over the years, OTA was more responsive to the majority party, despite the bipartisan board and the high percentage of studies that at least had a modicum of minority support. In sum, OTA was politically vulnerable at a time of major change in the larger politics of Congress. Despite this, the fight over ~3See, for example, Wood, F. B., Some Preliminary Thoughts on Restructuring OTA, Memo to the OTA Director and Senior Management, Jan. 18, 1995: Wood, F. B., Thoughts on OTA Testimony for FY96 Appropriations Hearing, Memo to the OTA Director, Feb. 7, 1995; Dougherty, D., Govan, E., Niblock, R , Shaw, A., Tunis, S., and Wyckoff, A., Program Directors' Draft Principles for a "New" OTA, Memo to the OTA Director and Senior Management, draft, Mar. 21, 1995. These memos collectively provided a possible template for a reinvented OTA. ~aIbid. The OTA Director and TAB did agree on a set of restructuring principles that would increase the flexibility of OTA work products and the synchronization between OTA studies and reports and congressional needs. See Restructuring, Memo, April 1995. ~ See Wood, F. B., Memos, Jan. 18 and Feb. 7, 1995, op.cit., note 13. The OTA Director reemphasized implementation of various of the Policy Analysis report [23] options and appointed a new internal Committee on Policy Analysis, Memo, Mar. 3, 1995. t~ The "Task Force on the New OTA" and its mission were announced by Johnson, P., co-chair, in an allOTA (e-mail memo, July 17, 1995, and by e-mail memos from task force members to their own programs. See, e.g., Coates, V. T., also Task Force co-chair, Memo to the Industry, Telecommunications, and Commerce Program staff, July 17,1995, which listed these key questions: (1) What are OTA's one or two major "vulnerabilities" as revealed by recent events in Congress, and how might they be corrected or compensated for?: (2) Who is/are our most essential client(s) or audience(s) within Congress?: (3) What are the needs of those clients, and how well do we meet them--what are our strengths?; Which of their needs do we not meet--what are our weaknesses?; (5) How could we better meet their needs?; (6) What penalties would we pay, if we change to meet their needs better--what would we have to give up, what values might we sacrifice, and who might we disappoint?; and (7) What are the distinctive differences between OTA and its sister agencies--how can we preserve those differences even as we change? The Task Force immediately went to work, convening meetings with program staffs. See, e.g., Coates, V., Industry, Telecommunications, and Commerce [Program] Brown Bag on the "New OTA," summary notes, Memo, July 19, 1995. Some TA professionals outside of OTA also provided ideas. See, e.g., Priest, C. W., "Reinvention of OTA,'" e-mail memo. July 24, 1995.
150
F.B. WOOD
O T A ' s survival was close, with significant bipartisan support that fell just short of the votes needed to keep O T A alive, t7 However, some of O T A ' s congressional critics did, in the public debate, raise issues that directly or indirectly related to methodology and management. The most notable was an expressed concern about O T A studies being good but too late and too long to be of use. In this view, O T A was out of step with the legislative process, reflecting, in effect, a mismatch between O T A work and the legislative rhythms and cycles of Congress. Others asserted that O T A was a luxury Congress could not afford, and that O T A studies duplicated work of other congressional agencies, notably the General Accounting Office ( G A O ) and Congressional Research Service (CRS). Still others observed that OTA-like studies were available on most topics from numerous other government and private sector sources, thus obviating the need for O T A (see, e.g., [25]). O T A ' s congressional advocates rebutted these concerns, emphasizing the differences between the basic missions and methodologies of OTA, GAO, and CRS, and the many checks and balances in the O T A process that are not employed by the other congressional agencies or by most executive branch or private sector study groups. Also, advocates pointed out that O T A studies did make numerous contributions to the legislative process, in many direct and indirect, visible and diffuse ways [26-29]. 18 O T A ' s efforts to reinvent itself to better match the legislative process did not progress fast enough or reach fruition soon enough. Whether or not an accelerated O T A effort would have made any difference in the final outcome is debatable and speculative. But had O T A produced consistently shorter studies and reports, developed wider congressional understanding of what made O T A ' s study process different and valuable, and forged closer and more varied linkage to the legislative process prior to the 1995 debate on O T A ' s fate, these arguments would have been less salient, at the least. The O T A methodology and management experience provides important lessons for any future efforts to re-establish a technology assessment capability in Congress, and for those already operating or seeking to establish technology assessment organizations or offices in the United States and overseas.
Strengths and Weaknesses of OTA's TA Methodology and Management O T A ' s general methodology was really a process that involved the management of assessments as well as the methods used to do research and analysis. In this sense, by the late 1980s, the O T A methodology had matured to the point where, at a high level, considerable consistency was achieved across project lines. 19 The strengths and weaknesses of several major steps in a typical O T A study are discussed below. t7 The floor vote in the House of Representatives on June 22, 1995, was 220 to 204 in favor of continued funding of OTA functions at $15 million as part of the Congressional Research Service in the Library of Congress. The vote in the Senate Committee on Appropriations on July 18, 1995, was 13-11 against an amendment that would have continued OTA as a separate entity with $15 million in FY96 funds. The floor vote in the Senate on July 20, 1995, was 54-45 against a similar amendment that would have continued OTA. On July 27, 1995, the House-Senate Conference Committee met to resolve differences between the House and Senate positions on OTA. On a tie vote, the Conference Committee failed to approve an amendment to continue funding for OTA operations. ~ See especially [26], Attachment 3, Legislative Impact Summary. ~9For a summary of the OTA process as of September 1995, see The Assessment Process at http:// www.wws.princeton.edu:80/~ota/ns20/process_n.html. This web site also includes selected documents on the OTA history plus a complete set of all published OTA reports. Several OTA staff prepared their own syntheses of the OTA process, including methodological dimensions. See, e.g., Crane, A., and Friedman, R., Lecture Notes: Six Day Course on Technology Assessment and OTA, presented to the Chinese Academy of Sciences, July 1985.
LESSONS IN T E C H N O L O G Y
ASSESSMENT
151
PRE-REQUEST ACTIVITY Informal discussions between congressional committee staff and O T A usually preceded formal requests for O T A studies. Staff-level interaction was frequently necessary to help identify appropriate and timely topics for O T A research. However, these efforts were usually ad hoc and fell considerably short of a systematic dialog with the committees about their technology assessment needs. O T A attempted to balance its work load among the various committees and between the House and Senate. Inevitably, however. some committees used O T A more than others, especially in the absence of a more formalized way to engage all committees in the early stages of identifying possible studies. Whatever political party was in the committee majority also tended to use O T A more often, given the majority's greater opportunities for setting agendas and initiating actions, compared to the minority. In the first few months of 1995, O T A did move more aggressively to attempt to match its expertise with a wider range of committees and subcommittees. 2° This appeared to be a constructive effort, but did not have an opportunity to be fairly tested, given that O T A ' s survival was in question. FORMAL REQUEST
Whether preceded by informal discussions or not, almost every major OTA study was formally requested by the chair and/or ranking minority member of one or more committees. Both the Technology Assessment Board and the O T A director had the authority to request studies, but this authority was rarely used by TAB, and sparingly by the director, and then usually for small studies. The formal requests were important for laying out at least an initial framework of key issues and questions for O T A consideration. O T A managers and project staff considered request letters to be a key part of the "charter" for each study. On the other hand, the process of obtaining formal letters could take considerable time, and in general made it more difficult to respond to more immediate needs. STUDY PROPOSAL
Once O T A received a letter requesting a study, and occasionally before a formal letter was received, O T A staff prepared a more detailed study proposal. Each proposal typically discussed the need for the study, identified other related research (including any other studies by congressional agencies), and laid out a research plan, schedule, and budget. Preparing the proposal generally served constructive purposes by requiring a more detailed examination of a potential study before making commitments, and by providing a more robust basis for decisions by OTA management and TAB as to whether the study as proposed was suitable for OTA. Most proposals were not unduly long, 10-15 pages. Starting in June 1994, all study proposals were required to include a technology page that explicitly identified the technologies to be assessed and cited the relevant authority in the Technology Assessment Act. 2~This requirement, proposed by O T A and approved by TAB, responded to concerns of some members of the Senate Committee on Appropriations, Subcommittee on the Legislative Branch, that O T A studies were straying too far afield from technology. The Act itself mentions "technology," "technological ~00TA Resource Briefs were prepared on several dozen topics; each brief listed relevant OTA staff and prior and current studies on the topic of interest. The briefs were directed primarily to the subcommittees. This effort was judged to be worthwhile, but would have taken several more months to come to fruition. See OTA memo from Johnson, P., Chair of the Task Force on Subcommittees, Mar. 20, 1995. 2~Memo from OTA Director Roger Herdman, Emphasizing Technology, Washington, DC, June 28, 1994.
152
F.B. WOOD
applications," "technological programs," and "alternative technological methods of implementing specific p r o g r a m s " - - a fairly broad charter. 22Also, the Act clearly establishes a mandate to assess broadly the "physical, biological, economic, social, and political effects" of technology. 23 Despite the broad statutory language, the appropriate scope of O T A studies has been a matter of intermittent questioning on Capitol Hill, and perhaps an indicator of deeper philosophic or political concerns, well beyond what a "technology page" could reasonably address. The fundamental question of what is technology and technology applications under the Act, and thus within the mandate of OTA, was never fully addressed or resolved, but should be if a congressional T A function is re-established. STUDY APPROVAL The Technology Assessment Board approved all O T A studies, whether requested by committees, the O T A director, or TAB itself, except for small studies initiated within the director's discretion (under $50,000). Overall, the TAB approval process worked well in helping assure that studies had a reasonable degree of bipartisan support and were appropriate for OTA. Proposals generally were approved by TAB by a substantial bipartisan vote or unanimously. 24More broadly, the role of TAB gave credibility O T A ' s mission of conducting nonpartisan studies. TAB met once every two or three months on the average, and usually considered no more than three or four proposals per meeting. Also, TAB served to legitimize O T A as an independent congressional agency acting with the full authority and prestige of Congress, within the domain of the authorizing statute. This no doubt helped O T A obtain information and cooperation from all sectors of society [27]. SELECTION OF ADVISORY PANEL One of the most important steps in the O T A methodology was the selection of a project advisory panel. All major studies had advisory panels, and even most smaller studies at least used a workshop as a de facto panel. The typical advisory panel advised the O T A study team on the key issues, provided a wide range of perspectives and views relevant to the study topic, and reviewed and commented on draft materials and reports. The panels did not draft the O T A reports; nor were the panels held responsible for the study results. And the panel input was taken very seriously by O T A project staff (even though the panels were not charged with reaching a consensus). Panels ranged in size from about 14 to 24 persons, selected to represent academic, research, consumer, business, educational, technical, policy, and other stakeholders or viewpoints relevant to the study. However, the selection of panels frequently was a painstaking and time consuming process that could stretch over several m o n t h s - - a major potential delaying factor. Project staff screened possible panelists, prepared a comprehensive memo laying out the rationale and balance of the panel, and recommended specific panel members. Each proposed panel was reviewed by the O T A director and either approved as is, approved with modifications, or sent back to the project staff for further work. Panels usually met two or three times over the course of a study. Successful panel meetings required significant staff preparation time as well as interested and committed panel members. 22P.L.92-484, Sec. 3(c) (l-3). 23Ibid., Sec. 2 (d)(1). 24The more controversialproposalsusuallyweredeferred and/orreviseduntil a consensuscouldbe reached.
LESSONS IN TECHNOLOGY
ASSESSMENT
153
Some of the smaller, quicker turnaround O T A studies successfully used workshops, in lieu of panels, in order to expedite the study process. DATA COLLECTION, ANALYSIS, AND SYNTHESIS At the heart of every O T A study was intensive research by the project team, typically a project director and one to three other project staff. Some projects had only a project director. A very few projects had more than four total staff assigned. Multistaff project teams were often interdisciplinary. At a general level, the assessment methodology was similar across O T A projects in that most projects used a mix of literature reviews, interviews with technical and policy experts, agency and stakeholder briefings, site visits, an occasional survey, targeted contractor research, quantitative analyses where appropriate, the rare computer model, and a variety of workshops focused on specific technical or policy issues or stakeholder perspectives [23]. The exact mix and balance of research methods varied but usually drew from this general collection of techniques. The general wisdom at O T A was that there was no methodological prescription or formula for doing a technology assessment, other than the high-level consensus on the O T A process described in this article--and typically equated with OTA's methodology. To a large extent, the actual research process at O T A was eclectic. At OTA, a general methodology or process was prescribed, for example the use of advisory panels and external review, and a range of data (and information) collection techniques was encouraged. But project staffs were afforded substantial discretion in the selection of use of specific data collection and research methods to best match the varied challenges of each specific study. This was probably as it should have been in most instances, However, O T A had difficulty in developing and effectively sharing "successful" approaches to assessment, either at the tactical level of specific techniques or at the conceptual level. The latter might be called mid-level assessment methodology, if highlevel equates to the overall O T A methodology or process and tactical-level refers to specific data collection or research methods. Examples of mid-level methodology would be general frameworks for: stages of technology development and application; generic types of stakeholders and values or perspectives; generic types of possible direct and indirect, short- and long-term impact areas; and a spectrum of types of policy options or interventions and their consequences. This mid-level methodology would provide a common conceptual frame of reference for many studies. Actual application of the frameworks would depend on the particular study focus and needs. The frameworks also could facilitate internal review and quality control. Both the 1979-1980 Task Force on T A Methodology and Management and the 1992-1993 Policy Analysis Task Force addressed this topic. The earlier methodology task force reviewed the academic literature and commissioned contractor papers on midlevel technology assessment methodology [20]. Some promising ideas were identified, but consensus was lacking on any OTA-wide approach, thus leaving this methodological arena up to individual project director discretion and generally at the periphery of O T A work [30] .25The Policy Analysis Task Force similarly reviewed literature, commissioned papers, and identified ideas on strengthening mid-level policy analysis methodology for OTA. The task force opted for a series of recommended options for process changes, such as internal advisory or "shadow" panels and policy workshops, that, while useful, 25The Task Force commissioned the preparation of a draft handbook on T A methods [30] that was never finalized, widely distributed, or used.
154
F.B. WOOD
continued the eclectic approach to methodology at O T A . The task force report itself did provide many helpful ideas on improving policy analysis and better matching policy analysis methods with the policy issues or questions at hand. 26 O T A was reasonably successful in bridging the gap between technology and policy in most assessments. The Technology Assessment Act authorized O T A to "identify alternative programs for achieving requisite goals," "make estimates and comparisons of the impact of alternative methods and programs," and "present findings of completed analyses to the requisite legislative authorities, ''27and thus brought policy analysis within the umbrella of technology assessment. The Act itself was unclear as to whether O T A was authorized to make recommendations, although the legislative history suggests that O T A was to present options and not recommendations [1, 31, 32]. O T A operating policy permitted and encouraged the inclusion of options or evaluated options in reports, rather than recommendations, although the distinction was sometimes difficult to make or maintain. The policy analysis side of O T A work did need strengthening, and thus the task force on this topic. Most relatively recent O T A studies made a significant effort to discern the policy options relevant to the technical or scientific issues under consideration. In this sense, O T A adopted through practice the integrated notion of technology assessment developed by the early T A academics and scholars in the 1960s and 1970s [see, e.g., 4, 8, 10, 14]. However, the depth and quality of policy analysis varied widely. Some reports identified and described options, but did not go further, while others examined and analyzed the possible implications of various policy actions. The Policy Analysis Task Force reaffirmed that policy analysis was central to most O T A studies, as O T A ' s primary mission was to advise a policy-making body, the U.S. Congress [23]. DRAFT REPORT REVIEW Extensive review of draft reports was another bedrock of the O T A methodology. Reviewers included not only the advisory panelists but typically a cross-section of experts and stakeholders that participated or had an interest in a study. The total number of reviewers ranged from a few dozen to literally hundreds for major studies of broad interest. Project staff were required to prepare, for the O T A director, a lengthy m e m o summarizing the major review comments received and the O T A staff responses thereto. The review process was generally conducted in the open. Reviewers were asked to respect the limitations on the documents, namely that they were for review and comment only, and not for public attribution or distribution. The review system worked for the most part. There were a handful of occasions when draft materials were leaked to the press or otherwise prematurely released, in some cases creating a political problem. 2s A n d the volume of review comments could be daunting at times. But overall, the review process was a strength of O T A and an important check and balance on the study results. The greatest downside of the review process was simply the time and effort required, both to conduct the review and to interpret and use the results. In the last few years, O T A staff were rapidly expanding use of electronic mail and the Internet to expedite 26The Task Force stimulated several process changes as well as a series of staff roundtables on topics such as the use of case studies, contracting, and workshops as part of OTA's assessment methodology. 27P.L. 92-484, Sec. 3(c)(4-6). 28One example is the unauthorized release of a draft paper on automated firearms purchaser checks. With regard to OTA, the issue was not so much the substance of the draft but the violation of the OTA process by stakeholders on one side of the issue. The premature and unauthorized release was perceived by some to have damaged the credibility of the OTA process.
LESSONS IN TECHNOLOGY ASSESSMENT
155
the review process (and other project activities). Also, in recent years, O T A management strongly encouraged additional internal peer review of draft reports. Results of this initiative were mixed, largely because of the significant time and effort required to provide meaningful review comments. REPORT PUBLICATION AND RELEASE To many, perhaps most users of O T A work, the full O T A report was the primary product. Over its lifetime, O T A produced about 750 reports, many of which were (and continue to be) highly regarded in the relevant research and scientific disciplines [26, 28-30]. The length of O T A reports has been a continuing dilemma. The academic and research communities are used to long reports, as an indicator (not always accurate) of comprehensive and in-depth coverage. And much of O T A ' s technical credibility depended on favorable reviews by the academics and researchers. Yet most elected public officials, Congress included, do not have the time or inclination to read lengthy documents. O T A spent many years trying to resolve this tension. In OTA's early years, studies were largely contracted out, and reports were lengthy and unwieldy--perhaps unavoidably so. But the tenure of O T A directors Gibbons and Herdman was marked by intermittent efforts to ratchet down the length of O T A reports, emphasize the preparation of well-written report summaries and briefs and, in later years, develop alternative or supplemental electronic and video formats for report delivery. Various strategies were used to attempt to shorten reports, including a 200printed-page ceiling, project review check points, the use of project-level and OTAwide editors, and emphasis on summary reports in lieu of full reports. These efforts met with mixed success. The quality, readability, layout, and format of O T A reports did dramatically improve over time [33]. For the last decade or so, many O T A reports received awards from library associations and government communication groups, in addition to continuing the long record of acclaim from the academic and research communities [26-29]. And OTA emphasized other, non-written ways to release and deliver reports to Congress-especially through staff briefings and committee hearings. In its final months, O T A was ramping up on the information superhighway to make its reports, summaries, and briefs electronically available via the Internet and World Wide Web. 29 Yet despite these many promising initiatives, O T A still had difficulty producing shorter "full" reports. The significance is not just that shorter reports are more likely to be read by congressional staff. If done right, shorter reports should take less time and money to draft, review, edit, and publish. But shorter reports do not happen by magic or by fiat, especially if quality is to be preserved. Changes in methodology, management expectations and oversight, and organizational culture, among others, would be needed. STUDY FOLLOW-UP The O T A assessment process did not end when a report was completed. Project staff were encouraged to brief the requesting committee staff in anticipation of report release, solicit quotes from members for use in the O T A press release, and offer to provide testimony or formal briefings on the report after release--depending on the level of committee interest. For some reports, especially those on "high-profile" or 29Ironically, within a few months after closing, all 750 OTA reports were made available in perpetuity both on-line and on CD-ROM. For the web site, the URL is http://www.wws.princeton.edu:80/~ota/.The CDROM is availablefrom the U.S. GovernmentPrinting Office.
156
F.B. WOOD
"hot" topics, the requesting or other committee(s) issued a press release in parallel with OTA's release, and on occasion the committee, an individual member, and/or OTA held a press conference. At the institutional level, OTA issued general information brochures, publication catalogs, and summaries of ongoing and recently completed work. During the 1990s, OTA developed targeted catalogs and brochures, organized by program or subject area (e.g., health, telecommunications, environment). On average, in a typical year, OTA released about 50 reports, testified 35 times before congressional committees, and briefed committee staff about 70 times. OTA staff conducted hundreds of informal discussions with committee staff in a given year, and participated in numerous professional meetings and conferences related to completed or ongoing OTA studies. The press gave extensive coverage to many OTA reports. The number of press clippings from all sources (daily, weekly, and trade press plus selected electronic media) that mentioned OTA reports or OTA ran about 5,000 per year, until OTA cut back on its clipping service. Overall, these numbers suggest that Congress, the press, and relevant professional and stakeholder groups were paying attention to and using OTA reports at least enough to justify OTA's continued existence. But given the congressional decision in 1995 to cease funding OTA, and notwithstanding overriding political factors, it would appear in retrospect that the numbers were deceiving and that a fresh look at follow-up--along with the rest of the assessment process--was (and is) needed.
Implications for the Future of OTA and Technology Assessment The primary implication of this discussion of technology assessment as practiced by OTA is the rethinking of T A methodology and management to better meet the needs of elected public officials generally and the U.S. Congress in particular. OTA produced high quality reports at a cost and time per study that compared quite favorably with other technology assessment entities in and out of government, with the extra value-added that derived from the checks, balances, and openness built into OTA's methodology. While no process is perfect, OTA's reports generally reflected a high degree of balance, fairness, and objectivity in the presentation and analysis of competing information and views---certainly when compared to reports on similar topics prepared by advocacy, trade, industry, and even academic groups. The need for OTA-like functions continues, as science and technology become ever more pervasive in American and global society.3°Members of Congress are responsible for representing the diverse interests of their constituencies, their home towns and states, and the nation as a whole. Science and technology related trends and issues impact the congressional agenda in ways large and small, obvious and subtle, immediate and long-term. A key question is how to evolve from the OTA of the past to an OTA or OTAlike organization(s) of the future that will better mesh with congressional needs. A threepronged response could involve: a downsized and reinvented OTA or the equivalent; a 30A review of various computerized bibliographic databases found an overwhelmingly positive response to OTA in the press, academic, and research communities, and a strong sense during 1995 that OTA was serving a unique and important role in better informing Congress and the public on science and technology issues. Sources: review of 206 citations to OTA identified in a search of the ABI/Inform database for 1986--1995; review of 21 citations to OTA identified in a search of the newspaper abstracts index for 1994-1995 (Gelman Library, The George Washington University); review of 43 citations to OTA identified in a search of UMI/ ProQuest for 1995-1996.
LESSONS IN T E C H N O L O G Y ASSESSMENT
157
private not-for-profit Institute for Technology Assessment (ITA) or the equivalent; and various other public and private sector technology assessment entities. No other congressional agency has emulated the O T A process, or appears to have the desire to do so. At the time of this writing, it is not known whether the next Congress will decide to support a downsized and reinvented O T A or any OTA. Regardless, technology assessment organizations--whether a reinvented OTA, a not-for-profit ITA, the National Research Council, or others--that strive to serve congressional and legislative branch needs could benefit from lessons learned during OTA's earlier incarnation. IDENTIFYING CONGRESSIONAL NEEDS AND BUILDING CONGRESSIONAL U N D E R S T A N D I N G OF TA
O T A needed a more systematic way to identify and refine congressional needs for technology assessment work, and to build more bridges and relationships with the client (while still maintaining independence, not always easy to do). Whether a reinvented OTA, ITA, or other T A organization, the T A staff could discuss upcoming needs at the beginning of each session (or between sessions or at other optimal times) with majority and minority staff of interested committees and subcommittees where appropriate. This information could be used to develop a committee-by-committee T A agenda and an overall program of assessment activities for each Congress. O T A needed a broader base of members and staff in Congress who understood what technology assessment could do for them. T A staff could provide briefings and seminars on selected science and technology topics, within a technology assessment framework, for interested members and staff. Key congressional staff could be invited to participate in periodic feedback sessions to further improve the utility of T A work products for Congress. 31The congressional authorizing committees could hold periodic oversight hearings on T A [32]. The T A organization could issue a periodic newsletter and/or an annual outlook report on key T A trends and issues, in order to help increase congressional and public awareness and interest. Also, T A staff need a good understanding of the legislative process and other congressional functions. T A staff could be required to participate in internal and external short courses on Congress, and some staff could be rotated for short-term assignments to committees or subcommittees, majority and minority, in order to better appreciate congressional needs first hand. 32 I N C R E A S I N G T H E F L E X I B I L I T Y O F T E C H N O L O G Y ASSESSMENT W O R K P R O D U C T S
O T A needed more flexibility in the range of products and services offered in order to better match congressional needs. An effective T A organization could offer everything from the traditional "full" O T A assessment (e.g., 200 + pages, 18-24 months) to shorter reports, briefing papers, technology or policy workshops, early warning reports, testimony, and staff briefings. O T A was moving in this direction, but was still primarily identified with (and producing) the lengthy full reports. M O R E C L E A R L Y L I N K I N G ASSESSMENT R E Q U E S T S TO T H E L E G I S L A T I V E PROCESS
Requests for O T A studies frequently made some general reference to legislative needs, but in many instances might have benefited from greater detail and precision in matching study products, including interim deliverables, with specific needs. The T A organization could work with congressional and committees and subcommittees---chairs 31For example, OTA's Policy Analysis Task Force used a small group of congressional staff advisers. 32OTA's Office of Congressional and Public Affairs conducted intermittent staff seminars on Congress that proved quite useful.
158
F.B. WOOD
and ranking minority members--in formulating requests that would include a list of priorities and linkage to anticipated oversight, foresight, legislative, authorization, appropriations, and other activities. The requests also could explicitly specify the types of product or service needed, such as short report, briefing paper, technology workshop, or early warning report, as well as the desired or anticipated timing of the product or service to meet congressional needs. The TA organization could prepare an overall program of T A activities, integrating all requests. The overall program presumably would attempt to balance coverage across committee needs and across types of TA products and services. The program could be revised and adjusted as each session of Congress progressed, additional requests were received, and on-going work was completed and delivered. REDUCING THE AVERAGE COMPLETION TIME FOR TA WORK PRODUCTS The completion schedules of many O T A projects were not well synchronized with congressional time frames. To some extent, this was beyond anyone's control, as legislative issues and schedules are hard to predict, at best. Nonetheless, for Congress, the balance of T A work needs to shift significantly in the direction of shorter, quicker turnaround products and services. The entire TA process, and each step within it, needs to be rigorously scrutinized for possible time savings and efficiencies. Longer-term studies and reports could include multiple interim deliverables, to meet well-defined but more immediate client needs as the study proceeds. O T A had some success with this approach. The long report probably would be the exception, however, conducted only when both the assessing organization and requesting congressional entity strongly agree on the need. The average time to completion of a report might be in the nine-to-twelve or twelve-to-fifteen month range. Early warning reports, briefing papers, technology workshops, and the like might be produced in three to six months, on the average. Testimony, staff briefings, and issue memos could be provided even faster, in a matter of weeks, when in the T A organization's areas of established expertise. The typical TA report length could be redefined from long to short, with "short" meaning 50-100 printed pages. These reports would cut back on background and contextual information and give priority to technology issues, trends, and findings, stakeholder perspectives, and evaluated policy issues, options, and impacts. Should a congressional T A unit be re-established, some of the longer-term studies that were traditionally conducted by O T A could be shifted to the Institute for Technology Assessment, National Research Council, and other private sector organizations. STREAMLINING AND ENHANCING PUBLIC PARTICIPATION Public participation was one of the bedrock principles of the O T A assessment process. "Public" means the range of persons, organizations, perspectives, and values relevant to a particular study. The number of participants ranged into the several hundreds for a major study and exceeded 5,000 per year for O T A as a whole in recent years. Yet this aspect of O T A ' s methodology could be time consuming and still fall short of attaining fully balanced participation, while leaving some interested persons or organizations unsatisfied. The T A organization needs to experiment with alternative forms of public participation, including use of the Internet and web sites, focus groups, public opinion surveys, site visits, and grass roots citizen meetings as well as the traditional advisory panels and workshops (for which video conferencing may now be a viable option). Again, the key is flexibility, because there is a limit to how much input
LESSONS IN TECHNOLOGY ASSESSMENT
159
and how m a n y reviews any T A t e a m can meaningfully handle. Use of templates or frameworks of the types of "publics" relevant to a given study could be helpful. FURTHER DEVELOPING MID-LEVEL TA METHODOLOGY A s n o t e d earlier, O T A m a d e only limited progress in developing so-called midlevel assessment methodology. While an eclectic a p p r o a c h has merit, T A organizations would benefit from somewhat greater emphasis on developing and refining p r o t o t y p e frameworks for identifying the relevant stakeholders, steps in technology creation and d e p l o y m e n t , range of relevant policy options and impact areas, and the like. These f r a m e w o r k s would c o m p l e m e n t the overall T A process (high-level m e t h o d o l o g y ) and also help guide the selection of specific techniques for d a t a collection, analysis, and synthesis (tactical-level methodology). This effort would likely help expedite the study process and reach closure on a time frame and at a level of detail required to meet congressional needs. T A organizations could build on the O T A experience and that of other T A groups in the U n i t e d States and abroad. The National Science F o u n d a t i o n (NSF) could re-engage in this arena, as was envisioned in the Technology Assessment Act and was the case in the early years of technology assessment. 33 T A organizations, p e r h a p s jointly and with support from NSF, could p r o d u c e a series of T A h a n d b o o k s on T A m e t h o d o l o g y and management. Much of the raw material to p r e p a r e such h a n d b o o k s is already available, p r e p a r e d over the years as a result of O T A ' s selfi m p r o v e m e n t efforts, by academic scholars on T A methodology, and at other T A centers in the U n i t e d States and a b r o a d [34]. INCREASING THE FLEXIBILITY OF THE TA PROCESS O n e of O T A ' s lasting contributions was its d e v e l o p m e n t and refinement of the O T A assessment process, as described earlier. T h e process p r o v i d e d much n e e d e d consistency as O T A matured, and served as a b e n c h m a r k against which other T A activities could be compared. However, in recent years, several other promising approaches to conducting T A s have been d e v e l o p e d and implemented, such as the "consensus conference process "34 for citizen involvement in T A used in the N e t h e r l a n d s and D e n m a r k [34-36]. Also, various future studies groups have e x p e r i m e n t e d with alternative assessment methodologies. T A organizations need to be flexible in finding the best match between process and methodology, on one hand, and the specific assessment needs and requirements, on the other. T A organizations need to do this while keeping in mind, and not sacrificing, their basic mission and operating principles. T A organizations also need to stay attuned to new ways to reach out for diverse views on any given topic. The anticipatory or early warning function envisioned by the Technology Assessment A c t implies a process that seeks over-the-horizon, cutting-edge ideas and perspectives that, by definition, are not yet m a i n s t r e a m or consensus thinking. In addition, T A organizations need to give considerable attention to the selection and training of project staff. As m o r e flexibility is built into the assessment process, the organizational culture needs to shift as well. This 3~P.L. 92-484, Sec. 10. 34One form of the consensus conference includes a four-day public forum: day 1, experts and public and private sector interest groups testify before a lay panel; day 2, the lay panel cross-examines the experts and interest groups; day 3, the lay panel deliberates privately and writes their own report summarizing their judgment on the issues; and day 4, the lay panel releases their report at a press conference and responds to questions from the media, legislators, and others. See Sclove, R. E., Loka Alert 3(4), June 7, 1996, and 3(5), July 12, 1996, an occasional series of electronic postings, on science, technology, and democracy, published by the Loka Institute, P.O. Box 355, Amherst, MA 01004, URL = http://www.amherst.edu/~loka
160
F.B. WOOD
applies to the length and format of reports or other products, and their dissemination, as well as to the assessment process and the use of both mid-level and tactical-level assessment methods and techniques. EMPHASIZING ELECTRONIC DISSEMINATION OF TA PRODUCTS In its final months, O T A was gearing up for electronic dissemination. Ironically, O T A OnLine was developed in the months before OTA's closure and preceded the wave of interest in the Internet and World wide Web now sweeping Capitol Hill. The historical work products of OTA, including all 750 published reports, are now available at several web sites and on CD-ROM. Other TA organizations need to carry on this line of development, and make appropriate use of the Internet and web sites for electronic dissemination. Over time, electronic formats will be increasingly accepted among members of Congress and staff, although there is likely to be a place for hard copy printed documents, in addition to the all important face-to-face discussions and briefings, for years to come.
Conclusion Making these kinds of changes in assessment process, methodology, and management could have significant, and perhaps profound, implications for the staffing, training, and professional development in any TA organization. The TA unit needs to consider carefully which subject areas, disciplines, and methodologies should be part of the core staff expertise. The right mix of established staff expertise will be a prerequisite to producing credible assessment results more quickly and flexibly. No matter how significant the TA improvements look on paper, selecting, nurturing, motivating, managing, and leading the all-important human resources will prove the greatest challenge and route to success for TA organizations. Whether Congress decides to reinvent OTA or not, the lessons learned from OTA's experience will benefit organizations such as the Institute for Technology Assessment 35 that are committed to the OTA principles of balanced, objective analysis, broad participation of interested persons and stakeholders, and open review. Various other public and private sector organizations, in the United States and other countries, that conduct technology assessment could more aggressively adopt or adapt various elements of the OTA methodology and lessons learned. Those TA organizations that define elected officials, and especially legislative branch officials, among their primary clients would be well advised to study and learn from the OTA experience. The concept of technology assessment continues to be vitally important to enlightened public policy making in modern society. The O T A experience should help technology assessors at home and abroad advance the theory and practice of technology assessment as we enter the 21st century.
The author accepts full responsibility for the views expressed in this article, but gratefully acknowledges the review comments of the following colleagues: David Guston and Bruce Bimber (special issue co-editors), John Andelin, Audrey Buyrn, Pat DeLeon, Bob Friedman, Stephanie Gajar, Tom Hausken, Roger Herdman, Mary Lou Higgs, 35The Institute for TechnologyAssessment was organized as a not-for-profitcorporation by former OTA staff. ITA "strives to maintain the same standards of objectivity,soundness, and credibility achieved by OTA in its work for the U.S. Congress, while meeting the analyticalneeds of American industry as well as federal and state governments,"ITA brochure, September 1996.ITA has a Board of LegislativeAdvisorsthat includes U.S. and State Senators and Representatives. The IT web site URL is http://www.mtppi.org/ita
LESSONS IN TECHNOLOGY ASSESSMENT
161
Kerry Kemp, Bill Norris, Peter Sharfman, Jim Turner, Chris Waychoff, Pat Windham, Joan Winston, Andy Wyckoff, and two anonymous reviewers. The author benefited from discussion of ideas from earlier versions of this article presented at a July 17, 1996, World Future Society General Assembly session, co-led by Vary Coates and Don Kash, and at June 21-22, 1996, sessions of the International Symposium on Science and Technology (sponsored by the IEEE Society on Social Implications of Technology and Princeton University) at which Vary Coates, David Jensen, Todd LaPorte, Bill Creager, Chris Hill, Dick Sclove, Phil Bereano, Robert Margolis, Joe Herkert, and David Guston provided useful comments. References 1. U.S. Congress, House of Representatives: Technology Assessment Act of 1972, Conference Report: Rep. No. 92-1436, 92d Cong., 2d sess., Washington, DC, Sept. 25, 1972. 2. U.S. Congress, Senate: Committee on Rules and Administration, Subcommittee on Computer Services, Office of Technology Assessment for the Congress, Hearing, 92d Cong. 2d sess., Washington, DC, Mar. 2, 1972. 3. U.S. Congress, Senate: Committee on Rules and Administration, Technology Assessment Act of 1972. Report, Rep. No. 92-1123, 92d Cong., 2d sess. Washington, DC, Sept. 13, 1972. 4. Armstrong, J. E., and Harman, W. W.: Strategies for Conducting Technology Assessment, Department of Engineering-Economic Systems, Stanford University, 1977. 5. Breslow, M., Brush, N., Giggey, F., and Urmson, C.: A Survey of Technology Assessment Today, Peat, Marwick & Co., Washington, DC, 1972. 6. Cetron, M. J., and Bartocha, B., eds.: Technology Assessment in a Dynamic Environment, Gordon and Breach, New York, 1973. 7. Coates, J.: The Role of Formal Models in Technology Assessment, Technological Forecasting and Social Change 9, 139-190 (1976). 8. Coates, V. T.: A Handbook of Technology Assessment, Final Report, Office of Energy Programs, School of Engineering and Applied Science, The George Washington University, March 1978. 9. Hetman, F.: Society and the Assessment of Technology, Organization for Economic Cooperation and Development, Paris, 1973. 10. Jones, M. V.: A Technology Assessment Methodology, Mitre Corporation, McLean, VA, 1971. 11. Mayo, L. H.: Some Legal, Jurisdictional, and Operational Implications of a Congressional Technology Assessment Component, Program of Policy Studies in Science and Technology, The George Washington University, December 1969. 12. Mayo, L. H.: Some Implications of the Technology Assessment Function for the Effective Public DecisionMaking Process, Program of Policy Studies in Science and Technology, The George Washington University, May 1971. 13. Mayo, L. H.: Social Impact Evaluation, Program of Policy Studies in Science and Technology, The George Washington University, November 1972. 14. Mayo, L. H. et al.: Readings in Technology Assessment, Program of Policy Studies in Science and Technology, The George Washington University, Washington, DC. September 1975. 15. Office of Technology Assessment, U.S. Congress: OTA Processes: Program Manager Workshops on Technology Assessment Processes, Washington, DC, April 1977, draft, mimeo. 16. Office of Technology Assessment, U.S. Congress, Technology Assessment in Business and Government, Washington, DC, January 1977. 17. Coates, J., and Amin-Arsala, B.: Setting Priorities at the U.S. Office of Technology Assessment, World Future Society Bulletin, March-April 15-26, 1979. 18. Office of Technology Assessment, U.S. Congress: Priorities 1979, Washington, DC, January 1979. 19. Office of Technology Assessment, Task Force on TA Methodology and Management: Report on Task Force Findings and Recommendations, Washington, DC, Aug. 13, 1980, mimeo. 20. Wood, F. B.: The Status of Technology Assessment: A View from the Office of Technology Assessment, Technological Forecasting and Social Change 22, 211-222 (1982), 21. Gibbons, J. H.: Technology Assessment for the Congress, The Bridge, Summer, 2-8 (1984). 22. Gibbons, J. H.: Technology Assessment and Governance, Remarks before the Technology Assessment Study Group of the Alpbach European Forum of the Austrian College, Alpbach/Tyrol, Austria, Aug. 23, 1985, mimeo.
162
F.B. WOOD
23. Office of Technology Assessment, U.S. Congress: Policy Analysis at OTA: A Staff Assessment, Washington, DC, May 1993. 24. Office of Technology Assessment, U.S. Congress, Task Force on Long Range Planning: Report to the Director, Washington, DC, Jan. 24, 1994, mimeo. 25. U.S. Congress: Legislative Branch Appropriations Act, Fiscal Year 1996, Congressional Record House 141(102), H6168--6200 (June 21, 1995) and 141(103), H6213-6214 (June 22, 1995). 26. Herdman, R. H.: Testimony on OTA's FY96 Appropriations Request, Before the U.S. Senate, Committee on Appropriations, Subcommittee on the Legislative Branch, Washington, DC, May 26, 1995. 27. Hill, C. T.: The Congressional Office of Technology Assessment: A Retrospective and Prospects for the Post-OTA World, Technical Expertise and Public Decisions, Proceedings, 1996 International Symposium on Technology and Society, The Woodrow Wilson School of Public and International Affairs, Princeton University, Princeton, NJ, June 21-22, 1996, pp. 4-12. 28. Margolis, R. M.: Losing Ground: The Demise of the Office of Technology Assessment and the Role of Experts in Congressional Decision-Making, Technical Expertise and Public Decisions, Proceedings, 1996 International Symposium on Technology and Society, The Woodrow Wilson School of Public and International Affairs, Princeton University, Princeton, NJ, June 21-22, 1996, pp. 36-54. 29. Office of Technology Assessment,U.S. Congress: Annual Report to Congress, Fiscal Year 1995, Washington, DC, January 1996. 30. Coates, V. T.: An OTA Handbook on Technology Assessment Methods, prepared for the Office of Technology Assessment, Washington, DC, June 7, 1982, draft. 31. Kunkle, G. C.: New Challenge of the Past Revisited? The Office of Technology Assessment in Historical Context, Technology in Society 17(2), 175-196 (1995). 32. U.S. Congress, House of Representatives, Committee on Science and Technology, Subcommittee on Science, Research, and Technology: Review of the Office of Technology Assessment and Its Organic Act, Report, 95th Cong. 2d sess. Washington, DC, November 1978. 33. Office of Technology Assessment, U.S. Congress: Publishing Style and Procedures, Washington, DC, 1994. 34. Smits, R., Leyten, J., and Hertog, P. D.: Technology Assessment and Technology Policy in Europe: New Concepts, New Goals, New Infrastructure, Policy Sciences 28, 271-299 (1995). 35. Sclove, R. E.: Democracy and Technology° Guilford Press, New York, 1995. 35. Sclove, R. E.: Town Meetings on Technology, Technology Review 99(5), 24-31 (1996). Received 30 May 1996; accepted 31 October 1996