Edward Wenk, Jr., keeping tabs on technology

Edward Wenk, Jr., keeping tabs on technology

87 John Kettle interviews Edward Wenk, Jr., keeping tabs on technology Following many years in the politics of science and technology in Washington D...

390KB Sizes 0 Downloads 78 Views

87

John Kettle interviews Edward Wenk, Jr., keeping tabs on technology Following many years in the politics of science and technology in Washington DC, Edward Wenk, Jr., started up the Program in the Social Management of Technology at the University of Washington, Seattle-one of the few ofits kind. In this interview, he outlines some principles that have guided (and emerged from) his thinking. The major threat I see in our future comes from the possibility of a nuclear exchange. This strikes me as being at the head of an inventory of threats to survival, because it is unprecedented. The scale of disaster is global, the number of innocent bystanders who would be involved is staggering; and finally, there is the great tragedy that the natural evolution of the human species -and I would include even up to its Dr. Edward Wenk, Jr., took his engineering degrees at Johns Hopkins University, Baltimore, Maryland, and Harvard University, Cambridge, Massachusetts. He spent 16 years with the US Navy as a specialist in submarine structural design (“where I lirst understood many of the implications of risk”) and then, in 1959, immediately after the NASA space programme was established, became the first science advisor to the US Congress. In 1961, he moved to the White House as an assistant to President Kennedy’s science advisor, then alternated between Congress and the 1Vhite House, serving under Presidents Johnson and Nixon, before moving to the west roast and the Program in the Social ~lanagement ofTechnology. The programme is now, unhappily, threatened by the financial squeeze in the state funding for the Address: Program in the Social university. hianagement of Technoipgy, University of \Vashington. 3 16 Guggenheim FS- 15, Seattle, iVashingron, 98 195, USA. John Kettle is the publisher and editor of The FufurcLetter, 2249 Queen Street East, Toronto, Ontario, Canada M4E IG 1.

FUTURES

hawuary

1983

spiritual evolution-might be, if not snuffed out, at least so radically altered that it might be a long, long time before we would see a return of human civilization as we know it today. We have been successful in not having a nuclear exchange since the first demonstration of the potency of these weapons in the US bombing ofJapan in World War II. The fact that there has not been an accidental or deliberate use of these weapons subsequently is a tribute both to the political leadership that controls these weapons, and to the capabilities that are technological indeed designed to minimize the possibility of accidents. But I cannot see this iife on the edge of a voIcano continuing indefinitely, without the possibility, or even the probability, of an exchange. What has happened in the last 35 years is a near-perfect example of the social management of technology. I regard the nuclear test ban as one of the few global agreements that merits de& nition as technology assessment. I mean that it was an appraisal of the state of threat that existed at the time, a catalogue of alternatives, the choice ofonethe nuclear test ban-and then the negotiation of international agreement. The cynics will say that there were other reasons beside the concern for possibly genetic hazards resulting from the fallout that was accumulating in the atmosphere. But in my view, it still reveals itself as an almost perfect model ofwhat can be done internationally. The public clientele I would say that we now need a different approach to the social management of this technology, in several regards. First, there has been an abdication of the

88

John Kettle interviews

decision process by those who would be most intimately affected by it, and who ought to have a say. In a democracy, the public itself is a clientele that should put on its own agenda a study of these major threats to survival: and this nuclear possibility we are discussing is not the only one that I feel deserves this type of attention. People have to care, and we don’t have a very convincing history of them acting as though they cared. An awful lot of people feel, first of all, that the subject is too technically complicated for them to get a grasp of the implications, and therefore they have to leave it to the experts. Second, they feel impotent in the face of the political machinery, in that decisions of this scope seem to be made at such high levels of government that communication to them through our social strata seems unlikely. In the scale of concerns I have about the future, I define survival as being more than biological, and this was highlighted in the title of my book, Margins for Su~~va~ (Pergamon Press, 1978). My inventory of global dangers and threats to survival includes not only the dangers of nuclear warfare, but the dangers of widespread famine and the resulting disorder, global environmental poisoning, large-scale local environmental poisoning, inadvertent climate modification, urban deterioration, resource depletion, global disorder from increasing economic disparity among nations, institutional and policy system failures, loss of freedom, and pathological shifts in values. To me survival means being alive and free. I see threats to freedom as one ofthe serious threats down the line which, like all of these that we are referring to, have a high technological content. That indeed is how I entered the whole question of dealing with the future, looking at the mix of technology and politics with regard to the beneficial effects for society as a whole, but also the threats to survival. When we talk about the overwhelm-

ing size of many technologies, what we are really talking about is the size of the institutions that are involved in technology. When we talk about the multinational corporations, the global communication networks, the major oil companies in the US with their connections world-wide, what we are often referring to is really a large and potentially unmanageable social system. Now, the question, “Are these unmanageable?” boils down to one simple question (maybe oversimple), and that is, &&Are these systems accountable?’ And ifthey are, or can be made so, then it seems lo me that the question of size is not itself serious.

Contrasting society and industry I want to draw a distinction between the social management of technology and the industrial management of technology. The concept of industrial management or industrial engineering is widely understood. It is taught in universities, in schools of business and engineering, so that we have an understanding of industrial management as being focused on decision-making by the industrial manager. The social management of technology is an analogue to industrial management, but on a much broader scale, dealing with overarching questions, but more than that, arising in all institutions and sectors of our society, where the ultimate products of technolo,gy, goods and services, are tested for the benefits derived, but also for the disbenefits. And this is where the whole notion of the social management of technology seems to me important. The management of what might be alluded to as large technology should not be left only in the hands of the industrial managers or technologists. That means people will have to change both their attitudes and their aptitudes. Put another way, being a citizen in a democracy in the Zlst century is going to be tougher, more FuTuREsFebfuary1983

John Kettle interviews 89

demanding, than it has been previously. People are going to have to do their homework; they are also going to have to be committed to political action. I don’t mean radical action, but the use of all communication channels in our very complex political system to make their voices heard. And these should be the voices, to every extent possible, of informed people. Today technology has become more political than ever before, though it hasn’t been widely recognized. Because of the ubiquitous use and influence of technolocgy, the question of who wins and who loses becomes more potent. Settling that question of who wins and who loses is a political matter, at least in our form of government (and, I think, in most others). And therefore, questions of the goals of technology, or strategies, of priorities, of whether money is spent on eg military hardware or civilianrelated research and development, these are political questions and they are in the main settled by government. To put this another way, I have the view that the major decisions on technology are no longer made in the market place as they may have once been. They are made by government. The major decisions are made by government, and this is because the government is involved in different ways in every technology, either in providing incentives, in regulating it, in paying for social overhead, as eg in funding R and D, training technicians, etc, and finally as a customer for technology, as in the case of the military. The government, therefore, stands at the crossroads of these technological decisions. In trying to understand what is going on, I have referred to the US President as ‘systems manager’.

term, but it seems to be a convenient label for a systematic way of asking key questions, starting with “What might happen if. . .?” One then asks, “What are the circumstances that affect the use of technology?“, and this means using not only facts but imagination. Everybody today is involved in what might be called technological delivery systems. Not just the mechanic at the bench, not just the industrial manager, but the politicians who necessarily make decisions that affect that technology, and ultimately the person in the street (who is not only a voter but also in most cases the ultimate recipient of that technological artifact). One might suppose in a dangerous situation, such as the world is in today, that our heightened sensitivity to the futures we do not want would lead to more attention being paid to the future. One doesn’t see this happening. It is partly a cultural problem. In our culture, there is a dedication to instant coffee and instant music and the instant turn-on and the whole notion of “me now”. The swift pay-off on investment, the capitulation and to political expediency for reasons of re-election, are two ofwhat I have identified as about 15 pathologies of the short run. Put another way, there are some very significant and powerful currents in out society today that predispose people to be concerned about what is going to happen tomorrow morning, rather than looking ahead, and therefore there isn’t the appetite for or the commitment to looking at emerging technologies or large systems that mtght be threatening. Until our society builds this more fully into the culture, I’m afraid that we are not going to have the necessary instrumentalities to do it, even if a few have the vision.

What might happen. . .? The one thing that I would claim is missing in the system today is an improved technique of looking ahead. Technology assesment is shorthand for that notion. I don’t want to overuse the

FUTURES Felmfaly

1983

Technology assessment Technology assessment was born in 1965 as a political tool of the US Congress. By circumstances, I was a midwife in that process. That was a long

90

John Kettle ink&w

time ago, and I have to say that I have not seen this field mature either professionally, from the point of view of an appetite for the use of these tools by people in political o&e. The Congress established an Ofice of Technology Assessment in 1972, and it went through about seven years of turmoil before it finally began to fulfill a few of the objectives of its initial advocates. The reason it failed is partly internal leadership, but partly that the Congress itself wanted this particular office (which was committed to looking to deal only with Monday ahead), morning’s problems. So it was distorted. At the very pinnacle of government, there are very few navigation aids close at hand that can provide signals of storms ahead, either ofthreatening technologies or threatening institutions. There are a lot of other signals in the system, very loud and demanding ones, but thev all have self-interest at their heart. \li’hat is needed is an independent appraisal or early warning of what might lie ahead. In the programme in the Social hlanagement of Technology at the University OfLVashington, Seattle, there is a heavy emphasis on the theatre of decision-making and on scenarios of the

behaviour of participants, the behaviour of presidents, ofcongressmen, oflobbies, of citizen activist groups, of industrial managers, and a recognition that one has to take all of this into account in trying to decipher the code, so to speak, in understanding really how the system works. Over the years, I have developed some grasp of what I think is the way it works, by using. case studies. The making of a dectston is the flash point when all of these different parties are at least temporarily connected together, in some coherent way. I have used the case technique with about 70 major decisions, all involving technoloLgy, some involving high technology like the US space programme, some which might be thought of as low technology (as in the building of highways). Seventy different cases, and what I have been looking for is commonality. Even though the technologies are very different, the system is the same, and therefore what I am looking for are indicators of the behaviour of this system. The belief, of course, is that, if we can understand the system better, then we can improve the probability that future decisions will give us the desired results.

FUTURE!3Fetwuary 1983