Half a century of progress in science and technology

Half a century of progress in science and technology

Half a century of progress in science and technology Trevor I. Williams The past half century has seen remarkable progress in science and technology. ...

734KB Sizes 0 Downloads 62 Views

Half a century of progress in science and technology Trevor I. Williams The past half century has seen remarkable progress in science and technology. New fields have been opened up which did not even exist before the war and new concepts have had to be devised and assimilated to describe new phenomena. This article reviews some of the salient developments. In some ways, this would be an appropriate occasion in which to review the progress of science as a whole over the past half century, especially as the Second World War was a very definite watershed. However, this is already well documented and is very adequately reflected in the review articles - some three thousand in all - which have appeared in Endeavour over the years. It is perhaps more interesting and instructive to consider only those lines of development which have no pre-war counterparts. As a preliminary, however, a brief caveat must be entered. Rather few new discoveries have no antecedents, however tenuous, which the diligent historian cannot uncover. Thus while the Atomic Age can be logically dated from the first test explosion in New Mexico on 16 July 1945, Frederick Soddy - father of isotopes repeatedly urged in the 1920s that radioactivity might be the source of ‘power beyond the dreams of even the scientific novelist’. Indeed, he might have been listened to more attentively if he had not been so vociferous a prophet. Again, the Computer Age is, for all practical purposes, a post-war event, even though calculators have an ancestry that can be traced back as far as Blaise Pascal’s adding machine of 1642 or even earlier to the oriental abacus. Arguably, the computer was as revolutionary in the 20th century as the steam engine was in the 19th. Additionally, we are looking at a period of time long enough to allow second thoughts to emerge. Initially, for example, tetraethy lead as a fuel additive and a range of synthetic pesticides and herbicides seemed to have immense longterm benefits: today, both are in eclipse. Trevor I. Williams, M.A., D.Phil., F.R.S.C., F.R.Hist.S After graduating in chemistry in 1942, he did postgraduate research in antibiotics in the Sir William Dunn School of Pathology, Oxford, 194245. He was appointed Deputy Editor of Endeavour in 1945 and Editor in 1954. He has a particular interest in the history of science and technology. Endosvour, Now 6min. Volume 16, No. 1,1666. 0166m/66 66.66 + 0.66. Pergsmon Press pk. Printed in Grest Bitsin.

The changing

pattern

of science

Before proceeding to the particular, some generalisations are appropriate. Science, like most human activities, does not advance uniformly on all fronts. For most of the 19th century chemistry was in the forefront but by its end physics - especially at the atomic level - began to yield the most dramatic results, leading on to the harnessing of atomic energy, and tremendous advances in particle physics and our understanding of the nature of matter and the origin of the Universe. Meanwhile, the biological sciences had been relatively quiescent, though the work of Darwin and the early geneticists had begun to offer a logical explanation of the observed facts of inheritance. Post-war, however, biology suddenly surged ahead on a new course with the advent of molecular biology. When biological processes began to be studied at the molecular level the mechanism of heredity began to become clear and the causesof some hereditary diseaseswere identified as chromosomal defects. Genetic engineering emerged as a means of imparting to living organisms new characteristics - in biosynthesis in particular - which they do not naturally possess. Such advances have had far-reaching consequences: the organization of the scientific world of today bears little resemblance to that of fifty years ago. Then, funds allocated to research were negligible in terms of national budgets. Today, research and development in the western nations is a significant fraction of gross national product: a few individual projects - such as the Hubble space telescope and the superconducting supercollider - alone call for billions of pounds. Elaborate machinery has had to be established to allocate funds and scientists have to compete fiercely for a share of them. The days of altruism are past. The intrinsic interest of the research has to be weighed against potentially useful results - a process which purists decry. Increasingly, major projects have to be organized on the basis of international co-operation - as in the case of the European Molecular Biology Laboratory in Germany;

CERN in Switzerland; and the Joint European Torus (JET) project in the U.K. The International Geophysical Year (1957/S) was an example of a massive short-term international collaboration. Another important development is in the attitude of scientists to making profitable use of their discoveries. A century and a half ago Sir Joseph Banks, one of the most illustrious presidents of the Royal Society, ruled - in a thinly veiled reference to W. H. Wollaston, who was making money from his malleable platinum - that ‘the keeping of secrets among men of science is not the custom here, and those who enter into it cannot be considered as holding the same situation in the scientific world as those who are open and communicative . . .‘. In 1941 the then President of the Royal Society, Sir Henry Dale, took exactly the same attitude - as did the Secretary of the Medical Research Council - when E. B. Chain urged on them the desirability of protecting by patents the extraction and use of penicillin. Today, in a totally different social and economic climate, it would be regarded as a dereliction of duty if universities failed to protect their commercial interests in respect of new discoveries. Very large sums indeed can be involved. At the lower level of the individual laboratory, too, the scene has changed dramatically. Where once measurements had to be made laboriously by individual research workers or skilled technicians, a whole range of new techniques has emerged, operating on the ‘black box’ principle without need to know how they work. Research, like manufacturing processes, has become highly automated and the rate at which work can be done enormously accelerated. The history of penicillin strikingly illustrates this point. During the war there was naturally much interest in the possibility of preparing penicillin synthetically rather than in huge fermentation plants, and this demanded elucidation of its structure. With the methods then available this was a task for an Anglo-American team of 430 scientists working in 32 groups for some two

3

years: Today, given a few micrograms of pure material, the structure could be elucidated within days. The role of technicians in research has changed correspondingly: fewer are required but they need to be highly skilled. At the personal level, too, the last fifty years have seen a conspicuous change, which is reflected in the scientific journals. Increasingly, original research came to be published in commercial journals, posing a challenge to those of the learned societies, to whom they are an important source of revenue. Whereas previously papers were submitted largely by individuals or quite small groups, today considerable numbers of collaborators are commonly involved, reflecting the increasingly interdisciplinary nature of science. The nature of journals, too, has changed dramatically. In the 1950s a universal topic of discussion among scientists of all disciplines was the problem of keeping abreast of new developments in their own field: the complaint was that if they located and read everything relevant that was published they would have no time for their own research and, in academic life, teaching. The problem was aggravated by the fact that professional advancement in the academic world depended to a considerable extent on an individual’s record of publication, inevitably leading to a certain amount of repetition. An additional factor was that of language. While English was already the most widely used language it did not then hold the commanding position it does today. The advent of computerbased information systems greatly simplified the storage and retrieval of information - not only in science but in all fields. More recently, considerable progress had been made in computer-based translation systems, though these still have a long way to go to achieve unambiguous interpretations. Conventional means of written communication have also improved. It is hard now to imagine a world without photocopiers or, more recently, FAX facilities, freely available. Yet the first commercial Xerox photocopier did not appear until 1947. With the internationalization of science it has inevitably assumed an increasingly important political dimension. National prestige has become an important factor in funding major projects. The dramatic American. Apollo programme of the 1960s is a case in point. How far the enormous expense of landing men on the Moon - as distinct from using unmanned space probes as the Russians did - was justified in terms of the scientific results obtained is still a matter for controversy. At the same time, one cannot ignore the boost to public morale in the U.S.A., notably depressed after some spectacular firsts by Soviet space probes, such as the first sight of the dark side of the Moon. 4

Today, a manned mission to Mars is still in the U.S. programme, though prudently without a target date. There are also plans, still unscheduled, to establish a permanently manned base on the Moon, After the war, its dramatic conclusion with the use of the atomic bomb, and the release of much detailed information about other major developments such as radar and penicillin, ensured that science was held in high public esteem. If it could achieve so much in time of war how much might it not contribute to rebuilding a better world? For a time hopes were high, but gradually disillusionment set in. What science could achieve and what was achievable within the constraints of hard everyday economics were two very different things. Political decisions had to be made which excited much public controversy. Decisions advantageous to one group could be disastrous for another, especially in disrupting the pattern of employment or local amenity. Science, it was feared, would increase the number of jobless: in fact, tens of millions now find employment in industries which did not exist before the war. In the 1960s one major factor - the environment - began to emerge in this area as being of paramount importance and public interest. Just when disillusionment set in is not readily discernible, but in the public mind it is closely identified with the publication of Rachel Carson’s Silent Spring in 1962. In this, she drew attention to the far-reaching consequences of the widespread use of new agrochemicals, notably herbicides and insecticides. These could disrupt ecologically significant food chains, with disastrous effects on wildlife and, in some cases, pose a serious threat to human health. Equally, certain new drugs, whose beneficial effects had not been challenged began to be identified with dangerous side-effects. The most publicized example was thalidomide, a seemingly harmless and effective sedative which proved capable of producing serious congenital defects in the children of mothers who had taken it during pregnancy. Worldwide, some 10000 children were involved. The overall consequence was a radical tightening up of the requirements to be fulfilled before new products could be licensed. It may be that this has been carried too far and begun to be counterproductive: widespread public demand for no-risk products is not realistic. Chemical manufacturers face such enormous costs in conforming to current statutory requirements, which may vary from country to country, that potentially useful new products are not being developed as fast as they might be. Moreover, even when all the statutory requirements are met, there is always the possibility that wholly unsus-

pected adverse ill effects may emerge. When thalidomide was introduced in the 1950s there was no requirement to test it as a teratogen and this property came as a complete surprise. Even penicillin - the miracle drug of the 1940s which at first seemed to have no vices at all - was eventually found to have adverse side-effects in some individuals. It is interesting to reflect what the consequences would have been had it been obliged to conform to modern statutory requirements before being released for clinical use. Certainly it would not have been available in quantities sufficient to treat all casualties after the D-day landings in France in 1944, the prime reason for the crash programme of production initiated in the U.S.A. after Pearl Harbour. Today, the environmental consequences of the application of science dominate all others, in particular in respect of changes in the upper atmosphere leading to long-term climatic changes. While global warming is certainly a hazard not to be ignored, it is clear that it is governed by an interaction of events far more complex than was originally assumed. Apart from other considerations, it provides an interesting example of how science feeds on itself. The present intense programme of atmospheric and oceanic monitoring, designed to reveal the true nature and extent of the problem, is possible only because of major developments in other fields of science and technology, in particular those that make possible the extensive use of observational satellites. Having thus briefly surveyed in a general way how the scope and organization of science has changed worldwide over the past half century it is appropriate to conclude by recalling some of the major events which have stood out from the evolutionary process as a whole. Physics:

big science,

little science

In physics there was intense activity at both ends of the research spectrum, in the microworld of atomic particles and the macroworld of the ultimate Universe. At mid-century the structure of atoms seemed to be satisfactorily contained within the Rutherford-Bohr system consisting of protons, electrons, and neutrons: the positron had also been identified but did not appear to be a universal building brick. From midcentury onwards, however, it became apparent, as the number of specific new particles identified grew rapidly, that this simplistic view was untenable. Quarks and leptons have emerged as two basic types of particles, subsuming all others. Their properties and interactions have to conform to certain basic conservations and symmetries. Strong, weak, and electromagnetic forces -

‘gauge forces’ - have been accommo- these emit radio waves-and pulsars: the dated within the last decade in what has new vocabulary grew steadily, as it did in become known as the ‘standard’ model atomic physics. From 1975-82 the sateldepicting the behaviour of particles. lite COS.-B unfalteringly explored space The fourth basic force, gravity, is still through the medium of gamma radiadifficult to fit into the pattern but, as tion, discovering new stars unknown at shown in an article elsewhere in this other wavelengths. The advent of satellites able to carry issue, the experimental detection of gravity waves now seems within sight, an array of telescopes opened up furthsomething believed impossible 30 years er opportunities. Observations could be ago. A complex esoteric vocabulary has extended deeper into space and with evolved as particle physicists sought to increasing accuracy. Their justification give verbal identity to essentially ab- was that they made it possible to deduce stract concepts. Thus ‘charm’ was in- something about the behaviour of atovoked to characterize ‘quarks’, of which mic particles under conditions of six types (and an equal number of anti- temperature and pressure not conceivquarks) are now postulated. Unex- ably attainable in the laboratory, so pectedly, quark itself is an invented providing an unexpected link between word not of physicists, but of the Irish the micro and the macroworld. It also novelist James Joyce in his Finnegan’s threw light on the basic question of the Wake: ‘Three quarks for Mister Mark’. origin of the Universe, a subject dear to At the experimental level a major adv- the heart of every cosmologist. Initially, ance was the invention of the bubble- two theories held the stage. One was chamber as a far more effective alterna- the steady-state theory, advanced in the tive to the cloud-chamber for detecting 195Os,according to which the Universe particles. Without this it would have had no beginning and no end. Its expanbeen impossible to realize the potential sion, which Hubble had demonstrated of the enormously powerful accelera- in 1929, was accepted but it was posttors - the ultimate manifestation of Big ulated that new matter was created Science - which are the main source of from empty space to maintain an overall the new particles, many with almost uniform density. Its rival was the Big infinitesimally brief lives. Bang theory, according to which a point At the other end of the scale, science source exploded in a fraction of a looked outward to the extreme limits of second some lo9 years ago and matter the Universe, and here new windows was thrown outwards in all directions. opened. For more than three hundred Particles and anti-particles annihilated years the main observational instrument each other, but a sufficiency of protons, of astronomers had been the optical electrons, and neutrons survived to telescope, built with ever increasing form the expanding Universe as we now apertures to increase light-gathering know it. This theory gained some expower. The six-metre reflecting tele- perimental support in the 1960s bescope completed at Mt Semirodriki in cause it predicted that some of the the Caucasus in 1976 could in theory radiation initially released should have detect the light of a candle at 24 000 km. permeated the Universe and then progNew design techniques have made it ressively cooled. Moreover, it should possible to increase the effective aper- have cooled to a temperature of ture by using an array of smaller units. about 5” above absolute zero. In 1964 But information from extra-terrestrial radiation with a wavelength of 7 cm sources is conveyed by radiation other was discovered to be reaching the than that within the narrow band of Earth equally from all directions: this visible light. The adjacent infra-red, for corresponds to the radiation emitted example, has revealed further sources by a black body at 3SK. Today, of radiation and thrown new light on the Big Bang theory is generally physical processes taking place deep in accepted, though undeniably it presents space. The major new field that de- difficult conceptual problems for nonveloped, however, is that of radioastrocosmologists. It postulates that when nomy, stemming from the discovery in the Universe began, all matter - and 1932 of a source of radio waves in the indeed all space - was concentrated in centre of the Galaxy. Not until after the an infinitesimal volume. Comparable war, however, was this discovery fol- situations now apparently occur in lowed up intensively, beginning with the Black Holes, collapsed stars so dense 76m steerable bowl of the radio tele- that no matter or radiation can escape scope completed in 1957at Jodrell Bank, from them. In such ‘singularities’ the near Manchester. It was diverted from ordinary laws of classical physics cannot its original purpose by being used to apply. There is thus a challenge to the track the series of satellites that followed basic concept - first explicitly stated by the launch of Sputnik 1 in the same year. Avicenna a thousand years ago - that Since then radio telescopes have iden- the laws of nature are immutable, retified many objects in space not revealed gardless of both time and place. A by optical telescopes - by 1965 the disturbing thought! catalogue had reached 50000. These Progress in physics at the atomic level included quasars - though not all of paid many practical dividends, of which

the most important was the transistor, in the late 1940s. This made possible an almost incredible miniaturisation of all kinds of electronic circuits - in computers, radio, television, and many domestic appliances, among others. The effect on the development of computers has been particularly significant. Expectations that the way forward would be though mainframe computers serving a network of substations were not fulfilled. The capacity and versatility of persona1 computers, and a remarkable lowering in cost, has completely changed the picture. Curiously enough the same false assumption was made in television. The giant-screen Eidophor assumed that the public would watch television en masse, as they did the cinema. However, it came into its own in NASA’s space control rooms where large-scale monitoring screens were essential. The maser appeared in 1954 and the laser in 1960. Once described as an answer looking for a problem, the laser has since found many important applications. Not least is in time measurement: a laser clock is accurate to one second in 50 million years. By contrast, the then advanced quartz crystal clocks installed at Greenwich Observatory in 1945 were accurate only to one second in 30 years. By providing an intense beam of strictly monochromatic light the laser revolutionized the scope of holography, invented in 1947. Particularly because of the need for very powerful electromagnets for particle accelerators and other technical purposes the post-war years saw new interest in superconductors, discovered as long ago as 1911. Their practical limitation was that they were effective only at temperatures very near to absolute zero. In 1986 a sensation was caused by the announcement of superconductors effective above 30 K and this was soon raised to 90K. Such temperatures, above the boiling-point of liquid nitrogen, are relatively easily achieved and maintained. However, the dream of a room-temperature superconductor has yet to be realised. Almost from the first achievement of energy release by atomic fission processes there was great interest in the possibility of achieving the same result more advantageously by fusion. In the 1980s sensational reports of ‘fusion in a testtube’ proved unfounded, but very recently the Joint European Torus (JET) Project in the UK has achieved a twosecond pulse of energy release from the fusion of deuterium and tritium. However, effective power generation by this means cannot be feasible until the next century. The chemical

world

Advances in physics were closely linked with those in chemistry, which increasingly relied on a wide range of new

5

techniques and corresponding instrumentation. These included infra-red and X-ray spectroscopy, mass spectroscopy, nuclear-magnetic resonance, electronspin resonance, X-ray crystallography, and flash photolysis. The availability of a range of artificial radioactive isotopes opened up new possibilities for investigating reaction mechanisms. The Periodic Table, which since 1868 had seemed to set a limit to the chemical world, as Ptolemy’s map did to the ancient world, was shown to be incomplete by the discovery of many transuranit elements, beginning with plutonium. While the latter was prepared in substantial quantities, for industrial and military use, others - such as elements 108 and 109 - first became known as literally only a few atoms. Even so, a surprising amount of information could be gleaned about their chemical properties. Another surprise lay with the inert gases of the atmosphere, which similarly had had to be incorporated in the Periodic Table after their discovery in the 1890s. These gases were, of course, so named because they seemed incapable of chemical reaction, a property seemingly in accordance with their electronic configuration. Nevertheless, in 1962 a whole range of inert gas compounds were announced, prepared by quite simple means, beginning with those of xenon. It is a measure of Mendeleeff’s achievement that his Table could accommodate two unsuspected groups of elements so easily. The plastics industry was well established before the war, with a wide range of thermosetting and thermoplastic compounds. Their manufacture was, however, on a largely empirical basis and it was not until after the war that polymer chemistry began to be put on a sound theoretical basis. Concurrently the industry expanded and new products - both fibre and solid - were introduced. One of the most unexpected was polythene, discovered just before the war, whose role seemed destined to be no more than that of a specialist electrical insulator produced in small quantities. Instead it blossomed - with polypropylene - into a constructional material produced in millions of tonnes. Nearly all polymers contain carbon chains - the ultimate being carbon fibre - but an exception is the silicones, organosilicon compounds, which are inert, waterrepellent, and good electrical insulators. The biological

sciences

While the biological and medical sciences advanced in their own right, they owed much to the adoption of contemporary advances in the physical sciences. Molecular biology, for example, was much indebted to the use of various forms of chromatography and of the ultracentrifuge to purify materials, and of X-ray crystallography to elucidate

6

their structures. Milestones on this new road were research on bacterial genetics in the early 1940s; the identification ten years later of DNA as the carrier of genetic information and the elucidation of its double-helix structure; and the discovery of transfer RNA. The 1970s saw the discovery of transcriptase for transcribing RNA to DNA, until then supposed not to be possible, and production of the first monoclonal antibodies. The 1990s saw the first constructive move in a massive international collaboration - the Genome Project - to map human chromosomes in their entirety. This points the way to clearer understanding, and potentially to effective treatment, of a variety of diseases resulting from inborn genetic defects. The project has yet to be adequately funded, and there is some controversy as to the immediate need to map certain areas of apparently minor significance, but there seems little doubt of its ultimate completion. Nevertheless, opposition to it has been voiced on ethical grounds. The successof penicillin triggered off a search for other antibiotics suitable for clinical use but out of hundreds discovered only a handful met the strict criteria imposed. Outstanding among these were the cephalosporins, discovered in the 1950s. It is argued that one reason for Fleming’s failure to pursue penicillin with the energy it was subsequently shown to deserve was that at that pre-sulphonamide time, the medical profession pinned their hopes to immunisation techniques. These certainly continued to pay a good dividend, as with the development of anti-polio vaccines in the 1950s and of the antiviral interferons. Around the same time recognition of slow virus infections - taking perhaps a year to manifest themselves- opened up new understanding of certain viral diseases. From the earliest days of medicine diagnosis has been of prime importance, and since the beginning of this century increasing reliance has been placed on physical methods, such as X-rays, electrocardiography, and electroencephalography. To these, four important new techniques were added in the 1960s and early 1970s. These were respectively thermography; computerised axial tomography (CAT); ultrasonic scanning; and nuclear magnetic resonance (nmr) scanning. One consequence of the cheap mass treatment of many major diseases resulting from the advent of drugs such as antibiotics and synthetic antimalarials, coupled with the availability of insecticides to control insect vectors of disease, was an alarming increase in world population, with inadequate resources to sustain it. The Green Revolution produced more food - though not necessarily available in the areas need-

ing it most - and on the other hand improved techniques of contraception provided a brake. Of outstanding significance here were the oral contraceptives, introduced in the mid-1950s. In the western world these had farreaching social effects. But the new sexual freedom to which oral contraception contributed did not long remain unclouded. In the early 1980s the appearance of AIDS (Acquired Immune Deficiency Syndrome) raised a major international health problem which is still increasing and to which there is no satisfactory answer. The origin of the causative HIV virus is unknown. One effect was to bring the old-fashioned condom back into favour; interesting, because it was thus restored to its original lirth-century use of providing protection against venereal disease. AIDS is a social as well as a medical problem and governments have had to allocate very large sums not only to research and treatment but to propaganda explicitly defining the nature of the risk and how to minimize it. The human body can survive a remarkable degree of mutilation but certain organs are essential for life - notably lungs, kidneys, brain, and heart. When these organs fail a variety of techniques have been devised to provide alternative life support. An artificial kidney was devised in 1943, primarily to give short-term relief until healing occurred or remedial therapy could be applied. Since 1964, however, home dialysis machines have made it possible for patients with chronic conditions to survive indefinitely by regular courses of treatment. At about the same time pacemakers became available to control automatically irregularities in the beat of the heart. Until the 1960san alternative approach, the transplantation of healthy organs to replace those made inoperative by disease, made little progress. The main reason was that the body’s normal defence mechanisms cause it to reject foreign implants, much as it would marshal1 its defences against invasive bacteria. However, research on the nature of immune reactions has revolutionized the situation and since the 1960sheart, kidney, and lung transplants have been carried out with an increasing rate of success. Indeed, kidney transplants have now reached the level of routine operations and the main problem is to maintain a sufficient supply of donor organs. How these can be acquired has raised important ethical considerations. The scientific

periphery

Progress in medicine was largely dependent on a multidisciplinary approach and this proved of great value also in many peripheral sciences. Archaeology, for example, profited greatly from the development of new dating techniques

such as radiocarbon, palaeomagnetic, and isotope dating. In geophysics, the use of laser techniques and satellites made it possible to measure the exceedingly slow rate of continental drift with great precision, giving new credence to the controversial theory proposed as early as 1912. Improved techniques in seismology have given new knowledge of the structure of the earth’s crust. In meteorology the range and accuracy of weather forecasting has been enormously improved with the benefit of satellite observation and powerful computers for the swift processing of a huge volume of recorded data. Satellites have also opened up new possibilities in cartography and in the mapping of natural resources, both vegetable and mineral. The breadth and depth of materials science increased enormously as quite new materials - metals and, increasingly, polymers-came into general use and there was greater understanding of the relationship between structure and properties. The way ahead

If this brief review of some of the salient developments that are new, or very

largely new, to the post-war world would have seemed speculative indeed. shows anything it shows the hazards of Still less predictable was the U.S. shuttle space-craft, taking men into space prophecy: Heisenberg’s uncertainty principle is as applicable in the world of for long periods and in considerable human affairs as it is in particle physics. comfort. We are proud that the latest Few of them could have been foreseen shuttle derives its name, as does our by those peering into a crystal ball in journal, from the bark Endeavour in 1940. New discoveries were not only of which Captain Cook undertook the intrinsic interest but often their ripples world’s first voyage of scientific explorahad far-reaching and unexpected con- tion. Yet we must look even further sequences. Some, like computers, be- ahead, for this is probably the last of the come all-pervasive: others breathed shuttles, to be succeeded by a new new life into particular fields which had generation of hypersonic spacecraft become stagnant through lack of techni- which will take off and land like convenques to advance them. What does seem tional aircraft. certain, however, is that economic conAfter recording his observation of the straints will become increasingly impor- transit of Venus in Tahiti on 3 June 1769, tant: what can be done and what is Cook recorded in his log: ‘we very affordable are two very different things, distinctly saw an Atmosphere or dusky and this is something that scientists and shade around the body of the Planet laymen alike must accept. which very much disturbed the times of Nothing more emphatically epito- the Contacts’. Little more than two mises the element of the unexpected centuries later space probes directly than our cover picture and first article. confirmed the existence of such an When our first issue was published in Atmosphere, consisting largely of car1942 the German V-2 rockets - the first bon dioxide. Currently, the Magellan effective demonstration of long-range probe has kept the surface of the planet rocketry - still lay two years in the under instant surveillance during severfuture. That the exploration of space al thousand orbits and the topography and landings on the Moon would be of Venus is now as well known as that of achieved within quarter of a century the Earth.

7