Book reviews The Neglect of Experiment. Franklin. Pp. 280. Cambridge Press. 1987. f30.00 ($42.50).
ByAllan University
As its title suggests the book aims to end the neglect of one of the most important yet least understood aspects of science. Franklin republishes several valuable case studies and revised versions of philosophical papers on the role of experiment in theory choice and the rationality of beliefs about observations. He aims to strike a blow against the textbook mythology of experiment which has led most non-scientists to adopt a naive view of experimental practice. This book is meant to answer those who ‘deny, or minimize, the role that experiment plays in theory choice’ and who ‘cast doubt on the validity of experimental results’ (p. 244). His studies show how properly conducted experiments enable scientists to formulate rationally grounded beliefs about the world. Systematic experimentation does, of course, explain scientists’ confidence in the outcome of their experiments. Yet Franklin largely ignores the empirical findings of recent case studies of experimentation which challenge the same naive view of experimenters’competence and about the quality and significance of their work. These studies point to a more subtle view than Franklin’s, of the grounds for scientists’ confidence in experiment and for our confidence in science. It is a pity that Franklin chose to neglect the ‘sociological’ challenge as unworthy of a considered reply. His real interest lies with philosophically proper relations of experiment to theory, not with the still neglected problem of how experimentalists actually engage the natural world. D. C. Gooding The Advancement of Science and its Burdens. By Gerald Ho/ton, Pp, 351. Cambridge University Press. 1986. Hard coverf27.50($39.50), Paperbackf9.95 ($12.95). The title of this book of collected essays was the subject of Holton’s 1981 Jeffersen Lecture. He sees the advancement of science as the fulfilment of two distinct, although closely related, programmes. Firstly, there is what Holton calls the Newtonian Programme, the bringing of ever widening areas of natural phenomena under increasingly unified explanatory schemes, and then there is what he calls the Baconian Programme of extending our technological control over natural processes for the supposed benefit of mankind. But there’s the rub. As Holton puts it: ‘The old (and sometimes blind) faith in the benign efficacy of technological progress has been waning, even yielding to its very opposite Endeavour, New Series, Volume 11, No. 4,1987. OlfiO-9327187 $0.00 + 50. @ 1987. Pergamon Journals Ltd. Printed in Great
216
Britain.
the fear of an autonomous technology’. The main burden of science is the inability of the ordinary man or woman to understand the increasingly esoteric arguments of pure theoretical science, and to join in informed debate about the effects on ordinary people’s lives of the escalating pace of technological advance. Holton believes the solution must lie in improvement of mass education. In one of the other essays in the book Holton quotes the alarming statistic that 50 per cent of North American seventeen-year-olds do not know how to calculate the area of a square! Apart from themes of limiting technology and improving education, a third of the book is taken up with Einstein, essentially as a case study in the development of the Newtonian Programme. The books is accurate in its details and deals with important issues, but it is rather dully written, with too much repetition in different essays of the same salient points. M. Redhead Computer Software Applications in Chemistry. By Peter C. Jurs. Pp. 253. Wiley, Chichester. 1987. f33.75. Peter Jurs provides a survey of the uses of computers in chemistry, with emphasis on statistical and numerical methods (curve fitting, regression analysis, matrix algebra, Monte Carlo methods) and on non-numerical data handling (structures, database, artificial intelligence). He gives a good overview of these areas, starting at a student level and giving copious further references. The relevance to chemical problems is emphasised and appropriate illustrations are used, including example programs. The newcomer will be able to grasp the concepts and appreciate the use of the computer. Jurs has placed most emphasis on the less well documented appplications; omitting the computing of theoretical structures, of x-ray crystallography, and even the Fourier Transform. The brief introductory section shows the problem of defining the target reader-is he a total novice or not? -and the ‘graphics’ chapter is not long enough to analyse the real importance of visual interaction, though it does provide a useful source of references. The claim that the book provides a ‘comprehensible overview’ is hardly justified, but it does earn its place on the library shelves; regrettably, it is unlikely to be recommended as a course book for student purchase at this price. John S. Littler Annual Review of Computer Science, Vol. 1,1986. Edited by J. F. Traub, B. J. Grosz, B. W. Lampson and J. J. Nilsson. Pp. 459. Annual Reviews, Inc., Palo Alto. 1986. $39.00 (USA), $42.00 elsewhere. This first Annual Review of Computer comprises fourteen chapter-length cles-, mostly by well established North
Science artiAmer-
ican academics. The topics addressed fall mainly in the areas of artificial intelligence, computer architecture, and the mathematical analysis of computation; an emphasis which the prospectus for volume II of the series suggests will continue. A few of the articles are rather specialised, but most take a significant computer science problem domain, summarise the development of that area and the main results to date, and indicate current research issues. The ensuing literature citations are extensive and commendably up-to-date, and the standard of presentation and clarity of discussion consistently high. Such material constitutes a valuable information source for advanced undergraduate and PhD students and teaching staff (amongst others). Representative titles are ‘Natural Language Interfaces’, ‘Advances in Compiler Technology’, ‘Dataflow Architectures’, ‘Knowledge Representation and Reasoning’, and ‘Information-based Complexity’. This book has a strong content, is well produced, and by modern standards reasonably priced; even so, few individuals are likely to be sufficiently involved in enough of the topics covered to feel that purchase is justified. It is, however, the kind of widely-useful contribution to the serious computing literature which no well-found technical library should exclude. E. F. Elsworth Introduction to Computer-assisted Experimentation. ByKenneth L. Ratzzlaff. Pp. 438. Wiley, Chichester. 7987. f43.25. Most computer users eventually move beyond word processing or numerical computation. Its more fun to link your PC with the real world of instruments and sensors - and more productive too. But commercial software insulates the user from computer jargon. Any attempt to interface requires the user be on chatting terms with buses, interrupts, hand-shaking, and more. For anyone who doubts their ability to step into this unfamiliar world, Ratzlaff’s book is a treasure. Filters, transducers, timers, ADCs, switches, encoders, even simplex optimisation, convolution techniques, and digital smoothing are there. All are explained lucidly and succinctly. Now such a list of topics looks like dull fare for all but the most devoted computer buff. In reality, though, this book is comprehensive, well illustrated, up-to-date, and, above all, readable. There is insufficient space for topics to be covered in great detail, and the mathematics is largely superficial, but by the end, the reader has encountered - and will surely now understand - every important concept likely to be met in PC interfacing. The book also has an over-generous supply of typographical mistakes. However, these do little to distract one from the excellent text,