4959-4984 Data Processing’sfirst editor looks back to 1959
25 years of Data Processing by PETER J FARMER
T
he first issue of Data Processing was published in January 1959 with the subtitle ‘in business and industry’. It proclaimed itself to be ‘a quarterly for top management and chief executives’. The idea was to explain these new machines - computers - to the nonspecialist, which in those days represented 99.99% of the population. In the sense that it was directed at business management Data Processing presaged the future because the first computers had been built as mathematical tools and, at that time, commercial applications were rare. In fact, only a few years before, IBM had declared that no role could be envisaged for the electronic computer in business. An idea soon discarded following which IBM quickly established itself as the world’s leading manufacturer of computers for the commercial user - a position it retains to this day. Abstract: Twentyfive years ago, computers werefirst entering the commercial scene. Before that they had been envisaged solely as mathematical tools. Commercial computers in the 1960s were built with peripherals already attached, punched cards were the major form of input and programming was usually carried out in machine code. The article describes the general stateof computing in the 1960s and the role of Data Processing in communicating information about the new computersto the commercial and business world. Keywords: data processing, computer manufacture, computer industry. Peter Farmer was the first editor of Data Processing and is now a technical journalist.
16
0011-684x184/02001~3$03.00
0
Early UK computer manufacturers Twenty five years ago the computer scene was very different from what it is today. Few computers existed. Those that did were expensive, not too reliable and difficult to program. But at that time every major electrical company in Britain was keen to get involved in the business. All were developing machines, many in conjunction with British universities which had been quick to begin research into the new engineering techniques involved. Thirteen companies were listed in the first issue of Data Processing as suppliers of computers in the UK suitable for commercial work. These were: British Tabulating Machine Company, Burroughs, Elliott Brothers in association with National Cash, EMI, English Electric, Ferranti, IBM, Leo Computers, Metropolitan-Vickers, PowersSamas, Univac, and Standard Telephones. The American-based companies have survived, but none of the British pioneer organizations have, except as predecessors of ICL. The odd one out of the group was Leo Computers, the unlikely subsidiary of J Lyons & Co Ltd of tea shop fame. (Unfortunately the delightful Lyons tea shops are also now a thing of the past.) After the war, Lyons management was actively involved in examining all possible means of increasing administrative efficiency and during this search heard of this exceptional new machine, the electronic digital computer. Enquiries made about it in America
1984 Butterworth
& Co (Publishers)
Ltd.
brought the response that they should visit Cambridge University. The upshot of this was Leo 1 (Lyons Electronic Office 1) which was built specifically for commercial data processing, its design being based on Cambridge University’s EDSAC (Electronic Delayed Storage Automatic Computer). From its earliest days the computer industry has been the driving force behind the acronym as well as the information revolution. Many other UK universities and research institutes were associated with UK companies in mutual endeavours to build working computers: Birkbeck College with British Tabulating (Hollerith 1200); National Physical Laboratory with English Electric (Deuce); Manchester University with Ferranti (Mark 1 and Mercury).
Data input and output in the early 1960s The feature that distinguished commercial from mathematical computers in the 1960s was the peripherals offered with them. No such things as standard interfaces existed, the computer had to be built with its peripherals already attached to the system. There were three ways of getting data into a computer: via a keyboard, punched paper tape or punched cards (of which there were two standards). Results could be printed out by rather slow line printers, or even slower character printers, or punched in paper tape or cards for later transcription offline on another item of machinery. Punched cards were so important as basic universal business documents
data processing
Cambridge EDSAC.
University s
that the logo adopted for Data Processing incorporated one in its design, together with a symbolic piece of punched paper tape. For the sophisticated business of that era, punched cards for input and output data were perfectly acceptable as they were familiar documents. Until the arrival of the computer, punched card machinery provided the primary means of manipulating data automatically. Short records that could be contained on individual cards could be selected from a file, collated with other records, sorted into sequence and then printed out for subsequent manual action. While the commercial computer was slowly evolving, the punched card equipment manufacturers were designing and building punched card calculators, machines of limited ability capable of performing a short calculation sequence involving the four basic arithmetical operations. For example, a price and a quantity read from a punched card could be multiplied together and the result punched back into the same card or, in some cases, another card. The sequence of operations needed was wired on a control panel inserted in the machine. The Powers-Samas
~0126 no 2
march 1984
EMP (electronic multiplying punch) was one such device described in the first issue of Data Processing. The punched card persisted in commercial computing for many years and is still in use today. It takes a long time to supplant a document that has achieved such universal acceptability. This is even reflected in company names. In 1959, International Computers and Tabulators Limited was created as the result of the first of many mergers in Britain to establish a viable national computer company. The tabulator in the title referred to that vital item of punched card machinery which translated holes in cards into print on paper. When it was perceived that the day of the punched card was coming to an end, the T was dropped and International Computers Limited (ICL) was born.
Thermionic
valves and reliability
To those on the periphery of computer technology in the 1950s it was always a surprise that computers ever worked at all. Thermionic valves were
Leo 1, built for commercial data processing.
used in their electronic circuits. Many hundreds in the larger machines. Thermionic valves are not particularly robust, and many noncomputer engineers were happy to explain that their reliability was such that because some must fail every time the computer was switched on, the machine would be permanently unserviceable. But this dire calamity never happened, though users were of switching wary machines on and off, preferring to leave them on all the time. Computer designs based on the transistor began appearing in the late 1950s and early 1960s and reliability ceased to be the problem it once was.
Advances
in programming
Programming is another area in which great changes have occurred. In the 1960s programmers were of the elite. Autocodes had been introduced but in those heady pioneer days to achieve maximum efficiency in execution, program sequences were prepared in
17
4959-4984
The changing style
ofIBM computing.
machine code. In fact it was considered effete to program in anything other than machine code. In the first issue of Data Processing an article entitled ‘Automatic programming’ held out the hope that plain language programming was a possibility, and could be developed to enable users to prepare computing routines themselves without the assistance of programmers. Heuristic is another good computer-type word. ‘A machine that learns from experience’ was the title of an article in the second issue that described an automatic keyboard instructor - a teaching machine used to train card punch operators. A novel feature of this device was that as the pupil became more efficient at carrying out exercises, the tempo was increased automatically. Conversely, if difficulties were being encountered, the speed of presentation was reduced and visual cues given of the keys to be operated. In the later stages of tuition the trainer forced the pupil to concentrate on keying sequences with which particular difficulties were being experienced. Many ideas were around in the 1950s about how computing techniques could be exploited. The problems were the eternal ones of converting bright ideas into practical facts. While the ideas were there 25 years ago, noone foresaw how cheap and accessible computing universally power would become by the 1980s. Playing games on computers in the 1950.3 preferably chess, was an exercise performed to demonstrate the remarkable capabilities of the ma-
18
methods of handling data automatically, not just computer techniques. This was partly because computer applications were few, and the real tool of administrative, and especially financial, management was the punched card. Data Processing’s particular claim to fame is that it was the first computer magazine to be published in the UK on a regular basis. It had a rival, Automatic Data Processing, which began three months later. In chine. That children in the 1980s the 195Os, the subject that intrigued would be playing games on home me as a technical journalist working computers linked to TV sets was an for the magazine Aircraft Production, idea that never entered anyone’s head. was the numerical control of machine In fact it would have been viewed tools. Tools capable of machining with disdain, as the trivialization of complex shapes entirely automaticthe capabilities of a unique machine. ally. A device called a computer was After all, not so many years had used to prepare the control tapes elapsed since the establishment of the driving these machines. After a while first computing centre in Europe I became interested in the computer which, when envisaged, was thought as well as the machine tool and the capable of providing all the compu- idea of publishing a magazine on ting power that would ever be re- computing was conceived, and Data quired by European scientists in the Processing was the result. pursuit of their work. Reading Data Processing today, the applications do not seem so very different to those mooted in 1959. Centralized computer services Maybe the emphasis then was on integrating procedures, and the opThe idea of a computer centre providportunities this presented in achieving ing general computing services for centralized administrative control users was soon adopted by computer with computers, whereas today the manufacturers. First, it provided a trend appears to be towards distribumeans of demonstrating the attributes ted processing and decentralization. of a particular computer to potential But then, this is just part of the purchasers. Second, such centres eternal cyclical pattern of business were part of an educational process development : centralization, decendirected at making the computer actralization, centralization, ad in&Gceptable to commerce, by proving it turn. However, the technology today could perform useful work at an is very different, and also the jargon. economic cost with the added merit The computing power now available that the user did not need to become and its low cost would have seemed too closely involved in the complexiunbelievable 25 years ago. As would ties, unless he/she wanted to, of what have been the idea of children getting was then a very ‘black art’. It also computers in their Christmas stockgenerated a cashflow badly needed to ings. What will it be like in 2009? sustain research and development in Truly ‘intelligent’ machines and the the many rapidly evolving aspects of cashless society may have arrived by computer engineering. then. Some ideas are realized, others are not. The fascination is that noone D&a Processing- then and now can predict which. Anyway here’s to In 1959 Data Processing covered all the next 25 years. q
data processing