The use of a micro-mini-mainframe computer network in the analytical chemical laboratory

The use of a micro-mini-mainframe computer network in the analytical chemical laboratory

_LTnalytica Chimica Acta. l-Z1 (1982) 13-23 Elsevier Scientific Publishing Company, Amsterdam THE THE -Printed in The Netherlands USE OF A MICRO-M...

780KB Sizes 0 Downloads 50 Views

_LTnalytica Chimica Acta. l-Z1 (1982) 13-23 Elsevier Scientific Publishing Company, Amsterdam

THE THE

-Printed

in The Netherlands

USE OF A MICRO-MINI-MAINFRAME COMPUTER ANALYTICAL CHEMICAL LABORATORY

R. P. J. DUURSMA, Laboratory 166, IOlS (Received

H. STEIGSTRA,

for Analytical It’V Amsterdam 18th

May

R. G. LOGCHIES

Chemistry, l7niversity (The Netherlands)

NETWORK

IN

and H. C. ShlIT*

of _-Imsterdam.

.kriertwe _-ichtergracht

1982)

SUMMARY A network of communicating and cooperating computers, consisting of a central host computer and one microcomputer per experiment, is described. The network is intended to be used as a universally applicable tool in analytical chemical research. A specially developed programming language, easy to learn, offers the possibilities of real-time clock-controlled data acquisition, control of experiments, simple logical decisions, and cooperation with the host computer, which simultaneously runs a program in a conventional higher language such as FORTRAN. The analytical application used for illustration relates to inductively-coupled plasma emission spectrometry.

In the evolution of analytical chemistry, three distinct phases can be distinguished: the period of classical analysis upto the 195Os, the rapidly increasing importance of instrumental analysis from about 1955, and the still increasing application of digital computers (from around 1970). The introduction of instrumental anaiysis implied the generation of analog signals. These (electrical) signals served mainly as a source of information obtained with relatively simple signal-hand!ing techniques. Other signals were generated for the control of temperature, flow, etc. In practice, not much attention was given to optimal signal handling. More recently, digital electronics, including computers, have been introduced. However, even simple applications require some basic knowledge of signal theory, in addition to computer science, numerical mathematics, etc. This has retarded the efficient application of computers in analytical chemistry_ Lack of knowledge and underestimation of the problems have often led to dubious results. Nevertheless, the possibilities are great: not only can the usual analytical procedures and instruments be automated and optimized, but a complete new approach, chemometrics, becomes accessible_ Computers are essential to chemometrics: multivariate analysis, correlation techniques, optimum parameter and filter estimation, library search, artificial intelligence, pattern recognition, Fourier and Hadamard transforms are viable only with computers. the human In classical chemical analysis, there are two m_ai.n features: factor and the esperiment. The experiment can be defined as a process with 0003-26’i0/82/0000-0000/$02_‘i5

0

1982

Elsevier

Scientific

Publishing

Company

14

technical aids for measuring and controlling the relevant parameters_ The interaction between the human factor and the esperiment may be called the which is usually heuristic in classical and instrumental “human interface”, analysis. The computer replaces this human interface by two new kinds of interface: the interface with the esperiment, and another human interface, in this case by means of terminals, printers, plotters and in some applications by other optical or acoustic aids (Fig. 1). The composition of the experiment interface shows a greater variety, but in general there are four basic features: measuring voltages, generating voltages, setting on/off conditions, and testing on/off conditions. The introduction of a computer also offers the possibility of data storage; data acquired from the experiment and the worker can be stored permanently in a massstorage device, e.g., a (floppy) disc or magnetic tape, for later processing or for use in search routines_ Choice of computers A choice can be made from a wide variety of computers with different capabilities and costs. Usually (digital) computers are classified as minicomputers, microcomputers and main-frame computers_ In this paper, some attention is paid to selection criteria for computer and data-handling systems in an analytical laboratory, where research in different fields is conducted, including chemometrics; many of the arguments are also valid for routine or diagnostic laboratories_ It is obvious that the needs of widely divergent analytical techniques make it. impossible for one standard device to fulfil optimally all the requirements_ For instance, even two kinds of spectrofast Fourier transform and u.v.-visible, require largely different scopy 9 approaches. Apart from the usual micro-mini-mainframe division, roughly corresponding to price, considerations can also be based, for instance, on centralized/decentralized data handling. If various measuring instruments have to be computerized, several solutions are possible: a central computer can be connected with each instrument or each instrument can have its dedicated computer_ An intelligent terminal, preprocessing the data for use in calculations in the central computer, is an intermediate solution_ There is some controversy over the relative merits of central vs. dedicated computers. An

Fig. l_ Representation

of interfaces.

15

important drawback of a central computer is the complexity of the required real-time programming, which increases overheads. Physical distances prohibit high-speed data transmissions or require special and comples interfaces. Of course, the computing capabilities of a central computer are superior_ However, recent developments frequently allow the use of easily adaptable microcomputers close to the experiment. Simple microcomputers like the personal computer may need estensions with, for esample, A/D and D/A converters, but are playing an increasingly important role. Nevertheless, presentday microcomputers do not meet all needs for memory and computertime demanding calculations_ In this paper, another approach is described: a network of communicating and cooperating computers consists of a central host computer and one microcomputer per experiment. The intention was to design a universally applicable system, capable of fulfilling the varying demands imposed by research in different analytical techniques. This system is not designed to perform only some special tasks in a particular analytical method, but is of analytical practice. Another designed to meet the varyin g requirements feature is the ability to operate independently of the host computer, increasing practical applicability_ Of course, the large extensions of the possibilities offered by the host computer are inhibited in these cases. XXALYTICXL

PROCESSES

_ASD SIGS_ALS

The specification of a data handlin g and control system in analytical chemistry can only be based on knowledge of the signals to be espected from, or to be sent to, an analytical process. In particular, the power spectral density (PSD) of the signals, determining the minimum sampling frequency, and the dynamic range are important. For example, electroanalysis usually involves rather slow chemical processes, often controlled by diffusion. Relatively slowly fluctuating signals are produced. Special attention has to be paid to the rejection of hum, caused by the high impedance of many electrodes. A masimum sampling frequency of 100 Hz is certainly sufficient in the usual methods; in many cases, 10 Hz can be used. This allows dual slope techniques to be used in the converters, offering inherent hum rejection_ Chromatography can be a problem, when the high dynamic range of the signals is taken into account. Voltage-to-frequency converters, logarithmic conversion or (espensive) .4/D converters with minimal 16-bit resolution; may be necessary_ The minimum peak width in common chromatographic procedures is unlikely to be less than 0.5 s and is normally much more. A rough estimate of the required minimum sample frequency gives a value of about 40 Hz. Only high-pressure gas chromatography (1l.p.g.c.) may require higher sample frequencies. Spectroscopic signals can be divided into two groups: basic physical signals, requiring very fast A/D conversion, and the analytical signals, based on these physical phenomena. In the latter cases, there is always a speed-reducing

16

step, determined by the manipulation of the sample under test. Examples are the delivery of atoms from the wall of a graphite furnace in flameless atomic absorption and sample nebulization for flame and inductively-coupled plasma experiments. The dynamic behaviour is governed by time constants in the range 10 ms-1 s [ 13. A rough estimate of the sampling frequency gives a value of about 400 Hz. Similar considerations are valid for instrumental techniques like mass spectrometry and Fourier-transform spectroscopy_ However, in these techniques dedicated computers, optimally matched to the requirements, are applied_ The general conclusion is that a sampling frequency of 1000 Hz is sufficient for most analytical techniques. Chemometrics Before chemometrics can be applied to already obtained analytical data, the following demands have to be fulfilled. A library of chemometric algorithms must be available; this usually requires access to a large computer, though a microcomputer may sometimes be sufficient. Further, the information must be available at a central site for administration, and there must be easy access to softwaredeveloping aids. The incorporation of chemometrics into an analytical technique extends the demands on the computer. Finally, accurate interfacing between the experiment and the artificial intelligence or computing power must be available physically close to the experiments. COMPUTER

NETWORK

_A possibility for fulfilling these demands is the use of decentralized computers in a network environment_ Editors, compilers and assemblers for program development are required for every computer, and therefore external memory like (floppy) discs and an operating system to coordinate the actions inside and between the systems must be present. If (simple) minicomputers are used, the costs are inevitably quite high. The most attractive way to meet all demands at reasonable cost is a computer network consisting of a central minicomputer and a dedicated microcomputer for each experiment (Fig. 2). These satellite microcomputers are equipped with two serial controllers, one connected to the (local) terminal and the other to the host computer, in the present case a minicomputer. This real-time minicomputer is, in turn, contained in a network of 200 UT batch stations, giving access to a number-crunching computer system. The connection with this mainframe is achieved by a separate microcomputerbased protocol processor, developed in this laboratory, which handles the timeconsuming synchronizations and generation of responses according to the 200 UT protocol definitions_ However, if a mainframe with a guaranteed time response is available, this can be used as a host as well. Nevertheless, in this configuration the tasks of the host are editing source files, running (cross-)compiIers and (cross-)assemblers and protection of user files. In general, this is a multi-task environment, supporting program development from different sites. Adequate software is commercially available.

17

Fig. 2. Method of connecting the microcomputers terminals in a laboratory computer network. Fig. 3. General flow of software

development

between

the host computer

applied to the computer

and the

network.

The hardware connection between the microcomputer and the host computer is established with cheap multiplexed asynchronous serial interfaces. One task for the microcomputer is transfer of data between micro-terminal and the host computer, thus maintaining terminal access to the host [2] _ Mostly, the experiment and the host computer are physically far apart. Therefore, it is attractive to connect the two by means of an RS422 connection. This connection can also be used for the terminal access of the host. Sharing the same cable for both data transmission and normal terminal data transfer implies the need of a special protocol on ASCII base. By definition of special control characters (i.e., escape sequences) the data can be recognized as information under program control and will be converted without affecting the terminal. Data sent to the terminal without these special control characters will not affect the memory contents of the microcomputer, so that a perfect separation is established between microcomputer and terminal. Implementing the protocol as described has the important advantage that even a public telephone line can be used to interface t.he esperiment to the central host computer. A microcomputer without interrupts originating from floppy discs is considered fast enough to maintain data transmissions at 1200-9600 baud. In many cases, this is sufficient to maintain a constant data stream of about If the host computer is busy and 100 samples/second to the host computer. therefore limiting the transmission speed, synchronization is maintained by the echo of characters on terminal connections. A drawback is a halving of the effective transmission speed to the host, but transmissions in the other

direction are always maintained at maximum speed. If all lines of an RS422 serial connection can be used, in particular the Data-Ready and Request-toSend signals, and the software escludes the echo of characters, transmissions in both directions are maintained at full speed. However, if the speed of data communications to the host is too low or if communication is impossibIe, an alternative must be available_ The authors have opted for a digital cassette, because it is most. appropriate for collection of serial data. The gathered data are then transmitted to the host when faster lines become available or when communications are established_ This offers the possibility of complete independence from the host computer, provided that the programs themselves can also be stored on the cassette_ Object modules from the host are, however, loaded directly and in a short time. If several object modules are available in the host, the host computer can be used as a program library. Development of programs is fast, because the large source files are kept on the hard disc of the minicomputer (see Fig. 3). Because the peripherals of the host are shared, they can be more espensive and therefore faster than the separate peripherals of each minicomputer_ Also, effective use can be made of software libraries. EXAMPLE OF NETWORK SOFTWARE

APPLICATION

AND DESCRIPTION

OF THE REQUIRED

The use of microcomputers as described above may be illustrated by investigations on inductively-coupled plasma emission spectrometry (i.c.p.e.s.) This esample can be considered as typical in analytical research. The results have been reported 133 , but the philosophy behind the software and programs used is described here to give an idea of the possibilities offered by the network. This example shows distinct steps, which are fairly typical of any investigation: a preliminary inventory of all measured properties to check a hypothesis, the calculation of results and proof, and the presentation of these asgraphs, tables, etc. In i.c.p_es., nebulized liquid sample is injected into the plasma and the emitted radiation is used to identify and quantify the .elements. The mist produced showed visual density variations, and this effect was investigated. . Speculative was the idea of using the measurable fluctuations in the mist (input of th e system) for correction of the analytiCal slgnaI (output), to obtain better reproducibility and signal/noise ratios. First measurements concerned the simultaneous detection of the variations in the mist and the output of the spectrometer_ From these data, the probability density functions (PDF) and the first three moments of these PDFs were calculated. The sign&s were sampled at a rate of 100 Hz (initiated by interrupts frqm timers) and the binary output from the A/D converter was used aS an index in an array_ The indicated element in the array was incremented b$ one. After the required number of interrupts (18000 samplings in 90 s) had been reached, the first and second moments were evaluated and reported. The result was a

19

relation between the averages of the input and output signals. This part of the investigation was done in assembler code and was developed in the host computer with a cross-assembler_ Once the program had been tested and approved, the object was stored on disc for direct use. So that the program could be loaded in a short time, the time-sharing subsystem (TSS) in the minicomputer was estended with high-priority real-t.ime routines, which transfer the object at high speed without intewentions of the TSS. Nest the influence of fluctuations in the mist on the output was esaminecl. For this purpose, it was necessary to obtain the autocovariance function (XCVF) of the input and the cross-covariance function (CCVF) of the input and output. From the Fourier transforms of these data, an unambiguous relation between input. and output is obt.ainecl, i.e., the impulse response Iz (t ). This impulse response is calculated from h(t)

= F-’

[F[CCVF(t)]

/F[_-KXF(t)j:

where F is the Fourier transform and F-’ is the inverse Fourier transform_ This problem was tackled in two steps_ The first step was collecting about 10,000 data pairs on a digital cassette with a sampling frequency of 100 Hz. This was clone with the Laboratory-Basic interpreter. The second step was the transmission of these data to the host computer. which did the compies calculations described above. Two conclusions were drawn: (1) as expected fluctuations in the from the configuration (Fig. 41, there is a delay between g fluctuations in the output; (2) the magnimist signal and the correspondin tucie and sign of this relation depends on the concentration of the sample in question. The harclware usecl is illustrated in Figs. 4 and 5. Figure 6 illustrates the procedure developed to reciuce the uncertainty of the results. First, the microcomputer calculates a demodulation signal from the time delay, the impulse response anti the input signal. If the concentration is approsimateiy known, it is possible to correct fluctuations in the analytical signal with this demodulation signal. Because this yields a better estimation of the concentration, the variations are better corrected_ The time delay and the impulse response need only be calculated once, and this is most suitably done by the minicomputer. In this way, the microcalculates constantly the computer, physically close to the esperiment,

Fig.

1. ‘The

hardware

used

in a study

of

fluctuations

in the

plnsma.

Fig. 5. Thesoftware used in thestudy of i.c.p.e.s., corresponding to the hardwareofFig_ 4.

%I?

Fig.

6.

I

The q.-stem used to reduce uncertainty in the results from i.c.p.e.s.

output of the experiment, but is controlled by the information from the host computer. As a consequence, two programs concerning one experiment must be running concurrently. corrected

SOFTWAIZ

In order to achieve optimal interaction between the analytical experiment and the experimentalist, a programming language for microcomputers was developed_ Both the typical sequence of events in the measurement and control aspects are considered. First, an inventory is made of the frequently

21

used measurement and control procedures, and a limited number of universally applicable computer actions are listed, which are afterwards realized as language statements_ The interactive aspect requires extensive consideration of the involvement of the experimentalist in the technical aspects of the experiment_ The choice made calls for a minimum of knowledge of hardware aspects, such as signal transducers, data storage devices, and computers. Xlso timing problems in the case of real-time instrumentation and development of control software are minimized. The programming language combines t.he knowledge of the computer specialist and the experimentalist in language statements that satisfy the hardware control demands. It offers the possibility of achieving fast and concise (routine) actions. The timing aspects are solved by implementing pause statements, separated from the program and synchronized with a real-time clock. The interpreter-like language, to be used for measurement and control programs, supplies on-line editing facilities with emphasis on self-correction_ Only existing statements asking for the relevant variables can be introduced, while format errors cannot be introduced. After introduction, the statement is translated to a statement set, containing the subroutine address, variables address and the address of the following statement set. The software is optimized to minimum run-time. Some examples of statements are: Acquire

(variable)

from

input (parameter>

(analog

input).

Pause (variable). Open (parameter>

I/O for controlling

Close (parameter) Step (variable) buret.

to

1 port

PRB (variable).

Generates

Test (variable> etc _

from

(parameter>. binary

valves, etc. Burst

for

a stepping

motor-controlled

noise pattern.

input (parameter).

I/O for testing

conditions

of switches,

In addition, a set of BASIC-like statements can be used to control the program flow. Examples are: Go To, Go Sub, Return from subroutine, End, etc. Statements for data handling are: Store (variable) in data buffer (parameter), and Dump to tape Wrriable> (for cassette tape data storage). Statements for the console are: Write (text), Print , New line, Bell. An important part of the software concerns the exchange of data with the host computer. If the software to be developed is considered as an entity for problems where both host computer and microcomputer are required, it would be advantageous to assign a name to the same parameter in both computers, e.g., a time delay. This cannot be done with commercially available programming languages, because only values of parameters are transferred to subroutines_ Consequently, when protocols are used together with

33

:...-:_:.:. _I- _: ::1..

I

__

I -I

;.:_

.,_..

;.

.

-...:

I_-

:._.-.-

I.

:_

-

...I :‘__

._.

.___.__

.;-:

_.>-:*

Fig. 7. The relations between produced program codes.

L the

input

of

a hypothetical

integrated

compiler

and the

high-level languages, only the values of parameters and not the names are transferred between microcomputer and host computer, because the protocols are called as subroutines_ Although the use of subroutines as a method of creating modular programs is well known, the transfer of parameters is a common source of errors [4] _ Therefore, in this work, the interpreter was designed for use on the (8080based) microcomputers with a built-in interface to a protocol_ This protocol transfers both variable names and their values. Programs written in highlevel languages (ALGOL, FORTRAN, PASCAL, etc.), originating from can control parameters in this mathematically welldefined algorithms, interpreter irrespective of their current use. This is possible because an interpreter always keeps track of all used variable names and values. The transfer of the values is handled without intervention of the user. In this way, flexibility in experiments is optimal. Data in the interpreter are represented in a fixed point format with 16 bits before and after the decimal point, using 32 bits altogether_ The times required were 0.2 ms for additions/subtractions and 5 ms for multiplications/divisions_ .The processor used is a standard 8080 with a 2 MHz clock frequency_ If the instructions of a Z-80 (Zilog) or the hidden instructions of an 8085 (Intel) are used, multiplications and divisions are done within 1 ms. With special routines it is possible to synchronize with the timers of the T&IS 5501chip (Texas Instruments). In this way, it is possible to sample a signal at exact time intervals without the use of hardware conversion commands. DISCUSSION

The above procedures for direct communicating programs are certainly not the most advanced possible solution to the problem. It would be better if complete block structures and scopes of variable names were extended over two different computers, working concurrently (Fig. 7). This requires a programming language which can be translated into executable code for both host computer and microcomputer_ These codes can be produced by an inte-

23

gral compiler, which also incorporates a method of synchronization of the different computers. However, it should be clear that a researcher is capable of adding new techniques to his experiments if he has a micro-mini network at his disposal_ The parallel computing of micro- and minicomputer needs a new approach to programming Ianguages, although the above-described 1: 1 correspondence of variable names and values between programs proved to be very useful, as has been shown by the example. The system proved to be equally useful in completely different experiments such as automatic titrations. REFERENCES 1 W. &I. G. T. van den Broek and L. de Galan, Anal. 9 H. C. Smit, Anal. Chim. Acta, 122 (1980) 201. 3 R. P. J. Duursma,H. 1 D. Gries, Compiler

Chem.,

19 (1977)

2176.

C. Smit and F. J. %I. J. hlaessen, Anal. Chim. Acts, 133 (1961) Construction for Digital Computers, Wiley, New York, 1971.

393.