Phys. Chem. Earth (A), Vol. 25, No. 12, pp. 813-8 17, 2000 8 2000 Elsevier Science Ltd All rights reserved 1464-1895/00/$ - see front matter
Pergamon
PII: S1464-1895(01)00012-6
A Vision Towards Automated H. Schuh’
Real-Time
VLBI
and W. Schwegmann2
‘Institute of Geodesy and Geophysics, University of Technology Vienna, Gusshausstr. 27-29, 1040 Wien, Austria ‘Istituto di Radioastronomica (CNR), Via P. Gobetti 101, 40129 Bologna, Italy Received 18 May 2000; accepted 19 September 2000
Abstract. The paper presents a general concept for very long baseline interferometry (VLBI) in near realtime and describes two important components of realtime VLBI: the direct transmission of the received signals from the radiotelescopes to the correlator or to a central computer for further processing and the automation of the VLBI data analysis by methods of artificial intelligence (AI). The concept of a knowledge-based system which supports the data analyst is presented and an example for automatic editing of the station log-files is given. 0 2000 Elsevier Science Ltd. All rights reserved. 1
Very Long Baseline
(VLBI)
Introduction Fig. 1. The components nology, models
Based on the inertial reference system of extragalactic radiosources VLBI is an extremely precise geodetic technique for monitoring the Earth rotation and for the realization of the global reference system. However, the costs are relatively high and there is still a delay of five days minimum between the time of observation and the availability of the results. If this gap could be shorthened considerably this would allow to monitor the Earth rotation parameters (ERPs) and the relative station motions in near real-time. This could be extremely useful for high precision positioning and navigation. Results in near real-time could also be used as references for other ‘on-line’ observing methods or for an Earthquake warning system, as well as for observing transient astronomical phenomena. 2
Interferometry
of the VLBI system:
procedure,
tech-
on the quasi-inertial reference system of extragalactic radio sources. Thus, it is much less depending on hypotheses than geodetic satellite techniques. In the past two decades the accuracy of the results of VLBI measurements has improved considerably due to improvements of the VLBI technology and due to refinements of the models used for data analysis. However, the VLBI procedure with four separate steps (preparation, observation, correlation, data analysis) has not changed since the end of the seventies. As can be seen on Fig. 1, efforts and costs and also the time till availability of final results mainly depend on the procedure itself rather than on the technology or on the models used for data analysis. In the last years information and communication technologies based on global communication networks have made fast progress with respect to computer power, data transmission rates and software technology. For VLBI there is a big chance to innovate the VLBI procedure by making use of these recent developments. The concept of VLBI in near real-time will be presented in the next chapter. It is mainly the aspect of automation which will streamline and accelerate the VLBI procedure.
Present status of geodetic VLBI
VLBI is the only space geodetic technique
which allows to simultaneously monitor all parameters which determine the orientation of the Earth, i. e. polar motion xp,yp, Universal Time 1 (UT1) and the nutation parameters AC, A+. VLBI is important for the realization of the global terrestrial reference system because it is based Correspondence to: H. Schuh
813
814
3
H. Schuh and W. Schwegmann: A Vision Towards Automated Real-Time VLBI A concept for VLBI
All means which celeration of the (for more details between the four
in near real-time
contribute to an VLBI process are see Schuh (1995)). steps of ‘classical’
automation and acbriefly given below These are the ‘glue’ VLBI .
1. Scheduling: (a) automatic
scheduling
(b) electronic
distribution
by optimization
methods
of schedules
2. Observation: (a) by teleoperation, i.e. remote scopes (e.g. in remote areas) (b) direct data transmission the correlator
control
from
of radiotele-
radiotelescopes
to
3. Correlation: (a) software (b) correlation (c) feedback
correlation supported
on parallel
computers
by AI methods
to radiotelescopes
(d) results of correlation analysis centers
directly
transmitted
to VLBI 4
4. Analysis: (a) supported (c) continuous
on parallel
update
Benefits of the new VLBI
The main advantages cedure are:
by AI methods
(b) global solution
costs. Many parts of the data analysis software can be parallelised. Thus, by highly parallel computers the time for a global solution combining some thousands of VLBI sessions will be reduced from presently a few hours to less than one minute. In the future the global data base of VLBI observables can be updated permanently, and a solution for the Earth rotation parameters and for global geodynamic parameters with 1 mm accuracy can be done in near real-time. A first successful realization of an on-line VLBI system is the Japanese Keystone project. This project, where four radiotelescopes in the Tokyo area produce geodetic results in near real-time is part of the Japanese Earthquake Prediction Programme (Yoshino (1994), Koyama et al. (1998)). For the MkIII Data Analysis System, a widely used VLBI software system (Ryan et al., 1980), first steps towards real-time VLBI have already been taken, too. Beginning in 1979, for example, small amounts of MkIII data were transmitted from over modem for fringe verification in real time (Levine and Whitney (1979)). concept
of the innovation
of the VLBI pro-
computers
of global data
base
(d) automatic transfer of results to Earth observation centers (e.g. International Earth Rotation Service (IERS), Earthquake research centers . ) or to other users (e.g. International GPS Service (IGS)) Some parts of the concept for real-time VLBI have already been realized in the last years; some others are waiting for their practical application. A significant component is the direct transmission of the received signals from the radiotelescopes to the correlator or to a central computer for further processing. Highbandwidth communication networks using optical fiber links or communication satellites will allow intra- and intercontinental transmission of signals with hundreds of Mbit/set in the near future. The correlation process itself can be done by software only, rather than by special processors. Already in the beginnings of VLBI in 1967 the first software only correlation was done and continued until the first Mark I hardware processor (ca. 1971). First successful software correlations, using a transputer based parallel computer were reported by Fayard et al. (1993). But this requires high computer power as the MkIII correlator performance is equivalent to 1 Gflops/baseline and that of the MkIV correlator to lo-100 Gflops/baseline. In the final data analysis many steps which till today are carried out manually can be supported by artificial intelligence (AI) methods. In a (semi-)automated data analysis more data can be processed faster and at lower
High degree of automation: to reduce the costs, to be able to process a large amount of data, to be more independent and to be safer against failures and data losses. Results in near real-time, the VLBI session
i.e. in a few minutes
after
- to provide the ERPs for prediction programmes. Precise predicted values as input are needed for high precision terrestrial navigation and positioning, space tracking, and as references for continuous ‘on-line’ observing techniques of ERPs (laser gyro, super-fluid helium gyro, . . . )
- to monitor
crustal deformations in near real-time. Thereby local/regional measurements by conventional geodetic instruments or by GPS could be referred to the independent VLBI network which starts to observe in the real-time mode when ‘suspicious’ regional deformations have been monitored. This could be an important contribution to any Earthquake warning system because so-called precursors usually occur within the last six days before the Earthquake in a distance of up to 1000 km from the epicenter.
- on the astrophysical side, real-time VLBI will have an impact on astronomy as transient phenomena in radio sources (super novae, rapid intra-day variations of the flux density of active galactic nuclei, . . . ) will be observable. After having detected such
H. Schuh and W. Schwegmann: A Vision Towards Automated Real-Time VLBI
effects in near real-time by scanning on known candidates, the network of radiotelescopes could immediately start intensive observations of the specific phenomenon. 5
Direct data transmission
At present the ‘turn-around time’ in VLBI is never shorter than five days. Most of this time delay is due to the shipment of the magnetic tapes from the individual radiotelescopes to one of the correlation centers and due to the necessity to read there the magnetic tapes again for further processing. An important part of the concept presented in chapter 3 is the connection of the VLBI radiotelescopes with the correlation center by the high-bandwidth network. This would allow an on-line correlation, i.e. processing the recorded signals in real-time and thus would be an important step towards automation and acceleration of the VLBI procedure. F&her advantages are: - No tape recording and playback of the observed radio signals are necessary. This avoids degradation and loss of data. - The check of the quality of the recorded data during the time of observation is possible. Errors and failures can be reported back immediately from the correlation center to the radiotelescopes. - The efforts for purchasing, handling and transporting the large number of big magnetic tapes can be saved. Several studies have been carried out by the international VLBI groups on how to realize the broadband communication between the radiotelescopes and the correlators in global VLBI experiments. A goal of first, tests is the end-to-end real-time transmission of the received data at a bandwidth of 2 Mbit/set, i.e. one channel only. Later, the data rate will be increased to 28 Mbit/set (14 channels) to validate the new on-line method on the standard MkIII system. Upgrading to 56 Mbit/set became possible in 1999. Approaching the 112 Mbit/set (MkIII-A) and the Gbit/sec range (MkIV) should be envisaged at a later stage. In Japan the radiotelescopes of the Keystone VLBI network were interconnected by optical fibers in 1996. The first successful on-line correlation of VLBI signals was carried out in May 1996 using Asynchronous Transfer Mode (ATM) with a data rate of 256 Mbit/set (Koyama, 1996). 6
Use of Artificial Intelligence
(AI) methods
In the past two decades the field of AI has been one of the most fascinating subjects in computer sciences. In particular the applications of expert systems (XPS) have proved to be useful for various fields such as medical diagnosis or production process planning. An XPS for
815
scientific data analysis such as VLBI data processing will also bring many benefits. 6.1
General remarks about XPS
A definition of expert systems similar to the following one was found in Schnupp et al. (1989): An XPS is a computer system which can administer specific expert knowledge, store and evaluate it in such a manner that it can provide targeted information to users, or can be used to dispatch certain tasks. It is an intelligent and active system, which contains logic and knowledge representation by rules, facts etc. knowledge-based software.
Usually it is also called
An elemental disposition of expert systems discerns between decision-support systems, diagnostic systems, analysis systems, planning systems and teaching systems. Typical applications of XPS are: medical diagnosis, computer-aided instructions, control of industrial enterprises (production process planning, information management), configuration of computer systems, maintenance and repair of technical devices (cars, machines, . . . ), control (air traffic control, . . .), data analysis or image processing. 6.2
Benefits of an XPS for VLBI data analysis
The large number of VLBI observing sessions leads to the necessity of a faster and even (semi-)automated data flow during the data analysis. This can only marginally be achieved by faster computers, because the data analysis is very complex and often requires the analyst’s (expert’s) decisions. Already today there is a shortage of qualified experts and the analyst’s training takes a long time. Moreover, even experienced analysts make time-consuming errors. Wrong results could lead to misinterpretations and to incorrect conclusions. Thus, the goal should be to make the VLBI data analysis faster and safer by ensuring that fewer failures and fewer errors are made even by less experienced analysts. 6.3
Automation
of VLBI data analysis
When looking at, the work done by the analyst during the course of VLBI data analysis, it can be seen that almost all tasks can be supported by an XPS like solution. Some of them can be realized by typical AI methods, others by a comfortable user interface or by additional algorithms or crosschecks conventionally programmed within the existing software. Methods of AI applied to VLBI were proposed by Schuh (1993a, 1993b). In 1997 a concept for the automation of the MkIII Data Analysis System has been developed at DGFI, Munich within a research project called ‘Applications of methods of AI for the VLBI data analysis’. The goal of the project is to accelerate and automate the VLBI data analysis. A knowledge-based system will be developed as a shell around the existing software.
H. Schuh and W. Schwegmann:
A Vision Towards Automated
Real-Time
Table 1. Possibilities data analysis
VLBI
for the application
of an XPS in the VLBI
advises the anaiyst how to overcome the problem (diagnostic system). The results are controlled automatically with respect to their plausibility (analysis system). The system contains a flexible and extensible knowledge-base which can be realized using Prolog, one of the typical AI languages, to represent rule-based knowledge. 6.4
Fig. 2. Flow chart
of the standard
MkIII Data Analysis
System
The data flow inside the MkIII Data Analysis System has been subdivided into five blocks to examine the applicability of knowledge-based techniques. Every block contains those programs which are related to a superior task. For instance the programs DBEDIT and APRIORI are used to generate and to modify a particular database. Figure 2 shows the structure of the MkIII Data Analysis System and the corresponding partition into the blocks: correlator output, apriori information, generation and modification of databases, log-file processing, data analysis. The demands on a software which should automate the data analysis differ for each of the blocks. Table 1 shows which possibilities exist to make use of an XPS in the different blocks of the data analysis. An XPS can be used in almost all blocks as a teaching system and for special aspects of the data processing thus it can exonerate the analyst from routine work. The main application, however, will be for decision-support, diagnosis and analysis of results inside block V. After having thouroughly analyzed the data flow inside the MkIII Data Analysis System a concept for the knowledge-based support of the VLBI data analysis was developed in the research project. The system contains a knowledge-based assistant for support and guidance of the analyst. It controls the operations done by the analyst and does all neccessary checks if possible automatically. The system gives detailed explanations to the analyst (decision-support system) in particular when he/she has to chose between different options or has to select specific models or parameters. When failures or problems in the process of dat,a analysis have occured it
Automatic
processing
of station
log-files
An example for the automation of data analysis will be described now: the processing of station log-files. In the analysis of VLBI experiments the meteorological data and the cable calibration data. have to be read from the station log-files and are entered into the experiment database. For that purpose the program PWXCB extracts the meteorological and cable calibration measurements from the station log-files and stores the data in separated files (see Fig. 2). Creating these files can be very time-consuming because of wrong entries in the log-files which have to be corrected manually. In addition, the extracted data have to be checked with respect to their plausibility. These tasks shall be performed now by an automated version of Pivxcs called XLOG. Because the knowledge needed for these tasks is relative simple the automation could be done so far by conventional programming techniques that is by supplementing the existing program-code written in Fortran77. All possible errors had to be considered even if they appear quite implausible, to obtain a high reliability of the automation process. Such errors can be rather clear, e.g. obvious outliers or mixing up temperature and pressure data. Other problems can only be understood in connection with the procedure of the VLBI data analysis. For example if meteorological data had been measured for too few points in time. Then the further processing of the VLBI data should be done without these data because the atmosphere couldn’t be modelled properly from very few measurements for the duration of the whole VLBI experiment which is in general 24 hours. The following possible errors of the data are considered by the program XLOG: Wrong entries in the log-files (missing keywords, mixing up temperature and pressure, . ), single outliers, offsets and/or gaps, and too few data. The automatic processing of the log-files was checked on real data. Thereto about 100 station log-files which had been already processed manually were reprocessed with the automated version of the program, i.e. XLOG.
H. Schuh and W. Schwegmann:
A Vision Towards Automated
4
Fig. 3. Comparision of original temperature after automatic processing with XLOG (b)
measurements
7
Conclusions
There is a good chance for innovations of the VLBI procedure which are necessary to obtain a higher degree of automation. This will decrease the time till availability of the final results considerably. Each part of the concept presented in this paper has a different potential concerning saving in cost and time. Similar developments in other techniques, for instance automation in GPS, are regarded as good examples. Besides the innovation of the VLBI procedure further improvements of the VLBI technology and of the models used for data analysis should be aimed at.
VLBI
817
b)
at a VLBI station
Almost 20 percent of the 100 log-files have had wrong entries and had to be edited manually. The comparison of the results showed a success rate of better than 90 percent of the automated version compared to manual editing which means that basically the same results were achieved as by the analyst who had to do manual editing which is very time consuming. An example is given in Fig. 3 which shows successful automatic editing of meteorological data by the present version of the program XLOG. The original temperature measurements were intentionally falsified before (Fig. 3(a)). Figure 3(b) shows the temperature data after processing with the program. All obvious errors were detected by XLOG. Merely at the end of the experiment there is a problem when no data are available for three hours which reveals some weaknesses of the conventionally programmed version of XLOG. Hence there is a difference of almost eight degrees centigrade between the last data point before the gap and the first point after. At present the program cannot yet decide if this is a measurement error or if the data after the gap are correct because this requires the knowledge and the experience of the analyst. The next version of XLOG will contain methods for knowledge representation and knowledge processing to solve such problems automatically.
Real-Time
8
(degrees
centigrade)
from the log-file (a) and the data
Acknowledgements
The research project ‘Applications of methods of AI for the VLBI data analysis’ is supported by the Deutsche Forschungsgemeinschaft (DFG), project Schu 1103/2-l. The authors would like to thank two anonymous referees for their useful comments. References Fayard, T., Petit, G., Nogl, J. and Moueza, J., The New CNES Soft-Correlator, Proceedings of the 9th Working Meeting on Europenn VLBI for Geodesy and Aatrometry, Sept. 1993, Mitteil. a. d. Geodatischen Instituten der Rhein. Friedr. Wilh. Univ. Bonn, No. 81, pp. 123-128, 1993. Koyama, Y., VLBI Data Analysis System for the Key Stone Project, Communications Research Laboratory, IERS TDC News No. 8, pp. 10-15, 1996. Koyama, Y., Kurihara, N., Kondo, T., Sekido, M., Takahashi, Y., Kiuchi, H. and Heki, K., Automated Geodetic Very Long Baseline Interferometry Observation and Data AnaIysis System, Submitted to Earth, Planet and Space, 1998. Levine, J. I. and Whitney, A. R., Mark III Real-Time Fringe Detection System, NASA Conference Publication 2115, 1979. Ryan, J. W., Ma, C. and Vandenberg, N. R., The Mark III VLBI Data Analysis System, NASA Goddard Space Flight Center, publication X-945-80-25, 1980. Schnupp, P., Huu, C. T. N. and Bernhard, L. W., Expert Systems Lab Course, Springer-Verlag, Berlin, Heidelberg, 1989. Schuh, H., Design of an Expert System for VLBI Data AnaIysis, Proceedings of the 8th Working Meeting on European VLBI for Geodesy and Aatrometry, Dwingeloo, June 1991, Report MDTNO-R-9243, Survey Department of Rijkswaterstaat, Delft, The Netherlands, pp. IV53-IV58, 1993a. Schuh, H., An Expert System for the Selection of Parameters in Solve, Proceedings of the 9th Working Meeting on European VLBI for Geodesy and Aatmmetry, Sept. 1993, Mitteil. a. d. Geodatischen Instituten der Rhein. Friedr. Wilh. Univ. Bonn, No. 81, pp. 79-86, 1993b. Schuh, H., A concept for real-time VLBI, Proceedings of the 10th Working Meeting on European VLBI for Geodesy and Aatrometry, May 1995, edited by R. Lanotte and G. Bianco, agenzia speziaie italiano, pp, 51-56, 1995. Yoshino, T., Crustal Deformation Monitoring in Tokyo Metropolitan Area by Space Geodetic Observation, Journal of the Communication Research Laboratory, 41(3), pp. 1-9, 1994.