~
Pergamon
www.elsevier.com/locate/actaastro
Acta Astronautica Vol. 49, No. 3-10, pp. 441-445, 2001
PIh S0094-5765(01)00118-7
© 2001 Elsevier Science Ltd. All fights reserved Printed in Great Britain 0094-5765101 $- see front matter
Medicine in Long Duration Space Exploration:The Role of Virtual Reality and Broad Bandwidth Telecommunications Networks Muriel D. Ross, Ph.D.The University of New Mexico Health Sciences Center, Telemedicine, 1005 Columbia, NE, Albuquerque, New Mexico, 87131-5042 Key words: virtual collaboratory, Telemedicine, CyberMedicine, multicasting, Cyberscalpel
ABSTRACT Safety of astronauts during long-term space exploration is a priority for NASA. This paper describes efforts to produce Earth-based models for providing expert medical advice when unforeseen medical emergencies occur on spacecraft. These models are Virtual Collaborative Clinics that reach into remote sites using telecommunications and emerging stereo-imaging and sensor technologies. © 2001 E l s e v i e r S c i e n c e Ltd. All r i g h t s r e s e r v e d . 1
INTRODUCTION
Maintaining the health of astronauts is critical to mission accomplishment during long duration space flight and exploration whether the astronauts are on space station, on the moon or on Mars. Self-sufficiency on the part of the medical personnel on board the spacecraft will be necessary. However, those concerned with health care of industrial staff engaged in exploration in remote locations on Earth know that, even with utmost care in pre-selection of staff, emergencies arise for which on-site medical personnel are unprepared. On Earth in such a case, evacuation of the seriously ill or injured patient is carded out, often at great expense. In travel to the moon or distant planets or asteroids, evacuation will not be possible. Thus, at the Center for Bioinformatics at NASA Ames Research Center and now at the University of New Mexico, there has been a concerted effort to advance 3D stereo-imaging and virtual environment technologies for use on space station and beyond. The goal is to enhance astronaut ability to deal with medical uncertainties that will arise. This report briefly describes principles of Virtual Collaborative Clinics using advanced telecommunications and imaging technologies, and provides insights to future developments that will be critical to health care in space.
2 METHODS
The first Virtual Collaborative Clinic, sponsored by NASA and held on May 4 th, 1999, used broad bandwidth ground fiber networks (Abilene and CaIREN, California Research and Education Network), satellite communication and multicasting to link five sites for clinical discussion and diagnosis [Ross et al., 2000]. The sites were NASA Ames Research Center at Moffett Field, California; Salinas Valley Memorial Hospital in Salinas, California (through the University of California at Santa Cruz); Stanford University Medical Center in Palo Alto, California; Cleveland Foundation Clinic in Cleveland, Ohio (through Glenn Research Center at Lewis Field in Ohio); and the Navajo Medical Center at Shiprock, New Mexico (Figure 1). Communication between these sites was organized by staff of the NASA Research and Educational Network (NREN) of the Information Systems Directorate at NASA Ames Research Center, and by staff at Glenn Research Center. Software for multicasting as well as for all 3D, stereo, interactive imaging and virtual surgery was developed at the Center for Bioinformatics at NASA Ames Research Center. The demonstration utilized 35 Mbs in ground links and 39 Mbs by satellite for transmission of complex, patient-specific, stereo 3- and 4D images and a virtual environment surgery. Visualizations were seen simultaneously in stereo at all sites on PCs using inexpensive polarized glasses (NuVision). The demonstration had the cooperation of industry, including Cisco Systems, Silicon Graphics, and PanAmSat. 441
442
13th Humans in Space Symposium
3 RESULTS The Virtual Collaborative Clinic successfully demonstrated the use of broad bandwidth ground and satellite communication systems to link several clinical sites for medical diagnosis and discussion of patient-specific data. In particular, the Northern Navajo Medical Center at Shiprock, New Mexico was representative of a truly remote site. The Navajo are spread over 4 states (Arizona, New Mexico, Colorado and Utah), with many of the Navajo living in remote sites lacking telephone communication and electricity. Medical care for these Navajo is a function of variously distributed Navajo Medical Centers. The WAN (wide-area network) was uplinked via satellite through the courtesy of Glenn Research Center, which provided technical support and the correct antenna and modem for downlinking to the Northern Navajo Medical Center. Shiprock used a DS3 line (T3, 44.736 Mbs) for outgoing interactions with the other members of the Clinic (see Figure 1).
D. t m ~ 2P,.Apr.~
Figure 1. Diagram of the network and satellite connections necessary for the fu'st Virtual Collaborative Clinic Wide area networks were linked with local area networks (LANs) at Stanford University and the University of Santa Cruz, where members of the Salinas Valley Memorial Hospital participated in the demonstration. Cleveland Clinic Foundation participation was through network connections with NREN at Glenn (Lewis) Research Center. Moves to the University of Santa Cruz and to Glenn Research Center were necessitated to access the broad bandwidth required for the demonstration. The complexity of the number of connections and routers required to reach
13th Humans in Space Symposium
443
final destinations where the computers were located limited bandwidth to only 35 Mb of the available 2480 Mb on the Abilene network. The satellite connection did better; 39 Mb out of 45 Mb made available by PanArnSat were utilized. Although the imaging software was designed to be platform-independent, all sites except for NASA Ames used PCs with ELSA graphics cards for stereo viewing and manipulation of the visualizations. The server at NASA Ames was a Silicon Graphics Onyx Infinite Reality2, and a Silicon Graphics Infinite Reality desk-top workstation was used for a virtual surgery demonstration conducted at Ames by surgeons from Stanford University. The three-dimensional, stereo images used in the demonstration were a heart with a graft made from CT data and dynamic reconstructions of echocardiograms of hearts, with Doppler effects, from patients with septal or valve defects. A 3-D virtual jaw surgery was demonstrated using reconstructed CT data from a patient with cancer of the jaw. Since the original data sets could contain as much as 100 to 150 GB of data, the data had to be reduced for transmission over the network and satellite. Mesh reduction software developed by Michael Garland at Carnegie Mellon University [Qslim, version 1.0, 1997] was used to achieve a 1 million to 20,000 polygon reduction without losing topography. The reduced data were broken up into 1KB packets for multicasting images to the sites. On a completely clean line, about 15 frames/sec could be achieved for manipulation of the data (i.e., turning the image around). To accommodate multicasting of image manipulation with a reasonable frame-rate and with simultaneity of viewing at all sites, software was developed to transmit coordinates as the models were moved. When manipulations stopped, the server multicast the original image with its potentially million or more polygons to all sites. A pointer to aid interaction was provided in the software. Participants at all sites joined in the discussions at will and handed off use of the pointer during interactions. Simultaneously, all visualizations and discussions were transmitted to industry and government observers in an auditorium at the Numerical Aerodynamic Simulation Facility (NAS) at NASA Ames Research Center, for their evaluation. 4 CONCLUSION The Virtual Collaborative Clinic held on May 4 ~ demonstrated the feasibility of utilizing broad bandwidth ground and satellite links to multicast patient-specific data to experts located at several sites. In particular, it was possible to link a truly remote site at Shiprock, New Mexico and a clinic in an agricultural setting (Salinas Valley Memorial Hospital) with centers of expertise at Stanford and at Cleveland Foundation Clinic. This demonstration was but the first step in planning for telecommunication linkages and virtual reality technology for space applications. Certainly, the network interactions have to be simplified to achieve even higher bandwidth for transmission and interaction. This will be essential to permit more than the 15 frames per second manipulations used on May 4 ~, 1999, for force feedback implementations in particular. In the future there will be monitors that will provide stereo 3D imaging without the use of special transmitters or glasses. Such technology is already largely realized with the 3D stereo monitors developed by Dimensions Technologies that were displayed following the Virtual Collaborative Clinic. In addition, these monitors can now be used with ordinary PCs and with a small immersive workbench (Reachln Technology, invented by Dr. Timoty Poston of CieMed) for virtual environment manipulation of images projected onto them, and with inclusion of haptic (tactile, force) feedback. In the future, smaller computers with greater graphics and computational prowess but lower power requirements will become available. These can, for example, be combined with improved 3D ultrasound imaging to provide nearly instantaneous stereo images of an astronaut's heart daily on the spacecraft, to keep track of changes in heart muscle during flight. Other data collected from microsensors will
444
13th Humans in Space Symposium
assess blood flow, oxygen levels, electrolytes, etc, with data fused to provide continuous monitoring of heart and general health. These sensors can be embedded in a T-shirt or can be part of hand-held devices, much like the palm pilots now available, that will plug into any computer. When unforeseen medical problems arise, such as a trauma to the head, hand or limb, or a heart problem that is beyond the expertise of the personnel on board, telecommunications ~ d virtual environment can augment astronaut responses to the emergency. In such cases, following stabilization of the patient, advice can be sought from Earth even though lag-time in communications would be considerable. Three- and four-dimensional ultrasound images and other data transmitted securely back to Earth via broad bandwidth satelfite can be downiinked immediately to secure ground lines and multicast to appropriate experts around the globe. Improvements in switches and in software for multicasting complex information will be required. Wireless technology will doubtless come to the forefront in such applications. Expert medical advice can be sent back to the spacecraft with virtual reality implementation to provide practice and/or direction as treatment begins. It should be noted that security during information flow is paramount. Here, federal laboratories as well as industry can make significant contributions. The outline presented here has the advantage that procedures and technologies can be perfected through testing using remote sites on Earth and, later, using space station as a remote site. With this in mind, planning for a second virtual collaborative clinic is underway at the University of New Mexico with colleagues across the country, using clinical cases to mimic possible events in space. Centers in Hawaii, Alaska, the Navajo Nation, and Salinas Valley Memorial Hospital will participate along with the University of New Mexico. The plan is to have a permanent virtual clinic in place following the second demonstration of the feasibility of such collaborative efforts, and to have permanent partnerships in place. Virtual reality training software and equipment will test the concept that a person at a remote site can use virtual reality technology to enhance learning and treatment. To accomplish these goals, we must learn from Telemedicine applications attempted or in place in truly remote sites such as the Arctic Circle or within the Navajo Nation. The demonstration will combine NASA's capabilities in broad bandwidth utilization and software development, medical and non-professional personnel located at remote sites, and Federal Laboratory and industry partners to push the telecommunications envelope further for CyberHealth purposes. It is clear that as we develop technologies for space medical applications, people on Earth will benefit greatly. NASA's efforts in CyberMedicine will help to equalize health care for underserved communities in remote locations on Earth while promoting the use of broad bandwidth telecommunications in medical practice, education and training. But the long-range goal is to fulfill humankind's destiny to reach and populate distant sites in the universe, and to do so safely. 5 ACKNOWLEDGEMENTS The Virtual Collaborative Clinic could not have been accomplished without the help of NREN, Numerical Aerospace Simulation Facility (NAS), Center for Bioinformatics staff at NASA Ames Research Center, and the generous help of staff at NASA Glenn Research Center at Lewis Field. The many industry partners included Silicon Graphics, PanAmSat (which made 45 Mb of satellite bandwidth available to us for testing and for the demonstration) and Cisco Systems, whose staff attended to router problems. 6 REFERENCES . M. D. Ross, I. A. Twombly, C. Bruyns, R. Cheng, S. Senger, Telecommunications for health care over distance: The virtual collaborative clinic. In: Medicine Meets Virtual Reality 2000,
13th Humans in Space Symposium
J.D. Westwood, H. H. Hoffman, G. Mogel, R. Robb and D. Stredney (eds). lOS Press, Washington, D.C. pp. 286-291 (2000). . M. Garland, Qslim Simplification Software, version 1.0, first released October 1, 1997, downloaded from
[email protected] (1997).
445