Commit. & Graphics, Vol. 20, No. 6, PP. 763-774, 1996 Copyright Q 1996 Elsevier ScienceLtd Printed in Great Britain. All rights reserved 0097-8493j96 $15.00+ 0.00
PII: s0097-8493(96)00054-4 Medical Visualization
USER-CENTERED DEVELOPMENT OF MEDICAL VISUALIZATION APPLICATIONS: FLEXIBLE INTERACTION THROUGH COMMUNICATING APPLICATION OBJECTS JURGEN FECHTER+, THOMAS GRUNERT, L. MIGUEL ENCARNACAO and WOLFGANG STRADER Interactive Graphics Systems Laboratory (WSI/GRIS), Wilhelm-Schickard-Institute of Computer Science, University of Tiibingen, Auf der Morgenstelle 10, C9, D-72076 Tiibingen, Germany e-mail: juergen.fechter@uni-tuebingende Abstract-Today’s software systems for medical visualization still suffer from a shortage of practical applications due to their lack of usability with respect to a large variety of users with different computerskills and experiences. Furthermore, progress is often hampered by the lack of adequate software tools that allow research ideas to be rapidly accomplished, evaluated, and brought into routine. We developed an integrated software environment that is distinct because it, on the one hand, provides an applicationbuilder toolbox for medical diagnosis and therapy. And, on the other hand, it allows for the development, integration, and user-centered evaluation of existing and new interaction techniques in 2-D, 3-D, and Virtual Reality (VR). The environment’s interaction support is based on communicating application objects that employ a sophisticated message-based inter-object communication. In addition, the system contains a component for user-adapted interaction and system support. The system has been used for several clinical auulications which are brieflv described at the end of this paper. Copyright IQ 1996 Elsevier Science Ltd --
1. INTRODUCTION
the availability of powerful workstation hardware and the infrastructure of In the field of medicine,
picture
archiving
and
communication
not only hasto be derived from the underlying data to be presented, but strongly depends on the application
system, the application
purpose,
and-
as a major factor-the type of user that has to deal
systems
(PACS) have, on the one hand, brought substantial advancementsfor medical visualization. Yet these systemsstill suffer from a shortage of practical applicationsdueto their lack of usability with respect to a large diversity of userswith different computer-
with the user-interface: an adequate application system’s user-interface has to consider all of the
aspects of human-computer interaction. This is especiallytrue for computer-visualizationsystemsin medicine. skills and experiences (e.g. nurses, doctors, techniDue to theserequirements,there is a strong need cians, researchers):Clinical researchersand staff for development systemsfor medical applications from a variety of disciplinesare increasinglycon- that serve to support the user-centered generation of fronted with the difficulty of making efficient useof the user-interfaceand, at the sametime, provide the digital equipment including complex image visualization algorithms and basic functionality, thus management, analysis, and visualization tools. supporting the introduction and integration of the Furthermore, progressis often hamperedby the lack applications in clinical working environments. In of adequatesoftware tools that allow researchideas order to meet the demandsof heterogeneoususer to be rapidly realized, evaluated, and efficiently groups, user-centeredreasoning implies a certain brought into routine. amount of adaptation functionality. For the same On the other hand, a variety of new techniques reason,application systemsfor clinical useneeduser(2-D, 3-D, VR) for data presentationand human- interfaces that not only support one, but several computer interaction have evolved, thus creatingthe reasonableinteraction techniques and styles. The need for new user-interfacestyles and interaction actual configuration might be automatically defined techniques.While the discussionon the usefulness by the system or adapted through the user at and adequacyof thesetechniquesand user-interface runtime. Systems are needed that integrate visualizastyles--especiallyin medical environments-is still tion, adaptation, and development functionalities. going on, it showsthat there is no generalanswerto Thesesystemshave to bemadeaccessible to different thesequestions.We are convincedthat this adequacy user groups through user-tailored user-interfaces, and, as a major requirement, need to be easily extensiblefor new applications.Due to the complexity of such integrating systems, a fair amount of + Author for correspondence. research has to focus on communication issues 763
764
J. Fechter, T. Grunert, L. M. Encarnaplo and W. StraDer
among the different parts of the system and between the different kinds of interfaces. There is an existing variety of stand-alone application systems, such as MagicView (Siemens AG), Advantage Windows (GE), Silhouette (ISG Techn. Inc.), and ANALYZE (CNSoftware, Southwater) [l]. These systems provide high functionality which is accessible through a solely clinically oriented userinterface. However, they all show a severe lack of extensibility and adaptability to different groups of users. Computer graphics (CG) and image processing (IP) libraries (e.g. [2]) and toolboxes for scientific visualization, such as AVS [3], Khoros, apE [4], and IRIS Explorer [5], provide high and extensible functionality to the developers of application systems. But they do not take clinical usability into account. Whereas, on the other hand, user-interface management and development systems exist, yet are mostly limited to a certain (2-D) interaction style (TeleUSE, GRITplus, Rapid+ + , etc.) and few support user-adapted interface development [6]. In addition, modem research directions focus on the exploration and propagation of Virtual Reality (VR) interfaces based on description languages 171. The distributive usage of such interfaces through eventdriven communication is the focus of innovative research (e.g. [S]). Even though systems (none of which pertain to the area of medical applications) that provide a certain amount of adaptation functionality to the user exist, the functionality is, however, tailored and, therefore, restricted to the proper application system [9-l 11. We present the integration and development environment UC-AID med.+ for the area of medical diagnosis and therapy that was developed to meet the requirements stated above. It represents a distinctive integrated software environment for the user-centered development of medical application systems and innovative interaction techniques for clinical use. It consists of an extensible development platform with standard visualization algorithms and basic functionality for medical applications. This platform has been developed in close cooperation with the University Clinics of Tiibingen. For the purpose of a flexible interaction support for this application field, it is equipped with a 2-D user-interface, 3-D functionalities, and a prototype Virtual Reality user-interfaceS. This interaction support is based on a sophisticated message-based inter-object communication related to state-of-the-art research in distributed VR, as mentioned above. Adaptation
+ UC-AID med. stands for User-Centered Adaptivity and Interaction Development for Medical Applications. * A Virtual Reality user-interface, in contrast to 3-D userinterfaces, is defined through the additional functionaLityof immersion in and navigation through the user-interface and enabledexperimentationwith 2-D and 3-D objects (cf. also 1231).
functionality is provided through a separate component in order to support user-interface adaptation. This component is, furthermore, leading to an adaptive hypermedia help system. The structure of this paper is as follows. First, we describe the integrating development environment with its major modules and functionalities. Then we describe the integrating inter-object communication and the adaptation functionality. Following that, we present the supported interaction techniques and types of user-interface. And finally, we present some of the applications that have been realized within the development environment and introduced into clinical practice at the University Clinics of Tiibingen. 2. THE DEVELOPMENT FJWIRONMENT FOR MEDICAL VISUALIZATION APPLICATIONS
To meet the demands and requirements stated above, we developed the UC-AID med. development environment for medical applications shown in Fig. 1. It consists of a three-layer development platform with dedicated extension interface to applicationspecific modules, an interface layer integrating different types of interfaces, and an adaptation component to tailor the interface configuration and behavior to the user’s needs and preferences. The different components of the three-layer development platform are described in the following sections. 2.1. Object class libraries
The object classlibraries representan application builder toolbox asa key elementof our development system. Visualization and image processing objects cover a broad spectrum of medical imaging and grapkics functionality [24]. The architecture of this class library is basedon the client-server model and the data flow approach. Its structure representsa large amount of software developmentexpertise in this area and, therefore, allowsinexperiencedapplication programmers to quickly achieve a professional program design.The visualization and imageprocessingobjectsarewrappedinto a communicationshell, which allows the application programmerto simply connect the object instancesto an object pipeline. The communicationshellmanagesthe processingof events in the pipeline (e.g. starting new data processing-allalong the pipeline-afterchangingsome parameters).Thus relieving the application programmer from the necessityto program all the eventhandling by himself. Furthermore, the client-server architecture supports distributed computing. The server is reconfigurableat runtime, which allows it to be usedas an interpreter for visual programming. A consistentintegration of new appiicationsinto the developmentsystemmay be performedby the use of managementobjects.Theseprovide methodsfor user-interfaceaccessas well as study-handling and I/O control. Common interface object libraries complete the basic layer of the developmentplatform, providing
UC-AID med.:communicating applicationobjects
165
Application lnterfact~s 2D
.. ...--.... -.- . .. .. . / .__._.._.......... A&ptibility
.-------.-.._. A&pi@
Self-defined Elements
Virtual Reality
3D
_.__._._______
Interface -1
Basic
Functionality
Interface Elements
Object Class Libraries I
-
r
__
Development Phfform Fig. 1.Architectureof thedevelopment environment.
standard dialogue elementsto the user, such as buttons in 2-D, trackballs in 3-D, and drawersfor VR. In this manner,a new application may be realized by simply combiningalready existing objects,ideally by visual programming.
ment, negate,filter (e.g. smoothing,edge-enhancement, noise reduction), histogram modification, imagearithmetics). l Evaluation: Local/global greyvalue statistics in user definable Regions- or Volumes-Of-Interest (ROI, VOI). Geometric properties (2-D/3-D distance, angle,area, volume, profile). Evaluation of time-series. 2.2. Basic functionality and interfaceelements 0 Documentation: Image annotation and hardcopy One of the primary goals of the system is to (via PostScriptand DICOM 3.0). addressthe needsof a broad community of clinical . Analysis:Advanced imageprocessingand graphics clients.Therefore, an analysisof the requirementsfor methods (e.g. various segmentation methods, a medical workstation has to cover various disregistration,classiiication,multiplanar reconstrucciplines. We performed such an analysis at the tion, standardvolume renderingmethods,modelUniversity Clinics of Tiibingen. From the resultswe ing functionality). derived a functionality specificationfor a universal medical graphics workstation. We distinguished betweenfive categoriesfor which we implemented E--Thesemethodsand algorithmsrepresentthe basic runctionality of our developmentenvironment, thus the following functionality [24]: providing all the tools commonly in use in PACS workstations. l Accessand Display: Image storage/retrieval,data compression,interpretation of file formats (in Furthermore, this layer of the developmentplatparticular ACR-NEMA, DICOM 3.0), commu- form contains methodsand functionality to access nication with PACS-Systemsand other imaging basic standardizedinterface elements,such as prodeviceson the basisof the DICOM 3.0 protocol, vided by the X/Motif widget set [13]. In addition, we handling of multiple studies,multiple imagedis- provide higher-level interface elements, such as dialogueboxes in 2-D. Theseinterface elementsare play. l Manipulation: Image processingoperations (e.g. built on top of the basicelementsin a well-defined zoom, pan, mirror, contrast/brightnessadjust- way that is already partly standardized through
766
J. Fechter, T. Grunert, L. M. Encarnaqao and W. Strager Table 1. Comparison between conventional 2-D user-interfaces and virtual environments 3-D virtual environments
2-D interfaces Presentation
2-D objects (e.g. buttons, sliders, pull-down menus, cursors); standardized object sets
Dialog Control
Standardized object behavior and simple inter-object relationships (e.g. set button sensitive) Clear separation between application program and user-interface possible
Application Interface
Program
3-D objects as virtual representations of real-world objects (e.g. alternator); no standardized sets of objects Non-standardized object behavior and complex inter-object relafionships (e.g. change alternator panels) Objects include application program functionality -.
platform in order to allow for the consistent integration of new application modules.Such moduleswereusedto realizethe applicationsdescribedin Section6. In addition to the presentedvisualization and basic WI. The main problem of Virtual Environments functionality, we propose an elaborate extension compared to conventional 2-D interfaces is that a interface. Here, the classstudy plays a central role standardizedset of objectswith standardbehavior is from the programmer’spoint of view: a patient study not available (seeTable 1). In order to remedy this may contain textual elements(e.g. patient name. problem we propose basic objects for Virtual medical report, anamnesis), graphical objects, Environments(seeTable 2) which can have different images, audio, and video. The study provides representations and the possibility to add new methodsfor data accessand visualization basedon (special)behavior to the basicone. the functionality previously described(c$ Section The higher-level elementsalso include units that 2.2) relieving the application programmer from were newly developed and that are application- dealingwith thesecomplex tasks. dependent. In our application system, these units In this sense,a study representsa specialized are highly specializedto servethe medicalfield, thus medicalapplication object providing mechanisms for consideringthe specialneedsand requirementsfrom event-handling in order to maintain the dataflow this domain. through the application,and performing the management of user input in the screen area, such as 2.3. Extensioninterface painting, picking, and movementsin 2-D, 3-D, and A medical imaging and graphics workstation VR. suitable for clinical researchmust contain mechanisms for application development support and the 2.4. Application objects integration of new algorithms. Just providing the The top layer of the deveiopment platform is source code of the system is insufficient in this representedby different application objectsthat unify context. The extensioninterface of our systemis used interface elements’featureswith the basic fimctionto extend functionality layers of our development ality of the underlying layer of the architecttire. ln widely accepted style guides[14, 151.In 3-D, such interface elementsmight e.g. include databaseaccess cones or perspective viewing walls [16] in VRinterfacesreal-world objectssuch as completedesks
Table 2. Classification of objects in virtual environments Class
Tw of data
Representation
Interaction task
Type of user interaction
Trigger Switch Symbol
3-D-button Switch, lever Any/3-D
Binary
Execute
Logical Logical
Enable/Disable Execute, Call
Push Button Switch-to--on/off Select
Indicator
Highlighting,
Real
Status, Warning
Note of ONLY
emphasizing Selector
multi-position
Container
Box, folder, drawer
Discrete objects
Hierarchy
Tree Wall projections, blackboard Time-tunnel
Discrete objects Discrete objects
Body (avatar), camera, invisible
Discrete objects
Prospect History User
lever Discrete, exclusive Selection l-of-n
Continuous data
Scattered organization Organize structure Viewing, sorting, give overview Back-and-forth tracking Change position, navigate
Set mode Open/Close, Get/Put Browse, Select
Example Quit Application Turn iight on/off Camera: connect to collegue Red Light: Stop-Access denied! Set sorting mode to patient names Arrangepatient records
Arrangestages illness Multi- selection, Virtual lightbox with CT images Drag, Arrange
Follow, Focus-on Go to
Traverse patient history Move towards the lightbox
UC-AID med.: communicating application objects order to generate such application objects, abstract interaction classes that summarize different dialogue and interaction types of the interface have to be defined. The criteria for the definition of such classes are: l l
l
the type of data that can be presented; the interaction task that has to be completed, i.e. how the data is and might be handled using this interaction class; the type of userinteraction.
Table 2 showsour approach to this classification [17]. We took a medicalapplication scenarioto show examplesof different types of interaction classes that can be establishedfor a VR interface. Although this table may not be complete,it showshow a seriesof different dialogueand interaction mediaand techniques can be efficiently summarizedby introducing abstract application objects with behavior. This behavior might also include attributes of the objects that influence their behavior, such as moveable, editable, hideable. It seemsreasonablethat the inter-object relationshipsshouldalsobe generalizedby taking often used interactions as the basis. Such relationshipsmight focus on positioning (e.g. drag-and-drop), editing/ modifying (e.g. pen and blackboard), timing (e.g. synchronization), and observing (e.g. retrieving objects’ states). The latter is especiallyinteresting for implementingtechniquesfor user-interfaceadaptation (cJ Section 4), thus providing basicmechanismsto protocol and evaluate the user’sbehavior at the interface to the application. Due to thesedemands,the applicationobjectsneed the additional ability to communicate with each other, thus allowing for the integration of different application objects in certain interfaces, and the combination of different interface techniques,consequently. We realizedthis inter-objectcommunication through a sophisticatedmessage-concept describedin Section 3. 3. INTEGRATION
THROUGH
MESSAGEPASSING
761
integrated into the sceneand existing objectscan be deleted. 3.2. Object names
Our communication model assumesthat each object is addressable. This isrealizedthrough naming the objects. In order to support the grouping of objectsin logical or natural entities, name-partsare separatedby periods.The useof hierarchicalnames are essentialfor the construction of, and messagepassingin, large VR scenes.We furthermore allow the use of wildcards like “*” or “?’ for parts of objects’names.This way, through a construct like “alternator.panels.*“, all the alternator panelsin a scenecan be addressed.In the specialcasethat all objectsare subjectto the receptionof a message, just a singleasterisk“*” has to be used. This naming schemecan be enhancedto support relative addresses,special objects like avatars for multi-user applications, and interfaces to external serviceprogramsas proposedin [S]. 3.3. Architecture
Our standardmessage format consistsof five parts: the receiver, the sender,the message itself, messagedescribingparameters,and timestamps(cf: Fig. 2). On the reception of a messagethe receiving object executesfour steps: First, the fulfillment of one or more pre-condition(s) is checked. In the next step,the message is registeredin a list, which is thus representingthe messagememory unit of the applicationobject. There are two modes available: one mode is to registereach message if its frequency is important (append-mode); the other mode replacesthe last received messageof the same kind (replace-mode).It is the most commonly used. Then the action/behavioritself is activated. With the messagea so-calledpost-action can be associated.When a post-actionis executed,one or more messages are sent to other objects which might includethe proper object itself.
The message concept that realizesthe communication amongthe different applicationobjectssupports A simpleexampleof the communicationbetween the sending and reception of messages within or two application objects within a medical Virtual between different objects, thus allowing for the Environment is shownin Fig. 2. integration of different interface objectsand types. Eachapplication object hasa list of inherited basic behavior which can be extended.For this reason,all objectshave to havecommonbasicfunctions, suchas 3.1. Types of messages We distinguishbetweentwo types of messages: (1) adding and removing behavior or the ability to messagesthat execute actions and (2) messages disclosetheir namesto others.By addingbehavior to containing statusinformation about the application, an object, we can take advantageof polymorphism, the scene,or singleapplication objects.The messages i.e. the circumstance,that different objects can use containing status information can also be used to the sametoken for similaractions(e.g. behavior open trigger other events or to modify the behavior of for objectsfolder and desk).Moreover, it is possible objects (see Section 3.3). This message-based com- to define macros, i.e. sets of fixed actions with munication concept provides a high level of flex- determinedand known object addressesand correibility, also due to the fact that the inter-object spondingmessages. The advantageis, that frequently relationshipsare not hard-wired: new objectscan be neededbehavior only hasto be definedonceand then
J. Fechter, T. Grunert, L. M. Encama@ and W. StraDer
768
.. 8* a : : : .:. ‘.
ator.RemoteControl.PanelManipul.P2upldown
.. :. .. : : .. .
.. ... ... Message-parsing
Fig. 2. Message-basedcommunication: when the user presses a button on the remote control device, a messagewill be.sent to the corresponding panel-objectand trigger the associated action (c$ Fig. 5).
runs on different instances of objects independent of the determined name of the sending object. In addition, the set of behavior can be re-used from objects with different representations. In order to realize a flexible inter-object communication, both synchronous and asynchronous message-passing is supported. Furthermore, the application objects have the ability to send queries to other single objects that then return the corresponding information. But an application object can also send a query to a set of objects. In return it will receive a list of the requested information which it can use to react appropriately. 3.4. Access management Access management in our decentralized environment has not been realized until now. An early implementation provides a simple locking mechanism. This prevents an object from sending messages to itself, thus stopping deadlock situations and infinite loops from occuring. For a server-based distributed VR multi-user system some other access mechanisms are discussed in [18]. 3.5. Communication between applications We developed two special objects which can send messages between two applications running on the same machine, the so-called router and bridge: a router can only pass a message in a one-way direction while a bridge allows for the two-sided exchange of messages. Moreover, it is possible to distribute the
messages on a network. In this case the forwarding of messages will be performed by a server. All three objects; router, bridge, and server, have the ability to filter messages: only the messages for appointed sets of senders or receivers are passed through. Considering the limited bandwidth on a network, this mechanism reduces the amount of relevant information which might ameliorate the performance of applications by minimizing the network traffic. 3.6. Special problems A major problem that had to be soIved throughout the realization of the message concept was the consideration of constraints among ob+ts, implying an a priori knowledge of the objects at creation-time (e.g. position) and changes of object-hierarchies at runtime (e.g. drag-and-drop). Furthermore, we had to introduce specialized management objects that take care of the ambiguities in the creation, moving, copying, and deletion of our application objects. 4. ADAPTATION
FUNcIloNAlLpry
As stated before, adaptation is a major requirement for the usability of interfaces. Th$ acQt&ion at runtime might be performed by a user (adaptability, contigurability) or-automatically or se&automaticahy-by the system. In the latter case, the system has to provide lkm&iend&y to protocol the user’s interaction behavior, to establish a model of the user, and to adapt itself to the user’s requlrements. To meet these demands, we developed a
UC-AID med.:communicating applicationobjects
flexible adaptation framework [19] that providesthis functionality. It protocols the user’sinteraction with the system on the basis of the earlier described application objects, also exploiting the communication functionality of the underlying message concept (c$ Section 3). As a result, it influencesor modifies the behavior of those application objects, adapting them to the user’sneeds.Hereby, the adaptation functionality can relate to all the parts of directmanipulative graph&interactive human+omputer interfaces, such as dialogue, interaction, and presentation. In a first step,we consideredpresentationadaptation by providing different userswith different kinds of application objectsfor the samedata. Usersthat are familiar with computer-applicationsmight prefer standardizedinterface elements,whereas,computernovices-that may of course be experts in their working field (such as medical diagnosis)-will certainly go for more intuitive presentations:in a medicalapplication scenarioand for the representation of patients’ recordsthis could mean to provide common 2-D file selectiondialogueelementsto the advancedcomputer-userand to support the novice with 3-D drawersand folders (c$ Section 2.4). This kind of adaptation is not alwayspossibledue to the limitations of the currently chosen interface type: while 2-D application objectsmight still be realizable in a Virtual Environment, 3-D objects are not possiblewithin a pure 2-D interface. Furthermore, this kind of adaptation might not be reasonablefor somecases,where e.g. the underlying data or the application purposestrongly definesthe presentation: representingmedicalimagedata through symbolsor iconsfor the purposeof diagnosiscertainly misses the aim of the application. To observe and develop the adaptation through the system,we implementedan adaptive hypermedia help component [20] as another application object that can be integrated in the different types of interface. Focusing on a certain type of interface, however, the provided adaptation mechanismscan also be usedto adapt the layout-configuration of the userinterface [21], i.e. the actual configuration of a predefinedset of application objects. Moreover, the protocoling of the user’sdialogue with the application systemon the basisof communicating application objects also provides powerful meansto adapt the dialogueitself through enabling/ disabling and emphasizingdialogue elementsand through the offering of macros for reasonable subsequentdialoguestepsat runtime. A further target of our adaptation componentis the adaptation of the interaction of the userwith the systemby providing different tools for different users, as far as the data to be manipulated and the application purposeallow for it. We are aware of the fact that the considerationof distributed applications and cooperative scenarios raisesnew ambiguitiesthat have to be focusedwith
169
respectto the adaptation issues.Theseare e.g. the costsof modelingdistributive located usersand the disadvantages that may result from adaptingthe data presentation and interaction of users’ interfaces, while the cooperatingpartners still need to be able to dismissthe data: individual adaptation will most probably lead to confusion amongthe users,so that some kind of a group adaptation and, therefore, modelinghasto be introduced. 5. INTERACTION
TFLHlWQUES
AND INTERFACE
STYLES
On the basisof the applicationobjectsdescribedso far and exploiting the functionality of the underlying messageconcept and adaptation component describedearlier, we are now able to combinedifferent interface techniquesin a commonuser-interface. To demonstratethe applicability of our approach of mutually exchangeableapplication objects and interfaces in a common medical scenario, we introduceda lightbox metaphorto all user-interfaces developed and described in the following. This metaphor allows us to extend the characteristicsof conventional lightboxes, i.e. the 2-D viewing of a limited number of images that are panel-wise exchangeablethrough an alternating functionality, by the additional functionalities available in the respectivetype of user-interface. 5.1. 2-D interaction for medical diagnosis We developeda 2-D interface for medical diagnosis and therapy on all different kinds of digital medicalimagedata that already found its way into practice (cJ Section 6). A large part of the workstation screen is used for image display in a compoundviewing area.Only a smallpart is reserved for a menu field. In analogy to the observed characteristicsof conventionallightboxes,the following look and feel is supported(seeFig. 3): l
l
l
An imageor a patient study can easily be selected from the databasepanel and 6xed to an arbitrary lightbox segmentvia a drag-and-dropmechanism. Unrestrictedmaneuveringon the lightbox panel is performed by realtime zoom and pan, thus controlling the panel section actually displayed on the screenand its spatialresolution. The logical link between images and lightbox segments can be graphically reorganized.
5.2. 3-D interaction
support
The 2-D functionality is supplementedby 3-D interaction support, alsobasedpartly on the lightbox metaphor. Many additional features enhance the interpretation process,suchasfast skimmingthrough imageseries,arbitrary slicing through volume data sets, or volume rendering. Furthermore, a virtual manipulator that mimics a trackball allows for the maneuveringof objectsin 3-D (seeFig. 4). For medicalapplications,one can imaginefurther integration of other (artificial) 3-D objects, such as cone trees to visualize hierarchical data like patient
J. Fechter, T. Grunert, L. M. Encarna@ioand W. Straher
770
Fig. 3. Diagnosis on conventional X-ray images in 2-D.
records or archive systems [16]. The advantage of such a system is the increase of the density of information and a faster access to the desired information. Furthermore, there are attempts to visualize the results of database queries with the aid of architectural metaphors [22].
l l
l l l l
5.3. Interaction
in a Virtual
Environment
As mentionedbefore, different usersprefer different user-interfaces,e.g. a beginner would possibly choosea VR environment,sinceit is self-explanatory, while an expert will tend to use a time-saving, powerful, and productive user-interface. VR user-interfacesor Virtual Environmentsintend to provide the user with the feeling that (s)hewas interacting directly with real objects.In this manner the user may forget about the presence of a computer. Comprehensiveness can be accomplished by creating a virtual world which resembles the “real world” with which the user is familiar allowing actions to be performed in a natural way. The digital lightbox metaphor applied to our VR environment [12] allows for the following typical functionalities (seeFig. 5): l
Open an archive drawer and take out a patient folder.
Fix imagesto an alternator panel Magnify/highlight image sections and analyze imagesvia rulers. Annotate images. Changealternator panels. Dictate report into voice recorder/microphone. Consult a referencebook or a colleaguevia phone or video conference(multimedia).
The interaction with the objectsshouldproceedin a natural way, Therefore, navigation in suchVirtual Environments requires the use of advanced input devices,such as spaceball,head-tracker, dataglove, etc., but due to the still existing lack of userfriendlinessof such devices(it still requires a lot of experiencefrom a user to grab and move objectsin the virtual 3-D-space),we implementedsomeadditional automatic (intelligent) behavior to minimize the amount of interaction-efforts for the user. For example,when the userclosesa container object, all the objectsthat are related to it will be reset. 6. CLINICAL
APPLICATIONS
The presented software environment is in the processof being evaluated through practical useat the University Clinics of Tubingen. First responses from the clinical usersare very promising,especially
UC-AID med.: communicating application objects
171
Fig. 4. 3-D interaction support for 3-D-cndosonography.
with respectto adaptabletypes of user-interfacesand adaptive hypermediahelp presentation.Someexample applications from the University Clinics of Tiibingen that show the system’sapplicability to different medical environments and working fields are describedbelow. 6.1. Diagnosis on conventional
X-ray images in 2-D
ConventionalX-ray produceshigh-resolutionalyet only 2-D image data, thus being an appropriate application for sole2-D interaction support. Our 2-D user-interfaceas shown in Fig. 3 hasbeen usedfor medicaldiagnosison X-ray data at the departmentof diagnosticradiology. This technique,on the basisof X-ray imagedata, has also shown to be extremely adequateas a basis for discussionand demonstrationthroughout medical gatherings. 6.2. 3-D interaction for J-D-endosonography
sequenceof imagesthat are digitized and stored in our system. The IP- and CG-tools provided by the presented developmentenvironment (cj Section2) contain all the methodsnecessaryto perform the analysispart of a 3-D-endosonography.Examplesfor suchmethods are the reconstructionof arbitrarily oriented planes. This makesit possibleto obtain planesthat cannot be accessedby conventional examination, and to examine the data completely off-line with all the advantagesfor the doctor and the patient. According to this, three-dimensionalmeasurements,such as distance,angle,area, and volume can be carried out. Different volumerenderingalgorithmscan be usedto give an insight into the spatial relationships. In order to supportan increasedthree-dimensional understandingof the complexanatomicrelationships within the region of the sphincterapparatusand the pelvic floor, we introduced 3-D interaction support to the systemin the form of a virtual trackball (see Fig. 4). This trackball provides navigation and 3-D viewing functionality to the user. An additional picking-mechanismallows for the correlation of 2-D imagesand 3-D reconstructions.Furthermore, the implementedrenderingalgorithmssupport CSGoperations, thus providing a sophisticatedtool for
Endorectal sonography is generally acceptedin pediatric surgeryfor the evaluation of the pelvic floor muscles.In order to enhancethe meaningfulness of this diagnostic tool, especiallyin incontinencesurgery, we developeda method to acquire the ultrasonic data in three dimensions[25]: a computer- cutting parts from volumina to give an insight into controlled carriage which allows for the withdrawal the data. 3-D-Endosonography, when done in the way of conventional endorectal ultrasonic probes from the rectum under constant rotation. This producesa presented, reduces the discomfort of the patient and CAG zo:w
J. Fechter, T. Grunert, L. M. EncarnaqBoand W. StraBer
Fig. 5. VR user-interface from PACS applications.
improves the diagnostic impact. Application fields are diagnosis and treatment planning. Further clinical applications of our development environment with 3-D interaction functionality to medicine are a general-purpose PACS workstation [24], the construction of a drilling device for dental implants f26], the construction of individual hip endoprostheses, and the interactive 3-D segmentation and visualization of vasculature [27]. 6.3. VR user-interface for PACS appiications
As an example for the application of Virtual Reality to medicinewe focusedon the viewing and analysis of medical images, a task which is of importance not only in diagnostic radiology, for which it is currently evaluated at the departmentof neuroradiology, but in a largevariety of other clinical disciplines as well. The main idea of our VR approach [I2] is to create a intuitive virtual world on the workstation where real-world objects are represented by virtual counterparts in a threedimensionalscene(seeFig. 5). Real-world objectsthat physiciansdealwith in the context of image-baseddiagnostics and that we included in our VR approach are:
0 Viewing room l Lightbox or alternator 0 Imageson X-ray film l Patient folders containing imagesand other documents l Patient folder archive l Magnifying glass a Ruler and compasses 0 Pencil l Dictating machine/microphone o Telephone/videocamera(remoteconsultation) l Referencelibrary The navigation through this Virtual Environment is supported by the useof advanced input devices, suchasspaceballand head-tracker. 7. CONCLUSWNS
This paper presentsan integrated development environment for medical visualization app%cations. Its aim is to support the rapid rea&ation of application and researchideasand their clinical use and acceptance through extensible visualization algorithms and basic fun&ion&y. This uniclue approach to clinical usability is supported through
UC-AID med.: communicatingapplication objects the developmentof flexible interaction in 2-D, 3-D, and VR with and throughout the medical data. Different interaction techniquesand interface types and their integration in a common development environment for medical applications are briefly introduced, along with an adaptation component and an underlying message-based inter-object communication concept. Furthermore, diverse application systemsfor diagnosisand therapy that have been put into practice at the University Clinics of Tiibingen are presented. Future researchwill include further refinementof the development environment, completion of the interface layer, and application of the adaptation component to a larger set of application objectsin addition to the hypermediahelp component.Due to the intensecooperationwith the University Clinicsof Tiibingen, further medical applicationson the basis of the presented development environment are planned, thus supporting the ongoing refinementof the system. Another direction of research that has been focusedon is the support of distributed cooperative work (CSCW). The major problems here are the aspectsof synchronizedcommunicationamongdistributed application objects and the handling of different sightsonto shareddata discussed earlier. A prototype implementationtowards CSCW is in the processof realization for 2-D interaction. Basedon the paradigmof application objectsdescribedin this paper, a further developmentand implementationof the other interaction techniques(3-D, VR) towards CSCW shouldbe realizablein the near future. With respect to VR user-interfaces,one can imagine the complete mapping of the entire infrastructureof a hospitalto a Virtual Environment, thus providing communicationand collaboration support to the diverseworking areaswithin the medicalfield. This reasoningraisesnew problemsand challenges for researchand technical enhancements:the communication load for updating the sceneand for multi-user interaction of suchhuge virtual environmentswill definitely leadto a performancebottleneck at the side of the central communication server. Therefore, the load for the communicationmanagement needsto be sharedamong severalservers.In order to support realtime navigation, the (extremely large) scene descriptionsof Virtual Environments need to be split into setsof relevant and mutually connected sub-scenes.The multicasting approach (e.g. [28]) seemsto provide reasonablemechanisms to solve theseproblems.
713
participatedin therealizationof thepresented development environment for theirimplementation efforts. REFERENCES 1. Robb,R.A., Hanson,D.P., Karwoski,R.A., Larson,
A.G., Workman,E.L. and Stacy.MC.. Analvze.:a comprehensive, operator-interacti softwarepackage for multidimensional medicalimagedisplayandanalysis.Computerized Medical Imaging and Graphics, 13(6),
1989,433-454. 2. Heffernan,P. and Dekel, D., Imagingapplications platform: concept to impiementation. In Visualization in Biomedical Computing, ed. R.A. Robb.Bellingham, SPIE,1992,pp. 495-509.
3. Upson, C., Faulhaber, T., Kamins, D., Laidlaw, D., Schlegel, D., Vroom,J., Gurwitz, R. andvanDam,A., The application visualization system: a computational environment for scientific visualization. IEEE Computer Graphics and Applications, 9(4), 1989, 30-42. 4. Dyer, D.S., A dataflow toolkit for visualization. IEEE Computer Graphics and Applications, 10, 1990, 6069. 5. Upson, C., et al., The Iris Explorer Technical Report. Silicon Graphics Inc., Mountain View, CA, 1991. 6. Sukaviriya, P. and Foley, J.D., Supporting adaptive interfaces in a knowledge-based user interface environment. In Proceedings of the 1993 International Workshop on Zntelligent User Interfaces, Orlando, FL, ACM Press, 1993, pp. 107-113. I. Bell, G., Parisi, A. and Pesce, M., The Virtual Reality Modeling Language: Version 1.0 Specification. URL: http://vrml.wired.com/vrml.tech/vrmllOc.htmI, 1995. 8. Broll, W., England, D., Fechter, J. and Koop, T., The power of dynamic worlds, VRMI, 2.0 proposal. URL: http://wintermute.gmd.de:8OOO/vrml/dynamic Worlds.html, February 1996. 9. Cote Mufioz, J., AIDA: an adaptive systemfor interactive drafting and CAD applications. In Adaptive User Interfaces: -Priuciples a& Practice, eds.- M. Schneider-Hufschmidt. T. Kiihme and U. Malinowski. North-Holland, 1993, pp. 225-240. 10. Malinowski, U., Kiihme, T., Dieterich, H. and Schneider-Hufschmidt, M., Computer-aided adaptation of user interfaces with menus and dialog boxes. In Human-Computer Interaction: Software and Hardware Interfaces, Proceedings of the Fifth Conference on Human-Computer Interaction (HCI International ‘93),
Orlando, Florida, Volume 2, Elsevier Science Publishers B.V., 1993, pp. 122-127. 11. Thomas, C.G. and Krogsieter, M., An adaptive environment for the user interface of EXCEL. In Proceedings Intelligent
of the 1993 International User Interfaces, Orlando,
Workshop
Florida,
on
ACM
Press, 1993, pp. 123-130. 12. Grunert, T., Ehricke, H.-H., Fechter, J., Kolb, R. and Skalej, M., PACS man-machine communication via virtual environments. In EuroPACS94, Geneva, Switzerland, 22-24 September 1994. 13. Heller, D. and Ferguson, P.M., USF/Motif Programming Manual,
OSF,2ndedn,1994.
14. Open Software Foundation, OSFjMotif Style Guide, Rev. 1.2., Prentice Hall, Englewood Cliffs, NJ, 1993. 15. Fowler, S.L. and Stanwick, V.R., The GUIStyle Guide, AP Professional, 1995. 16. Robertson, G.G., Card, S.K. and Mackinlay, J.D., Information visualization using 3D interactive animaAcknowledgements-The authors would like to thank, first tion. Communications of the ACM, 36, 1993. and foremost,their partnersat the UniversityClinicsof 17. Fechter, J. and Encama@o, M., A classification scheme Tiibingen,the Drs med. F. Dammann,M. SkaIey,G. for interface objects to support user modeling in virtual Stuhldreier,S. Se& and their teams,for the professional environments. Research Report, WSI-GRIS, University discussions and the friendly realizationsupport.Furtherof Ttibingen, Germany, November 1994. more,wewouldliketo thankJ. Wassermann andF. Weisser 18. Broll, W. and England, D., Bringing worlds together: for their implementation supportand the fruitful discusadding multi-user support to VRML. In Proceedings of sionson the development of the 2-D and3-D interaction. the VRMLPS Symposium (VRML’9S), ACM SIGLastbut not leastwewouldlike to thankall students that GRAPH, 1995.
114
J. Fechter, T. Grunert, L. M. Encama@o and W. Stral3er
19. Encarna@o, L.M., Adaptivity in graphical user interfaces: an experimental framework. Computers & Gruphics, 19, 1995, 873-884. 20. Encamafio, L.M., An adaptive hypermedia help component for graphical user interfaces. Technical Report WSI-95-26, Wilhelm-Schickard-Institut f”ur Informatik, University of Tilbingcn, Germany, November 1995.
Encarna@o, L.M. and Stork, A., AdaptionsmGglichkeiten in modemen CAD-Systemenz Bewertung, Konzeption und Realisierung. In CAD’%, Kaiserslautern, Germany, 1995. 22. Clay, G., Database visualization and VRML. URL: http://reality.sgi.com/employees/clay-corp/article/article.html, 1995. 23. Gigante, M.A., Virtual reality: definitions, history and applications. In Virtual Reality Systems, eds. R.A. Eamshaw, M.A. Gigante and H. Jones, Academic Press, 1993, pp. 3-14. 24. Grunert, T., Fechter, J., Stubldreier, G., Ebricke, H.H., Skalej, M., Kolb, R. and Huppert, P.E., A PACS Workstation with integrated CASE tool and 3D21.
Endosonography application. In Computer Assisted Radiology CAR’95, 21-25 June, Berlin, G&many, 293~ 298, (1995). 25. Stubldreier, G., Schweitzer, P., Kirschner, H.-J., Fleiter, T. and Grunert, T., Three-dimensional endosonography of the rectum. Coloproctology, 17, 1994, 1-5. 26. Bauer, J., Grunert, T., Schaich, M., Niemeier, R,, Fleiter, Th. and Kaus, T., CT-based construction of a dental drilling device. In Computer Ass&& Raclotogy CAR95, Berlin, Germany, 21-25 June 1995, pp. 958963.
Welte, D., Grunert, T., Klose, U., Petersen, D. and Becker, E., Interactive 3D segmentation and visualization of vessels. In Computer As&ted Radbl~y CAR’%, Paris, France, 199& pp. 329-334. -28. Macedonia. M.R.. Zvda. M.J.. Pratt. D.R. er cil.. Exploiting ‘reality ‘wi& &u&a& groips: a network architecture for large-scale virtual &wironments. In Proceedings of the 1995 Virtual Reality Annual Znternational Symposium (VRAZS 95), Los Alamitos, CA, IEEE Computer Society Press, 2-10 March 1995.
27.