Proceedings from Montebello Round Table Discussion nonequilibrium thermodynamic systems obey the principle of maximal entropy production [5-7]. The last piece of this nonequilibrium thermodynamics puzzle arises from the observation that fractal structures occur ubiquitously in nature, form as self organized, spontaneously forming structures, and occur where there is energy dissipation occurring. Innumerable examples abound, including fractals in space (eg, river deltas, coastlines, trees, lightning bolts, waterfalls) and time (eg, the dynamics of solar flares, earthquakes, weather). Thus, the following hypothesis is presented, namely, that the fractal properties of variability appear as a self-organized phenomenon, specifically because they are more efficient at dissipating energy gradients. In medicine, loss of temporal fractal structures of life support are associated with worsened oxygen delivery [8-11], and loss of the spatial fractal structure of pulmonary anatomy in emphysema is an example, whereby loss of fractal structure leads to impaired ability to dissipate energy gradients. Thus, fractal measures of variability may offer a measure of the ability of an underlying system to dissipate energy, produce entropy, and occur as a spontaneous self-organizing phenomenon. In summary, this theoretical exploration of the meaning and origin of complex fractal variability leads to several hypotheses. Overall degree of variation signifies adaptability, defined as the proportion of maximal work possible divided by resting work output. Information complexity measures may signify informational complexity and the ability of the underlying system to compress information. Fractal measures of variability may signify intrinsic thermodynamic system complexity, defined as the energy dissipation per unit mass per unit time. Fractal variability is a self-organized phenomenon, occurring spontaneously because of the principle of maximal entropy production. This theoretical research now requires careful and rigorous evaluation using experimental research. doi:10.1016/j.jcrc.2011.02.037
References [1] [2] [3] [4] [5] [6] [7] [8] [9] [10] [11]
Goldberger A. Lancet 1996;347:1312. Goldberger AL, Peng CK, Lipsitz LA. Neurobiol Aging 2002;23:23. Chiasson EJ. BioSystems 1998;46:13. Chiasson EJ. Cosmic Evolution: The rise of Complexity in Nature. Cambridge: Harvard University Press; 2001. Dyke J, Kleidon A. Entropy 2010;12:613. Lorenz R. Science 2003;299:837. Dewar RL. J Phys A Math Gen 2003;36:631. Mutch WA, et al. Ann Thorac Surg 1998;65:59. Mutch WA, et al. Ann Thorac Surg 2000;69:491. McMullen MC, et al. Anesthesiology 2006;105:91. Mutch WA, et al. Crit Care Med 2007;35:1749.
Collection and distribution of continuous physiologic data across the academic enterprise John A. Morris Jr Department of Surgery, Vanderbilt University Medical Center
The vision: Valuable physiologic information is being wasted across the enterprise. We hypothesize that continuous physiologic data, densely collected, and analyzed and distributed in real time can be linked to clinical databases (medication, laboratory, microbiol-
e29 ogy, and genetics) to create new bedside biomarkers (new vital signs). These biomarkers can be used to stratify individual patient risk, reflect patient trajectory (improvement or deterioration), and may ultimately illuminate underlying mechanisms of disease. The implementation strategy for our vision requires simultaneous execution across 2 domains: geography and complexity. Geographic development introduces our vision as a proof of concept in multiple patient care venues both inpatient (intensive care unit [ICU], stepdown, floor) and outpatient (emergency department). Barriers to execution are a function of capital allocation. Implementing the increasing functionality in the system is more difficult. Consequently, we have adopted a 4-phase implementation design. Phase 1 consists of displaying and distributing (on an electronic status board and handheld devices) trend data in old vital signs continuously in real time across a single inpatient domain (ICU). Thirteen physiologic variables have been chosen for display and distribution (heart rate, pulmonary artery catheter pressures, SaO2, and intracranial pressure). Phase 2 will incorporate new vital signs such as integer heart rate variability into the real-time display. Phase 3 will incorporate nursing, laboratory, microbiology and protocol deviations to the status board. Phase 4 will incorporate new variables, such as waveform or electrocardiogram morphology and new methods of analysis (complexity, fractal dimension, and others) as they are validated in an offline research database. Progress to date: Signal Interpretation and Monitoring contains information on over 8000 ICU patients and more than 700 000 continuous hours of physiologic monitoring captured, linked, and stored in a research repository. This dense repository grows by more than a billion data points per day. Currently, we are collecting both integer and waveform data from the Philips Intellivue System, Baxter pulmonary artery catheter device, and Siemens ventilators. In addition, we are collecting less dense data (integer every 30 seconds) from all inpatient monitored beds via HL7 feeds from the networked Philips system. Our goal is to link this information with activations of the rapid response team and to understand the cost benefit issues of expanding new vital signs across the enterprise. doi:10.1016/j.jcrc.2011.02.038
Synergies of the complexity continuum and the stream computing paradigm Daby Sow , Carolyn McGregor IBM T.J. Watson Research Center Hawthorne, Hawthorne, NY Faculty of Business and IT/Faculty of Health Sciences University of Ontario Institute of Technology, Oshawa, Ontario, Canada
Intensive care units (ICUs) worldwide offer support for patients in need of critical care. They boast a range of state of the art medical monitoring devices to monitor patient's physiologic parameters such as blood oxygen, blood pressure, electrocardiograms, and electroencephalograms. Although these monitoring systems aim at improving patient care and staff productivity, they clearly have the potential of introducing a data overload problem. In fact, most of data collected by these monitoring systems in ICUs is transient, and the typical practice is for physicians to eyeball representative readings and manually record summaries of these readings in the patient record
e30 once every 30 to 60 minutes. The rest of the data remains on the device's buffers typically for up to 96 hours before it times out and is lost forever. Hospitals are simply not equipped with the right IT infrastructure to cope with most of the data collected on their patients. Recent clinical research has found that physiologic parameters seemingly unrelated to a given diagnosis have been found to show certain behavior changes before clinical diagnosis, demonstrating the potential to create systems that process this data in real time as it is received from distributed sources. Complexity and variability analysis of physiologic time series data has been proven to be a very effective approach for the early detection of health complications in ICU patients. For example, the work performed at the University of Virginia [1] has shown that heart rate variability may precede clinical symptoms for sepsis in neonatal ICUs. In other settings, researchers have shown that the onset of seizures can be detected in real time by processing electroencephalograms. In neurologic ICUs, several efforts are being made to detect early delayed cerebral ischemia episodes [2]. Although these analytic techniques are being developed, there is a lack of generic programmable systems facilitating the implementation of these techniques for real-time monitoring in production environments. Most existing solutions tend to be developed from the ground up and in silos, thus limiting reusability and extensibility. In addition, there is a lack of exploration systems supporting physicians in quests for interesting patterns buried in these physiologic data sets. Traditional computing paradigms for data analysis are challenged by the real-time requirements of critical care. They were developed under the assumption that data needed to be stopped and persisted before being analyzed. In ICU environments, streams of data are continuously flowing from monitoring devices, resulting in voluminous data rates and stressing the performance of traditional data warehouses. These systems are often not equipped with tools capable of analyzing data quickly enough for physicians to extract information in a timely manner that would result in important windows of opportunities for clinical interventions. We present a new paradigm of computing known as stream computing that addresses these deficiencies, as a means to implement architectures for real-time analysis of multiple physiologic data streams to watch for medically significant events. In contrast with traditional data analytic approaches, this paradigm does not attempt to stop continuous flows of data for analytic purposes. It analyzes data on the fly, in real time. We present a new stream computing platform developed at IBM Research called InfoSphere Streams (Streams) [3] and describe how we have tailored it for ICU environments. The resulting platform accelerates the development and deployment of real-time monitoring solutions by exposing developers to an intuitive programming model where they focus on the business logic of their application while being shielded from the complexity of the underlying real-time stream computing infrastructure. doi:10.1016/j.jcrc.2011.02.039
References [1] Griffin P, Moorman R. Toward the early diagnosis of neonatal sepsis and sepsis-like illness using novel heart rate analysis. Pediatrics 2001;107(1): 97-104. [2] Wartenberg KE, Schmidt M, Mayer SA. Multimodality Monitoring in Neurocritical Care. Crit Care Clin 2007;23:507-38.
Proceedings from Montebello Round Table Discussion [3] IBM InfoSphere Streams. http://www-01.ibm.com/software/data/infosphere/streams/.
Session VIII: Clinical Applications of Variability Analysis What lies beneath: Physiology and pathology of electroencephalogram synchrony Vera Nenadovic Critical Care Unit, Toronto Hospital for Sick Kids, Toronto, Ontario, Canada
EEG phase synchrony1, entropy, and variability are being used to evaluate brain function and predict neurologic outcome in both adults and children. Increased EEG phase synchrony has been associated with pathologies such as epilepsy and Parkinson disease. Phase synchrony among brain regions also increases during normal physiologic events (eg, slow-wave sleep or during cognitive tasks). EEG phase synchrony in both normal and pathologic conditions may have a common underlying mechanism. Understanding the mechanisms producing EEG phase synchrony in the brain will impact the clinical interpretation and significance of this phenomenon. We have developed a program of research evaluating EEG phase synchrony in normally developing children and critically ill children in coma. Recently, we have found that (1) during hyperventilation (an EEG activation method), phase synchrony increases globally with hypocarbia. (2) Maximum phase fluctuations (variability) occur before EEG slow-wave appearance during hyperventilation. This bifurcation in the system is similar to that observed before the onset of a seizure in patients with epilepsy. (3) Conversely, EEG phase synchrony decreases during periods of sleep apnea with hypercarbia. (4) In children with normal sleep architecture, there is a temporary increase in EEG phase synchrony in the transition periods between sleep stages from drowsiness to deep sleep. These findings point to the interaction between respiratory, cardiovascular, and central nervous systems in the phenomenon of EEG phase synchrony and that synchrony is involved in changes in brain state. Further studies evaluating EEG phase synchrony require simultaneous evaluation of cerebral blood flow and PCO2 levels in a multimodal approach to a complex system.
1
EEG phase synchrony. The raw EEG waveforms are quantified using the analytical signal concept. EEG phase synchrony assumes that if 2 or more brain regions are synchronized, then for the specified time epoch, those regions are transiently coordinated [1]. In the example below, part A shows 2 raw EEG channel recordings: one from the left temporal (T5) area and the other from the left occipital (O1) region. In panel B, the 2-channel recordings are converted into vectors, and the circular variance of the phase difference is calculated. The amplitude of phase synchrony is quantified as the synchrony index R.