Toward optimal display of physiologic status in critical care: I. Recreating bedside displays from...

9
Toward optimal display of physiologic status in critical care: I. Recreating bedside displays from archived physiologic data ,☆☆ Anton Burykin PhD a,b, , Tyler Peck BA b , Vladimir Krejci MD d , Andrea Vannucci MD c , Ivan Kangrga MD, PhD c , Timothy G. Buchman PhD, MD a,b a Emory Center for Critical Care (ECCC) and Department of Surgery, School of Medicine, Emory University, Atlanta, GA 30322, USA b Department of Surgery, School of Medicine Washington University in St Louis, St Louis, MO 63110, USA c Department of Anesthesiology, School of Medicine Washington University in St Louis, St Louis, MO 63110, USA d Department of Anesthesiology, University Hospital of Bern, CH-3010 Bern, Switzerland Keywords: Data display; Dynamic visualization; Scientific visualization; Patient monitoring; Visualization of physiologic signals; Medical education Abstract Background: Physiologic data display is essential to decision making in critical care. Current displays echo first-generation hemodynamic monitors dating to the 1970s and have not kept pace with new insights into physiology or the needs of clinicians who must make progressively more complex decisions about their patients. The effectiveness of any redesign must be tested before deployment. Tools that compare current displays with novel presentations of processed physiologic data are required. Regenerating conventional physiologic displays from archived physiologic data is an essential first step. Objectives: The purposes of the study were to (1) describe the SSSI (single sensor single indicator) paradigm that is currently used for physiologic signal displays, (2) identify and discuss possible extensions and enhancements of the SSSI paradigm, and (3) develop a general approach and a software prototype to construct such extended SSSI displaysfrom raw data. Results: We present Multi Wave Animator (MWA) frameworka set of open source MATLAB (MathWorks, Inc., Natick, MA, USA) scripts aimed to create dynamic visualizations (eg, video files in AVI format) of patient vital signs recorded from bedside (intensive care unit or operating room) monitors. Multi Wave Animator creates animations in which vital signs are displayed to mimic their appearance on current bedside monitors. The source code of MWA is freely available online together with a detailed tutorial and sample data sets. © 2010 Elsevier Inc. All rights reserved. Abbreviations GUI, graphical user interface; HRV, Heart Rate Variability; ICU, Intensive Care Unit; MODS, multiple organ dysfunction syndrome; MWA, Multi Wave Animator; OR, Operating Room; SICU, Surgery Intensive Care Unit; SSSI, single sensor single indicator Financial support: This work was generously supported by grants from the James S. McDonnell Foundation (220020070) and Defense Advanced Research Project Agency (DARPA) (49533-LS-DRP and HR0011-05-1-0057). ☆☆ Conflict of Interest: none declared Corresponding author. Department of Surgery, Emory University, Atlanta, GA 30322, USA. Tel.: +1 314 761 5422. E-mail addresses: [email protected]; www.burykin.com (A. Burykin). 0883-9441/$ see front matter © 2010 Elsevier Inc. All rights reserved. doi:10.1016/j.jcrc.2010.06.013 Journal of Critical Care (2010) xx, xxxxxx

Transcript of Toward optimal display of physiologic status in critical care: I. Recreating bedside displays from...

Journal of Critical Care (2010) xx, xxx–xxx

Toward optimal display of physiologic status in critical care:I. Recreating bedside displays from archivedphysiologic data☆,☆☆

Anton Burykin PhDa,b,⁎, Tyler Peck BAb, Vladimir Krejci MDd, Andrea Vannucci MD c,Ivan Kangrga MD, PhD c, Timothy G. Buchman PhD, MDa,b

aEmory Center for Critical Care (ECCC) and Department of Surgery, School of Medicine, Emory University,Atlanta, GA 30322, USAbDepartment of Surgery, School of Medicine Washington University in St Louis, St Louis, MO 63110, USAcDepartment of Anesthesiology, School of Medicine Washington University in St Louis, St Louis, MO 63110, USAdDepartment of Anesthesiology, University Hospital of Bern, CH-3010 Bern, Switzerland

s

R

0d

Keywords:Data display;Dynamic visualization;Scientific visualization;Patient monitoring;Visualization ofphysiologic signals;

Medical education

AbstractBackground: Physiologic data display is essential to decision making in critical care. Current displaysecho first-generation hemodynamic monitors dating to the 1970s and have not kept pace with newinsights into physiology or the needs of clinicians who must make progressively more complexdecisions about their patients. The effectiveness of any redesign must be tested before deployment.Tools that compare current displays with novel presentations of processed physiologic data are required.Regenerating conventional physiologic displays from archived physiologic data is an essential first step.Objectives: The purposes of the study were to (1) describe the SSSI (single sensor single indicator)paradigm that is currently used for physiologic signal displays, (2) identify and discuss possibleextensions and enhancements of the SSSI paradigm, and (3) develop a general approach and a softwareprototype to construct such “extended SSSI displays” from raw data.Results: We present Multi Wave Animator (MWA) framework—a set of open source MATLAB(MathWorks, Inc., Natick, MA, USA) scripts aimed to create dynamic visualizations (eg, video files inAVI format) of patient vital signs recorded from bedside (intensive care unit or operating room)monitors. Multi Wave Animator creates animations in which vital signs are displayed to mimic theirappearance on current bedside monitors. The source code of MWA is freely available online togetherwith a detailed tutorial and sample data sets.© 2010 Elsevier Inc. All rights reserved.

Abbreviations GUI, graphical user interface; HRV, Heart Rate Variability; ICU, Intensive Care Unit; MODS, multiple organ dysfunctionyndrome; MWA, Multi Wave Animator; OR, Operating Room; SICU, Surgery Intensive Care Unit; SSSI, single sensor single indicator

☆ Financial support: This work was generously supported by grants from the James S. McDonnell Foundation (220020070) and Defense Advancedesearch Project Agency (DARPA) (49533-LS-DRP and HR0011-05-1-0057).

☆☆ Conflict of Interest: none declared⁎ Corresponding author. Department of Surgery, Emory University, Atlanta, GA 30322, USA. Tel.: +1 314 761 5422.E-mail addresses: [email protected]; www.burykin.com (A. Burykin).

883-9441/$ – see front matter © 2010 Elsevier Inc. All rights reserved.oi:10.1016/j.jcrc.2010.06.013

2 A. Burykin et al.

1. Introduction

Bedside presentation of physiologic data is central tomodern critical care. Current-generation bedside monitorsare designed according to the “single sensor single indicator”(SSSI) display paradigm [1]. That is, a single indicator isdisplayed separately for each individual sensor connected tothe patient. Waveform displays (such as electrocardiogram[ECG]) and simple time-averaged data (such as heart rate)have been the basis for clinical decision making for at least 4decades. However, whether these displays provide theoptimal data synthesis is still unknown.

As computer storage has become less expensive,waveform data have started to become archived in numerousdatabases such as Multiparameter Intelligent Monitoring inIntensive Care (MIMIC) [2], MIMIC-II [3] (most of thesedata are freely available on Physionet, see Moody et al [4] fordetails), IMPROVE (improving control of patient status incritical care) [5], IBIS (Improved monitoring of brainfunction in intensive care and surgery) [6], and ComplexSystems Laboratory (CSL) [7]. The number of patients andrecords in each vary from a few dozen to several thousands,and the length of records varies from a few hours to severaldays (for a review see Korhonen et al [8], for an approachusing modern health Information Technology (IT) standards,see Eklund et al [9]). We, too, have collected and archivedphysiology data for postprocessing and analysis (see Burykinand Buchman [10] and Lu et al [11] for details).

Special computer programs are required to visually displaythese signals. Although many of these programs are freelyavailable (eg, on Physionet [12]), significant effort may berequired for a clinician to learn how to use them (becausemanyof these programs work only under Unix-like environments).Moreover, the displays do not emulate bedside monitors. Thisshortcoming compromises development and comparison ofnew displays with current technologies. The technology ofrecording and archiving physiologic signals has advancedfaster than the technology of playing them back, which hasattracted only a limited attention (see, eg, Kreuzer et al [13] andStockmanns et al [14]).

To address the technology gap, we developed the MultiWave Animator (MWA), a set of open source MATLABscripts that allows one to create animations (eg, AVI videofiles) of recorded signals and thus present clinically relevantinformation in a format familiar to clinicians. It can also beused for construction and testing of novel, advanced types ofphysiologic signal display (before new display algorithmswill be hard coded into actual bedside monitors).

The paper is organized as follows. In the next section, wedescribe the displays of typical SSSI bedside monitors andprovide an overview of the archiving procedure. Then, inSection 3, we provide the detailed description of the MWAsoftware as well as the process of animation creation thatMWA uses. Then, in Section 4, we discuss several advancedfeatures (such as heart rate variability [HRV] indices, indices

of cardiorespiratory synchronization, or vital sign sonifica-tion) that can be incorporated through MWA into advanceddisplays. We also outline current limitations and futuredevelopment of the software. Finally, we provide informa-tion regarding MWA availability.

2. Display and recording of patient vital signs

2.1. Display of a typical bedside monitor

The patient data displayed by a bedside monitor belong to 2distinct types: waveforms (such as ECG) that are renewed“continuously” and numeric data (such as heart and respirationrates), which are renewed only, for example, once per second.Generally, there are 2 ways how waveform dynamics can bedisplayed on the monitor, sometimes called a steady trend lineand a moving trend line [15]. In the first case, a waveformstarts at the left border of the screen and proceeds with time tothe right border. Once the waveform has reached the rightborder, it returns back to the left border and begins tooverwrite previous displayed values with the new ones. In thesecond case, a waveform starts at the right border andconstantly proceeds to the left border, where it drops off thescreen. Moreover, frequently timescales of the displayswaveforms are not the same despite the “continuous sweep”across the screen: intervals of different length are displayed onthe monitor screen for different types (eg, hemodynamic andrespiratory) of waveforms. Such subtleties, although transpa-rent to most clinicians, profoundly affect data display design.

2.2. Data recording

Bedside data are typically displayed in real time. Archivingis another matter. Depending on the hardware and softwarecombination used for vital sign recording, different samplingfrequencies are observed in different databases. Waveformsare usually recorded at sampling frequencies between 62.5 and500 Hz (within one data set, different sampling frequencies areused for different waveforms), and the numeric data aretypically recorded at the sampling frequencies between 1 and0.0167 Hz (from once per second to once per minute). Usually(but surprisingly not always), the sampling frequency is thesame for all numeric data channels. In some cases, monitoralarms and alerts are also recorded.

The challenge of recreating a data display is apparent. Weneed to mention a typical problem that can occur during theanimation of multichannel recordings. Namely, if too manysignals were recorded simultaneously, gaps in the recordedsignals (both in waveforms and in numeric data) can appearbecause of the fixed and limited transmission capacity of themonitor. Thus, the sampling frequencies cannot be assumed tobe constant through the recording. Because these gaps occurrandomly at different time moments for different signals, thesignals become unsynchronized (this desynchronization can

3Recreating bedsise displays from archived physiologic data

occur both between waveforms and numeric data and amongdifferent waveforms) if displayed together under the assump-tion of constant sampling frequencies. Depending on theparticular archiving system, during an hour-long recording,gaps can cause a delay of about 10 seconds betweenwaveforms and numeric data (eg, between continuous arterialblood pressure (ABP) waveform and corresponding systolicand diastolic numeric values). This artifact has substantialpotential to confuse and confound and must therefore becorrected.

3. Multi Wave Animator framework description

The Multi Wave Animator has a modular structure andconsists of 3 MATLAB scripts (signal reader, framegenerator, and movie generator) that are applied sequentially.The choice of MATLAB (www.mathworks.com) reflects itsemergence as a standard software for biomedical signalprocessing and visualization, which is available for all majorOS (MS Windows, Mac OS, and Linux [Linux KernelOrganization, Inc., San Francisco, CA, USA]). Multi WaveAnimator was developed and tested with MATLAB release2007b (Microsoft, Redmond, WA, USA) running on Dellworkstations (Dell Inc., Round Rock, TX, USA) under MS

Fig. 1 Conceptual framework and the flowchart of the movie creationmodules of MWA: signal reader, frame generator, and movie generator (sis discussed in the Section 4.

Windows XP OS (both 32- and 64-bit versions) (Apple Inc.,Cupertino, CA, USA). It is expected to run under othersupported platforms and newer (and possibly also earlier)versions of MATLAB. A flowchart of the process of moviecreation with the MWA is shown in Fig. 1. The MWAcomponents (signal reader, frame generator, and moviegenerator) are described below. A detailed step-by-steptutorial with an example run is available on the MWA Webpage.

3.1. Signal reader

The input data for the signal reader is prepared in text(CSV), comma-separated values format. Although differentdatabases and waveform archives use different internal(mostly binary and sometimes also proprietary) formats forstorage, utilities are provided with commercial archivingsystems so the data can always be exported into the textformat. Signal reader can therefore work with signals fromany data library. Moreover, if the data are stored in arelational database, they can be imported directly intoMATLAB (using MATLAB Database Toolbox) withoutthe need of intermediate text files. In this case, signal readercode can be further simplified. Because MWA has a modularstructure, no other modules have to be altered.

procedure using the MWA. The inset at the bottom shows the 3ee Section 3 for details). The content of the box “advanced features”

4 A. Burykin et al.

To overcome the problem related to different samplingfrequencies and possible gaps in the data, all signals (bothwaveforms and numeric data) are interpolated (notresampled) by the signal reader at a common samplingfrequency (we usually use 100 Hz). The user is required toinput the value of the sampling frequency, select thefragment of the data set to be animated, and choose arange (minimum and maximum limits of the vertical axis) foreach waveform. If values of the waveform exceed the limits,they are “clipped” (that is, replaced by the constant that isequal to the minimum or the maximum limit) to be within arange, just as happens on physical monitors. Alternatively,the user can choose “autoscaling” for some waveforms (orfor all of them). In this case, the vertical axis limits will beadjusted dynamically during the animation to the minimumand maximum of the displayed interval of the waveform.Finally, all signals together with the entered parameters arewritten by the signal reader to the single binary MATLABfile called signals.mat.

3.2. Frame generator

The frame generator script is responsible for the creationof individual movie frames. First, it reads the signals andpreviously defined parameters from the signals.mat file.Then the user specifies the exact location (relative coordi-nates) of every signal on the screen (frame) and time interval(s) to display for every waveform and the frame rate. Wefound that a frame rate of 30 frames per second is appropriatefor a visually pleasing animation1. Then the frame generatorcreates individual frames one by one (in a loop) and writesthem to the hard drive. The loop counter (defined by theframe rate) determines the relative shift of the waveformsbetween consecutive frames. The fact that all signals havethe same sampling frequency allows us to use only oneparameter (time shift that is inversely proportional to theframe rate) that determines the shift of all waveforms and therenewal of all numeric vital signs at every frame. A typicalframe created by the frame generator is shown in Fig. E1a(online supplemental material). It is possible to definedifferent time intervals for different waveforms displayed asis common at many bedsides where the respiratory traces are“compressed” (see Fig. E1b). However, in this case, differenttime shifts must be used for different signals. All frames havean extra space at the bottom for subtitles (which are added

1 Our goal was to “imitate” physiologic waveform display by a realbedside monitor. Thus, the movie is animated “in real time” (ie, it takes 1minute to animate 1 minute of the recorded data). So, our optimal framerate (30 frames per second) corresponds to the “real-time” animation.However, most movie players have a “fast forward” option that controls theplayback speed, so it is possible to play a long movie faster, for examples,to find a clinically interesting fragment. At a higher playback speed, theframe rate of 30 frames per second is actually higher than necessary. Ouranimations are optimized for “real-time” viewing. Also, because our goalwas to create animations playable by any standard movie players, it was notpossible to implement variable (speed-depended) frame rate.

during movie postprocessing) that can contain any explan-atory information (eg, “stable hemodynamics”). This spacecan also be used to display bedside monitor alarm messages,if they were recorded together with vital signs. All framesalso have a “timer” (at the right bottom corner) that displaysthe time since the beginning of the movie (timer is based onthe sampling frequency).

Every frame is saved to the hard drive as a file inMATLAB format (“.dat”) and additionally as a bitmap(“.bmp”) file (or any other graphical file format supportedby MATLAB on a particular OS). This redundancy allowsthe user to visually monitor the movie creation process(because bmp files can be previewed, eg, with built-in MSWindows Picture Viewer). Another reason the individualframes may be needed as high-quality bmp files isbecause the user may need different formats of animationsother than an AVI movie file (eg, animated GIF or SWF/Flash files). These files can be created directly from asequence of bmp files and have relatively small sizes, sothey can be easily included in a webpage or a PowerPoint presentation. The AVI movie file can also becreated directly from the individual bmp images usingmost standard video-editing programs (see the Section 3.4below about postprocessing). This option (which caneasily be disabled), however, has a significant drawbackin the large size of a typical bmp file. For example, thesize of a bmp file of a single movie frame (Fig. E1) isabout 3 MB, so a 10-second movie at 30 frames persecond generates 300 bmp files with the total size about 1GB. Thus, if bmp files are generated, a relatively largescratch (temporary) disk is required.

On the basis of our experience, we elected to saveindividual frames and to separate the process of framegeneration from the process of movie generation alsobecause of the following reasons (in addition to thosediscussed above). First, we found that unless system ismanaged efficiently MATLAB processing can slow downsignificantly: frame creation time can increase 5-fold afterthe first 100 frames. Thus, we chose to automaticallyterminate and restart the frame generation process every 50frames. Second, at any given time only one frame is storedin the memory; thus, the memory is not a limiting factoreven for a very long movie (we run the MWA scripts on adesktop computer with 4 GB of RAM. The maximumamount of RAM occupied during the run was about 150MB). Also, if MATLAB or the OS crashes in the middleof a long movie creation, no results (frames) are lost, andthe program can be restarted from the frame at which itcrashed. Finally, with multiple processors increasingcommon in desktop computing, a long data set can besplit into multiple fragments, and multiple instances of theframe generator can run in parallel. This can be done usingeither several computers or a single computer with multipleprocessors (or cores) and multiple monitors, so that everyMATLAB instance runs on its own processor (or a core)and frames for each movie fragment are displayed on their

5Recreating bedsise displays from archived physiologic data

own monitor. Thus, the time of frame generation scalesroughly linearly (it is proportional to the number ofcomputers used). Parallel runs of the frame generator cansignificantly speed up the frame generation process, whichis the most time-consuming part of the animation creation.

3.3. Movie generator

Movie generator is a short and very simple MATLABscript that sequentially loads binary files (“.dat”) forindividual frames and creates a movie file in AVI format.It uses frame rate and the codec name as input parameters.

3.4. Postprocessing

Once created, the movie can be edited using commonvideo editing software including either free, for example, MSMovie Maker, Virtual Dub (under Windows OS), iMovie(under Mac OS), or commercial programs, such as AdobePremiere. Editing can include adding annotations (extraframes with a movie title and interrogatory and/or explan-atory text and subtitles) and audio (eg, narrations). Themovie can also be mixed with other video fragments, forexample, a short video lecture of a clinician that presents thecase, or with a video recording of the surgical procedure, ifthe vital signals were recorded during an operation, whichwas simultaneously videotaped (see, eg, Kanani et al [16]).

4. Advanced features

The Multi Wave Animator can be used to createinteractive2 dynamic displays that simply replicate bedsidemonitor display. However, the modular structure of MWA(Fig. 1) facilitates contractions of virtual displays that alsoinclude advanced features (some of them go beyond thecurrent SSSI paradigm). These features are discussed below.

4.1. Vital sign variability

Movies of virtual displays can include new signals (inboth “numeric” and “waveform” formats) that are notdisplayed by conventional bedside monitors but are derived

2 Animation video is an interactive type of display because the user canstop, replay, reverse, or change speed and view and review differentfragments of the movie in any sequence (for a formal discussion from theeducational psychology and cognitive science point of view see, forexample, Hegarty [17 and references therein]). We acknowledge that this isstill a very limited interactivity (as compared to those that can be achievedwith hypervideo or virtual reality technologies), but this is the maximumlevel of interactivity that can be achieved with a simple movie format (eg,AVI). Such a simple format is required because our goal was to create amovie that can be played by anyone using a regular desktop or laptopcomputer and virtually any media player without the need for any specialsoftware or hardware.

from the “raw” signals (recorded vital signs). These “extra”signals may include, for example, indices of HRV [18] orvariability (“complexity”) indices of other vital signs (freesoftware for variability analysis can be found online, eg, onPhysionet). In this way, movies of conventional and virtualnovel displays can be created and played back in parallel toassess the use of presentations before customized softwareand hardware is ever designed.

4.2. Organ-organ interconnection

In a similar way, MWA can also display indices thatmeasure interconnections between organ systems and thusovercome the conceptual limitation of the traditional SSSI-type monitors. Because every waveform represents a “state”or “dynamics” of a single organ (or organ system), there areno signals displayed on a conventional bedside monitor thatwould represent a degree of organ-organ or multiorganconnectivity (eg, indices of cardiorespiratory synchroniza-tion [19]) in real time. The importance of organ-organinteractions monitoring in intensive care unit (ICU) (espe-cially for multiple organ dysfunction syndrome patients) isdiscussed in [20]. This is also relevant to the animation of therecorded vital signs because, for example, IBIS databasecontains vital signs of multiple organ dysfunction syndromepatients. Indices of cardiorespiratory coupling calculatedfrom vital signs recorded from ICU monitor are discussed inBurykin and Buchman [10].

Display of these coupling indices in addition to the vitalsigns can be viewed as a “compromise” between traditionalSSSI-type displays and recently introduced advancedintegrated (graphical or ecological) displays [21-23] (suchmonitors do not display waveforms but graphically representorgan systems as, eg, 2-dimensional or 3-dimensionalobjects and organ dynamics as changes in object size orshape). In our case, some interrelationships among differentvital signs can be displayed, whereas all vital signs are stilldisplayed in the traditional, familiar “waveform and numericdata” format. With this extra capacity implemented inanimations, the dynamics of both of the vital signs andtheir variability and coupling indices can be displayed andviewed at the same time on the same screen.

4.3. Mathematical models

Along with animations of the results of the purely data-driven analysis of the vital signs (variability indices), MWAcan also create movies with additional data channels that arethe numerical solution (output) of a mathematical model thatuses recorded vital signs as an input (see, eg, Kennedy et al[24] and Zenker et al [25]). For a movie observer, such asolution will be displayed effectively “in real time” togetherwith the original waveforms (although numerical simulationsare performed off-line because they usually require asignificant amount of computer time and power).

3 Currently used simulators (training manikins) have several importantlimitations. First, they use artificially generated signals (ie, ECG, arterialblood pressure [ABP]). Second, they always assume the same physiologicalpatterns, for example, the ABP goes down therefore the heart rate goes up.It is known that this is not always true and that reality is much morecomplex. So, having the real physiologic waveforms and sequence of realevents would be an important additional training tool.

6 A. Burykin et al.

4.4. Sonification

Another opportunity to enhance MWA animations is tosupplement and expand a visual display with an audio signal(sonification or auditory display of the vital signs [26]).Audio representation of pulse oximetry signal with variabletone is widely used in clinical practice in addition to its visualdisplay on operating room (OR) bedside monitors [27]. It hasbeen suggested that other vital signs could be sonified aswell. Several experimental auditory displays that combine atraditional visual bedside monitor with sonification ofmultiple vital signs have been designed and successfullytested in laboratory environments (see Sanderson et al [21]for a review and Loeb and Fitch [28] and Sanderson et al [29]for details and for downloadable movie files with examplesof simultaneous visualization and sonification of simulatedwaveforms). Sonification has been applied to the study ofsingle channel (HRV [30]) and multichannel (severalelectroencephalogram [EEG] channels [31] and EEG,electrooculography (EOG), and ECG [32]) waveformdynamics. Simulation studies have shown that sonificationcan be used to detect (possibly time-dependent) coupling andsynchronizations in weakly coupled oscillator models[33,34]. Thus, sonification may also be used to detectorgan-organ interactions and, if included into the movietogether with animation, it may help to integrate traditionalSSSI-type representation of vital signs into a holistic pictureof a physiologic state (or dynamics) of the organism.

4.5. Alarm sounds

Another class of audio signals that can be added to theanimation are alarm sounds. Although many archivingsoftware (eg, BedMasterEx, www.excel-medical.com) cancapture alarm messages and many publicly availabledatabases (eg, MIMIC-II) contain times of alarm events andmessages (for specific alarm-focused data collections see, eg,Zhang et al [35] and Siebig et al [36]), none currently recordor store actual alarm sounds. Recently, however, an alarmsound database and simulator software became freelyavailable [37]. Alarm sounds can be taken from this databaseand added to the video animation (during the postprocessing)at indicated times. It is also possible to develop new alarms byapplying recently proposed novel algorithms (see, eg, Zong etal [38] and Clifford et al [39]) to the recorded vital signs andthen present these new sounds along with vital sign animationfor better testing and comparison.

In summary, using the advanced features describedabove, one can create a “virtual (or augmented) bedsidemonitor.” Such displays may enhance our understanding ofthe physiologic dynamics while at the same time theymaintain the format of the display as close as possible to thereal bedside monitor. We emphasize that MWA wasdesigned so that none of these advanced features requireany modification of its code or the movie creation procedure(see Fig. 1). That is, as long as additional signal is prepared in

the same format as the “raw” data sets (see the signal readersection), it will be displayed and animated by the MWA asjust one more waveform (or a numeric signal). Vital signsonification must be done separately (software for signalsonification is freely available online, see, eg, Olivan et al[32]), and the resulting audio files (wav or mp3) can beadded to the movie during the postprocessing stage using anymovie editing software (see above) as an extra audio channel.

5. Possible applications

Animations made with the MWA can be used as an “off-line emulator” of a bedside monitor for teaching, research,and quality control purposes:

5.1. Medical education

Vital signs that correspond to clinically interesting events,when they occur in the OR or ICU, can be recorded and thenanimated. Movies may be used for medical education todemonstrate the vital sign dynamics, for example, at theonset of a particular complication and in response to aparticular intervention3. These movies can be made toachieve multiple educational goals. For example, 2 versionsof the same movie can be created, one fully commented andannotated, and another one containing only the “raw” vitalsign animation. The fist can be used for teaching and theother for testing of medical students and residents.

5.2. Clinical research

Vital sign animations (especially those created with the useof advanced features) can be used to study system-levelphysiologic dynamics of the organism at different conditionsand under the influence of various clinical procedures. Also,MWA can create video of only synthetic (simulated) signalsusing a display identical to those of a typical bedside monitor.This may be used to represent results of clinically relevantmathematical models (see, eg, Kennedy et al [24] and Thamand Sasse [40]) in a format familiar to clinicians. A familiarrepresentation may help clinicians to better assess and evaluatethe behavior of suchmodels. Finally,MWAcan also be used todevelop and test new audiovisual formats (representations) ofvital signs (newmonitoring frameworks and display types). Forexample, it is important to test whether sonification (presentedvia auditory display) can better characterize organ-organinteractions (eg, synchronization) than the synchronization

7Recreating bedsise displays from archived physiologic data

indices displayed visually. Videos created with MWA mayeventually help us to understand how to monitor and displaythe dynamics of a complex system (critically ill patient) and itsresponses to an external perturbation (eg, a surgical procedure).

5.3. Quality control and patient safety

Audiotaping and videotaping are frequently used duringsurgical procedures (see, eg, Kanani et al [16]) and as tools forclinical team performance evaluation (for a review, see, eg,Jeffcott et al [41]). It is sometimes possible to simultaneouslyrecord patient vital signs. We believe that the retrospectiveanalysis of vital sign dynamics, displayed in a format thatmimics a real bedside monitor and synchronized with theaudio-video streammay enhance our understanding of clinicalteam actions. Possibly, vital sign animations may become apart of an electronic patient record in a hospital data system.

6. Comparisons with existing tools

Most of the currently available solutions have somelimitations. The EEG player [13] is a hardware-basedsolution, which requires actual EEG monitors to display thesignals. Also, it works only with EEG (waveforms andderived numeric) vital signs. Arbiter [42] is only a front end toanesthesia simulators (BODY [Advanced Simulation Corpo-ration, Point Roberts, WA, USA] and METI ECS [MedicalEducation Technologies, Inc., Sarasota, FL, USA]). To thebest of our knowledge, NeuMonD [14] is probably the onlysoftware that implements dynamic visualization of both rawvital signs as well as extendable set of advanced features(such as complexity or variability indices), although atpresent, it focuses mainly on brain dynamics display.

7. Limitations of the MWA framework andfuture development

Microwave animator is still under development. In itscurrent iteration, MWA is a working prototype rather than auser-friendly software product. Initially, MWAwas developedas an internal tool for our group to be used in batchmode, so nographical user interface was required or built. Thus, anymodifications in the current versionmust be done directly in theprogram code, so some elementary knowledge of MATLABprogramming language is required (we discuss the details ofour implementation in the appendix). This also means thatMATLAB (which is commercial software) is required tomodify and run the scripts. However, only the MATLAB basepackage is needed, and no additional toolboxes are required.Moreover, although the creation of an animation with MWAdefinitely requires some level of computer proficiency, oncecreated, the movie can be played by anyone using a regulardesktop or laptop computer and virtually any media player.

Future work includes development of a graphical userinterface for a fully visual creation of animations that willeliminate the need to manually edit MATLAB scripts. Theusers will be able to visually select data channels they wantto include in the animation and also visually define thelocations of the waveforms and numeric data on the screen(user configurable display). Also, currently, only the“moving trend line” waveform motion is implemented. Thefuture version will include the “steady trend line” option aswell.

8. Availability

Our discussions with other research groups that work withvital sign data make it apparent that there is a need forsoftware with this functionality. Thus, we decided to makethe current version of MWA available for further use anddevelopment. Moreover, to stimulate its future development,we decided to make MWA an open source project.

The complete set of MATLAB scripts is freely availablefor download from the following URL: www.burykin.com/mwa/. It will also be uploaded to Physionet (www.physionet.org), as well as the MATLAB Central File Exchange (www.mathworks.com/matlabcentral/fileexchange/). The ZIP filecontains the full set of MATLAB scripts, as well as a detailedtutorial that describes step-by-step how to use the scripts tocreate an animation. It also contains sample input “.csv” datafiles (20-second fragment of vital signs recorded from an ORbedside monitor), all the intermediate results (signals.mat fileand “.dat” and “.bmp” files with several individual frames),and the final “.avi” movie file, so the user can verify theresults of every step of the tutorial. The scripts areextensively annotated and commented, so it is straightfor-ward for the users to modify them and create new animationsof their own data.

9. Conclusions

We described current physiologic signal displays withinthe SSSI paradigm, considered possible extensions andenhancements within the SSSI paradigm, and developed asoftware framework to reconstruct current and futuregeneration displays from raw data. The MWA frameworkremains a working prototype (proof of principle); however, itcan readily be used (alone as well as together with othersoftware tools mentioned in this paper) to construct andexperiment with different types of physiologic signal displaywithin extended SSSI paradigm. This paper reports the firstphase of a larger project that focuses on optimizing display ofphysiologic status in critical care. This report focused onconventional displays of physiologic signals and theirpossible extensions within traditional SSSI paradigm. Thenext report will go beyond the SSSI paradigm and deal with

8 A. Burykin et al.

alternative data displays (such as displays based on complexsystems paradigm).

Acknowledgments

Authors would like to thank Drs Madalena D. Costa,Phyllis L. Stein, and Eizo Watanabe for useful discussions.

Appendix A. Supplementary data

Supplementary data associated with this article can be found,in the online version, at doi:10.1016/j.jcrc.2010.06.013.

References

[1] Goodstein LP. Discriminative display support for process operators.In: Rasmussen J, Rouse WB, editors. Human detection and diagnosisof system failure. New York: Plenum; 1981. p. 433-49.

[2] MoodyGB,MarkRG.A database to support development and evaluationof intelligent intensive care monitoring. Comput Cardiol 1996:657-60.

[3] Saeed M, Lieu C, Raber G, Mark RG. MIMIC II: a massive temporalICU patient database to support research in intelligent patientmonitoring. Comput Cardiol 2002:641-4.

[4] Moody GB, Mark RG, Goldberger AL. PhysioNet: a Web-basedresource for the study of physiologic signals. Eng Med Biol Mag,IEEE 2001;20(3):70-5.

[5] Nieminen, K., R.M. Langford, C.J. Morgan, J. Takala, A. Kari, Aclinical description of the improve data library. IEEE Eng Med BiolMag 1997:16(6);21-24, 40.

[6] Thomsen CE, Cluitmans L, Lipping T. Exploring the IBIS data librarycontents: tools for data visualisation, (pre-) processing and screening.Comput Methods Programs Biomed 2000;63(3):187-201.

[7] Goldstein B, McNames J, McDonald BA, Ellenby M, Lai S, Sun Z,et al. Physiologic data acquisition system and database for the study ofdisease dynamics in the intensive care unit. Crit Care Med 2003;31(2):433-41.

[8] Korhonen I, van Gils M, Gade J. The challenges in creating criticalcare databases. IEEE Eng Med Biol Mag 2001;20(3):58-62.

[9] Eklund JM, McGregor C, Smith KP. A method for physiological datatransmission and archiving to support the service of critical care usingDICOM and HL7. Conf Proc IEEE Eng Med Biol Soc 2008;2008:1486-9.

[10] Burykin A, Buchman TG. Cardiorespiratory dynamics duringtransitions between mechanical and spontaneous ventilation inintensive care. Complexity 2008;13(6):40-59.

[11] Lu Y, Burykin A, Deem MW, Buchman TG. Predicting clinicalphysiology: a Markov chain model of heart rate recovery afterspontaneous breathing trials in mechanically ventilated patients. J CritCare 2009;24(3):347-61.

[12] Goldberger AL, Amaral LAN, Glass L, Hausdorff JM, Ivanov PC,Mark RG, et al. PhysioBank, PhysioToolkit, and PhysioNet—components of a new research resource for complex physiologicsignals. Circulation 2000;101(23):E215-20.

[13] Kreuzer M, Kochs EF, Pilge S, Stockmanns G, Schneider G.Construction of the electroencephalogram player: a device to presentelectroencephalogram data to electroencephalogram-based anesthesiamonitors. Anesth Analg 2007;104(1):135-9.

[14] Stockmanns G, Ningler M, Omerovic A, Kochs EF, Schneider G.NeuMonD: a tool for the development of new indicators of anaestheticeffect. Biomed Tech (Berl) 2007;52(1):96-101.

[15] Steimann F., Diagnostic Monitoring of Clinical Time Series (DoctoralDissertation). 1995, Technical University of Vienna, Austria.

[16] Kanani M, Kocyildirim E, Cohen G, Bentham K, Elliott MJ. Methodand value of digital recording of operations for congenital heartdisease. Ann Thorac Surg 2004;78(6):2146-9.

[17] Hegarty M. Dynamic visualizations and learning: getting to thedifficult questions. Learn Instr 2004;14(3):343-51.

[18] CammAJ,MalikM, Bigger JT, Breithardt G, Cerutti S, Cohen RJ, et al.Heart rate variability—standards of measurement, physiologicalinterpretation, and clinical use. Circulation 1996;93(5):1043-65.

[19] Schafer C, Rosenblum MG, Kurths J, Abel HH. Heartbeat synchro-nized with ventilation. Nature 1998;392(6673):239-40.

[20] Godin PJ, Buchman TG. Uncoupling of biological oscillators: acomplementary hypothesis concerning the pathogenesis of multipleorgan dysfunction syndrome. Crit Care Med 1996;24(7):1107-16.

[21] Sanderson PM, Watson MO, Russell WJ. Advanced patient monitoringdisplays: tools for continuous informing. Anesth Analg 2005;101(1):161-8.

[22] Drews FA, Westenskow DR. The right picture is worth a thousandnumbers: data displays in anesthesia. Human Factors: The Journalof the Human Factors and Ergonomics Society 2006;48(1):59-71.

[23] Effken JA, Loeb RG, Kang Y, Lin ZC. Clinical information displays toimprove ICU outcomes. Int J Med Inform 2008;77(11):765-77.

[24] Kennedy RR, French RA, Gilles S. The effect of a model-basedpredictive display on the control of end-tidal sevoflurane concentra-tions during low-flow anesthesia. Anesth Analg 2004;99(4):1159-63.

[25] Zenker S, Rubin J, Clermont G. From inverse problems inmathematical physiology to quantitative differential diagnoses. PLoSComput Biol 2007;3(11):e204.

[26] Kramer G. Auditory display: sonification, audification, and auditoryinterfaces. Santa fe institute studies in the sciences of complexityproceedings, Vol. XVIII. Reading, MA, Addison Wesley; 1994.

[27] Santamore DC, Cleaver TG. The sounds of saturation. J Clin MonitorComput 2004;18(2):89-92.

[28] Loeb RG, Fitch WT. A laboratory evaluation of an auditory displaydesigned to enhance intraoperative monitoring. Anesth Analg 2002;94(2):362-8.

[29] Sanderson PM, Watson MO, Russell WJ, Jenkins S, Liu D, Green N,et al. Advanced auditory displays and head-mounted displays:advantages and disadvantages for monitoring by the distractedanesthesiologist. Anesth Analg 2008;106(6):1787-97.

[30] Ballora M, Pennycook B, Ivanov PC, Goldberger A, Glass L.Detection of obstructive sleep apnea through auditory display ofheart rate variability. Comput Cardiol 2000:2000.

[31] Baier G, Hermann T, Stephani U. Multi-channel sonification of humanEEG. Proc Intl Conf Auditory Display (ICAD). In: Scavone GP,editor. Proceedings of the 13th International Conference on AuditoryDisplay (ICAD2007), June 26-29, 2007. Montreal, Canada: SchulichSchool of Music, McGill University; 2007: 491-6.

[32] Olivan J, Kemp B, Roessen M. Easy listening to sleep recordings: toolsand examples. Sleep Med 2004;5(6):601-3.

[33] Baier G, Hermann T, Muller M. Polyrhythmic organization of couplednonlinear oscillators. Proceedings of the 9th International Conferenceon Information Visualisation. Los Alamitos, CA, USA: IEEEComputer Society; 2005.

[34] Baier G, Hermann T, Lara OM, Muller M. Using sonification to detectweak cross-correlations in coupled excitable systems. Proceedings ofthe International Conference on Auditory Display (ICAD 2005).Limerick, Ireland: International Community for Auditory Display;2005.

[35] Zhang Y, Silvers CT, Randolph AG. Real-time evaluation of patientmonitoring algorithms for critical care at the bedside. Conf Proc IEEEEng Med Biol Soc 2007;2007:2783-6.

[36] Siebig, S., S. Kuhls, M. Imhoff, J. Langgartner, M. Reng, J.Scholmerich, et al, Collection of annotated data in a clinical validationstudy for alarm algorithms in intensive care—a methodologicframework. J Crit Care, 2009 (in press).

9Recreating bedsise displays from archived physiologic data

[37] Takeuchi A, Hirose M, Shinbo T, Imai M, Mamorita N, Ikeda N.Development of an alarm sound database and simulator. J Clin MonitComput 2006;20(5):317-27.

[38] Zong W, Moody GB, Mark RG. Reduction of false arterial bloodpressure alarms using signal quality assessment and relationshipsbetween the electrocardiogram and arterial blood pressure. Med BiolEng Comput 2004;42(5):698-706.

[39] Clifford GD, Aboukhalil A, Sun J, Zong W, Janz BA, Moody G, et al.Using the blood pressure waveform to reduce critical false ECGalarms. Comput Cardiol 2006;33:829-32.

[40] Tham RQY, Sasse FJ, Rideout VC. Large-scale multiple model for thesimulation of anesthesia. In: Moller D, editor. Advanced simulation inmedicine. New York: Springer; 1989. p. 173-93.

[41] Jeffcott SA, Mackenzie CF. Measuring team performance inhealthcare: review of research and implications for patient safety. JCrit Care 2008;23(2):188-96.

[42] Watson M, Sanderson P, Lacherez P, Trentini M, Purtill T. Arbiter—a simulator for the design and evaluation of patient monitoringdisplays. Abstract for SimTect Healthcare Simulation Conference;2005.