Mobile case-based decision support for intelligent patient knowledge management

16
http://jhi.sagepub.com Health Informatics Journal DOI: 10.1177/1460458207079839 2007; 13; 179 HEALTH INFORMATICS J Dympna O'Sullivan, Eoin McLoughlin, Michela Bertolotto and David C. Wilson management Mobile case-based decision support for intelligent patient knowledge http://jhi.sagepub.com/cgi/content/abstract/13/3/179 The online version of this article can be found at: Published by: http://www.sagepublications.com can be found at: Health Informatics Journal Additional services and information for http://jhi.sagepub.com/cgi/alerts Email Alerts: http://jhi.sagepub.com/subscriptions Subscriptions: http://www.sagepub.com/journalsReprints.nav Reprints: http://www.sagepub.com/journalsPermissions.nav Permissions: © 2007 SAGE Publications. All rights reserved. Not for commercial use or unauthorized distribution. at PENNSYLVANIA STATE UNIV on February 7, 2008 http://jhi.sagepub.com Downloaded from

Transcript of Mobile case-based decision support for intelligent patient knowledge management

http://jhi.sagepub.com

Health Informatics Journal

DOI: 10.1177/1460458207079839 2007; 13; 179 HEALTH INFORMATICS J

Dympna O'Sullivan, Eoin McLoughlin, Michela Bertolotto and David C. Wilson management

Mobile case-based decision support for intelligent patient knowledge

http://jhi.sagepub.com/cgi/content/abstract/13/3/179 The online version of this article can be found at:

Published by:

http://www.sagepublications.com

can be found at:Health Informatics Journal Additional services and information for

http://jhi.sagepub.com/cgi/alerts Email Alerts:

http://jhi.sagepub.com/subscriptions Subscriptions:

http://www.sagepub.com/journalsReprints.navReprints:

http://www.sagepub.com/journalsPermissions.navPermissions:

© 2007 SAGE Publications. All rights reserved. Not for commercial use or unauthorized distribution. at PENNSYLVANIA STATE UNIV on February 7, 2008 http://jhi.sagepub.comDownloaded from

179

Article

Health Informatics Journal •••••Copyright © 2007 SAGE Publications (Los Angeles, London, New Delhi and Singapore)Vol 13(3): 179–193 [1460-4582(200701)13:3; 179–193; DOI: 10.1177/1460458207079839] www.sagepublications.com

Mobile case-based decision support for intelligent patient knowledge management

Dympna O’Sullivan, Eoin McLoughlin, Michela Bertolotto and David C. Wilson

Hospitals everywhere are integrating health data using electronic health record (EHR) systems, and disparate and multimedia patient data can be input by different caregivers at different locations as encapsulated patient profi les. Healthcare institutions are also using the fl exibility and speed of wireless computing to improve quality and reduce costs. We are developing a mobile application that allows doctors to effi ciently record and access complete and accurate real-time patient information. The system integrates medical imagery with textual patient profi les as well as expert interactions by healthcare personnel using knowledge management and case-based reasoning techniques. The application can assist other caregivers in searching large repositories of previous patient cases. Patients’ symptoms can be input to a portable device and the application can quickly retrieve similar profi les which can be used to support effective diagnoses and prognoses by comparing symptoms, treatments, diagnosis, test results and other patient information.

Keywords

clinical decision support, information retrieval, medical image management, knowledge sharing, knowledge management

Introduction

The medical community has an obligation to the public to provide the safest, most effective healthcare possible. This is increasingly achievable with EHR systems, portable computing and new applications that can be delivered over wireless networks. Caregivers equipped

© 2007 SAGE Publications. All rights reserved. Not for commercial use or unauthorized distribution. at PENNSYLVANIA STATE UNIV on February 7, 2008 http://jhi.sagepub.comDownloaded from

180

••••• Health Informatics Journal 13 (3)

with mobile devices now have levels of interaction at the bedside not possible with paper charts and can leverage accurate electronic real-time patient data at the point of care to make decisions and take actions more effi ciently.

Electronic health record systems will eliminate current accessibility diffi culties frequently experienced by caregivers where patient information is scattered across healthcare networks, often buried in inaccessible paper records. New EHR systems will allow disparate patient information (demographic and clinical information, prescription data, medical imagery, etc.) entered by different caregivers to be stored together as encapsulated patient cases in medical multimedia databases. Using portable devices such as personal digital assistants (PDAs) and tablet PCs, which provide secure access to EHR systems, the right healthcare professionals can access the right information at the right times.

In addition, the introduction of new wireless medical technologies is providing many exciting opportunities. Caregivers now have constant access to real-time patient information including other resources such as up-to-the-minute laboratory test results directly at the bedside; specialists located away from the hospital facility can help with diagnostics from a mobile device; and emergency healthcare workers can transmit X-rays to hospitals and receive analysis back on a mobile device.

Progress in techniques for medical image capture has led to many new forms of digital imagery such as computer tomography (CT) and magnetic resonance imaging (MRI). The marriage of digital imaging technology and high-speed networks is also providing new breakthroughs: for example, images from small regional hospitals can be read instantly from larger centres, maximizing limited resources (including radiologists themselves) and minimizing patient transfers. Improved methods for image capture, however, are giving rise to an information overload problem in the medical image domain, due to increased variety and volume of imagery. New applications must provide methods to effi ciently index, retrieve and integrate this information with other types of patient data in encapsu-lated patient profi les.

Medical diagnosis and decision-making involves interplay between vast numbers of medical knowledge resources [1, 2]. This can range from implicit knowledge held by healthcare workers to experiential and data-induced knowledge [3, 4]. Systems that can simultaneously access and combine relevant information from these various knowledge resources are crucial to the diagnostic process and subsequently the effi cient treatment of patients. From a decision support viewpoint, healthcare workers need fast access to complete, contextually relevant information that is consistent with the patient’s current medical state and that is appropriately presented at the correct level of abstraction.

We are developing an innovative application that allows doctors to effi ciently input, analyse, query and compare electronic patient records including associated medical imagery on any mobile or desktop device. Our integrated EHR system can be used wirelessly by caregivers at any location in the healthcare environment to record and access important patient data, including clinical information, up-to-date status reports, medication and medical imagery. The type of functionality we provide includes a user interface that allows caregivers to input patient information in a straightforward manner while at the same time providing all specialized functionality required by expert personnel. We have developed dedicated multimedia annotation tools for medical imagery that support communication and collaboration between caregivers as well as management of medical image resources. Medical image annotations and other patient data are integrated into encapsulated profi les

© 2007 SAGE Publications. All rights reserved. Not for commercial use or unauthorized distribution. at PENNSYLVANIA STATE UNIV on February 7, 2008 http://jhi.sagepub.comDownloaded from

181

O’Sullivan et al. Mobile decision support

and used to support retrieval of patient case histories for comparison of diagnoses and treatment procedures. We are also developing techniques for effective integration of image data with other patient information both within a database and a user interface.

Our work attempts to leverage the insights from a caregiver’s decisions and rationale as they diagnose and treat patients. As a caregiver interacts with an electronic patient profi le in the course of a diagnosis, the system analyses their actions. This enables the capture of human expertise and profi ciency in diagnosing and treating the particular illness which in turn allows us to understand why relevant information was accessed and investigated in the course of that diagnosis. Once this expert domain knowledge is captured, it is stored as a resource in a knowledge base of encapsulated patient profi les. This information can then be used to fi lter, retrieve and display the most relevant similar patient case histories from huge repositories of patient data as well as for diagnostic comparison with new patients. This avoids slow and tedious manual searching through thousands of paper records. Presenting complaints or other clinical information of new patients can be entered to the system and previous diagnoses, treatments and outcomes for other similar patients can be instantly accessed by physicians. This system differs from current practice whereby most patient data in EHR systems are indexed only by demographic information rather than by medically relevant information such as diagnoses, which can be very useful for physicians. Our application does, of course, also index by demographic data and relevant information can be retrieved using any combination of patient data. Such information can be interactively explored by caregivers and used to guide them towards appropriate and relevant information regarding diagnoses and treatments.

Effective use of this knowledge base of previous case histories is made possible by the application of case-based reasoning (CBR) techniques. CBR is a well established method for building medical systems [5] and one of the intuitively attractive features of CBR in medicine is that the concepts of patient and disease lend themselves naturally to a case representation. Also medical practitioners logically approach diagnosis from a case-based standpoint (i.e. previous specifi c patient interactions are as strong a factor as individual symptoms in making a diagnosis).

We identify three main advantages of our approach. First, by reusing collective knowledge in support of similar patient cases the time required to diagnose or treat a new patient can be signifi cantly reduced. Second, the approach facilitates knowledge sharing (remote or otherwise) by retrieving potentially relevant knowledge from other experiences. Finally, from a knowledge management perspective, contextual expert knowledge relating to particular cases may now be stored and reused as an additional resource for support, train-ing and preserving knowledge assets.

System description

Based on an investigation of current standards in Ireland for healthcare information exchange we are designing our tools as software add-ons that integrate with electronic repositories of patient records. The application is based on a three tiered architecture; client, server and database. The client and server are both implemented in Java and communicate using JDBC to a MySQL database. The system consists of two primary client components: a desktop application used by radiologists and a mobile component used by physicians.

© 2007 SAGE Publications. All rights reserved. Not for commercial use or unauthorized distribution. at PENNSYLVANIA STATE UNIV on February 7, 2008 http://jhi.sagepub.comDownloaded from

182

••••• Health Informatics Journal 13 (3)

The radiologist employs a suite of image processing tools provided by the desktop application to annotate imagery with relevant information regarding a patient’s condition. The system can be queried to display previously annotated patient images from a knowledge base of previous patient profi les for comparative studies to aid with more effective diagnosis and treatment. Once an interaction with a patient is complete the current patient’s images are added to their profi le in a central repository and the images and/or their annotations can be retrieved/updated afterwards by the radiologist or another doctor examining the patient.

Physicians use the mobile application on a PDA or a tablet PC to input, retrieve and view patient data and to quickly enter information about patient progress into electronic charts in real time. The PDA allows for improved mobility due to its reduced size and weight. The tablet PC provides a more comprehensive user interface and additional functionality such as the ability to view and annotate medical imagery due to the larger screen size and higher processing power. Both of the wireless devices can take advantage of any 802.11 wifi -based wireless network for data transmission. All of the information for each patient is stored as an integrated patient profi le in a database and all interactions by healthcare workers are recorded by the system. Both the PDA and the tablet PC can be used wirelessly at different locations around the hospital or in other situations where such facilities are not normally available (e.g. in an ambulance). They may also be used to access other resources such as online drug references or medical encyclopaedias. The mobile devices may also be used to query the central repository of patient profi les with information specifi c to a particular illness to retrieve similar patient case histories that may help with a diagnosis by providing comparative assistance. Figure 1 shows the scrollable tabbed interfaces implemented for the PDA for inputting, querying and viewing textual patient data, while Figures 2–4 show the same information as displayed on the tablet PC.

Image retrieval

Continuing advances in techniques for digital image capture and storage have given rise to a signifi cant problem of information overload in the medical imagery domain. It has therefore become increasingly critical to provide intelligent application support for managing large repositories of medical imagery. The majority of current medical image retrieval techniques retrieve images by similarity of appearance, using low-level features such as shapes, or by natural language textual querying where similarity is determined by comparing words in a query string against words in semantic image metadata tags.

Figure 1 Patient demographic, history and clinical data on the PDA

© 2007 SAGE Publications. All rights reserved. Not for commercial use or unauthorized distribution. at PENNSYLVANIA STATE UNIV on February 7, 2008 http://jhi.sagepub.comDownloaded from

183

O’Sullivan et al. Mobile decision support

Figure 2 Patient demographics

Figure 3 Patient history

© 2007 SAGE Publications. All rights reserved. Not for commercial use or unauthorized distribution. at PENNSYLVANIA STATE UNIV on February 7, 2008 http://jhi.sagepub.comDownloaded from

184

••••• Health Informatics Journal 13 (3)

It has been recognized, however, that neither of these approaches is fully adequate for answering the complete range of user queries. In order to overcome these diffi culties our application aims to unite information about underlying visual data with more high-level concepts provided by healthcare professionals as they interact with medical imagery. For example, capturing a measure of human expertise and profi ciency involved in making a diagnosis from an X-ray image allows us to understand why relevant information was selected (e.g. highlighting a particular body organ) and also how it was employed in the context of that specifi c diagnosis (e.g. inferred from added annotations). The approach allows us to capture and reuse best practice techniques by automatically constructing a knowledge base of previous user interactions using CBR techniques. This knowledge base can be exploited to improve future query processing by retrieving and reusing similar expert experiences, as the illness or injury of a presenting patient can now be grounded in the context of previous similar patients.

The caregiver can retrieve relevant patient imagery by entering context about the current presenting patient to the retrieval system. For example a radiologist may be viewing an X-ray image and may be having diffi culty in diagnosing the problem from the particular image. The X-ray however may remind him of an image he viewed previously and he may remember some of the details of the previous patient. In this scenario the radiologist could input the details of the previous patient as search parameters to the application. The application can fi lter out that patient’s profi le as well as any similar profi les from the EHR system. The radiologist can compare the current image with images from these previous cases to fi nd

Figure 4 Clinical data

© 2007 SAGE Publications. All rights reserved. Not for commercial use or unauthorized distribution. at PENNSYLVANIA STATE UNIV on February 7, 2008 http://jhi.sagepub.comDownloaded from

185

O’Sullivan et al. Mobile decision support

Figure 5 Searching for patient information

any similarities. If any of the similar images had been annotated during diagnosis the radiologist may study these notes for extra information regarding the specifi c injury or illness.

In another example, a radiologist may be diagnosing a patient from an MRI scan. The radiologist may be having diffi culty in diagnosing the patient as they may not have pre-viously encountered the particular injury or illness. In this situation the radiologist may wish to view other patient images and diagnoses to make a confi dent assessment so they could focus their search on the patient symptoms and retrieve information based on this parameter. The radiologist could compare the images and profi les with the current patient, using the application as a support for his eventual decision.

The caregiver may retrieve imagery by entering parameters, selecting associated weights and pressing ‘Search Patient Images’ on the interface depicted in Figure 5. The resulting images are displayed in Figure 6.

Image annotation tools

Many new radiology applications have integrated annotation tools where radiologists can annotate patient imagery (e.g. DICOM images) in an appropriate fashion while diagnosing patients. The tools we have developed are a subset of those normally found in image processing applications and have been specifi cally selected and designed for specialized radiography tasks (e.g. fi lters, highlighting, sketching and post-it type annotation tools).

© 2007 SAGE Publications. All rights reserved. Not for commercial use or unauthorized distribution. at PENNSYLVANIA STATE UNIV on February 7, 2008 http://jhi.sagepub.comDownloaded from

186

••••• Health Informatics Journal 13 (3)

Our annotation facility can accept and integrate with a number of common medical image formats (e.g. DICOM, X-ray) as all applied annotations are layered on top of the image and so do not alter the underlying raster. The annotation interface has been specifi cally designed to allow us to capture important contextual patient/diagnostic information by situating intelligent support for gathering it inside this fl exible environment. This information is collected implicitly to shield the radiographer from the burden of explicit knowledge engineering. From their perspective, the image interaction tools support them in carrying out their task (e.g. producing a report on the current patient) by making it easier for them to select and highlight relevant features, to store insights and to summarize aspects of their work. However, from a system perspective they are employed to monitor and record the radiologist’s actions and ultimately to capture contextual diagnostic knowledge to improve the ability of the application to recommend other profi les for comparing diagnostic information and treatment procedures.

The user can add media annotations to images as a whole or to particular highlighted aspects as depicted in Figure 7. Currently, the system supports annotation by text (including a facility to upload web documents), audio and video. All textual, audio and video annotations can be previewed before being incorporated as part of the knowledge base, and once recorded can be saved and uploaded to the image as a knowledge parcel associated with the particular patient. The system also supports annotation by cut, copy and paste between a given image and other images in the dataset, as well as any application that supports clipboard functionality. Once the radiologist has fi nished interacting with the imagery their entire work process is stored along with all other patient data as an encapsulated patient profi le in the knowledge base.

Figure 6 Retrieved patient imagery

© 2007 SAGE Publications. All rights reserved. Not for commercial use or unauthorized distribution. at PENNSYLVANIA STATE UNIV on February 7, 2008 http://jhi.sagepub.comDownloaded from

187

O’Sullivan et al. Mobile decision support

Case retrieval

As the system builds up encapsulated user interactions, another type of retrieval is enabled, retrieving entire previous patient case histories. This enables a caregiver to look for previous patient analyses that are most similar to the current patient both to fi nd relevant information and to make prognoses by examining the decisions and rationale that went into diagnosing and/or treating previous patients. Relevant patient case histories can be retrieved on the PDA by entering patient details to the tabbed interface depicted in Figure 8, adjusting the importance of the search fi elds using associated slider bars, and clicking the ‘Search Patients’ button. They can be retrieved on the tablet PC in a similar fashion by clicking on the ‘Search Patient Profi les’ button as shown on the interface in Figure 5.

For example, a physician may have diagnosed a patient as having a particular illness but may not be entirely sure what treatment to recommend. The physician can enter his

Figure 7 Image annotation

© 2007 SAGE Publications. All rights reserved. Not for commercial use or unauthorized distribution. at PENNSYLVANIA STATE UNIV on February 7, 2008 http://jhi.sagepub.comDownloaded from

188

••••• Health Informatics Journal 13 (3)

diagnosis to the application and retrieve from a central medical repository a list of patients who were similarly diagnosed. He may then study these patients to fi nd those whose diagnosis is most relevant to the current patient. He may access the full treatment planning processes for each of these patients as well as any recorded outcomes. He can also use the application to access medical reference resources about the medication prescribed as part of these treatment processes. Or in a different example a physician may have diagnosed a patient and be aware that a new drug is currently being tested on patients with the particular illness. The physician may be interested to see if this patient qualifi es for the new treatment based on information such as age and allergies. By querying the application she can view profi les of patients currently being prescribed the medication and see how they are responding to the treatment (contained in up-to-date patient status reports). All of this information can be quickly accessed at any location using wireless technologies through the one integrated application, thereby reducing the time and complexity in recommending the new treatment.

An obvious application of this facility to retrieve and reuse similar patient case histories is that of medical education. Medical students preparing to work with real patients in hospital wards could have access to this rich knowledge resource that offers actual experiential advice and instruction on how to diagnose and treat patients according to many kinds of patient data including symptoms, examinations, laboratory results, and medical imagery.

Figure 9 shows an example of retrieved case histories on the tablet PC. Each row represents a patient case history and is summarized to show the most important information for that patient. It includes a matching percentage score between the current query and the similar profi le, as well as the symptoms, diagnosis, applied treatments and outcomes for similar case. The user can click ‘Open Profi le’ to view the full history of the patient.

Our application offers several advantages over existing EHR systems. First, by reusing case histories and collective knowledge in support of similar diagnoses or treatments, the time required to diagnose and treat a new patient can be signifi cantly reduced. In addition, the approach facilitates knowledge sharing by retrieving potentially rele-vant knowledge from other patient case histories. Furthermore, contextual information

Figure 8 Searching for patient profi les

© 2007 SAGE Publications. All rights reserved. Not for commercial use or unauthorized distribution. at PENNSYLVANIA STATE UNIV on February 7, 2008 http://jhi.sagepub.comDownloaded from

189

O’Sullivan et al. Mobile decision support

relating to particular patients, diagnoses, symptoms, treatments and outcomes may now be stored to create accessible organizational knowledge.

Calculating patient diagnoses and treatment similarity

Retrieval within the system is taking place in the context of an overall workfl ow. Some of the most important steps in this workfl ow (which may or may not be relevant to all patients) are: entering preliminary patient details, recording results of an initial examination, inputting presenting conditions, uploading and annotating medical imagery, recording diagnoses and recommending treatments. Most patient profi les will consist of some if not most of the information described above. Given a textual representation of these patient profi les, we can match textual queries imputed by a caregiver to patient cases in the know-ledge base or case base of previous patients.

Figure 9 Retrieved patient profi les

© 2007 SAGE Publications. All rights reserved. Not for commercial use or unauthorized distribution. at PENNSYLVANIA STATE UNIV on February 7, 2008 http://jhi.sagepub.comDownloaded from

190

••••• Health Informatics Journal 13 (3)

Our retrieval metrics are currently focused on text (patient information and image annotations), using information retrieval metrics (e.g. [6]) as a basis for similarity. The task-based retrieval system employs indexes in separate spaces for the constituent textual segments of the patient profi le. When a caregiver enters a textual query the parameters and the weights that they specify are combined and compared to existing patient cases in the case base of previous patients and a weighted average is used to compute similarity between the current patient and other patients from the central medical database. These indices are used to calculate similarity in both retrieval of medical images and retrieval of patient case histories.

Evaluation

In an evaluation of our approach we conducted testing with an online dataset of 1000 encapsulated patient profi les with associated annotated medical imagery from the dermatology domain [7]. Experiments were conducted to test the previous patient case retrieval capabilities of the application. In this approach we were interested in showing that the system is capable of capturing and deciphering expert medical knowledge and we were aiming to show that recommendations made by the system based on similar patient data could be used to support physician decision-making, thereby saving healthcare providers valuable time. We were also interested in demonstrating the ability of the application to facilitate effective knowledge and data sharing within the medical community.

We initiated our evaluation by selecting a number of cases (10) corresponding to different skin conditions from the dataset. These cases were then eliminated from the dataset for evaluation. The remaining cases in the database were clustered as being either ‘relevant’ or ‘not relevant’ to each of the 10 selected cases. Symptoms, diagnosis, image annotations and treatments for each of the 10 selected cases were entered into the system and similar cases were retrieved using the ‘Find Similar Cases’ button. So for example for a patient suffering from acne vulgaris the symptoms were ‘The patient has relapsing and remitting patches of complete hair loss involving the scalp, eyebrows and eyelashes. The skin was otherwise normal.’ The cases retrieved by the application were then analysed and each returned case was marked as either ‘relevant’ or ‘not relevant’ to the search query. These ratings were then compared to the clusters outlined earlier to examine if the results were appearing in the relevant categories. This process was repeated for each of the 10 cases.

In order to graph the results for our prior case retrieval we employed precision and recall metrics. Figure 10 shows the average precision and recall values of the results for the 10 queries, and Figure 11 demonstrates F-scores (harmonic mean of precision and recall).

From the graphs we observe that the system is performing case retrieval accurately. The precision remains high for the fi rst group of cases retrieved, showing that the most relevant results are being retrieved fi rst by the application. From a decision support per-spective this means that the most pertinent patient cases are quickly recommended to caregivers for fast comparison with presenting patients and accurate diagnoses can be reached with greater speed and effi ciency. We also observe a linear increase in recall as the number of results returned increases, indicating that all relevant cases are being retrieved for each query.

© 2007 SAGE Publications. All rights reserved. Not for commercial use or unauthorized distribution. at PENNSYLVANIA STATE UNIV on February 7, 2008 http://jhi.sagepub.comDownloaded from

191

O’Sullivan et al. Mobile decision support

Related work

In traditional diagnostic systems, expert knowledge tended to be captured in the form of empirical classifi cation rules [8]. Such diagnostic systems often had limited application in the medical domain as highly professional caregivers were unwilling to accept such a master–slave relationship [9]. Other shortcomings of these knowledge-based systems are described in [10] where it is suggested that ignoring user interactions with these applications can frequently be a cause of failure. These systems tended to be system-centric rather then either task-centric or user-centric as is our application, which concentrates on deciphering and supporting complex work procedures in real time as they are performed by expert users. In this research we have drawn on medical knowledge management initiatives that promote the collection, integration and distribution of a single medical modality [11]. This allows us to build encapsulated patient profi les that are used both to effectively store patient data and for the purposes of comparison with new patient profi les for diagnosis and treatment.

Figure 11 F-score

Figure 10 Precision and recall

© 2007 SAGE Publications. All rights reserved. Not for commercial use or unauthorized distribution. at PENNSYLVANIA STATE UNIV on February 7, 2008 http://jhi.sagepub.comDownloaded from

192

••••• Health Informatics Journal 13 (3)

The system we are developing also has benefi cial implications from a healthcare modelling viewpoint [12], as explanatory models from amassed patient data can easily be created that can identify trends as well as compare diagnosis, treatments and departments.

We have also drawn on CBR research that generates and uses practice cases from a knowledge base for medical training and physician decision support [13, 14]. In our intelli-gent tutoring system, the cases used to educate caregivers are actual experiential records that contain advice and instructions on how to diagnose and treat patients by integrating many kinds of patient data. In addition, we have investigated work performed by [15] on integrating patient case histories with associated medical imagery.

In this research we are working with large collections of experience and user context. As in [16], we believe that user interactions with everyday productivity applications provide rich contextual information that can be leveraged to support access to task-relevant information. We have drawn from work that extracts context-relevant information during document browsing to support users in fulfi lling tasks [17]. Our methods for annotating multimedia are related to annotation for the semantic web [18] and multimedia indexing [19] where the focus is on developing annotated descriptions of media content. Multimedia database approaches such as QBIC [20] provide for image annotation but use the annot-ations to contextualize individual images. In this work we are concerned with a task-centric view of the annotations, where we employ annotations to elucidate how an image relates to a current domain task by using annotations to contextualize task experiences.

Conclusions

The traditional system in hospitals whereby doctors enter patient information using paper charts is cumbersome and time-consuming and does not facilitate knowledge sharing. Different types of information, including imagery, are stored in different locations and valuable time is often lost trying to correlate data to diagnose and treat patients. New EHR systems are addressing these problems and our application can provide healthcare personnel with instant access to accurate integrated information that allows them to make critical decisions with greater speed and effi ciency. It facilitates knowledge sharing and supports effective communication about the most effective ways to treat patients by linking similar patient case histories using CBR techniques. It adds more value to medical imagery by combining it with patient records to support more thorough com-munication, examination and diagnosis.

Our initial system evaluation has produced some very promising results. The system can effectively capture important patient information and can successfully retrieve similar previous patient case histories that offer useful real-time decision support to physicians at the point of care. We intend to conduct trials with domain experts in the near future. We also intend to supplement our textual retrieval by performing more complex textual analysis (e.g. lexical chaining). We are interested in incorporating a facility to record relevant feedback from physicians to improve the usability of the application for expert users.

References 1 Abidi S S R. Knowledge morphing: towards case-specifi c integration of hetrogeneous medical

knowledge resources. Proceedings of CBMS 2005, Eighteenth IEEE Symposium on Computer-Based Medical Systems, 2005.

© 2007 SAGE Publications. All rights reserved. Not for commercial use or unauthorized distribution. at PENNSYLVANIA STATE UNIV on February 7, 2008 http://jhi.sagepub.comDownloaded from

193

O’Sullivan et al. Mobile decision support

2 Abidi S S R. Knowledge management in healthcare: towards ‘knowledge-driven’ decision-support services. International Journal of Medical Informatics 2001; 63; 5–18.

3 Sim I, Gorman P, Greens R A, Haynes R B, Kalpan B, Lehmann H, Tang P C. Clinical decision support systems for the practice of evidence-based medicine. Journal of the American Medical Informatics Assocation, 2001; 8; 527–34.

4 Wyatt J C. Management of explicit and tacit knowledge. Journal of the Royal Society of Medicine 2001; 94 (1); 6–9.

5 Nilsson M, Sollenborn M. Advancements and trends in medical case-based reasoning: an overview of systems and system development. Proceedings of the 17th International FLAIRS Conference, 2004.

6 Salton G, McGill M. Introduction to Modern Information Retrieval, 1983. 7 http://dermatlas.med.jhmi.edu/derm/. 8 Buchanan B G, Shortliffe E H. Rule-Based Expert Systems: The MYCIN Experiments of the Stanford

Heuristic Programming Project, 1984. 9 Sacco G M. Guided interactive diagnostic systems. Proceedings of CBMS 2005, Eighteenth IEEE

Symposium on Computer-Based Medical Systems, 2005.10 Brezillon P. Context in problem solving: a survey. The Knowledge Engineering Review 1999; 14 (1);

1–34.11 Jadad A R, Haynes R B, Hunt D, Browman G P. The Internet and evidence-based decision making: a

needed synergy for effi cient knowledge management in health care. CMAJ 2000; 162.12 Ivatts S, Millard P H. Health care modelling: why should we try? British Journal of Health Care

Management 2002; 8 (6); 218–22.13 Bichindaritz I. Solving safety implications in a case-based decision-support system in medicine.

Workshop on CBR in the Health Sciences, ICCBR-05, 2003.14 Bichindaritz I, Sullivan K M. Generating practice cases for medical training from a knowledge-based

decision-support system. Workshop on CBR in the Health Sciences, ECCBR, 2002.15 Bradley F. Putting fun into function with QuizMed – an interactive medical application. Proceedings of

CBMS 2005, Eighteenth IEEE Symposium on Computer-Based Medical Systems, 2005.16 Budzik J, Hammond K J. User interactions with everyday applications as context for just-in-time

information access. Proceedings of IUI, 2000.17 Bauer T, Leake D. WordSieve: a method for real-time context extraction. Proceedings of CONTEXT,

2001.18 Hollink L, Schreiber A, Wielemaker J, Wielinga B. Semantic Annotation of Image Collections.

Proceedings of K-CAP Workshop on Knowledge Capture and Semantic Annotation, 2003.19 Worring M, Bagdanov A, Gemerr J, Geusebroek J, Hoang M, Schrieber A, Snoek C, Vendrig J,

Wielemaker J, Smuelders, A. Interactive indexing and retrieval of multimedia content. Proceedings of the 29th Annual Conference on Current Trends in Theory and Practice of Informatics, 2002.

20 Flickner M, Sawhney H, Ashley J, Huang Q, Dom B, Gorkani M, Hafner J, Lee D, Petkovic D, Steele D, Yanker P. Query by image and video content: the QBIC system. IEEE Computer 1995; 28; 9.

Correspondence to: Dympna O’Sullivan

Dympna O’SullivanSchool of Management, University of Ottawa,136 Jean-Jacques Lussier, Ottawa, ON KIN 6N5, CanadaE-mail: [email protected]

Eoin McLoughlinSchool of Computer Science and Informatics, University College Dublin, Belfi eld, Dublin 4, IrelandE-mail: eoin.a.mcloughlin

Michela BertolottoSchool of Computer Science and Informatics, University College Dublin, Belfi eld, Dublin 4, IrelandE-mail: [email protected]

David C. WilsonDepartment of Software and Information Systems, University of North Carolina at Charlotte, 9201 University City Boulevard Charlotte, NC 28223–0001, USAE-mail: [email protected]

© 2007 SAGE Publications. All rights reserved. Not for commercial use or unauthorized distribution. at PENNSYLVANIA STATE UNIV on February 7, 2008 http://jhi.sagepub.comDownloaded from