Explorations in Gesture Input for PACS Workstation Control

9
Explorations in Gesture Input for PACS Workstation Control Cliff Edwards 12/21/2011 MCKESSON ENTERPRISE MEDICAL IMAGING 130-10711 CAMBIE ROAD RICHMOND, BC, CANADA V6X 3G5 [email protected] 1-604-279-5422, EXT2496

Transcript of Explorations in Gesture Input for PACS Workstation Control

Explorations inGesture Input forPACS Workstation

Control

Cliff Edwards12/21/2011

MCKESSON ENTERPRISE MEDICAL IMAGING130-10711 CAMBIE ROADRICHMOND, BC, CANADA

V6X 3G5

[email protected], EXT2496

Keywords Capacitive proximity sensing · Human computer interaction · Multi-touch gesture · Natural user interface · PACSworkstation

2

PurposePurpose

Radiologists want their PACS diagnostic workstation interaction to be asefficient as possible, with significant interest in shaving as little asa few seconds off the completion time of reading a study. They also require the interface to be comfortable for long periods of use to avoidfatigue that can lead to repetitive stress injuries (RSI). While the mouse has been a remarkably useful graphical user interface (GUI) interaction device for decades it is not without issues for highly repetitive image and graphics intensive use. Radiologists often perform the same image navigation and manipulation tasks such as scrolling a stack of images, zooming and panning images and adjusting image window and level hundreds of times a day. This can lead to mouse related fatigue, and in severe cases RSI. The typical PACS diagnostic workstation software relies on standard GUI conventions such as drop down menus, graphical tools, and keyboard and/or mouse shortcuts which may not be the most efficient approach for these common image operations.

The widespread adoption of consumer mobile devices such as smartphones and tablets has shown there are powerful touch and gesture alternatives to mouse driven GUIs. However, it is not immediately obvious how to translate mobile device style direct display computer interaction to thePACS diagnostic workstation in the reading room where medical grade displays must sit vertically on a desk or table. Any touch or gesture interaction should happen with the arms in a comfortable resting position on the desk, in an area around the typical placement of the mouse and keyboard. Touching the primary interpretation displays with the fingers is out of the question due to poor ergonomics and the fingerprint smudges that will rapidly accumulate.

Many techniques for sensing gestures were considered or evaluated, including optical or vision based approaches (web cameras, time of flight depth sensing cameras), projected capacitive proximity sensing, multi-touch sensing surfaces and touch sensors. One constraint was the requirement that nothing need be worn or held by the user. Another was it must work in a reading room with reduced ambient lighting. In addition, an attempt was made with each evaluation to support eyes free use of the device as much as possible.

It is encouraging that suitable consumer devices are beginning to appearon the market with software development kits (SDKs) and application programming interfaces (APIs) such that software can be written to support some of the specific interaction needs of the PACS workstation.

This paper includes descriptions of several noteworthy gesture interaction techniques and devices developed and prototyped at greater

3

depth, including using a multi-touch sensing pad and using a keyboard augmented in two ways; using projected capacitive proximity sensors inside and using force sensing resistor strips on top of the case.

Methods

The PACS GesturePad is a versatile multi-touch sensing pad with softwareto map finger taps and gestures on defined regions of the pad into imagereview control operations. The pad sits close at hand beside the keyboard where one hand can reach it while still comfortably resting thearm. The functional regions are defined using a matt paper template overlaid on the sensing surface, providing a tactile sense of where the fingers are touching without always having to look. The large central region supports a gesture enabled mouse trackpad – similar to consumer products on the market such as the Apple Magic Trackpad or the recent Wacom Bamboo multi-touch tablets. Multi-finger gestures such as the ubiquitous pinch zoom or a two finger slide to scroll are performed here, as well as three and four finger sliding gestures to control pan and window/level respectively. The pad also has regions defined for discrete operations, essentially the equivalent of keyboard shortcuts. The shortcuts in this case are each labeled on the template overlay for ease of identification eliminating the need to remember obscure keyboardkey combinations. These “virtual buttons” are activated by a simple tap of the finger on the labeled region. Finally, the pad supports regions to each side of the central trackpad area dedicated to the most common interactive operations such as fast scroll and zoom. These are easily controlled by sliding one or two fingers forward and back within the respective region of the pad to vary the operation.

Building on work with PACS GesturePad prototypes in 2009 and 2010, a revised prototype was developed in 2011 and now utilizes a consumer Wacom Bamboo Create multi-touch & pen tablet as the sensor (see Fig. 1).A small custom application interprets several gestures and maps them to inputs for the PACS workstation software. A simple case made of clear plastic sheet was used to hold the matt paper template in place on the pad, and allow it to be easily slid out and changed.

4

Fig. 1 Labeled regions of the PACS GesturePad identify areas where common image review workflow operations are performed. Discrete operations are mapped to virtual buttons on the upper and left sides of the pad. Power scroll and Zoom are mapped to interactive sliding regionsto each side of the central trackpad area. The trackpad area can be usedlike a mouse trackpad to drive the cursor, as well as perform zoom, pan,slow scroll, and window & level operations

From a hardware prototyping and consumer device perspective a more futuristic research project involved the modification of a standard keyboard, augmenting it with sensors for gesture input in two different ways. One approach involved applying force sensing resistor (FSR) strip touch sensors from Interlink Electronics to the top of the keyboard casewhere no keys reside, providing control of image zoom, window and level adjustment, and series scrolling. A second approach involved embedding projected capacitive proximity (PCP) sensors from IDENT Technologies into the keyboard with the sensing field projecting several centimeters into the air above the keyboard through the plastic case and keys. Engaging the invisible electric sensing field with a hand then provided control over image zoom, pan, window and level adjustment, and series scroll. The objective met with both these approaches allows the hands to

5

remain over the keyboard and yet perform interactive operations through gesture without reaching for the mouse (see Fig. 2).

6

Results

Positive feedback from radiologists has been obtained for two PACS GesturePad prototype iterations after two RSNA trade show demonstrations. The most recent PACS GesturePad iteration, utilizing a Wacom Bamboo Create for sensing, is now ready for use in field trials.

Utilizing the FSR strip sensors worked well during demonstrations in thesense of providing very robust and positive sensing and parameter control near at hand to fingers at the keyboard. Drawbacks of the technique are the need for significant keyboard case redesign to supportthe sensors without exposed traces and wires, and the need for an x & y matrix sensor for controlling operations such as pan and window leveling, which were not implemented on the prototype. Keyboards with embedded trackpads are available and may support this requirement for future prototyping. Because the FSR strip sensors have to be placed to avoid keys, sensor positions are constrained and may not follow optimal positioning. See the white and black FSR sensing strips on the keyboardcase in Fig. 2 a.

The concept of gesture control in the air over a PCP sensing augmented keyboard was intriguing to many viewers at the RSNA trade show demonstration and holds further research promise, as the sensors for detecting the gestures are not visible and it’s a “zero touch” technique(see Fig. 2 a). Consumer hardware is not yet available for this technique so research is more complex, involving more extensive hardwareprototyping. We showcased a simplistic proof-of-concept utilizing a PCP evaluation board with four sensors only, in a 3.4cm on a side square area (see Fig. 2 b). The evaluation board had no difficulty sensing a hand or fingers placed over the sensor area of the keyboard, with the sensing fields easily projecting through the plastic layers of the keyboard case and several centimeters into the air above. With only 4 sensors in a small area the prototype was able to detect hand approach in the z direction, and hand position in the x and y direction, making it possible to control the zoom, pan, scroll and window leveling operations. However, the small sensing area meant control of these operations was not precise or robust enough for commercial use. The approach requires and merits additional phases of prototype development to increase sensing area throughout the keyboard and improve resolution,therefore improving gesture expression. Determining appropriate visual feedback techniques to help the user with effective control is another future research direction.

7

Fig. 2 a - performing a free space control gesture using the projected capacitive proximity (PCP) sensing circuit; b – the GestIC PCP evaluation board shown lifted from the hole it sits in underneath the keyboard

Conclusion

Through discussions with radiologists it is clear that many desire to try alternative interaction devices and techniques to the mouse and keyboard for controlling PACS diagnostic workstation software. Some are looking for efficiency, some for comfort and a few for reduction of their RSI symptoms. This exploration of suitable mouse and keyboard alternatives has shown there are promising techniques to move forward with, especially those with consumer hardware already readily available.

We do not yet advocate fully replacing the mouse with a gesture interface. However, providing an alternative for the most common and repetitive operations may help by reducing reliance on the mouse and reducing mouse use overall.

The PACS GesturePad device described above is ready for field trials. Future research of this technique involves migrating to a thin multi-touch display that can lie flat on the desk. Using a display means the layout and supported functions can be changed programmatically in software. However, it also means a different technique is needed to provide eyes free indication of where the fingers are on the display. Weplan to address this in future work through exploring haptic feedback ona multi-touch display to give tactile guidance through the fingertips.

While they showed promise as limited scope first prototypes the augmented keyboard techniques require expanded proof of concept prototype development before they are ready for users.

Future research plans also include continuing to explore new interactiontechniques and devices for controlling a PACS diagnostic workstation image viewing application, potentially including eye tracking technology, wearable devices such as touch display smart watches, and

8

zero-touch, free-space gesture control for surgical uses cases in the operating room.

9