A Haptic Soundscape Map of the University of Oregon

11
Journal of Maps, 2009, 19-29 A Haptic Soundscape Map of the University of Oregon MEGAN M. LAWRENCE 1 , NICHOLAS MARTINELLI 1 and RACHEL NEHMER 2 1 Department of Geography, 1251 University of Oregon, Eugene, OR 97403, USA; [email protected]. 2 Terra Firma Design, 1662 North Coast Highway, Newport, OR 97365, USA. Abstract A course jointly taught with Geography and Computer Science designed the first haptic soundscape map of the University of Oregon campus. An electronic haptic soundscape map is a cartographic representa- tion that is experienced through auditory cues and haptic (dynamic touch) feedback from the hand and fingers. The primary users of this map are blind and visually impaired persons. The goals of this student project was to create a tool that could be easily maintained, accessible at the low cost and flexible enough for update and change after research feedback. The outcome of the project is a preliminary version of the University of Oregon campus haptic soundscape map and this is continuing to be improved by students. (Received 20 th May 2008; Revised 23 rd February 2009; Accepted 9 th March 2009) Editorial Note: University campuses are complex and difficult to navigate for the tens of thousands of people that visit in a given year. The campus map is therefore a vital piece of information, challenging in it’s design in that it covers a large space, needs to incorporate tremendous detail about building, housing, recreation, and access, and, needs to be displayed on a relatively small computer screen. Interactive web mapping has provided the tool-set necessary to convert this complex, massive, dataset into a manageable form (embedding information in mouse-overs for example). In addition to navigation, campuses are storehouses for many layers of thematic datasets, such as historical information about land use, areas for community gardens, community plans, recycling bin locations and routing - to name a few. Campus maps then can be both navigational and thematic in nature. Universities act as a ‘living lab’ where students, faculty and staff have the opportunity to come together to work on projects that benefit the university and give the students applied project experience. Campus maps, for navigation and/or for specific themes such as historical representations of the landscape, are perfect examples of these integrative projects between students and the university. This paper, in addition to the two other papers in this series, is an example of one such project. Sally Hermansen, Department of Geography, University of British Columbia Mike J Smith, School of Geography, Geology and the Environment, Kingston University ISSN 1744-5647 19

Transcript of A Haptic Soundscape Map of the University of Oregon

Journal of Maps, 2009, 19-29

A Haptic Soundscape Map of the University of Oregon

MEGAN M. LAWRENCE1, NICHOLAS MARTINELLI1 and RACHEL NEHMER2

1Department of Geography, 1251 University of Oregon, Eugene, OR 97403, USA; [email protected].

2Terra Firma Design, 1662 North Coast Highway, Newport, OR 97365, USA.

Abstract

A course jointly taught with Geography and Computer Science designed the first haptic soundscape mapof the University of Oregon campus. An electronic haptic soundscape map is a cartographic representa-tion that is experienced through auditory cues and haptic (dynamic touch) feedback from the hand andfingers. The primary users of this map are blind and visually impaired persons. The goals of this studentproject was to create a tool that could be easily maintained, accessible at the low cost and flexible enoughfor update and change after research feedback. The outcome of the project is a preliminary version of theUniversity of Oregon campus haptic soundscape map and this is continuing to be improved by students.

(Received 20 th May 2008; Revised 23 rd February 2009; Accepted 9th March 2009)

Editorial Note: University campuses are complex and difficult to navigate for the tens of thousands of people that visit in a givenyear. The campus map is therefore a vital piece of information, challenging in it’s design in that it covers a large space, needs toincorporate tremendous detail about building, housing, recreation, and access, and, needs to be displayed on a relatively smallcomputer screen. Interactive web mapping has provided the tool-set necessary to convert this complex, massive, dataset into amanageable form (embedding information in mouse-overs for example). In addition to navigation, campuses are storehouses formany layers of thematic datasets, such as historical information about land use, areas for community gardens, community plans,recycling bin locations and routing - to name a few. Campus maps then can be both navigational and thematic in nature.

Universities act as a ‘living lab’ where students, faculty and staff have the opportunity to come together to work on projects thatbenefit the university and give the students applied project experience. Campus maps, for navigation and/or for specific themessuch as historical representations of the landscape, are perfect examples of these integrative projects between students and theuniversity. This paper, in addition to the two other papers in this series, is an example of one such project.

Sally Hermansen, Department of Geography, University of British ColumbiaMike J Smith, School of Geography, Geology and the Environment, Kingston University

ISSN 1744-564719

Journal of Maps, 2009, 19-29 Lawrence, M. M., Martinelli, N. et al

1. Introduction

Most, if not all, universities have campus maps readily available. These maps are of-ten just a mouse click away, providing students, staff and faculty with the capabilityof using these products through a computer graphical interface easily and quickly. Theability to use and access campus maps is often taken for granted as information we needand expect. Over the years the University of Oregon has produced a variety of campusmaps for a number of different and sometimes specialized purposes. For example, amap of the University of Oregon’s campus has been made to highlight lighted walk-ways and emergency telephones to provide information for safe travel after dark. Butthe University of Oregon did not have, as of 2007, a campus map for the special needsof the blind and visually impaired. Through a team-taught course in the Departmentsof Computer Science and Geography, a collaborative student project resulted in a hapticsoundscape map, or a digital cartographic representation of the campus that uses bothauditory and haptic cues to provide spatial information.

Like other map users, students, faculty, and staff who are blind or visually impairedhave a need for campus maps to support general exploration, route planning, and asnavigational aids. In general there are two main types of maps that can be provided,a non-electronic tactile map and a digital (or electronic) haptic soundscape map. Non-electronic tactile maps are similar to paper maps for the sighted in that they are pro-duced, or “printed,” in a static format. They are read or experienced by using onlythe fingers, typically the fingers of both hands. Non-electronic tactile map features donot move or provide any vibrating or haptic feedback. Non-electronic maps have beenstudied from both a design perspective (Heath, 1958; Perkins, 2001; Jehoel et al., 2006;Lobben et al., 2008) and from the standpoint of usability (Eriksson, 2001; Ungar et al.,1995; Golledge et al., 1996; Espinosa et al., 1999). It has been concluded that tactile mapsare helpful tools to provide the blind and visually impaired spatial information, helpwith travel, navigation, and creation of mental maps (Kitchen and Jacobson, 1997). Non-electronic tactile maps can be expensive to produce and duplicated, require upkeep toinclude new spatial information as campuses change, and only use one modality ofinput, touch, to deliver information.

An electronic haptic soundscape map on the other hand is experienced through audi-tory cues and haptic (dynamic touch) feedback from the hand and fingers. A hapticsoundscape map has the ability to replace a conventional flat map with a computer in-terface that provides graduated continuous sound changes (Golledge, 1991) and vary-ing vibration to produce geographic phenomena that may effectively incorporate boththe auditory and the haptic modality domains. A haptic soundscape map is a termthat means both hardware and software technology have been brought together to cre-ate a multimodal geographic experience (Jacobson, 1998). The haptic mouse (which iscommercially available) is used to explore the onscreen geographic information, giving

20

Journal of Maps, 2009, 19-29 Lawrence, M. M., Martinelli, N. et al

vibrating feedback. The haptic feedback in conjunction with auditory cues creates thespatial layout of the map. By using multiple “channels” of modality more spatial infor-mation may be able to reach the brain and this could lead to a reduction of errors andtime needed to read haptic soundscapes (Shimoga, 1993; Rice et al., 2005). The interac-tions between touch and audio are complex (Jacobson, 2004) but through a number orresearch projects Jacobson (2000; 2004) has shown that a mixed modality interface doeslead to more comprehensive feedback with on-screen features.

This paper will present the goals of the team-taught course and how the University ofOregon haptic soundscape map was made (Figure 1). There will be a brief discussionof an informal user feedback meeting that took place with a group of blind and/orvisually impaired individuals. There will be a critique of the map made, followed bysuggestions for further development and testing.

Figure 1. The campus map shows the colored polygons and point symbols that have both sound and haptic feedbackinformation attached to them. To experience all the sounds associated with each map feature please visithttp://geography.uoregon.edu/geocog/projects/haptic.html

21

Journal of Maps, 2009, 19-29 Lawrence, M. M., Martinelli, N. et al

2. Project goals

One of the main course goals designed by the instructors was to create and engage ina semi-formal realistic relationship between a “client” (the geography students) and aproduct design team (computer science students). The geography class provided the“problem” or the request for a haptic soundscape map, along with input on the theo-retical structure of the map and design considerations. The computer science studentsacted as a company that was designing a product based on the needs and wants of theclient. They brought to the collaboration knowledge of hardware, software and pro-gramming.

The end result from the collaboration was a haptic soundscape map that could be usedas both a useful research and practical tool. There were three main goals for this project:(1) a tool that could be easily maintained; (2) accessible at a low cost (3) and flexibleenough for update and change after research feedback. By meeting the goals statedabove, a haptic soundscape map was developed that can be used by blind and visuallyimpaired students to explore campus geography. The map can also be used as a researchtool to explore how the blind and visually impaired interact with such technology, aswell as the effectiveness of a haptic soundscape to convey geographic information aloneand in comparison to non-electronic tactile maps.

The existing (visual) campus map infrastructure at the University of Oregon is built onan Adobe Flash and ArcGIS platform. Integration of the haptic soundscape map withthe university’s existing campus mapping infrastructure would provide the opportu-nity for continued maintenance of this resource as part of the existing campus mapinfrastructure. A seamless connection of the haptic soundscape map within the univer-sity’s mapping system ensures that as the campus changes (i.e. new construction orrenovation) the map will be automatically updated and will therefore remain a current,useful tool.

Adobe provides a free player for web-based Flash content which supports the secondgoal, to create a financially accessible assistive map. In addition to the software costconcerns, there is a considerable range of hardware issues, such as haptic device costsand functions. Some haptic devices, such as the SensAble PHANTOM, make it possiblefor users to touch and manipulate virtual objects in three-dimensional space, but theyare expensive. The cost keeps haptic devices such as these out of the hand of averageblind and visually impaired users. A vibration mouse is another device available forhaptic feedback and emits vibrational responses that can be adjusted for intensity butcannot emulate three-dimensional space. The vibration mouse is less expense than otherdevices and again supports the second goal. It was critical to the development of thisproject to keep the map accessible to as many students and staff as possible. To do that,the hardware must be affordable. Again, keeping to the goal of financial accessibility,

22

Journal of Maps, 2009, 19-29 Lawrence, M. M., Martinelli, N. et al

the class chose a less expensive vibrational mouse to work with the prototype map inthe hope that a quality product could be created without high costs or isolating the mapproduced to the Disabilities Office.

The ability to easily update and adjust the map for use as a research tool in a humansubject testing environment (goal three) is very important for continued research intohow haptic soundscapes are used and interpreted by the blind and visually impaired.This would require an interface that could be updated by non-programmers, providinggeographic researchers with access to important visual, auditory and haptic cue con-trols.

3. Method

The computer science team developed a set of tasks based upon the constraints and finalproduct goals of the project, as noted above. The tasks were divided amongst groupswithin the computer science class. The three main tasks were to:

1. Transform the campus GIS (Geographical Information Systems) data from XML(Extensible Markup Language) format to a format which could easily be drawnby Adobe Flash;

2. Create a stylesheet language, similar to Cascading Style Sheets (CSS) used in html(HypterText Markup Language) development that would allow the geographyteam to make adjustments to visual, haptic and audio variables for experimentalpurposes. The custom style sheet syntax was designed to be similar to CascadingStyle Sheets so that it would be familiar to anyone with previous CSS experience;

3. Create a program that dynamically combines the html and Adobe Flash files foronline display.

The campus GIS data were originally stored as feature classes in an ArcSDE geodatabase.The relevant feature classes were exported as XML files. There were nine features thatthe geography team chose to include that would most assist navigation through thecampus. These included a grid overlay, buildings, athletic surfaces, bicycle racks, park-ing lots, streets, walks, walk underpasses, water features, and point features including,call boxes, bus stops, stairs, and public building entrances. These features are drawnand symbolized based upon settings within the stylesheet. The stylesheet provides thegeographers with the ability to easily adjust the color, vibration, and sound associatedwith given features.

23

Journal of Maps, 2009, 19-29 Lawrence, M. M., Martinelli, N. et al

All features already existed as part of the University of Oregon campus GeographicalInformation Systems infrastructure except the grid overlay and walk underpasses. Thewalk underpasses feature class was created to account for the fact that building foot-prints sometimes hide an accessible path that runs either through or under a structure.The creation and use of a grid overlay on haptic soundscape maps is a technique pre-viously developed and researched (Jacobson, 2000; Rice et al., 2005). By including agrating system information can be given about what direction the user is moving (car-dinal direction) and how far the user has gone (scale). The inclusion of a grid overlayaids in the perception of space, movement and orientation of the map user.

Choices were also made regarding continuous versus discrete cues associated with cer-tain features. Continuous auditory cues, or sounds that stay on continuously while themouse is over the feature, were created for buildings, walks and streets. A continuouscue was used for streets and paths to provide constant feedback to the user that theirmouse remained over the feature. Streets and paths are long, sometimes narrow fea-tures and a continuous auditory cue was designed to aid in a user’s ability to followthose features (perhaps to help in the task of route planning). Buildings were also givena continuous sound cue because they are objects that need to be explored. Each build-ing has a different footprint and a user may want to investigate the extent and shapeof each building. Buildings also had discrete auditory cues, or a one time, single soundthat takes place once the mouse curser encounters the feature. The discrete buildingcues give feedback about when the mouse cursor rolls in and out of a building. Thesediscreet auditory cues are meant to be an additional signal for the edges of objects. Dis-creet auditory cues were also created for the bicycle racks and other point features. Asthe mouse entered the shape defined in the style sheet a digitally encoded sound played.

Haptic cues, or forced vibrations from the mouse to the users’ hand, proved more chal-lenging to produce than the auditory cues. The haptic mouse device, paired with Im-mersion TouchSense software, allowed for the programming of a variety of vibrationalsettings. Again, as the mouse entered the shape as defined in the style sheet, a hapticcue (different for each shape object) was emitted and felt by the user. The haptic mousedevice used in this project was originally designed to add extra haptic sensations whenplaying video games (a dominant visual experience). While there were multiple vibra-tional settings to choose from with this device the physical vibrations produced weresubtle. The subtlety of the feedback may be considered a drawback of this type of tech-nology for use with haptic soundscapes, but the device was able to deliver a pairedfeedback system including both haptic and auditory cues for all features. By pairing thefeedback, each major feature has multiple sensory feedbacks for the user. This ensuresthat users without haptic mouse devices have access to the same spatial information inabsence of the haptic interface.

24

Journal of Maps, 2009, 19-29 Lawrence, M. M., Martinelli, N. et al

4. Map design

The map produced in this project used non-traditional cartographic design methods. Atthis time there are no standard audio and haptic “symbols” for haptic soundscape maps;although there are design suggestions (Rice et al., 2005). Therefore, the auditory symbolset was based on iconic, easy to interpret, audio cues. The haptic feedback was based onthe capabilities of the haptic mouse that was used. As noted above, all environmentalfeatures represented were accompanied by both an auditory and a haptic cue when theuser has the mouse device over a map feature.

The types of environmental features that were included on the map were based uponresults from a survey that was posted on a number listservers used by visually impairedpersons, Orientation and Mobility instructors, and others. The survey asked them to listwhat environmental features needed to be represented on tactile maps. From the 140responses, a list of environmental features was compiled and a frequency analysis per-formed. The environmental map features that were presented on the haptic soundscapemap were features that were those most often reported as important to include on maps(results reported in Lobben et al., 2008).

Each environmental feature has been housed in a separate layer so they can easily beturned on and off. This was done for two main reasons. Firstly, it allows researchersto control the number and types of environmental features a blind subject encounters.Secondly, it allows a blind user to make choices about what they would like to haveincluded on the map.

5. User feedback

An informal user feedback meeting was conducted with a group of blind and/or visu-ally impaired persons. The feedback meeting was not a formal experiment and thus willnot be reported as such. But the user meeting did provide some general patterns of useas well as potential problems with the map. Firstly, it was generally considered that theaudio symbology was more easily and quickly understood than the haptic feedback.Many of the environmental features were symbolized by auditorally iconic symbols(such as the bicycle bell sound representing bike racks) and not earcons or syntheticsounds (a further discussion of earcons can be found in Brewster and Edwards (1993).The iconic sounds were used to make it easier for the user to associate a map symbolwith knowledge they already possessed and therefore eliminate the need to memorizea legend of both audio cues for a feature and the haptic cue that accompanied it. Thereason for the ease in understanding the audio symbols may be due to the larger avail-

25

Journal of Maps, 2009, 19-29 Lawrence, M. M., Martinelli, N. et al

able range of sounds to choose from which lead to sounds that were easy to differentiatefrom one another. Representing real-world environmental features with known soundsnot only lowers the learning curve but may also lower the cognitive load it takes to in-terpret the auditory map symbols. A more formal experiment would be necessary toprovide in-depth conclusions.

The haptic symbology was less easily understood and can even be classified as mildlyannoying to some of the users. Again, the reason for the apparent lower level of successfor the haptic symbols may be due to the more limited range of haptic symbology avail-able (i.e. many of the symbols “felt” similar and were therefore difficult to rememberwhat they represented).

Finally, the most significant results were noted in the use of the map grid system. Thegrid system was included to give map users a spatial frame of reference. However,even with the grid system, spatial updating proved difficult for most participants. Thenon linear spatial association between mouse location and cursor location on the screenmade it difficult for users to maintain a spatial reference of where they were on themap in comparison to other map features that they had explored. A Braille or tactilemap allows the user to move over space and associate real hand position with spatiallocation. The common use of a mouse is to lift and move the mouse as it get furtheraway, which removes the direct relationship between the position of the hand to thecursor position on the map.

In comparison, a paper tactile map provides tactual spatial reference or the ability forthe map user to use his/her hand as a spatial reference. The user can use their hand tocollect spatial information about the map while using their other hand as an ”anchor”to keep relative position to other map features.

6. Discussion

The choice of Adobe Flash for dynamic drawing of the map has proven less respon-sive than desired using the current program. The latency of the auditory feedback isinconsistent, which could cause confusion about the locations of features. The latencybetween the mouse rolling over or past an object is especially evident when the mouseis moved quickly. Further refinement of the programming may alleviate the strain thatthe dynamic drawing puts on the Flash interface.

The haptic symbology was not as effective as the team had planned. To address thehaptic feedback system there needs to be a comprehensive user testing experiment onthe vibration settings available, with the haptic mouse device used to optimize what is

26

Journal of Maps, 2009, 19-29 Lawrence, M. M., Martinelli, N. et al

possible and perhaps find settings that work better. It may also be useful to try differentpairings between vibrational settings and certain sound cues. There may be optimalpairings that will enhance the haptic symbology. Furthermore, it would also be useful toinvestigate further the full range of haptic devices available and the cost/performanceratios.

A more formal, experimental user testing session is in order. It was discovered throughthe informal testing that iconic audio symbology is effective and that a direct link be-tween actual hand position and screen position would assist in spatial learning from thehaptic soundscape map.

7. Conclusions

The creation of the haptic soundscape map in a teaching setting provided an opportu-nity for students to work as groups and rely on each others’ strengths and abilities. Thestudents in both classes gained real-world experience by outlining and creating a prod-uct with multiples goals, much like working in a production cartography lab or productdesign company. The haptic soundscape map made for this course is a preliminary ver-sion and is not currently being used by blind and visually impaired persons on campus.However the haptic soundscape map project is continuing to be worked upon by an-other class in the Computer Science Department with the goal of overcoming some ofthe problems identified above. The University of Oregon haptic soundscape map willbe available to blind and visually impaired students and staff in the future.

8. Acknowledgements

We would like to acknowledge all the students on both the Geography and ComputerScience courses who made this project successful. We would also like to thank ProfessorAmy Lobben and Professor Michal Young for their excellent instruction.

Software

The Geography team exported GIS data from ArcMap in XML format. The data weremanipulated by the Computer Science team using and XSLT XML transformation for-

27

Journal of Maps, 2009, 19-29 Lawrence, M. M., Martinelli, N. et al

mat for use with Adobe Flash. Immersion TouchSense software was used to create thevarious vibration symbols for the surfaces on the map. JSyn software was used to man-age and generate audio symbols. Original programs were created by the CIS team totransform and render the map in Flash.

ReferencesBREWSTER, W. P., S. and EDWARDS, A. (1993) Conference on Human Factors in Computing Systems,

chap. An evaluation of earcons for use in auditory human-computer interfaces, Amsterdam, Nether-lands, pp. 222–227.

ERIKSSON, Y. (2001) How to make tactile maps discriminable for touch, In Proceedings of the 20thInternational Cartography Conference, Beijing, China, pp. 2914–2922.

ESPINOSA, A., UNGAR, S., OCHAITA, W., BLADES, M. and SPENCER, C. (1999) Comparingmethods for introducing blind and visually impaired people to unfamiliar urban environments, Jour-nal of Environmental Psychology, 18, 277–287.

GOLLEDGE, R. (1991) Tactual strip maps as navigational aids, Journal of Visual Impairment and Blind-ness, 85, 296–301.

GOLLEDGE, R. G., KLATZKY, R. L. and LOOMIS, J. M. (1996) The Construction of Cognitive Maps,chap. Cognitive mapping and wayfinding by adults without vision, Dordrecht, the Netherlands, pp.215–246.

HEATH, W. R. (1958) Maps and graphics for the blind: Some aspects of the discriminable of textural sur-faces for use in aerial differentiation, Unpublished PhD Thesis, Department of Geography, Universityof Washington.

JACOBSON, R. D. (1998) Navigating maps with little or no sight: A novel audio-tactile approach, In Pro-ceedings of the Content, Visualization and Intermedia Representation, University of Montreal, Canada,pp. 95–102.

JACOBSON, R. D. (2000) Exploring geographies of blindness: Learning, reading, and communicatinggeographic space, Unpublished PhD thesis, Department of Geography, Queens University of Belfast,Northern Ireland.

JACOBSON, R. D. (2004) Haptic soundscapes: Developing novel multi-sensory tolls to promote accessto geographic information, In JANELLE, D., WARF, B. and HANSEN, K., (eds.) WorldMinds: Geograph-ical Perspectives on 100 problems, Dordrecht, the Netherlands, pp. 99–103.

JEHOEL, S., MCCALLUM, D., ROWELL, J. and UNGAR, S. (2006) An empirical approach on thedesign of tactile maps and diagrams: The cognitive tactualization approach, British Journal of VisualImpairment, 24, 67–75.

KITCHEN, R. and JACOBSON, R. D. (1997) Techniques to collect and analyze the cognitive mapknowledge or person with visual impairment or blindness: Issues of validity, Journal of Visual Im-pairment and Blindness, 91, 360–376.

LOBBEN, A., LAWRENCE, M. and FICKAS, S. (2008) Standardized symbology for tactile navigationmaps, American Association of Geographers, Boston, Massachusetts.

28

Journal of Maps, 2009, 19-29 Lawrence, M. M., Martinelli, N. et al

PERKINS, C. (2001) Tactile campus mapping: Evaluating designs and production technology, In Pro-ceedings of the 20th International Cartographic Conference, Beijing, China, pp. 2906–2913.

RICE, M., JACOBSON, D., GOLLEDGE, R. and JONES, D. (2005) Design consideration for haptic andauditory map interfaces, Cartography and Geographical Information Science, 32, 4, 381–391.

SHIMOGA, K. B. (1993) A survey of perceptual feedback issues in dexterous telemanipulations, In Pro-ceedings of Part I. Finger Force Feedback, VRAIS, Seattle, Washington, pp. 263–270.

UNGAR, S., BLADES, M. and SPENCER, C. (1995) Visually impaired childrens strategies for memo-rizing maps, British Journal of Visual Impairment, 13, 27–32.

29