Enhancing Learning in Distributed Virtual Worlds through Touch: A Browser-based Architecture for...

27
Enhancing Learning in Distributed Virtual Worlds through Touch: A Browserbased Architecture for Haptic Interaction Sylvester Arnab, Panagiotis Petridis, Ian Dunwell and Sara de Freitas Serious Games Institute, Coventry University, United Kingdom Abstract As technology evolves to support an increasingly diverse range of pedagogic approaches, various kinds of technologydriven interven tions have already been integrated into conventional teaching meth ods, which include webbased tools and mobile devices, social net working environments and computer games. Such integration is capable of providing platforms to support formal and informal learn ing experiences, in both blended and standalone contexts. Innova tions in teaching and learning, promoting a knowledge society, and supporting tremendous growth in demand for highly educated and skilled individuals, are influenced by the advances in technology as well as the emergence of a new generation for whom technology is an integral part of the learning process. This generation is able to migrate between existing and new technologies, building socially driven communities, and has high expectations of fidelity and dy namics from virtual learning environments. However, as well as providing support for this fidelity, it is impor tant to consider the pedagogic models that underpin innovative, technologyenhanced learning techniques. In order to increase and sustain engagement and receptiveness of the learner and to ensure effective learning outcomes, immersive learning facilitates the com plete involvement of learners in their learning environment, harness

Transcript of Enhancing Learning in Distributed Virtual Worlds through Touch: A Browser-based Architecture for...

Enhancing Learning in Distributed Virtual Worlds through Touch: A Browser-­based Architecture for Haptic Interaction

Sylvester Arnab, Panagiotis Petridis, Ian Dunwell and Sara de Freitas

Serious Games Institute, Coventry University, United Kingdom

Abstract

As technology evolves to support an increasingly diverse range of pedagogic approaches, various kinds of technology-­driven interven-­tions have already been integrated into conventional teaching meth-­ods, which include web-­based tools and mobile devices, social net-­working environments and computer games. Such integration is capable of providing platforms to support formal and informal learn-­ing experiences, in both blended and standalone contexts. Innova-­tions in teaching and learning, promoting a knowledge society, and supporting tremendous growth in demand for highly educated and skilled individuals, are influenced by the advances in technology as well as the emergence of a new generation for whom technology is an integral part of the learning process. This generation is able to migrate between existing and new technologies, building socially driven communities, and has high expectations of fidelity and dy-­namics from virtual learning environments. However, as well as providing support for this fidelity, it is impor-­

tant to consider the pedagogic models that underpin innovative, technology-­enhanced learning techniques. In order to increase and sustain engagement and receptiveness of the learner and to ensure effective learning outcomes, immersive learning facilitates the com-­plete involvement of learners in their learning environment, harness-­

2

ing techniques of exploration to enhance understanding of particular subjects. Whereas an encyclopaedic approach for understanding the characteristics of an ancient artefact requires a degree of intrinsic motivation on the part of the learner, examining and exploring the artefact first-­hand via visual, social or tactile interaction can prove a more compelling and engaging experience, reaching wider student audiences more effectively. Supporting experiential and exploratory models of learning, this

chapter describes a method to combine different sensory elements of an immersive experience, to create an engaging and immersive mul-­timodal approach to learning. By introducing tactile interfaces with visual and social, it is envisaged that learner perceptions will be more immersed and the gap between virtual and real spaces will be bridged by deeper involvement of the learner. By stimulating visual and tactile perceptions, real learning experiences may be more accu-­rately replicated in a virtual world.

1.1 Introduction

With advances in technology enhanced learning in education and of high fidelity and engaging game-­based ex-­

periences, innovations in teaching and learning have become a focus of research and development activities. This is increasingly relevant in a time when teachers run the risk of alienating their students via traditional means of education that do not capitalize on the fact that

a-­ (Prensky 2001, Bialeschki 2007). In sup-­

port of these needs, virtual learning environments have already been integrated into conventional teaching methods, which include the use of web-­based tools and mobile devices, social networking envi-­ronments and computer games to support formal and informal learn-­ing experiences. However, experiences in a virtual learning environment ultimately

represent abstract subsets of their real-­world counterparts: regardless of how immersive the environment is, learners are still clearly able to distinguish between artificial and real, and this extends to their perceptions of action, reaction, and consequence, resulting in gaps

3

that may emerge in the experiential learning model between action, experience and reflection (Kolb 1984, Arnab et al. 2010). These are often evident as unfulfilled learning outcomes due to inadequate fi-­delity or reports of cognitive overload as learners struggle to address the additional demands required to reflect on virtual experiences in the context of real-­world events (Warburton 2008, Parker and Myrick 2009). Hence, it is fundamental to constrict the fissures be-­tween virtual and real spaces, and thus allow experiential learning (Kolb 1984) techniques to be successfully applied (Arnab et al. 2010). Multisensory teaching approaches mirror more closely evolved

learning processes and promote experiential learning pedagogy, suggesting unisensory approaches are sub-­optimal and that their se-­lection is mostly based in practicality rather than pedagogy (Shams and Seitz 2008). Multimodal interfaces can create more immersive experience by stimulating various perceptions in a virtual environ-­ment (Chalmers et al. 2009). In support for such pedagogic perspectives, this chapter aims to

explore a multimodal approach to learning paying close scrutiny to the deployment of tactile capability in a virtual learning environment towards narrowing the gap between virtual and real spaces. This chapter reviews the pedagogy and learning styles that influ-­

ence the need for tactile stimuli in a multisensory learning environ-­ment. The use of haptic technology is explored to provide insights into the current trends and future work within this domain. This chapter also reports on the deployment of tactile capability as a part of the Roma Nova project at the Serious Games Institute, UK.

1.2 Exploratory Learning Model

Under an experiential model of learning, individuals are encour-­aged to reflect on their actions and consequences to foster under-­standing and reapplying this understanding to future actions. In pro-­posing the experiential model, Kolb (1984) put forth a refined

There are four possible learning styles according to Kolb: (i) Diverging (feeling and watch-­

4

ing), (ii) Assimilating (watching and thinking), (iii) Converging (do-­ing and thinking) and (iv) Accommodating (doing and feeling). From these learning styles, the possible outcomes are Concrete

Experience (feeling), Reflective Observation (watching), Abstract Conceptualization (thinking) and Active Experimentation (doing).

, which depend on each individual preference and result in different outcomes.

Fig.1. Experiential learning based on Kolb (1984)

Reflective observation paired with abstract conceptualization is not uncommon in a typical classroom setting, encyclopaedic learn-­ing approach and e-­learning environment. As an attempt to address the need to emphasise the importance of concrete experience via ac-­tive experimentation, stimulating tactile perception may enhance not only engagement with the learning process but also the understand-­ing of experiment subjects/samples. This approach complements or perhaps enhances conceptualiza-­

tion and observation by placing a high emphasis on transforming the lesson-­learned to real applications, implementations and implica-­

5

tions as well as analysis-­oriented and hands-­on skills. Therefore, nderstanding

of a subject matter may narrow the gap between what we perceive as a virtual and real object in a virtual learning environment. In their model of exploratory learning, de Freitas and Neumann

(2009) expand the experiential learning model to include virtual en-­vironments (VEs). Exploratory learning opens up the capability for learning through exploration of virtual environments, and using vir-­tuality as an active training dimension. Tied deeply to social interac-­tive learning, the exploratory model aims to provide learning de-­signers with better tools for exploiting three-­dimensional VEs. Of particular relevance in this case is the merging of tactile interactions into a broader pedagogic toolkit for tutors. Learning in the explora-­tory model is choreographed, rather than designed, with the potential to include a myriad of sensory components from touch to smell, taste to vision and hearing. Game design and experience design therefore are not simply aspects of the reflective cycle of experience, but active design tools and components of experience design, lead-­ing to wider possibilities about how virtual spaces can be designed and interacted with, towards a seamless transition between physical and virtual spaces. Exploration in this sense then is a design compo-­nent that facilitates learning in different ways. Here we consider the role of tactile interfaces within the multisensory approach, and in particular the use of haptics to reinforce learning outcomes that are derived, rather than posited.

1.3 Tactile Stimuli

Real value exists in designing learning experiences to support an exploratory and open-­ended model of learning to encourage learners to make their own reflections and summations and to come to an un-­derstanding in their own way. Schiphorst (2007) stated that technol-­

A more engaging and involved learning environment may somehow promote concrete experience via active experimentation as a com-­plement to abstract conceptualization and reflective observation, thereby allowing through exploration with virtual and physical ele-­

6

ments a unique set of learning outcomes that reinforces reflection and learning. These desired outcomes can be supported by a multisensory ap-­

proach towards a more active and interactive engagement with the learning environment. There is evidence that multisensory or multi-­modal interfaces can create such engagement with a virtual world (Chalmers et al. 2009). Other studies (see Sham and Seitz 2008) also demonstrated that multisensory training can be more effective than similar unisensory training paradigms.

to be authentic is critical to experiential (and exploratory) learning (Bia-­leschki 2007), one of the most difficult challenges in VE research and development (Brown 2006). The immersive experience of a learner could be enhanced by adding the ability to not just perceive objects (visually and/or auditorially as in current VEs), but through tactile perception (Laycock & Day 2003). The implications of touch for cognition that advocates -­ instruction is rec-­ognized by many educators (Minogue and Jones 2006). As an at-­tempt to address the need to emphasise the importance of concrete experience via active experimentation on a multimodal platform, stimulating tactile perception may enhance not only engagement with the learning process but also the understanding of experiment subjects/samples. This approach complements or perhaps enhances conceptualiza-­

tion and observation by placing a high emphasis on transforming the lesson-­learned to real applications and implications as well as analy-­sis-­oriented and hands-­virtual artefact may narrow the gap between what we perceive as a virtual and real object. Tactile feedback can be achieved by haptic technology, designed

to communicate through the subtle and sensitive channels of touch. n-­

(Katz 1989 as cited by Minogue and Jones 2006). The term has

the human interaction with the exter(Minogue and Jones 2006). The use of such technology has been re-­ported

7

2004). of learning is highly encouraged by this form of interacti -­ Haptic devices are becoming more cost effective, which opens up

exciting possibilities for the inclusion of tactile perception in VE. Technological advancements in haptics have extended its appli-­cations in a numerous of disciplines ranging from military to medi-­cine (see Stone 1992, Srinivasan & Basdogan 1997, Lieu et al. 2003, Minogue and Jones 2006). The benefits of using haptics in learning are demonstrated by its increasing deployment in flight training as well as medical training. For instance, military and commercial pi-­lots carry out training using flight simulators, which require the ap-­plication of force or pressure on the controls corresponding to real experience during an actual flight (Minogue and Jones 2006). Simi-­larly, haptic interfaces are also widely used for medical simulation, particularly for laparoscopic and endoscopic surgery. Morris et al. (2007) suggested that in conjunction with visual feedback, tactile perception in a virtual training environment may enhance the teach-­ing of sensorimotor skills essential in surgery. The potential of such devices has been reported in literatures on training demonstrating a wide range of virtual applications from surgical simulators to train surgeons in performing surgeries in virtual environments (Dawson, et al. 2000, Satava 2001, Lieu et al. 2003, Laycock & Day 2003) to breast simulator towards raising awareness of the importance of breast assessment (Arnab and Raja 2008, Arnab et al. 2008, Solanki and Raja 2010). Furthermore, haptic devices have the potential to replicate real

experiences. For instance, studies carried out by Chial et al. (2002) and Greenish et al. (2002) demonstrated that the performance in real and virtual experiment, such as the cutting of tissues was similar. When haptic feedback is available during the exploration of three-­dimensional objects, studies have shown that individuals develop more three-­dimensional understandings similar to real experiences compared to when only visual feedback is available (Jones et al. 2003, Jones at al. 2005). Newell, et al., (2001) reported that tactile interaction may increase understanding of the physical characteristic of objects in a virtual world, where there was a preference for tac-­tile/haptic exploration of every single side of three dimensional ob-­jects including surfaces hidden from view whereas visually there is a

8

preference for exploring only the visible parts of objects. Minogue and Jones (2006) conclude from existing literature that haptic inter-­action and feedback is superior to visual responses towards the per-­ception of properties such as texture and microspatial properties of pattern, compliance, elasticity and viscosity. There is indeed a great potential for tactile stimuli to be intro-­

duced in a virtual learning environment. Bialeschki (2007) discussed a potential framework for the use of haptic devices in a classroom setting in support for research and development that promotes expe-­riential education. This framework encompasses the concepts of relevance, relationships, and real (authenticity). In conjunction with

theoretical framework serves as a motivation to explore the implementation of haptics (Bialeschki 2007, Gillenwater et al. 2007) to support the

-­approach within a classroom setting.

1.4 Learning ancient history as a case study

Learning concepts about a subject matter that lies beyond imme-­diate reach and understanding is a challenging task, requiring in-­creasing scaffolding and guidance as subject matter becomes in-­creasingly abstracted from everyday experience. As a particular example, learning ancient history within the context of cultural heri-­tage is traditionally dependent upon intangible and text-­based narra-­tives, often accompanied by illustrations and historical facts. To promote better engagement with the learning process and absorption of information, complete involvement of learners in their learning environment is essential to narrow the gap between these abstract concepts and real experiences, towards substantiating reflection, conceptualization and application without the need for extensive learner scaffolding. A selection of reconstructed/replica artefacts may be purchased

for teaching purposes, though this imposes a huge cost in obtaining every single artefact and scales poorly to large groups of learners. Driven by the advances in technology, ancient artefacts that are physically unavailable in a classroom setting may be made tangible

9

in a virtual environment (VE). Using virtual technologies, various artefacts can be visualised in a three dimensional space. However, learning via observation without any real interaction with these arte-­facts may not be as engaging as physically touching a virtual arte-­fact. Stimulating both visual and tactile perceptions in a virtual learning environment may narrow the gap between what we perceive as a virtual and real object. Existing works exploring such an approach commonly employ

specialist technologies, which can be expensive and are not easily available to a general audience. Hence, to promote engagement, the deployed technologies have to support accessibility to a wider demographic. The means of transfer thus require optimisation-­ sim-­ple, familiar, cost effective and easily available for mass consump-­tion. Furthermore, they are mostly driven by technology rather than pedagogy. The existing learning model based on reflective observation

(watching) and abstract conceptualization (thinking) has to be en-­hanced by emphasizing on concrete experience via experimentation towards a more involving process of learning. As a proof-­of-­concept, tactile interaction via a haptic device was implemented to

behaviours of ancient objects upon interaction. The behaviour solid, deformable, texture impressions may enhance the conceptuali-­zation of the subject matter. Hence, with regards to the experiential model, concrete experience via active experimentation may poten-­tially substantiate the outcome and process of observation and con-­ceptualization. This case study describes a part of the development project (the

Roma Nova project) built around the Rome Reborn Model (Guidi et al. 2007, Petridis et al. 2010, Panzoli et al., 2010) towards a mul-­tisensory learning platform within the domain of cultural heritage. The proof-­of-­concept advocates exploratory learning by incorporat-­ing tactile interfaces in a VE and accessibility to a wider demo-­graphic by capitalising on the fact that internet access, coupled with graphical user interfaces and web browsers, is common in education and at home (Kaklanis et. al 2009), and an abstraction layer provided through middleware allows a wide range of haptic devices to be supported.

10

1.4.1 Related development

There have been developments in the domain of a virtual space, where multimodal interfaces were explored towards providing a more engaging learning experience. The research in human-­computer interfaces aims to correlate the natural interfaces between human subjects with their surrounding and to constitute their charac-­teristics in a VE. VEs have also been employed within the domain of cultural heri-­

tage. Applications, such as Virtual Egyptian Temple (Troche and Ja-­cobson 2010) and Virtual Gettysburg (Recker 2007), present users with virtual recreation of ancient artefacts as well as events such as a battle in a computer-­generated scene, which allow users to move around and observe from any angle or location. First-­hand interac-­tion with artefacts is however not advocated by these environments. In conjunction with the multimodal approach, the stimulation of

tactile perception is very useful in cultural applications, where an-­cient artefacts are mostly beyond reach in a physical world (Barbagli et al. 2002;; Bergamasco et al. 2002;; Dettori et al 2002;; Brewster 2005). There is a collection of cultural heritage institutions that have already embraced haptic technology. For instance, The Museum of Pure Form (Frisoli, 2007) allows visitors to interact with 3D art forms and explore the museum via stereo vision and tactile stimuli when interacting with virtual sculptures. Figueora et al. (2009) dem-­onstrated a similar multimodal platform based on the Gold Museum in Bogota, where commercial devices were integrated in order to al-­low visitors to see in stereo, hear, and touch replicas of small ob-­

Interactive Art Mu-­seum (Brewster, 2001;; Barbagli et al., 2002). The visually impaired may also be empowered, where sight is no longer the only necessary sensory means to appreciate artwork or a virtual space in general (Kaklanis et al. 2009). However, the deployed technologies are restricted to specific au-­

dience and location due to their sophistication, complexity and cost. Some evaluation tasks such as for the Gold Museum (Figueora et al. 2009) confirmed the interest of using this technology in real-­world setups, although there are deployment and usability issues with re-­gards to the general public. To reach a wide demographic, a more

11

accessible media is crucial. Web and mobile platforms are common assets of most households in Britain and may allow beneficiaries in dispersed locations to be supported. For instance, some 73 per cent of UK households have Internet access in 2010 (OFNS, 2010). Modern cultural heritage exhibitions have evolved from static ex-­

hibitions to dynamic and challenging multimedia explorations (White el al. 2008). The main factor for this has been the domination of the web technologies, which allows cultural heritage institutions and other heritage exhibitions to be presented and promoted online. VEs for cultural heritage can offer much more than what many cur-­rent cultural heritage institution web sites offer, i.e. a catalogue of pictures and text in a web browser. White et al. (2008) introduced a new level of multimodal experience by implementing an augmented-­reality platform for cultural heritage on a web. This platform allows users to not only view and interact with web-­based cultural artefacts but to also experience a mixed reality visualisation of the artefacts. By using special markers, a webcam and possibly a mixed-­reality eyewear, virtual artefacts can be viewed in a real-­environment. To support tactile perception on a web platform, the network ar-­

chitectures needed to add haptic capability to the human computer interface have to be explored. Applications, such as The Hanoi Game and Pool Game implemented a simple haptic interface over the web, which demonstrate a possibility of encouraging a first-­hand learning experience over the web compared to a more encyclopaedic approach (Ruffaldi et al., 2006). However, the development of these applications is more technology-­led rather than pedagogically-­driven.

1.4.2 Roma Nova

The Roma Nova project is based upon the Rome Reborn model (Guidi and Frischer 2005) of ancient Rome 340AD, the most high-­fidelity model of Ancient Rome currently in existence, providing three-­dimensional digital model which may be explored in real-­time. Rome Reborn includes hundreds of buildings which are procedurally generated based on accurate historical knowledge, 32 of which are

12

highly detailed monuments reconstructed based on accurate archaeo-­logical data (Fig.1).

Fig. 1. Three-­dimensional reconstruction of the Forum

The aim of the project is to provide a distributed tutoring envi-­ronment for children 11-­14 year olds to support cross-­disciplinary study, as part of an exploration and social interactive learning model. The model has been integrated with various modules includ-­ing a serious game based upon the history curriculum in the UK, and current work is integrating text to voice software, virtual agents and haptics. The first critical aspect of the Roma Nova project considers the design of an intelligent tutoring system using gaming paradigms to explore issues related to pedagogic design, such as how desired learning outcomes may be broken down into different scenarios, and how different classes of characters or personality may help the learner to explore different perspectives on events and ultimately how to assess the knowledge effectively learnt. In order to integrate the interactions between the player and the

Non-­Player Characters (NPCs) as well as artefacts within the scene, we propose a novel framework called the Levels of Interaction (LoI). The LoI model is a theoretical framework, wherein interac-­tions between a human user/player and background characters are simplified to three levels, each offering different interaction possi-­bilities and playing a well-­defined role in the game.

13

Graphically, the LoI can be represented as auras (Panzoli et al. 2010) (Fig. 2) and based on a simple social space metric. As an illustration, the LoI is divided into three levels. The first level aims to populate the characters with authentic crowd in order to increase the immersion of the player. Characters located in closer surrounding of the player belong to the interaction level. Finally, a character inside the dia-logue level interacts with the player in a natural way, ultimately us-ing speech recognition and synthesis. All the NPCs by default be-long to the background level, but as the player moves in the environment and they happen to get closer or away from the player and thus enter or exit the interaction or dialogue levels.

Fig. 2. Level Of Interaction Framework (Panzoli et al. 2010)

This approach can be extended to define the relationship between

the players and the artefacts. Tactile interaction can be allowed within the dialogue level, indicating the potential to reduce proc-­esses required to support both visual and tactile feedbacks. A dia-­gram that illustrates interaction and visualization technologies em-­ployed in the multimodal mixed reality system is shown in Fig. 3. The system comprises of:

The Interaction layer to support various interaction modes that can be grouped in three categories: Brain-­Computer In-­teraction (BCI), Natural Speech Interaction and interaction with haptic devices.

14

The Middleware Communication layer (MCL), an interactive framework, to support data sharing between distinct sets of devices

The Visualization layer to support different game and visu-­alization engines such as Unity, X3D and TV3D

Fig. 3. Multisensory framework to support various modes of interaction

1.4.3 Tactile interaction with an ancient artefact

The existing role-­play game prototype for ancient Rome allows users/learners/players to explore the virtual space and interact with the virtual agent in order to learn about history, politics and geogra-­phy. Incorporating tactile interaction with the surrounding artefacts

15

will essentially enrich learning experience and enhance understand-­ing of ancient artefacts. To support the inclusion of haptic functionalities within a VE,

H3D by SenseGraphics provides an open source software develop-­ment platform (H3D API). To support real-­time interactivity and be-­haviours, there are three levels of programming for H3D -­ using C++, X3D or Python. X3D and Python are high level interfaces to the API, which allows the definition of visual and haptic properties as well as the dynamic behaviour upon interaction. For instance, the deformable behaviour of a non-­solid surface can be rendered as a re-­sponse to tactile interaction. For more advance programming, C++ provides an access to the API, where new nodes and fields can be created dependent upon the requirement for new behaviours, objects and interaction. Unlike most other scene graph APIs, H3D API is designed chiefly to support a rapid development process. H3D essentially is cross platform and haptic-­device independent.

This implies that existing web-­based games and e-­learning platform can be extended with haptic capability. Currently, H3D supports haptic devices such as Phantom Desktop, Phantom Omni and Novint Falcon. The following section describes the scene development that was

built on top of the H3D API and web deployment enabled by a plug-­in for tactile interaction on a web browser.

1.4.3.1 Scene development using H3D

H3D scene graph is based on X3D, which employs a structural division in the scene-­graph concept -­ the use of Nodes and Fields. Nodes and field are used to define geometries, physical attributes and behaviours that make up the artefacts and the surrounding envi-­ronment within a virtual space. These nodes and fields are used to define the visual and haptic properties of the scene and artefacts. Models from the Rome Reborn, mainly in 3D Studio Max format,

were repurposed and translated into X3D to provide the visual scene-­graph, extended with haptic functionality and embedded in an html script to be rendered on a web-­browser. Fig. 4 illustrates the general development process from the original model to the equiva-­

16

lent X3D scene-­graph with haptic definition-­ rendered on a web-­browser.

Fig. 4. An overview of the scene development stages

The virtual scene and artefacts were based on the Rome Reborn model, where a complete collection consists of a digital terrain map of the city, 7000 buildings and various ancient artefacts within the late-­antique Aurelian Walls. Such digital assets complement the re-­sources required to teach ancient history and cultural heritage within this era. Fig. 5 displays a screen shot of a scene in ancient Rome.

17

Fig. 5.

Towards enhancing experience in a virtual environment, tactile feedback respective of the texture, shape and material of the arte-­fact and under-­standing through such an exploratory engagement. To promote cost-­effectiveness and therefore increase accessibility, tactile interaction is achieved by using an off-­the-­shelf Novint Falcon, which is more affordable, usable and stable within the domain of games. It is also supported by H3D and does not require prior technical skills and ex-­perience.

ld, not only does the Novint Falcon (Fig. 6 (a)) support virtual navigation in three dimen-­sional space, it also allows users to experience high-­fidelity three-­dimensional force feedback that represent texture, shape, weight,

18

dimension, or/and dynamics upon interaction with virtual artefacts (Fig. 6 (b)).

(a) (b)

Fig. 6. (a) The Novint Falcon and the (b) interaction with an artefact

Haptic properties have to be defined before any interactions with artefacts can be rendered in the virtual environment. H3D comes with a full XML parser for loading scene-­graph definitions of X3D extended with haptic functionality. With the haptic extensions to X3D via H3D API, tactile definition can be incorporated into the scene-­graph. Hence, surface properties have to be defined in order to be able to

touch an artefact, in this case an X3D shape. The surface can be smooth, with friction, have a magnetic characteristic and/or influ-­enced by the texture specified for the shape As an example, the artefact (marble statue) was assumed to be

solid and the surface was considered as with friction. Fig. 7 illus-­trates real-­time rendering of a haptic cursor (in a shape of a three-­dimensional hand) that represents tactile interaction with the solid artefact as well as navigation within the virtual scene.

19

(a) (b)

Fig. 7. (a) Navigating the scene and (b) touching an artefact with a haptic cursor

In H3D, a pre-­defined Python scripting can be used to express more behaviour in the scene, such as a deformable behaviour upon interaction. Fig. 8 demonstrates haptic interaction with a non-­solid surface, further exhibiting a great potential to support behaviours and properties of various artefacts in existence within the era.

(a) (b)

Fig. 8. Haptic interaction (hand cursor) with a soft artefact: (a) ready to touch (b) visual and hap-­tic feedback upon interaction

20

The scene-­graph definition of the virtual environment was em-­bedded in an HTML script, which can now be rendered on a web browser via the H3D-­Web plug-­in. Fig. 9 illustrates the proposed ar-­chitecture for the deployment of a haptic-­enabled scene on a web-­browser.

Fig. 9. A scene built on top of the architecture of a haptic-­enabled browser (Arnab et al. 2010)

By embedding the X3D scene graph within the HTML, a web page can be extended with haptic capability. Fig. 10 illustrates a haptic-­enabled browser, where learners can experience tactile feed-­back from real-­time interactions with ancient artefacts over the web using a Novint Falcon device. To optimise the learning environment, the platform downloadable installer that encapsulates the required components. The runtime processing, such as rendering, is thus delegated to the client. The web content including the haptic-­enabled visualisation may reside at the server side.

21

Fig. 10. The ancient world rendered on a haptic-­enabled web browser.

1.4.4 Further work

The prototype has demonstrated the feasibility of stimulating tac-­tile interaction and perception within a virtual environment, and the potential of promoting experiential and exploratory learning to com-­plement existing class-­room, encyclopaedic and e-­learning ap-­proaches. The haptic-­enabled web-­browser can be used to extend the

22

level of engagement in learning from observation and conceptualiza-­tion to a more concrete experience via active experimentation. However, several limitations exist with the current development,

which will prove important areas for further research and develop-­ment. Firstly, the scene development is dictated by the complexity of the Rome Reborn model, which requires high conversion time into equivalent X3D scene graphs. Even though the levels of detail of the geometries have been re-­

duced, particularly for background artefacts to simplify the conver-­sion process, the huge number of artefacts dictates an extensive amount of time be required to define haptic properties for the arte-­facts within the scene. In tandem with the aims of experiential learn-­ing, it is more practical to assume that a user/player/learner will only closely examine a selected number of artefacts relevant to the learn-­ing objectives, though such limitations and preconceptions of learner behaviour limit the capacity for exploratory learning. Further work will include identifying the relevant artefacts that

through evaluations in schools. This implies that only a selected number of artefacts will provide any tactile feedbacks upon interac-­tion. Secondly, challenges exist in providing high levels of realism, in

terms of both visual and tactile fidelity, for deformable objects. Ex-­isting work such as Arnab & Raja (2008) can be adopted to address these concerns through an increase in realism and accuracy in object behaviour upon interaction. Thirdly, there are also possible latency and bandwidth issues

when attempting to provide real-­time high quality graphics to non-­broadband users;; however, this can be addressed by providing an op-­tion to download virtual scenes for local interaction.

1.5 Conclusions

In support for the emergence of a learner demographic who mi-­grates rapidly between different technologies, it is essential to capi-­talise on the use of technologies in learning to engage and to in-­crease receptiveness. Therefore, in order to replicate real learning

23

experiences, a multisensory platform for learning can be more effec-­tive than similar unisensory training paradigms. However, much of the multimodality literature tends to overlook the tactile stimuli and is instead subjected by research and development using stimuli in the auditory and visual modalities. However, the underpinnings of this existing research may provide a framework for investigating the impact of haptics in teaching and learning. Some indirect evidence of how haptics may improve learning can be seen in various do-­mains such as in medical education and military training. Moreover, the study of haptics has grown significantly with the advances in the experimentation and implementation of touch in computing, as many researchers are involved in the development, testing, refinement and application of tactile and force feedback devices in other domains. This chapter has described an innovative approach in introducing

tactile perception in learning over the web, seeking to reach a wider demography in a cost-­effective way. By employing both visual and

-­learning experience may be advocated. This development comple-­ments the existing Roma Nova project on cultural heritage, which has demonstrated the potential of stimulating tactile perception to complement the visual representation of the subject matter. Future research and development work has been discussed. Other further work will include extending the existing multimodal game-­based learning framework with the haptic-­enabled web environment. The evaluation of the level of engagement, motivation and cognitive benefit of the proposed learning platform will also be explored. By making available a whole new category of sensations on a web

platform, haptic technology will open up various possibilities for developers and researchers of pedagogy and technology driven in-­tervention in learning, including game-­based learning approaches and pedagogies.

1.5 References

Arnab S, Raja V (2008) Chapter 4: Simulating a Deformable Object Using a Surface Mass Spring System. In Proc. of the 3rd international Conference on Geometric Modeling and Imag-­ing. GMAI. IEEE Comput. Soc., London. 21-­26. DOI: 10.1109/GMAI.2008.24

24

Arnab S, Solanki M, Raja V (2008) A Deformable Surface Model for Breast Simulation. 20th In-­ternational Conference of the Society for Medical Innovation and Technology, Vienna,Austria 28 -­ 30 August Arnab S, Petridis P, Dunwell I, de Freitas S (2010) Touching artefacts in an ancient world on a browser-­based platform. IADIS International Conference: Web Virtual Reality and Three-­Dimensional Worlds 2010, Freiburg, Germany. Bergamasco M, Frisoli A, Barbagli F (2002) Haptics Technologies and Cultural Heritage Appli-­cations. In Proc. of Computer Animation. Geneva, Switzerland, IEEE Comput. Soc. DOI: 10.1109/CA.2002.1017503 Bialeschki M (2007) The three Rs for experiential education researchers. Journal of Experiential Education. 29(3): 366-­368 Brewster SA (2005) The impact of haptic 'touching' technology on cultural applications. In: Hemsley, J., Cappellini, V., Stanke, G. (Eds.), Digital Applications for Cultural Heritage Institu-­tions, Ashgate Press, UK. 273-­284 Brown, J. (2006). New learning environments for the 21st century: Exploring the edge. Change. September/October, 18-­24. Butler M, Neave P (2008) Object appreciation through haptic interaction. In Proc. of Ascilite, Melbourne Chial VB, Greenish S, Okamura A (2002) On the display of haptic recordings for cutting bio-­logical tissues. 10th international symposium on haptic interfaces for virtual environment and tele-­operator systems, IEEE virtual reality conference, 80-­87 Chalmer A, Debattista K, Ramic-­Brkic B (2009) Towards high-­fidelity multi-­sensory virtual en-­vironments. The Visual Computer. Springer Berlin/Heidelberg 25(12): 1101-­1108. DOI: 10.1007/s00371-­009-­0389-­2

learning in virtual environments. Computers and Education, 52(2): 343-­352 Dettori A, Avizzano CA, Marcheschi S, Angerilli M, Bergamasco M, Loscos C, Guerraz A, Art Touch with CREATE haptic interface, In Proc. Of The 11th International Conference on Ad-­vanced Robotics, University of Coimbra, Portugal, June 30 -­ July 3, 2003. DOI = 10.1145/ 1180495.1180523 Dawson S, Cotin S, Meglan D, Shaffer DW, Ferrell M (2000) Designing a computer based simu-­lator for interventional cardiology training. Catheterization and Cardiovascular Interventions. 51: 522 527 Figueroa P, Coral M, Boulanger P, Borda J, Londono P, Vega F, Prieto F, Restrepo D (2009) Multi-­modal exploration of small artefacts: an exhibition at the Gold Museum in Bogota. In Proc. of the 16th ACM Symposium on Virtual Reality Software and Technology (VRST '09). Steven N. Spencer (Ed.). ACM, New York, NY, USA, 67-­74. DOI=10.1145/1643928.1643945 Frisoli A (2004) The Museum of Pure Form. PERCRO. http://www.pureform.org/. Accessed 10 February 2011

25

Gillenwater C, Kumar A, Moynihan B, Van Drimmelen J (2007) Use of Haptics to Augment Presence in an Experiential HCI Environment. Haptics, Presence, & Experiential HCI Environ-­ments, University of North Carolina at Chapel Hill. White paper. http://vandfam.net/haptic/files/ Use_of_Haptics_to_Augment_Presence_in_an_Experiential_HCI_Environment.pdf. Accessed 10 February 2011 Greenish S, Hayward V, Chial V, Okamura A. (2002) Measurement, analysis, and display of haptic signals during surgical cutting. Presence, 6(12): 626 651 Guidi G, Frischer B (2005) Virtualizing ancient rome: 3d acquisition and modeling of a large plaster-­of-­paris model of imperial rome. In: Videometrics VIII, San Jose, California, USA (2005), 119 133 Guidi G, Frischer B, Lucenti I (2007) Rome Reborn -­ Virtualizing the ancient imperial Rome. In Workshop on 3D Virtual Reconstruction and Visualization of Complex Architectures, ETH Zu-­rich, Switzerland, 12-­13 July 2007 Jones G, Andre T, Negishi A, Tretter T, Kubasko D, Bokinsky A, Taylor R, Superfine R (2003) Hands-­on Science: The Impact of Haptic Experiences on Attitudes and Concepts. National Asso-­ciation of Research in Science Teaching Annual Meeting, Philadelphia, PA. Jones M, Bokinsky A, Tretter T, Negishi, A (2005). A COMPARISION OF LEARNING WITH HAPTIC AND VISUAL MODALITIES. Haptics-­e The Electronic Journal of Haptics Research 4(0). http://albion.ee.washington.edu/he/ojs/viewarticle.php?id=44. Accessed 21 February 2011 Kaklanis N, Tzovaras D, Moustakas K (2009) Haptic Navigation in the World Wide Web. Uni-­versal Access in Human-­Computer Interaction. Applications and Services. Lecture Notes in Computer Science, 5616: 707-­715 Katz D (1989). The world of touch (L. Krueger, Trans.). Hillsdale, NJ: Lawrence Erlbaum. Kolb D (1984). Experiential Learning. New Jersey: Prentice Hall, Inc. Lieu A, Tendick F, Cleary K, Kaufmann C (2003) A survey of surgical simulation: Applications, technology, and education. Presence, 12(6): 599-­614 Laycock S, Day A (2003) Recent developments and applications of haptic devices. Computer Graphics Forum. 22(2): 117-­132 Minogue J, Jones MG (2006) Haptics in education: Exploring an untapped sensory modality (2006) Review of Educational Research, 76 (3): 317-­348 Morris D, Tan H, Barbagli, F, Chang T, Salisbury K (2007) Haptic Feedback Enhances Force Skill Learning. EuroHaptics Conference, 2007 and Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems. World Haptics 2007. Second Joint, pp.21-­26, 22-­24. DOI: 10.1109/WHC.2007.65 Newell F N, Ernst M O, Tjan B S, Bülthoff H H (2001) Viewpoint Dependence in Visual and Haptic Object Recognition. Psychological Science, 12(1): 37-­42 OFNS (2010) Internet Access: Households and individuals. Office for National Statistics. http://www.statistics.gov.uk/pdfdir/iahi0810.pdf. Accessed 10 February 2011

26

Panzoli D, Peters C, Dunwell I, Sanchez S, Petridis P, Protopsaltis A, Scesa V, de Freitas S (2010) Levels of Interaction: A User-­Guided Experience in Large-­Scale Virtual Environments, IEEE 2nd International Conference in Games and Virtual Worlds for Serious Applications (VS GAMES10), Braga, Portugal, March 26-­27, IEEE, pp. 87-­90, ISBN: 978-­0-­7695-­3986-­7 Parker B, Myrick F (2009) A critical examination of high-­fidelity human patient simulation within the context of nursing pedagogy. Nurse Education Today, 29(3): 322-­329 Petridis P, Dunwell I, de Freitas S, Panzoli D (2010) An Engine Selection Framework for High Fidelity Serious Games. The 2nd International Conference on Games and Virtual Worlds for-­Serious-­ 25-­26 March 2010. Braga, Portugal Prensky M (2001) Digital Natives, Digital Immigrants. On the Horizon, MCB University Press, 9(5) Recker S (2007), Virtual Gettysburg: Bringing the battlefield to life. http://www.virtualgettysburg.com/. Accessed 21 February 2011 Ruffaldi E, Frisoli A, Gottlieb C, Tecchia F, Bergamasco M (2006) A Haptic toolkit for the de-­velopment of immersive and Web enabled games. Symposium on Virtual Reality Software and Technology (VRST), ACM, New York, Cyprus, 1-­3 Nov.2006, 320-­323. DOI=10.1145/1180495.1180559 Shams L, Seitz A (2008) Benefits of multisensory learning, Trends in Cognitive Sciences, 12 (11): 411-­417 Slater M, Khanna P, et al. (2009) Visual Realism Enhances Realistic Response in an Immersive Virtual Environment. IEEE Computer Graphics and Applications, 29(3):76-­84. DOI=10.1109/MCG.2009.55 Solanki M, Raja V (2010) Modelling Palpable Masses for a Virtual Breast Examination. 9th In-­ternational Conference on Virtual-­Reality Continuum and Its Applications In Industry, Seoul, South Korea, 12-­13 December. DOI: 10.1109/BIYOMUT.2010.5479779 Srinivasan MA, Basdogan C (1997) Haptics in virtual environments: Taxonomy, research status and challenges, Comput. & Graphics, 21(4): 393-­404. Stone RJ (1992) Haptic feedback: A potted history from telepresence to virtual reality. Robotica, 10: 461-­467 Satava RM (2001) Surgical education and surgical simulation, World J. Surg. 25: 1484 1489 Troche J, Jacobson J (2010) An Exemplar of Ptolemaic Egyptian Temples, Computer Applica-­tions in Archaeology (CAA), Granada, Spain, April, 2010. Van Schaik P, Turnbull T, Van Wersch A, Drummond S (2004) Presence within a mixed reality environment. CyberPsychology & Behavior, 7(5): 540-­552 Warburton S, Garcia MP (2008) Defining a Framework for Teaching Practices Inside Virtual Immersive Environments: the Tension Between Control and Pedagogical Approach. In Proc. of