Exploring interaction design for mindful breathing support

16
IN DEGREE PROJECT COMPUTER SCIENCE AND ENGINEERING, SECOND CYCLE, 30 CREDITS , STOCKHOLM SWEDEN 2018 Exploring interaction design for mindful breathing support: the HU II design case TINGYE ZHONG KTH ROYAL INSTITUTE OF TECHNOLOGY SCHOOL OF ELECTRICAL ENGINEERING AND COMPUTER SCIENCE

Transcript of Exploring interaction design for mindful breathing support

IN DEGREE PROJECT COMPUTER SCIENCE AND ENGINEERING,SECOND CYCLE, 30 CREDITS

, STOCKHOLM SWEDEN 2018

Exploring interaction design for mindful breathing support: the HU II design case

TINGYE ZHONG

KTH ROYAL INSTITUTE OF TECHNOLOGYSCHOOL OF ELECTRICAL ENGINEERING AND COMPUTER SCIENCE

Exploring interaction design for mindful breathing support:

the HU II design case

Tingye Zhong [email protected]

Computer Science and Engineering Master of Science in Human-Computer Interaction

Supervisor: Tina Bin Zhu Examiner: Anders Hedman

KTH Royal Institute of Technology

School of Electrical Engineering and Computer Science SE-10044 Stockholm, Sweden

2018-10-09

ABSTRACT Mindfulness-based stress reduction (MBSR) is an effective method of stress treatment and prevention. One of the interactive systems aiming at supporting MBSR is HU. As a continuous work of HU, a device aiding mindful breathing called HU II was developed and evaluated. The paper investigates the real-time interaction between the user and the HU II including the performance of the deployed sensor as well as the representation and mapping of chosen modalities. Ten participants with different backgrounds and knowledge levels of mindfulness evaluated the device and gave their subjective opinions and feedback. The results indicate that the employed sensor and modalities support real-time interaction for mindful breathing. Via these results, further research possibilities and guidelines, which can help the design of future mindfulness-based solutions were suggested.

SAMMANFATTNING Mindfulnessbaserad stressreduktion (MBSR) är en effektiv metod för att behandla och förhindra stress. HU är ett av flertal interaktiva system som ämnar att stödja MBSR. HU II är en apparat som hjälper medveten andning utvecklades och utvärderades som en fortsättning på HU. Detta arbete undersökte interaktionen mellan HU II och användaren i realtid, prestandan hos den användna sensorn samt representationen och kartläggningen av de valda modaliteterna. Tio deltagare med olika bakgrunder och kunskapsnivåer angående mindfulness utvärderade apparaten och gav deras subjektiva åsikter och feedback. Resultaten indikerar att den valda sensorn samt modaliteterna stödjer interaktion i realtid för medveten andning. Via dessa resultat föreslås framtida forskningsmöjligheter och riktlinjer som kan stödja designen av framtida mindfulness-baserade lösningar.

Exploring interaction design for mindful breathingsupport: the HU II design case

Tingye ZhongKTH Royal Institute of Technology

Stockholm, [email protected]

ABSTRACTMindfulness-based stress reduction (MBSR) is an effectivemethod of stress treatment and prevention. One of the in-teractive systems aiming at supporting MBSR is HU. As acontinuous work of HU, a device aiding mindful breathingcalled HU II was developed and evaluated. The paper inves-tigates the real-time interaction between the user and theHU II including the performance of the deployed sensor aswell as the representation and mapping of chosen modalities.Ten participants with different backgrounds and knowledgelevels of mindfulness evaluated the device and gave theirsubjective opinions and feedback. The results indicate thatthe employed sensor and modalities support real-time in-teraction for mindful breathing. Via these results, furtherresearch possibilities and guidelines, which can help the de-sign of future mindfulness-based solutions were suggested.

KEYWORDSInteraction, design, user experience, breath, awareness, self-reflection

1 INTRODUCTIONWith the fast growth of social development, stress causedby work and information overload is affecting a growingnumber of people [4]. Thus, increasing effort has been madeto develop supporting strategies for both treatments and pre-vention. One of the effective stress treatments is Mindfulness-based stress reduction (MBSR). The MBSR program wasfounded by Kabat-Zinn, who introduced mindfulness to thewestern scientific community. He described mindfulness asthe awareness cultivated by paying attention on purpose,through being present and non-judgemental to the unfoldingof experience moment by moment [14].Mindfulness-Based Stress Reduction (MBSR) has been

proven effective for reducing stress and anxiety [3, 15, 19].In MBSR, focused attention on breathing is a widely usedpractice that fosters meditators’ awareness of the presentmoment, by helping them in sustaining their attention [16–18].

The interest of mindfulness and MBSR in Human Com-puter Interaction (HCI) has been growing in recent years[26, 29]. Providing bio-feedback based on digital technologies

Figure 1: HU: a case study of mindful breathing for stressreduction [35].

can be used for sustaining attention to breathe and increasethe awareness of users’ breath. For instance, The Soma Matand Breathing Light [27] demonstrated a somaesthetic ap-proach to mindful breathing utilising light and heat as output.Sonic Cradle [30] was designed to foster calmness and aware-ness by generating a soundscape based on breath pattern. Inthe mixed reality sandbox Inner Garden [13], which aims topromote self-reflection, breathing is directly mapped to thesea level.Another interactive system which aims at supporting

mindfulness-based stress reduction is a physical device namedHU, which stands for "exhale" in Chinese [35]. HU uses abiosensor to measure the user’s heart rate, and then exploitthe heart rate data for breathing rate estimation. It guidesresonant frequency breathing by utilising vapour, light andsound as output modalities. The device was evaluated with 30participants in the lab and home settings. The results showedthat such multimodal physical devices could be used to en-hance stress reduction [35]. However, instead of gatheringrespiration data, HU estimates respiration rate by running al-gorithms to process heart rate, which means it only providedrespiration representation in an indirect manner. Hence, HUdoes not support a direct interaction [35]. How would the

1

device be received when HU provides direct real-time inter-action? Would the system still support mindful breathingwhen the output modalities are closely related to and affectedby breathing rate?

As a continuous work, this study is focusing on designinga new physical device HU II to improve HU. By utilisingdifferent biodata (i.e. respiration data), HU II provides di-rect, real-time visual and auditory biofeedback. The researchquestion of this work is: how can we design an interactionto support mindful breathing? The research question encom-passes two aspects: the performance of respiration moni-toring technology in supporting mindful breathing; and theusers’ perception of the respiration representations in suchan interaction. Thus, the main objectives of this study in-clude: (1) designing and prototyping an improved device HUII for HU by using a different breath monitoring method,(2) evaluating the functionality of this device by exploringdifferent combination of respiration representations withuser study.

The rest of the paper is structured as follows: Firstly, previ-ous respiratory monitoring technologies and related projectsthat support mindful breathing are presented. Secondly, thedesign and technical implementation of the HU II is ex-plained. Thirdly, the user study and its results are presented.Finally, our conclusion and further possibilities are discussed.

2 RELATED RESEARCHRespiration MonitoringGenerally, different approaches of respiration monitoringcan be categorised as contact-based or non-contact. Contact-based methods employ sensing devices which are attachedto the subjects’ bodies, while non-contact methods are ac-complished by instruments that do not require any contactwith the subjects [1].

Contact-based Technology. Three variables are usually mea-sured in contact-based respiration monitoring methods: res-piratory related torso movements, airflow and sounds [1].A large number of sensor-based projects are built upon

strap-like breath sensors, including existing commercial prod-ucts [11, 23, 25, 30] as well as prototypes assembled by re-searchers [2, 13, 20, 33]. These methods could indicate dif-ferent phases of respiration by measuring the thoracic andabdominal expansion and contraction [1, 31].Methods detecting torso movement are regarded as non-

invasive methods [1]. Most of them deploy belts with vari-ous embedded sensors. Yet there are solutions using regularclothes, in which the sensors take the form of a clipping tagattached on clothes [21].

Certain respirationmonitoringmethods directly detect theairflow by sensing either the temperature, humidity, or CO2content which indicate breathing rates [1]. These methods

Figure 2: The typical settings of non-contact methods

mostly rely on sensors attached to the airways [8]. The qual-ity of data collected via airflow detecting methods are highlyrelated to the sensor installation and collecting devices. In themethods which detect temperature using thermistors, there’sa high incidence of displacement which may lead to poordata quality [28]. Although pressure transducers and CO2sensors are considered more accurate [1], their performancecan be affected by the data collecting device [9].

Other parameters for monitoring breath pattern includingrespiratory sounds, which can be measured by using a mi-crophone detecting the sound variation [1]. The microphoneshould be placed close to the respiratory airways, includingnasal passage and mouth, or over the throat [1, 6, 22, 32].Usually, the microphone is embedded in a hardware (e.g.headset), which might be uncomfortable to wear and thusdistract the user [22].Some contact-based sensing methods are considered in-

trusive and to some extent affects the user experience. Suchmethods are implemented with masks on the user’s face andsensors in the user’s nostrils [1]. Furthermore, the perfor-mance of most sensors could be affected by design changesin the collecting device [9]. Therefore, there is a trend to-wards non-contact respiration monitoring technology and anon-invasive experience [1].

Non-contact Technology. Some monitoring methods do notrequire sensors to be in contact with the user’s body. Someof them use sensors that detect the chest’s movements froma distance, for example, radar-based breathing monitoringusing the Doppler phenomenon [10]. Additionally, infraredimaging, thermal imaging, and optical imaging have alsobeen deployed to monitor respiration remotely with the aidof advanced computing [1].

But non-contact methods always expect the subject to re-main still for accurate measurement either by sitting or lying.Therefore, most non-contact methods are used to monitorrespiration rate during sleep [1].

2

Respiration RepresentationSeveral modalities have been applied to showcase respira-tion pattern and status. They can be categorised as visual,auditory and tactile biofeedback.

Visual. Visual is one of the modalities which could be usedfor presenting respiration data. And visual user interfaces arenot only presenting raw data via diagrams. One of the typicalpresentations is the animation of visual objects. The visualobjects might either move back and forth or change shapesaccording to breath pattern, such as a balloon inflating duringinhalation [23], a circle growing and shrinking [33], and anobject moving up and down with inhalation and exhalation[20].Visual interface can be delicate with luxurious details

when applying Augmented Reality. In the project Inner Gar-den [13], the subject’s electroencephalogram (EEG) signaland breathing data are presented in an augmented sandbox,which displays a tiny vivid world consisting of land andocean. The motion of waves is mapped to breathe whilethe speed of day and night cycle is controlled by breathingvariability.

Another proposed visual output is the lighting effect. Itcan be received with closed eyes, which indicates that itcan provide guidance when the user practices mindfulnesswith closing eyes. The Breathing Light provides a changingambient light that can follow the rhythm of user’s breathing[12].

Auditory. Researchers have previously mapped sounds tobreathing by either using specific sounds to represent inhala-tion and exhalation, e.g. water coming back and forth [33],or by applying an algorithm that generates sounds basedon the subject’s breathing patterns [5, 11, 30]. Additionally,the auditory output can also work as a mentor who guidessubjects practising mindfulness [25].

Tactile. Physical interaction provides a novel yet subtle userexperience. The Soma Mat utilises heat feedback to aid medi-tation practice, which fosters somatic awareness at the sametime [27]. Both Breathing with Touch [34] and Breath withMe [2] demonstrates a haptic interface with an inflatable airbag, which simulates the lung’s movement in respiration.

The interaction patterns in previously mentioned projectscan be roughly marked as in guidance, mirror and person-alised modes. In guidance mode, subjects follow the signalsgiven by the application, which means that the applicationacts as the coach or tutor to provide instruction. Mirror moderefers to the applications or devices simply representing sub-jects’ respiration, which mirror their breathing. Applicationsin personalised mode give respiration guidance based onthe user’s real-time biodata. In other words, they adapt theirinstruction according to the user’s breath pattern.

Interaction PatternThe interaction patterns in previously mentioned projectscan be roughly marked as in guidance, mirror and person-alised modes. In guidance mode, subjects follow the signalsgiven by the application, which means that the applicationacts as the coach or tutor to provide instruction [5, 13, 27, 30].Mirror mode refers to the applications or devices simply rep-resenting subjects’ respiration, which mirror their breath-ing [20, 23]. Applications in personalised mode give respira-tion guidance based on the user’s real-time biodata. In otherwords, they adapt their instruction according to the user’sbreath pattern [33, 34].

3 METHODDesign and ImplementationResearch through Design (RtD) method [36] was appliedduring the whole design process. By exploring solutions forboth breath monitoring and representation, we gained newknowledge which was used to support our development pro-cess. The design process consists of three parts: Firstly, thepossible usage scenarios of the device was defined. Secondly,the most suitable method to collect and process biodata wasinvestigated. In the last step, the interaction design was outmain focus.

Defining the User Scenario. The first step of the design pro-cess aimed to define the project scope, envision the userscenario by taking the contextual environment, functions ofthe device, and the possible interaction into consideration.The scenario is that the user uses the device while doing

breathing practice, with the reason that one of the aims ofthe device is to support mindful breathing. When practisingmindful breathing, the most used posture is sitting on thechair or on the floor, e.g. the full lotus posture (Figure 3).Therefore, the following contextual environmental settingwhere the device would be used were generated: the userwould use it alone at home, sitting on a chair or sitting cross-legged on the floor, with the device in front of him. Thisenvisioned setting helped to define the interaction betweenthe user and the device, and prompt requirements as well asrestrictions of the system.

Designing Respiration Monitoring. A sensor based belt waschosen to be the respiration monitor solution due to thefollowing reasons.

Posture. Most non-contact based methods are sensitive tothe subjects’ motions. Hence, such methods require subjectsto keep still in their designated position, for example, sittingin front of the sensor or laying below the sensor [1, 12, 27].Furthermore, such detection methods usually place the sen-sors in front of the subject’s chest or torso, which would not

3

Figure 3: Envisioned postures

be fulfilled according to the envisioned scenario. Thus thecontact-based methods are more suitable.Comfort. Among the contact-based methods, there are

different methods that monitor respiration by detecting bodymovements, airflow and breathing sounds. Additionally, thereare different types of sensors that need to be attached aroundthe chest or abdomen, in nasal, close to nostril or mouth (usu-ally with a mask or headset), and near throat or nose respec-tively. According to previous work regarding the implemen-tation and performance of different methods and sensors,detecting body movements seemed to be an easily appliedmethod with good performance [2, 7, 20, 24]. It appeared tobe the most reliable method since it has been used in severalHCI projects.For detecting body movements, several sensors could be

employed on sensing belt: stretch sensors [20], piezoelectricsensors [1], a light sensor with a light emitter [2], and ac-celerometer [7, 24]. After implementing and testing severalsensors, the stretch sensor became the final choice becauseof its convenient installation and clean data with less noise.The final version of the self-developed sensing belt con-

sists of a stretch sensor (elastic stretch gauge), a denim strapand some clips for length adjustment (Figure 4). The stretchsensor is a conductive rubber cord whose resistance is in-creased by stretching: when it is being pulled the resistanceincreases, and when it shrinks back without additional forcethe resistance reduces.By connecting the sensing belt into the circuit with a

controller, the voltage value on the sensor could be detected,whichwould change according to the resistance of the stretchsensor.When the user inhales, the stretch sensor is pulled andits resistance increases, which results in the rise of voltagevalue. When the user exhales, the voltage value decreasesaccordingly. Hence, different phases of respiration could berecognised by judging the trend of voltage value change(Figure 5).

Figure 4: The sensor is embedded in the belt.

Figure 5: One of the visualisation of respiration: light bluerefers to inhalation, dark blue refers to exhalation and pur-ple refers to retaining.

Figure 6: The sensing belt is worn around the chest.

After testing the sensing belt around the chest and ab-domen respectively, it turned out that the belt worked bestwhen worn around the chest (Figure 6).

Designing Respiration Representation. The whole system ofHU II is embedded in a humidifier. The output modalitiesused are vapour, mono-colour light and sound.Vapour and lights are both visual outputs. Given that

colours are usually connected to different emotion and thusconvey various information, white was chosen to be the LEDlight colour to avoid any possible influence on subjects. Sev-eral mapping methods were designed and tested in the userstudy (Figure 7).

4

Sound mapping is simple: with having a gentle water flow-ing sound as ambient sound, two different wave sounds wereplayed in turns to represent inhalation and exhalation re-spectively.

The HU device uses a biosensor to collect heart rate data,which requires further analysis to calculate correspondencebreathing patterns [35]. HU II deploys the breathing sen-sor to provide breath data which allows direct presentationand real-time interaction. Both HU and HU II devices offera multi-sensory experience with vapour, light and sound.However, the respiration monitoring and representations aredifferent in these two device.

User StudyParticipants. Ten participants took part in the user study, ofwhich four were males, and six were females. The partici-pants aged between 28 and 45 (mean 36.6). The participantsrepresented diverse occupations including researchers, stu-dents, physicians, office workers, and mindfulness coaches.Three of the participants lacked experience in breath prac-tice, two were beginners and did not practice regularly, thefinal five participants were experienced in mindfulness.

Procedure. As shown in Figure 7, the user study consisted ofeight parts which took about 50 minutes in total. The partic-ipants read and signed the consent form before starting thestudy. The guidance mode and the mirror mode were testedseparately. The guidance mode was tested first, allowing theparticipant to get familiar with the device. It’s conductedwithout the sensor and focus on investigating the mapping ofoutput modalities. Afterwards, participants tried the mirrormode with sensor applied. The testing of the mirror modemainly aimed to examine the performance of respirationmonitoring technology.

1. Testing guidance mode (Part 1-4 in Figure 7): the partic-ipants were asked to try the prototype and answer questions.There were three short tests with different output mappings.A short structured interview regarding interaction experi-ence was conducted afterwards.2. Testing mirror mode (Part 5-6 in Figure 7): the partici-

pants wore the breathing sensor for data collecting in thissession. They tried the prototype inmirror modewith vapourand light output.3. Interview (Part 7-8 in Figure 7): this section included

a short presentation of two possible further designs anda semi-structured interview. The possible further designswere focused on the appearance of the device. One designwas large and stationary while another one was small andportable. The interview questions were centred around about5 areas: 1) the experience of the three respiration representa-tions including vapour, light, and sound; 2) the experienceof the two interaction methods, mirror mode and guidance

mode; 3) possible daily usage of the device and the context;4) possible social usage of the device, either in a shared of-fice or with relatives or friends, etc.; 5) the experience ofwearing the sensor, did it affect breathing movements or themeditation procedure.

Data Collection and Analysis. The whole user study was au-dio and video recorded. The records were transcribed andcoded. During coding, quotesweremarkedwith labels, whichincluded vapour, light, sound, interaction, experience, andscenario. Based on that, common themes, including userscenarios, interaction patterns, etc. were identified and con-cluded.

4 RESULTSAll subjects were fond of the vapour as a representation ofrespiration. For light and sound output, both of them had sixsubjects’ affection (Figure 9).

User ScenariosWhen asked about when and where they would use such de-vices, the participants gave various replies ranging from theindoor environment to outdoor nature settings, from lookingand focusing on it to set it aside as an ambient display duringworking. "I could imagine using it at work, to be honest. Like,during the day, sometimes when it just gets too much. Maybeif I had it, then I would like to take 5 minutes or something,sit in my office and just relax, and that would be really cool,actually, to do that." – P9. "I can envision that I am workingin front of a computer and having it there. That’s quite nice." –P10.Before the interview, two possible further designs were

introduced to participants. One possibility was a small andportable device that user could hold it in hands. One partici-pant mentioned that this holding action might invite him toput more attention on the device. "This holding is so powerful.It’s like holding the lives because breathing is vital to life." –P1.The small scale of the portable device also encouraged

some participants to consider taking it outside. "I like thatit’s small that you could hold it in your hands. Maybe it’smy friend that helps me to breathe. And I can even take it tosomewhere." – P3. "It’s also very convenient that you can bringit. It’s so versatile because you can use it in different ceremonies.There’s a lot of possibilities, for rituals, connected to breathing,to the ambience, to the natural elements, also to communion,the sense of sharing with the community." – P1.Some participants mentioned that they envisioned using

the device with families, or even putting up the device in apublic room so everybody can start following it.

5

Figure 7: User study procedure

One of our participants, who is amindfulness coach, talkedabout his experience of teaching breathing method and envi-sioned that the mirror mode could help him illustrating hisbreathing while coaching students: "So first you are the onewho is breathing with it, and when everybody is following themachine, they are actually following your breathing. It helps totrain, for example, to explain a different pattern of breathing.Because usually, it’s not easy to see. This could be a really easy

way to show because people can see the vapour. Yeah, it couldbe used for teaching breathing techniques." – P1

It was also put forward that this device could be placed ina public room and affects people imperceptibly: "I would liketo try it in a waiting room, where the patients sit. It could benice just to have it light up, the vapour coming out with somesoft music. Since when there is some music, people will walkwith the rhythm, maybe this one will have the same effect.

6

Figure 8: Guidance mode and mirror mode during the test.

Figure 9: Result overview

If you switch it on, everybody will breathe with this devicewithout thinking about anything." – P3

In summary, participants proposed various scenarios, fromisolated to social settings and from indoor environment topublic settings.

Respiration MonitoringDuring the testing of mirror mode, the device performedrespiration representations according to the user’s breathingby detecting their respiratory movements.

However, there were some embarrassment caused by thesensor. Two female participants expressed their displeasureabout the sensing belt. When Participant 9 adjusted the belt,she said: "It is a very private place to put it. Should I do that?

I feel weird. For me, as a person, I would feel like it was con-straining me which is interact opposite of what I want to do."During the testing, she also mentioned: "This belt is veryuncomfortable for me. I have to keep it there, so I don’t wantto breath out completely because I’m scared of it going to falldown. And then when I’m breathing out, I control my breathingand now it’s extended that I’m ’okay, now I keep it there’. Thisbelt to me is really hard to work with. I can’t breathe with it.I feel like it stops me from breathing in a natural way." Notonly the placement of the sensing belt made her indisposedto put it on, but she also felt the belt distracting her frombreathing normally.

Respiration RepresentationVapour. When trying the guidance mode with only vapoursignals, all participants liked the vapour, and some of themcommented that the vapour was suitable for representingbreathing. "I think the most powerful thing is the vapour it-self. The vapour is very powerful, inspiring you the feeling ofbreathing in and breathing out. Usually, we are not able to seethe air breathing in and breathing out. But when we see thevapour, it’s so powerful." – P1.

Moreover, some participants pointed out that the vapourhelped them to focus on the respiration and increase bodyawareness. "Vapour was simple. It was intuitive, and it allowedme to focus only on my breathing. I can focus a lot more onmy body than I usually would. And I’m thinking mostly aboutonly just my feeling." – P5. "The steam helps me to calm downand not to think on other things." – P8.Furthermore, the vapour did not only provide visual out-

put. It brought a unique experience with its moisture as well."I sense the humidity somehow from the vapour. So it was avery nice feeling, this feeling of humidity there." – P1.The smell of vapour was also mentioned by one of the

participants, suggesting that she would like to have fragrancein the vapour. "In this steam, is there any aroma or somethinglike that? I want to put some aroma, that would be perfect." –P8.

When participants tried the device for the first time, theywere asked to follow the vapour in the way they feel com-fortable. Five participants inhaled with the vapour comingout while other five participants choose to exhale. Therewas no significant gender or experience difference in thispreference.Some participants stated there were several reasons for

them to exhale when the vapour came out. "I’m not afraidof the vapour coming out, but it feels wrong for me to inhalethe vapour coming out of something. That’s like a mentalthreshold until I know that this is just water and it feels quitenice. Otherwise, I don’t want to intake. The vapour coming outdoes not invite me to breathe in, it invites me to shut my mouthand breath stops." – P4. "I don’t actually want to breathe that

7

in, so I don’t feel like I relax in this way. If there’s maybe somekinds of scent, I might like to have the vapour. But not when it’sjust sort of steam of water. And this is cold. Yeah, it might benice if it was like body temperature as well." – P9. According tothem, they tried to avoid breathing in the vapour through thenose. But one participant indicated that she might considerthe inhaling the vapour if it had scent.

Light. According to six participants, the light effect workedwell to indicate breathingmovements. Some participants alsomentioned that the light helped to establish a pleasant andrelaxing environment. "I think the light is nice because it buildsas I breathe. When I breathe in, the light grows. Then I can feelthat my lung is filling up, so it’s like a visual representation forme. With my preconceptions, it means to breathe and relax."– P9. "It (the light goes up and dims down) is quite appealingand attractive I think. Changing the light with the vapour, itgives a very special sense. It’s not a very powerful light, butit’s very warm." – P1.

Meanwhile, some participants stated that a changing lightmight be stressful for them. Thus, they would like to keepthe light at a consistent lightness. "If I could use it in the way Iwant, I would keep the light constant, just like a lamp, becauseit could create some stress for me, to have it changing lightitself. So I would like to have the option to have constant light."– P3.

In this study, the LED provided a warm white light withchanging brightness. A few participants suggested that theymight prefer it having the transition in colour with a constantluminance. "I want to be in concentration when I breathe out.I should be invited to breathe in. I mean, it doesn’t need tobreathe, to fade to lower lightness. It could fade to a differentnuance. I should have a stronger direct telling me to breathein. So maybe it goes from white, then when you breathe out itmay be bluish." – P4.

Sound. The sound received both positive and negative com-ments. One participant mentioned that sound brought himto a relaxing environment. Another participant stated thatthe sound supported deep breathing. "The sound helps a lot.I really like it. And it also gets you to a nice place, to the sea.It’s fantastic." – P3. "The sound is nice. The sound makes mebreathe a bit stronger. It made me breathe deeper, for somereason, deeper and stronger, actually." – P7.

Meanwhile, sound distracted some other participants. Ac-cording to their statements, they focused more on the soundinstead of their respiration. One participant pointed out thatshe did not want to hear any sound during mindfulness ex-ercises. "The sound distracts me because I’ve been in the phasethat I breathe to relax. And during relaxing, I don’t want tohear anything." – P8.

There was an interesting phenomenon during the testingof sound output. Some participants mentioned that they

thought it would be great to have sound before they tried theguidance mode with sound. However, some of them had anopposite opinion afterwards. For example, P1 suggested thatwater sound might be a good match: "All the water movingsounds could be very appropriate because the vapour is relatedto the water." However, after he tried the guidance modewith the wave sounds, he changed his mind because thesound was a distraction to him: "I prefer without the sound.Because if I focus on the waves, somehow it distracts me fromthe breathing. The waves are much with the rhythm. This isa very subjective experience of course, from my side. But thevapour and the waves inspires me the breathing pattern. Soit’s quite experimental. I thought it could match, but I thinkthe feeling I had is: okay, if the sound is in the background, it’sfine, but nothing related to the vapour."

Mapping. During the user study, different mappings of thevapour was tested. One mapping was that the vapour onlyreferred to either inhalation or exhalation. Another mappingwas that the vapour indicated both of them. It turned outthat most participants preferred the vapour only referringto either inhalation or exhalation. When the steam meanttwo different actions, the participants might get confusedand thus found it hard to follow. "I have to focus more to beable to do this. The vapour means different things, dependingon the order. Otherwise, you lose track of it. So I lost track ofmyself a couple of times in this session. I have to focus to dothe right way. I prefer the first mapping where the vapour isonly mapped to one thing." – P5.After experiencing multi-sensory feedback with two or

three different output modalities simultaneously, two par-ticipants suggested that these outputs should be coherent."It (light) could be useful. But then you have to match it withthe vapour. It could also be confusing to the brain to use twodifferent elements to say breathing in and breathing out." – P1."I don’t think it’s the fact that it has to be only one modality,I think it can be all of them if they are mapped in intuitiveways." – P5.

Interaction Patterns. While trying the mirror mode, someparticipants regarded it as an enjoyable experience. "It wasreally nice to have the feeling that you are breathing and thedevice was breathing with you, that was really nice to feel thatit was responding to your breathing pattern. That could bereally powerful when you breathe, and the device mirrors yourbreathing pattern, then you can use it as an element to breathaltogether." – P1. "While breathing myself, it was really niceto see my own breath and make me reflect on it so that I cancontrol it. It makes it easier to figure them out." – P10. Theyalso suggested that such real-time feedback supported theirself-reflection.

8

5 DISCUSSIONInterpretation of resultsIn this section, the results from the study are discussed. It isbased on participants’ accounts during user study.Regarding the performance of the breath sensor for sup-

porting real-time interaction formindful breathing, the stretchsensor is feasible in such a context, yet it needs improve-ments in accuracy. According to the observation during userstudy, there might be a gender issue when applying suchmonitoring technology.

The second research question was about the performanceof visual and aural output when representing respiration.Generally, vapour seems to be suitable for carrying respira-tion information. Meanwhile, both light and sound had sixapprovers.As the most popular respiration representation among

participants, vapour provides intuitive and aesthetic visualsignals. However, it can carry much more information be-yond visual signals, which encourages further exploration.As stated by some participants, they could sense the humid-ity and temperature of the vapour. This tactile experiencemakes it possible for the user to use the device with closingeyes. By feeling the change in moisture and heat, the usercan tell whether the vapour is going out or not.The third goal of the study is to know more about how

user envisioned such technology to support their mindfulpractice. Most participants approved of the positive effect ontheir practise of mindful breathing. They also illustrate manysituations where such technology could help, for example,reducing stress while working, assisting tutor in group train-ing, calming patients in the waiting room. It indicates thatsuch device might help creating social sustainable places thatpromote wellbeing by supporting mindfulness practice.

Storytelling. During the interview, it was noticed that a fewparticipants tended to make a story to help them better un-derstand and get used to the mapping. Even though theperceived output signals were the same, they might stillhave different interpretations due to the different storiesthey made. For example, when talking about the mapping ofthe light, they gave different stories, one was taking deviceand subject as a system, others regarded the prototype as adirect presentation of subject’s status. "When I breathe in, myenergy will arise so that I would have more light, so it meansthat the device will have less, so I’m taking the light from thedevice. When I breathe out, I exhale, and then the light fadesout, and it’s giving back the light to the device." – P1. "I thinkthe intuitive feeling is that when there’s no energy, there’s nolight, and there’s no vapour; when there’s energy, so you’removing and breathing, lights up and there’s vapour, and thenit disappears." – P5. "I think I would prefer the light going upwhen I breathe in. Because I wake up when I breathe in so that

the light would go up. And when I breath out the light woulddim. So it synchronises with my energy. I also feel like whenthe device is breathing out, I should breathe out." – P6.

All of these implies that users build their conceptual modelbased on their perception, which affects their understandingof mapping. It also implies that any signals on the productand the introduction during the test might be a crucial clueto the user. As the P1 mentioned, he imagined himself takingthe energy from the device and then giving it back. Whatif designers tell users a story from the first? Different usersmight take the idea and build their conceptual models onit, which could result in an identical behaviour. Designersmight manipulate storytelling to avoid potential ambiguity.

The multisensory output should avoid information overload.As mentioned in results, some participants pointed out thatthe multi-sensory feedback should be coordinated to avoidpotential ambiguity. Some also said that load of informationmight lead to cognitive problems when users have to dealwith many different signals. "I think, to focus on retaining,understand that when should I retain when should I breathein, and have the light, and have the sound, and you havethe vapour, I try to understand and do it in right way, and itbecomes just, like, too complicated. What I think is like, forme the most natural thing will be to breathe in when I seevapour and light and then I breath out, but if I have to focus on,like, retaining something and figuring out what light meansand, like, if it’s no vapour, it’s vapour ... yeah, it’s like toooverwhelming. Then it becomes more a cognitive practice, thana mindfulness exercise." – P5. "The patients are very sensitiveto inputs, both from lights and from the sound. It shouldn’t bea lot of things happening, they have difficulty handling thesetoo many things." – P3.

It indicates that designers need to be careful when design-ing multisensory feedbacks when the goal is not to designa challenge or mental training program. Since the aim is tohelp the user to reach their mental peace, designers shouldavert information overload.

Further Research Possibilities and SuggestionsPersonalised Interaction. In this project, the personalised in-teraction, which gives adapted guidance based on real-timedata was not implemented. One participant suggested thathe would like to have such a function. "It can mirror mybreathing, right? However, it could also mirror but add on to it,for example, try to alter my breath while I’m doing it, tryingto prolong it or ..." – P10.Since some participants stated that mirror mode helped

them to foster their awareness, can the device bring morebenefits by imperceptibly guiding them to calm breathing?Besides, if the device is used in an office, as mentioned by

9

some participants, personalised guidance might be less dis-tracting than simple real-time feedback [20].

Sensing Belt Redesign and Personalised Calibration. As men-tioned in results, there were some complaints about the sens-ing belt from two female participants. Those comments implythe potential ethical issue of the sensing belt. They also indi-cate the need of redesigning the gadget to provide a betterexperience.Moreover, personalised calibration of the sensor might

improve its performance. One of the causes of wrong inter-pretations was that the strip sensor did not suit everyonewell. Most people had their preferred way of breathing, somepreferred to keep an abdominal breathing, on which occa-sion the sensor would work better when placing around theabdomen. While people have different breathing patterns,, a calibration and adjustment session before testing mightimprove the monitoring result.

Not only the sensor needs calibration, the program shouldbe adapted according to circumstance as well. In the study,the pre-set breathing pattern of guidance mode was about 6seconds for both breathing in and breathing out. Some partic-ipants informed that although they understood the mapping,they still couldn’t follow and synchronise their breath withthe device because the interval was far too different fromtheir regular breathing pattern. "I didn’t try to follow becausefirst time when I try it was too long, so it was difficult for meto follow. I feel it was too long to get into the rhythm, to getinto the same pattern." – P2. "The inhalations are too long, andthe exhalations are too short. So it’s hard for me. I can do it butit’s hard." – P3. "It’s a little bit too fast for me. I’m trying tobreathe slowly so it’s like, by the time I breathe out it’s alreadystopped, and then I don’t have time to breathe in. So I lose therhythm, and I’m a little bit like: ’What’s going on.’" – P9. Thus,an adapted guidance based on their initial breathing patternsmight help to improve the user experience.

Explore Possibilities of Vapour. The vapour does not only pro-vide visual cues. During the study, two participants addressedthat the temperature of the vapour made them feel cool oreven cold. Moreover, the vapour is something that peoplecan possibly breathe in and smell, adding the fragrance tothe vapour was suggested by one of the participants. "In thissteam, is there any aroma or something like that? I want toput some aroma, that would be perfect." – P8.

Regarding the vapour mapping, one participant suggestedthat instead of directly mapping vapour to inhalation orexhalation, the tempo relates to breathing rate. "If I don’tbreathe, then there’s no vapour, so if I hold my breath, thereshould be no vapour. And if I suddenly breathe really fast, orreally deep, then there’s changing that the tempo of the vapourout. I would say. So maybe not necessarily that flow in andout exactly when I breathe in and out. That may be like if

I breathe in and out really slowly then the flow loops reallyslowly somehow." – P7.

Overall, there are still many aspects that can be exploredabout the usage of the vapour.

More or Less Information in Light. While some participantspreferred the light to be constant instead of carrying guid-ance information, one participant suggested to change thehue of the light colour instead of changing the brightness toguide the user’s breath. "I want to be in concentration when Ibreathe out, I should be invited to breathe in. I mean, it doesn’tneed to breathe, to fade to lower lightness, it could fade to adifferent nuance. I should have a stronger direct telling me tobreathe in. So maybe it goes from white, and when you breatheout it may be bluish." – P4.

Environmental Setting. Nine participants stated that theywould use the device in their daily life and some wouldeven bring it to the office and put it besides the screen. Thusin further studies, the user study could be conducted in anoffice setting where the device acts as an ambient display orat home that it’s under an everyday usage.

No matter where the future study takes place, the ambientlight is something that should be under control. It was aproblem that most of the time there was a strong ambientlight during the user test. The sunlight came from the leftside of the subjects. Even though the curtains were drawn,they did not help much to shield the light. While the lightingeffect was not obvious in such a bright environment, severalparticipants mentioned that they could barely see the lightdimming up and down. One participant complained aboutthe vague signals. "I didn’t see the light very clearly, so thatkind of distracted me." – P5.

Thus, in future user studies, the lightness of environmentshould be controllable, for example, by using a thick darkcurtain or hold the user testing in a dark roomwith adjustablelighting system.

6 CONCLUSIONSBased on previous project HU, a new physical device HU IIwas designed and developed with a breath sensor, which en-ables real-time multisensory feedback. The device is used tosupport mindful breathing utilising vapour, light and soundas representations of respiration. The sensor works to moni-tor the user’s breathing by detecting the body movements.

Based on the analysis and discussion of the study results,it comes to the following conclusions as well as possiblefurther research possibilities:(1) During the study, the stretch sensor worked and sup-

ported real-time interaction. However, the sensing gad-get and the algorithm needs improvement, for example,calibration, to ensure that it works for most people.

10

(2) While personal preference influences the acceptanceof light and sound, every participant agrees that thevapour is suitable for presenting breath. The vapour isa modality with abundant information, including vi-sual, tactile, olfactory output. It could also generate theaural output which comes from the vapour generator.

ACKNOWLEDGMENTSThe author would like to thank all the participants for theirtime and valuable contribution to this study.

REFERENCES[1] FarahQAL-Khalidi, Reza Saatchi, Derek Burke, H Elphick, and Stephen

Tan. 2011. Respiration rate monitoring methods: A review. Pediatr.Pulmonol. 46, 6 (2011), 523–529.

[2] Ilhan Aslan, Hadrian Burkhardt, Julian Kraus, and Elisabeth André.2016. Hold my Heart and Breathe with Me. In Proceedings of the 9thNordic Conference on Human-Computer Interaction - NordiCHI ’16.

[3] Ruth A Baer, James Carmody, and Matthew Hunsinger. 2012. WeeklyChange in Mindfulness and Perceived Stress in a Mindfulness-BasedStress Reduction Program. J. Clin. Psychol. 68, 7 (2012), 755–765.

[4] David Bawden and Lyn Robinson. 2008. The dark side of information:overload, anxiety and other paradoxes and pathologies. J. Inf. Sci. Eng.35, 2 (2008), 180–191.

[5] Sixian Chen, John Bowers, and Abigail Durrant. 2015. ’Ambient walk’.In Proceedings of the 2015 British HCI Conference on - British HCI ’15.

[6] Phil Corbishley and Esther Rodriguez-Villegas. 2008. Breathing Detec-tion: Towards a Miniaturized, Wearable, Battery-Operated MonitoringSystem. IEEE Transactions on Biomedical Engineering 55, 1 (2008),196–204.

[7] Atena Roshan Fekr, Majid Janidarmian, Katarzyna Radecka, and ZeljkoZilic. 2015. Development of a Remote Monitoring System for Respira-tory Analysis. In Lecture Notes of the Institute for Computer Sciences,Social Informatics and Telecommunications Engineering. 193–202.

[8] M Folke, L Cernerud, M Ekström, and B Hök. 2003. Critical review ofnon-invasive respiratory monitoring in medical care. Med. Biol. Eng.Comput. 41, 4 (July 2003), 377–383.

[9] Mia Folke, Fredrik Granstedt, Bertil Hök, and Håkan Scheer. 2002.Comparative provocation test of respiratory monitoring methods. J.Clin. Monit. Comput. 17, 2 (Feb. 2002), 97–103.

[10] E F Greneker. 1997. Radar sensing of heartbeat and respiration at adistance with applications of the technology. (Jan. 1997), 150–154.

[11] J Harris, S Vance, O Fernandes, A Parnandi, and others. 2014. Sonicrespiration: controlling respiration rate through auditory biofeedback.CHI’14 Extended (2014).

[12] Kristina Höök, Martin P Jonsson, Anna Ståhl, and Johanna Mercurio.2016. Somaesthetic Appreciation Design. In Proceedings of the 2016CHI Conference on Human Factors in Computing Systems - CHI ’16.

[13] Joan Sol Roo, Renaud Gervais, Martin Hachet. 2016. Inner Garden: anAugmented Sandbox Designed for Self-Reflection. In Proceedings ofthe TEI’16: Tenth International Conference on Tangible, Embedded, andEmbodied Interaction. ACM, 570–576.

[14] Jon Kabat-Zinn. 2006. Mindfulness-Based Interventions in Context:Past, Present, and Future. Clinical Psychology: Science and Practice 10,2 (2006), 144–156.

[15] Bassam Khoury, Tania Lecomte, Guillaume Fortin, Marjolaine Masse,Phillip Therien, Vanessa Bouchard, Marie-Andrée Chapleau, KarinePaquin, and Stefan G Hofmann. 2013. Mindfulness-based therapy: acomprehensive meta-analysis. Clin. Psychol. Rev. 33, 6 (Aug. 2013),763–771.

[16] Tatia M C Lee, Mei-Kei Leung, Wai-Kai Hou, Joey C Y Tang, Jing Yin,Kwok-Fai So, Chack-Fan Lee, and Chetwyn C H Chan. 2012. Dis-tinct neural activity associated with focused-attention meditation andloving-kindness meditation. PLoS One 7, 8 (Aug. 2012), e40054.

[17] Dominique P Lippelt, Bernhard Hommel, and Lorenza S Colzato. 2014.Focused attention, open monitoring and loving kindness meditation:effects on attention, conflict monitoring, and creativity - A review.Front. Psychol. 5 (Sept. 2014), 1083.

[18] Antoine Lutz, Heleen A Slagter, John D Dunne, and Richard J Davidson.2008. Attention regulation and monitoring in meditation. Trends Cogn.Sci. 12, 4 (April 2008), 163–169.

[19] William R Marchand. 2012. Mindfulness-based stress reduction,mindfulness-based cognitive therapy, and Zen meditation for depres-sion, anxiety, pain, and psychological distress. J. Psychiatr. Pract. 18, 4(July 2012), 233–252.

[20] Neema Moraveji, Ben Olson, Truc Nguyen, Mahmoud Saadat, YaserKhalighi, Roy Pea, and Jeffrey Heer. 2011. Peripheral paced respiration.In Proceedings of the 24th annual ACM symposium on User interfacesoftware and technology - UIST ’11.

[21] N Moraveji, E Santoro, E Smith, A Kane, A Crum, and others. [n. d.].Using A Wearable Health Tracker to Improve Health and WellbeingUnder Stress. pdfs.semanticscholar.org ([n. d.]).

[22] Rakesh Patibanda, Florian ’floyd’ Mueller, Matevz Leskovsek, andJonathan Duckworth. 2017. Life Tree: Understanding the Design ofBreathing Exercise Games. In Proceedings of the Annual Symposium onComputer-Human Interaction in Play. ACM New York, NY, USA ©2017,19–31.

[23] Yongqiang Qin, Chris Vincent, Nadia Bianchi-Berthouze, andYuanchun Shi. 2014. AirFlow: designing immersive breathing train-ing games for COPD. In Conference on Human Factors in ComputingSystems - Proceedings.

[24] T Reinvuo, M Hannula, H Sorvoja, E Alasaarela, and R Myllyla. [n. d.].Measurement of respiratory rate with high-resolution accelerometerand emfit pressure sensor. In Proceedings of the 2006 IEEE SensorsApplications Symposium, 2006.

[25] Ameneh Shamekhi and Timothy Bickmore. 2015. Breathe with Me:A Virtual Meditation Coach. In Lecture Notes in Computer Science.279–282.

[26] Jacek Sliwinski, Mary Katsikitis, and Christian Martyn Jones. 2015.Mindful Gaming: How Digital Games Can Improve Mindfulness. InLecture Notes in Computer Science. 167–184.

[27] Anna Ståhl, Martin Jonsson, Johanna Mercurio, Anna Karlsson,Kristina Höök, and Eva-Carin Banka Johnson. 2016. The SomaMat andBreathing Light. In Proceedings of the 2016 CHI Conference ExtendedAbstracts on Human Factors in Computing Systems - CHI EA ’16.

[28] K Storck, MKarlsson, P Ask, andD Loyd. 1996. Heat transfer evaluationof the nasal thermistor technique. IEEE Trans. Biomed. Eng. 43, 12 (Dec.1996), 1187–1191.

[29] Ralph Vacca and Christopher Hoadley. 2016. Understanding the Expe-rience of Situated Mindfulness Through a Mobile App That PromptsSelf-reflection and Directs Non-reactivity. In Lecture Notes in ComputerScience. 394–405.

[30] Jay Vidyarthi, Bernhard E Riecke, and Diane Gromala. 2012. SonicCradle. In Proceedings of the Designing Interactive Systems Conferenceon - DIS ’12.

[31] O L Wade. 1954. Movements of the thoracic cage and diaphragm inrespiration*. J. Physiol. 124, 2 (1954), 193–212.

[32] J Werthammer, J Krasner, J DiBenedetto, and A R Stark. 1983. Apneamonitoring by acoustic detection of airflow. Pediatrics 71, 1 (Jan. 1983),53–55.

[33] Kanit Wongsuphasawat, Alex Gamburg, and Neema Moraveji. 2012.You can’t force calm. In Adjunct proceedings of the 25th annual ACM

11

symposium on User interface software and technology - UIST AdjunctProceedings ’12.

[34] Bin Yu, Loe Feijs, Mathias Funk, and Jun Hu. 2015. Breathe with Touch:A Tactile Interface for Breathing Assistance System. In Lecture Notesin Computer Science. 45–52.

[35] Bin Zhu, Anders Hedman, Shuo Feng, Haibo Li, and Walter Osika.2017. Designing, Prototyping and Evaluating Digital MindfulnessApplications: A Case Study of Mindful Breathing for Stress Reduction.J. Med. Internet Res. 19, 6 (June 2017), e197.

[36] John Zimmerman, Jodi Forlizzi, and Shelley Evenson. 2007. Researchthrough design as a method for interaction design research in HCI. InProceedings of the SIGCHI conference on Human factors in computingsystems - CHI ’07.

12

www.kth.se