Objectifying facial expressivity assessment of Parkinson's patients: preliminary study

13
Research Article Objectifying Facial Expressivity Assessment of Parkinson’s Patients: Preliminary Study Peng Wu, 1 Isabel Gonzalez, 1 Georgios Patsis, 1 Dongmei Jiang, 2 Hichem Sahli, 1 Eric Kerckhofs, 3 and Marie Vandekerckhove 4 1 Department of Electronics and Informatics, Vrije Universiteit Brussel, 1050 Brussels, Belgium 2 Shaanxi Provincial Key Lab on Speech and Image Information Processing, Northwestern Polytechnical University, Xi’an, China 3 Department of Physical erapy, Vrije Universiteit Brussel, 1050 Brussels, Belgium 4 Department of Experimental and Applied Psychology, Vrije Universiteit Brussel, 1050 Brussels, Belgium Correspondence should be addressed to Peng Wu; [email protected] Received 9 June 2014; Accepted 22 September 2014; Published 13 November 2014 Academic Editor: Justin Dauwels Copyright © 2014 Peng Wu et al. is is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. Patients with Parkinson’s disease (PD) can exhibit a reduction of spontaneous facial expression, designated as “facial masking,” a symptom in which facial muscles become rigid. To improve clinical assessment of facial expressivity of PD, this work attempts to quantify the dynamic facial expressivity (facial activity) of PD by automatically recognizing facial action units (AUs) and estimating their intensity. Spontaneous facial expressivity was assessed by comparing 7 PD patients with 8 control participants. To voluntarily produce spontaneous facial expressions that resemble those typically triggered by emotions, six emotions (amusement, sadness, anger, disgust, surprise, and fear) were elicited using movie clips. During the movie clips, physiological signals (facial electromyography (EMG) and electrocardiogram (ECG)) and frontal face video of the participants were recorded. e participants were asked to report on their emotional states throughout the experiment. We first examined the effectiveness of the emotion manipulation by evaluating the participant’s self-reports. Disgust-induced emotions were significantly higher than the other emotions. us we focused on the analysis of the recorded data during watching disgust movie clips. e proposed facial expressivity assessment approach captured differences in facial expressivity between PD patients and controls. Also differences between PD patients with different progression of Parkinson’s disease have been observed. 1. Introduction One of the manifestations of Parkinson’s disease (PD) is the gradual loss of facial mobility and “mask-like” appearance. Katsikitis and Pilowsky (1988) [1] stated that PD patients were rated as significantly less expressive than an aphasic and control group, on a task designed to assess spontaneous facial expression. In addition, the spontaneous smiles of PD patients are oſten perceived to be “unfelt,” because of the lack of accompanying cheek raises [2]. Jacobs et al. [3] confirmed that PD patients show reduced intensity of emotional facial expression compared to the controls. In order to assess facial expressivity, most research relies on subjective coding of the implied researchers, as in aforementioned studies. Tickle- Degnen and Lyons [4] found that decreased facial expressivity correlated with self-reports of PD patients as well as the Unified Parkinson’s Disease Rating Scale (UPDRS) [5]. PD patients, who rated their ability to facially express emotions as severely affected, did demonstrate less facial expressivity. In this paper, we investigate automatic measurements of facial expressivity from video recorded PD patients and control populations. To the best of our knowledge, in actual research, few attempts have been made for designing a computer-based quantitative analysis of facial expressivity of PD patient. To analyze whether Parkinson’s disease affected voluntary expression of facial emotions, Bowers et al. [6] videotaped PD patients and healthy control participants while they made voluntary facial expression (happy, sad, fear, anger, disgust, and surprise). In their approach, the amount of facial movements change and timing have been quantified by Hindawi Publishing Corporation Computational and Mathematical Methods in Medicine Volume 2014, Article ID 427826, 12 pages http://dx.doi.org/10.1155/2014/427826

Transcript of Objectifying facial expressivity assessment of Parkinson's patients: preliminary study

Research ArticleObjectifying Facial Expressivity Assessment of ParkinsonrsquosPatients Preliminary Study

Peng Wu1 Isabel Gonzalez1 Georgios Patsis1 Dongmei Jiang2 Hichem Sahli1

Eric Kerckhofs3 and Marie Vandekerckhove4

1 Department of Electronics and Informatics Vrije Universiteit Brussel 1050 Brussels Belgium2 Shaanxi Provincial Key Lab on Speech and Image Information Processing Northwestern Polytechnical University Xirsquoan China3Department of Physical Therapy Vrije Universiteit Brussel 1050 Brussels Belgium4Department of Experimental and Applied Psychology Vrije Universiteit Brussel 1050 Brussels Belgium

Correspondence should be addressed to Peng Wu pwuetrovubacbe

Received 9 June 2014 Accepted 22 September 2014 Published 13 November 2014

Academic Editor Justin Dauwels

Copyright copy 2014 Peng Wu et alThis is an open access article distributed under the Creative Commons Attribution License whichpermits unrestricted use distribution and reproduction in any medium provided the original work is properly cited

Patients with Parkinsonrsquos disease (PD) can exhibit a reduction of spontaneous facial expression designated as ldquofacial maskingrdquo asymptom in which facial muscles become rigid To improve clinical assessment of facial expressivity of PD this work attemptsto quantify the dynamic facial expressivity (facial activity) of PD by automatically recognizing facial action units (AUs) andestimating their intensity Spontaneous facial expressivity was assessed by comparing 7 PD patients with 8 control participants Tovoluntarily produce spontaneous facial expressions that resemble those typically triggered by emotions six emotions (amusementsadness anger disgust surprise and fear) were elicited using movie clips During the movie clips physiological signals (facialelectromyography (EMG) and electrocardiogram (ECG)) and frontal face video of the participants were recordedThe participantswere asked to report on their emotional states throughout the experiment We first examined the effectiveness of the emotionmanipulation by evaluating the participantrsquos self-reports Disgust-induced emotions were significantly higher than the otheremotionsThuswe focused on the analysis of the recorded data duringwatching disgustmovie clipsTheproposed facial expressivityassessment approach captured differences in facial expressivity between PD patients and controls Also differences between PDpatients with different progression of Parkinsonrsquos disease have been observed

1 Introduction

One of the manifestations of Parkinsonrsquos disease (PD) is thegradual loss of facial mobility and ldquomask-likerdquo appearanceKatsikitis and Pilowsky (1988) [1] stated that PD patientswere rated as significantly less expressive than an aphasicand control group on a task designed to assess spontaneousfacial expression In addition the spontaneous smiles of PDpatients are often perceived to be ldquounfeltrdquo because of the lackof accompanying cheek raises [2] Jacobs et al [3] confirmedthat PD patients show reduced intensity of emotional facialexpression compared to the controls In order to assess facialexpressivity most research relies on subjective coding of theimplied researchers as in aforementioned studies Tickle-Degnen andLyons [4] found that decreased facial expressivity

correlated with self-reports of PD patients as well as theUnified Parkinsonrsquos Disease Rating Scale (UPDRS) [5] PDpatients who rated their ability to facially express emotionsas severely affected did demonstrate less facial expressivity

In this paper we investigate automatic measurementsof facial expressivity from video recorded PD patients andcontrol populations To the best of our knowledge in actualresearch few attempts have been made for designing acomputer-based quantitative analysis of facial expressivity ofPD patient To analyze whether Parkinsonrsquos disease affectedvoluntary expression of facial emotions Bowers et al [6]videotaped PD patients and healthy control participantswhile they made voluntary facial expression (happy sad fearanger disgust and surprise) In their approach the amount offacial movements change and timing have been quantified by

Hindawi Publishing CorporationComputational and Mathematical Methods in MedicineVolume 2014 Article ID 427826 12 pageshttpdxdoiorg1011552014427826

2 Computational and Mathematical Methods in Medicine

estimating an entropy score plotted over timeThe entropy is ameasure of pixel intensity change that occurred over the faceas it moved during expression Also they computed the timeit took an expression to reach its peak entropy value from theonset of each trial Using both measures entropy and timethe authors demonstrated that less movements occurred overthe face of PDpatientswhen theywere asked tomimic a targetexpression relative to control

Despite its good results the above described amount offacial movements does not directly relate to measuring facialmuscles activity Indeed facial expressions are generated bycontractions of facial muscles which lead to subtle changesin the area of the eyelids eye brows nose lips and skintexture often revealed by wrinkles and bulges To measurethese subtle changes Ekman and Friesen [7] developed theFacial Action Coding System (FACS) FACS is a human-observer-based system designed to detect subtle changes infacial features and describes facial expressions by action units(AUs) AUs are anatomically related to contraction of specificfacial muscles They can occur either singly or in combina-tions Simons et al in [2] used the FACS to analyze facialexpressivity of PD patients versus control In their study odorwas used as a means of inducing facial expression CertifiedFACS coders annotated the facial expressions A total facialactivity (TFA) measure estimated as the total number ofdisplayed AUs was used to assess facial expressions Theauthors demonstrated that the TFA measure revealed thatcompared to controls PD patients have reduced level of facialactivity in reaction to unpleasant odors

Estimating only the number of displayed AUs does notcompletely capture the dimensions of facial masking that arepresent in PD and defined in the Interpersonal Communica-tion Rating Protocol-Parkinsonrsquos Disease Version (ICRP-IEB)[9] The ICRP-IEB facial expressivity is based on the FACSIt defines expressivity in terms of (i) frequency that is howoften a behavior or movement occurs (ii) duration that ishow long a behavior or movement lasts and (iii) intensity ordegree being the strength force or levelamount of emotionor movement

In this study we propose a system that (i) automaticallydetects faces in a video stream (ii) codes each frame withrespect to 11 action units and (iii) estimates a facial expres-sivity as function of frequency duration and intensity of AUsAlthough there is already a substantial literature on automaticexpression and action unit recognition it still remains anactive area of study due to the challenging nature of theproblem [10] Moreover in contrast to AU detection thereis scarce work in the literature on AU intensity estimationThe proposed facial expressivity quantity makes use of theoutput of a previously developed automatic AU recognitionsystem [11] based on support vector machines (SVM) andAdaBoost To determine the AU intensity the resulting AUdistance measure from the AU SVM classifier is mapped tothe estimates of probability using the Platt scaling algorithmPlatt scaling [12] refers to a technique whereby a score-to-probability calibration curve is calculated using the trainingset Frame-by-frame intensity measurements are then used

Table 1 The selected movie clips listed with their sources

Emotion Excerptrsquos source1 2

Amusement Benny and Joone The god fatherSadness An officer and a gentleman UpSurprise Capricorn one Sea of loveAnger Witness GandhiDisgust Pink flamingos TrainspottingFear Silence of the lambs The shiningNeutral Colour bar patterns Hannah and her sisters

Clip 1 SR Clip 2 SR Clip 3 SR SR Clip 14Training clips SR

Figure 1 Emotion elicitation protocol (SR indicates self-report)

to estimate facial expression dynamics which were previouslyintractable by human coding

2 Methods

21 Pilot Study Our study aims to quantify facial expressivitydynamics of PD patients Thus gathering usable qualitativeemotional data is the essential step prior to the analysis offacial behaviors To voluntarily produce spontaneous facialexpressions that resemble those typically triggered by emo-tions in our study six emotions (amusement sadness angerdisgust surprise and fear) were elicited using movie clipsDuring the movie clips physiological signals and frontal facevideo of the participants were recorded Fifteen participants7 PD patients and 8 healthy control persons participated inthe pilot study After each movie the participants were askedto rate the intensity of dominant emotions they experiencedwhile watching the movie clips The evaluation of the partic-ipantrsquos self-reports showed that the disgust-induced emotionwas significantly higher than the other emotions Thus wefocused on the analysis of the recorded data during watchingdisgust movie clips

22 Emotion Induction Protocol Based on the studies ofGross and Levenson [13] Hagemann et al [14] Lisetti andNasoz [15] Hewig et al [16] Westerink et al [17] andSchaefer et al [18] we composed a set of movie clips [19] Foreach emotion (amusement sadness anger disgust surprisefear and neutral) two excerpts were used as listed in Table 1In the sequel we will refer to move clips as 119890119909119888119890119903119901119905119894 119894 = 12 For example the two surprise movie clips will be denotedas 1199041199061199031199011199031198941199041198901 and 1199041199061199031199011199031198941199041198902

The used protocol is depicted in Figure 1The participantswere told to watch 2 training movie clips and the 14 movieclips of Table 1 The movie clips were shown randomly 2successive ones with different emotions After each video clipthe participants filled in a questionnaire (ie self-report) withtheir emotion-feeling (amusement sadness anger disgustsurprise fear and neutral) and rated the strength of theirresponses using a 7-point scale

Computational and Mathematical Methods in Medicine 3

Figure 2 Experimental setup

The data recording took place in a sound-isolated labunder standardized lighting condition The movie clips werewatched by the participants while sitting on a sofa that wasfacing a screen with dimensions 125 by 125m (see Figure 2)

23 Participants This pilot study considered seven PDpatients (3 men and 4 women between the ages of 47 and 76years durations of PD ranged from 15 to 13 years) and eightcontrol participants (5 men and 3 women between the age of27 and 57 years)The control participants were recruited fromthe VUB ETRO department without specific characteristicsThe PD patients were selected with the help of the FlemishParkinson LeagueThe studywas approved by the committeeson human research of the respective institutions and wascompleted in accordance with the Helsinki Declaration

Based on the medical dossier as well as the feedback fromthe PD patients we defined three PD categories One patienthad a very light form of PD and therefore was classified asthe least severe case (denoted as LP) Another patient had themost severe form of PD of the whole group (MP) The restof PD patients were situated between those two extremes wedenoted them by intermediate PD (IP)

24 Data Acquisition During the movie clips physiologi-cal signals and frontal face video of the participants wererecorded As physiological signal we recorded electromyo-graphy (EMG) and electrocardiogram (ECG) All the datachannels (ie EMG ECG and videotape)were synchronized

The ECGmeasures the activity of heart contractionsThephysical action of the heart is induced by a local periodicelectrical stimulation and as a result a change in potentialof 10-20 120583V is measured during a cardiac cycle betweentwo surface electrodes [20] Heart rate (HR) and heart ratevariability (HRV) are the cardiovascular response featuresmost often reported as indicators of emotions [21] In thisstudy the ECG was measured at 512Hz using a Shimmer

The EMG measures the frequency of muscle tensionand contraction Within each period of interest the rootmean square (RMS) and absolute mean value (AMV) arecommonly used as features [22] In this study two facial mus-cles were measured at 2000Hz using the EMG Biomonitor

Ground

Orbicularis oculi

Levator labii superioris

Figure 3 Locations of the EMG electrodes

ME6000 namely the levator labii superioris (LLS) and theorbicularis oculi (OO) as illustrated in Figure 3

The frontal face video of the participants was recordedfor the purpose of quantifying facial expressivity Followingthe (ICRP-IEB) [9] we selected 11 action units (AU1 AU2AU4 AU6 AU7 AU9 AU12 AU20 AU23 AU25 and AU27)among the 41 ones defined by FACS Table 2 lists the usedAUs along with their associated facial muscles as well as theones defined by the measured EMG

Our purpose being the development of a nonobtrusiveapproach for facial expressivity assessment by automaticallyrecognizing facial action units (AUs) and estimating theirintensity the ECG and EMG measurements have beenincluded in our experimental protocol to quantitatively assesthe emotional manipulation for inducing facial expressionsand confirm facial muscle activity when expressing disgustexpression As our objective did not aim at physiologicallyinvestigating the effect of Parkinsonrsquos on facial EMG only theLLS andOOweremeasured by the EMG as a complementaryinformation to the considered AUs (see Table 2)

25 Data Preparation It is an accepted practice to code aproportion of the observations or ldquothin-slicesrdquo of recordeddata as a representation of the analyzed behavior [23] Inour experiments 30 s data (ECG EMG and video record)extractswere selected corresponding to the last 30 s segmentsof the shown video clips Intuitively one may think thatlonger segments of expressive behavior in persons with PDwould give more information and thus increase the accuracyof analysis However Ambady and Rosenthal [24] foundthat the judgment performance did not significantly improvewhen using 5 minutes slices versus 30 s slices This was alsoconfirmed in [25 26] Note that baseline data were alsoselected from the stimuli neutral1

26 Physiological Data Processing Physiological signals needto be preprocessed prior to feature extraction in orderto remove noise and enhance signal-to-noise ratio (SNR)[27 28] In this work we make use of a newly developedsignal denoising approach based on the empirical modedecomposition (EMD) [8] Different from state-of-the-art

4 Computational and Mathematical Methods in Medicine

Table 2 FACS AUs and related muscles

AU FACS name Facial muscle Videotape EMGAU1 Inner brow raiser Frontalis (pars medialis) X mdashAU2 Outer brow raiser Frontalis (pars lateralis) X mdashAU4 Brow lowerer Depressor glabellae depressor supercilii and CS X mdashAU6 Cheek raiser OO (pars orbitalis) X XAU7 Lid tightener OO (pars palpebralis) X XAU9 Nose wrinkler LLSAN X mdashAU10 Upper lip raiser LLS caput infraorbitalis mdash XAU12 Lip corner puller Zygomaticus Major X mdashAU20 Lip stretcher Risorius X mdashAU23 Lip tightener Orbicularis Oris X mdashAU25 Lips part Depressor labii inferioris X mdashAU27 Mouth stretch Pterygoids digastric X mdashAU45 Blink Contraction OO mdash XAU46 Wink OO mdash X

0 500 1000 1500 2000 2500

0100200300

The raw ECG signal

minus100

0 500 1000 1500 2000 2500

0100200

The denoised ECG signal

minus100

Figure 4 Example of physiological signal (ECG) denoising usingthe method proposed in [8]

methods [27ndash33] our approach estimates the noise level ofeach intrinsic mode functions (IMFs) rather than estimatingthe noise level of all IMFs using Donohorsquos strategy [34] priorto the reconstruction of the signal using the thresholdedIMFs Please refer to [8] for more details Figure 4 illustratesthe denoising results

261 Electrocardiogram (ECG) Heart rate (HR) and heartrate variability (HRV) are the cardiovascular response fea-tures most often reported as indicators of emotion [21]HR is computed using the time difference between twoconsecutive detected R peaks of the QRS complexes (ieRR interval) and is expressed in beats per minute HRVis the variation of beat-to-beat HR Thus the first step inextracting HR and HRV starts from the exact detection ofR peaks in the QRS complex Thus the detection of QRScomplex in particular R peak detection is the basis forECG processing and analysis Many approaches for R peaksdetection have been proposed [35] However most of themare off-line and targeting the noiseless signal which do notmeet the requirements of many real-time applications To

0 500 1000 1500 2000 2500

0

50

100

150

200

250

300

350

Samples

Noi

sy E

CG

minus100

minus50

Figure 5 Change points detection

overcome this problem in [8] we proposed an approachbased on a change point detection (CPD) algorithm for eventdetection in time series [36] that minimizes the error infitting a predefined function using maximum likelihood Inour current implementation polynomial fitting functions ofdegree 1 have been selected empirically An example of resultsis illustrated in Figure 5 which detects all R picks (cross) andsome irrelevant change points (circle) which can be filteredout using a predetermined threshold Once the R peaks weredetectedHRandHRV features are estimated as the differencebetween the HR (HRV) estimated from the considered 30 swindow (of the stimuli) and the one estimated from theneutral 30 s

262 Electromyogram (EMG) The absolute mean value(AMV) is the most commonly used feature for identifying

Computational and Mathematical Methods in Medicine 5

the strength of muscular contraction [22 37] defined overthe length119873 of the signal 119909(119905) as follows

AMV = 1119873

119873

sum

119905=1

|119909 (119905)| (1)

The AMV value during the considered 30 s window wasexpressed as a percentage of the mean amplitude during the30 s neutral baseline This percentage score was computed tostandardize the widely different EMG amplitudes of individ-uals and thus to enable comparison between individuals andgroups

27 Design and Statistical Analysis Preliminary analysis wereperformed to check the validity of the acquired data

(1) Manipulation Check We calculated the descriptive statis-tics (the mean and standard deviation) of self-reportedemotional ratings to check if the participantsrsquo emotionalexperience was successfully manipulated Then Wilcoxonrank sum tests were performed to compare the self-reportbetween groups (PD and control) and between the two clipsof the same emotion

(2) Analysis of Physiological Parameters Physiological vari-ables were tested for univariate significant differences viarepeated-measures analysis of variance (ANOVA) betweengroups (PD versus control) and between the disgust1disgust2 and neutral2 stimulants Group and stimuli arethe between-subjects and within-subject factors respectivelyResults were considered statistically significant at 119875 lt 05

28 Facial Action Units Recognition In this study we makeuse of an automatic facial action units recognition systemdeveloped at our department [11]This system allows context-independent recognition of the following action units AU1AU2 AU4 AU6 AU7 AU9 AU12 AU20 AU23 AU25 andAU27The overall recognition scheme is depicted in Figure 6

The head and facial features were tracked using theconstrained shape tracking approach of [38] This approachallows an automatic detection of the head and the trackingof a shape model composed of 83 landmarks After thefacial components have been tracked in each frame bothgeometry-based features and appearance-based features arecombined and fed to the AdaBoost (adaptive boosting)algorithm for feature selection Finally for each action unitwe used a binary support vector machine (SVM) for context-independent classificationOur systemwas trained and testedon the Kanade et al DFAT-504 dataset [39] The databaseconsists of 486 sequences of facial displays that are producedby 98 university students from 18 to 30 years old of which65 is female All sequences are annotated by certified FACScoders start with a neutral face and end with the apex ofthe expression SVM [11] has been proven to be powerful androbust tools for AU classification

A test sample (a new image) z can be classified accordingto the following decision function

119863 (z) = sign (ℎ (z)) (2)

Table 3 Facial expressivity items defined in ICRP-IEB [9] and usedAUs

Item Gestalt degree Related AUs(1) Active expressivityin face Intensity 11 AUs

(2) Eyebrows raising Intensity + frequency AU1 and AU2(3) Eyebrows pullingtogether Intensity + frequency AU4

(4) Blinking Frequency AU45(5) Cheek raising Intensity + frequency AU6(6) Lip corner puller Intensity + frequency AU12

The output ℎ(z) of a SVM is a distancemeasure between a testpattern and the separating hyperplane defined by the supportvectors The test sample is classified to the positive class (theAU to which it was trained) if 119863 (z) = +1 and is classified tothe negative class if119863(z) = minus1

Unfortunately we cannot use directly the output of aSVM as a probability There is no clear relationship with theposterior class probability 119875(119910 = +1 | z) that the pattern zbelongs to the class119910 = +1 Platt [12] proposed an estimate forthis probability by fitting the SVMoutput ℎ(z)with a sigmoidfunction as follows

119875 (119910 = +1 | z) = 1

1 + exp (119860ℎ (z) + 119861) (3)

Theparameters119860 and119861 are foundusingmaximum likelihoodestimation from the training set The above equation has amapping range in [0 1]

29 Facial Expressivity In this work we follow the rec-ommendation of the Interpersonal Communication RatingProtocol-Parkinsonrsquos Disease Version (ICRP-IEB) [9] wherethe expressivity based on the FACS has been defined interms of (i) frequency that is how often a behavior ormovement occurs (ii) duration that is how long a behavioror movement lasts and (iii) intensity or degree being thestrength force or levelamount of emotion or movement InICRP-IEB facial expressivity is coded according to the gestaltdegree of intensitydurationfrequency of 7 types of facialexpressive behavior (items) depicted in recorded videos ofPDs using a 5-point Likert type scale 1 = low (with low tono movement or change or infrequent) 2 = fairly low 3 =medium 4 = fairly high and 5 = high (very frequent veryactive) Table 3 lists 6 of the ICRP-IEB facial expressivityitems and the corresponding AUs used in this study forestimating them The last item is related to active mouthclosure during speech which was not used in our modelas the participants were not asked to speak during theexperiments

As our current AU recognition system considers only 11AUs (AU1 AU2 AU4 AU6 AU7 AU9 AU12 AU20 AU23AU25 and AU27) we did not consider blinking (AU45) inour facial expression formulation

6 Computational and Mathematical Methods in Medicine

Original video

I(0) I(1) I(T)

Shape model tracking

Appearance-based

P(0) P(1) P(T)

Geometry-based

Feature extraction

AU12 AU27 AU6 AU9 AU1 AU4

Lower facial features

Middle facial features

Upper facial features

Featureselection

AdaBoost

AU12 AU27 AU6 AU9 AU1 AU4Binary SVMclassification

Output decision values

AU1 AU2 AU4 AU6

AU7 AU9 AU12 AU20

AU23 AU25 AU27

Inner brower raiser Outer brower raiser Brow lowerer Cheek raiser

Lid tighten Nose wrinkle Lip corner puller Lip stretch

Lip tighten Lips part Mouth stretch

middot middot middot

middot middot middot

middot middot middot

middot middot middot

middot middot middot

middot middot middot

middot middot middot

middot middot middot middot middot middot middot middot middot

middot middot middot

Figure 6 The AU recognition system

Following the criteria of the ICRP-IEB we defined thefacial expressivity of a participant as follows

EFE = AMC119890+ AI119890minus (AMC

119899+ AI119899) (4)

where AMC119890and AMC

119899are the amount of movement

changes during the disgust and neutral facial expressionrespectively AI

119890and AI

119899are the intensity of the displayed

AUs during the disgust and neutral facial expression respec-tively

It has to be recalled that for all these estimations onlythe considered 30 s windows are used The quantities AMCand AI refer to frequency and intensity respectively Theirdetailed formulation is given in the following sections Inorder to assess the effectiveness of the proposed formulationwe also compared it to the following definition of facialexpression where the baseline of neutral emotion was notconsidered Consider the following

FE = AMC119890+ AI119890 (5)

(1) Intensity of Displayed AUs It has been shown in [40] thatthe output margin of the learned SVM classifiers contained

information about expression intensity Later Savran et al[41] estimated the AU intensity levels using logistic regres-sion on SVM scores In this work we propose using Plattrsquosprobability given by (3) as action unit intensity at frame 119905119868119905(AU119894) = 119875119905AU119894 with 119875

119905

AU119894 = 119875(119888 = AU119894 | z119905) is estimated

using (3) The 119868119905(AU119894) time series is then smoothed using a

Gaussian filter We denote by 119868119905(AU119894) the smoothed value

Figure 7 plots for a participant the smoothed intensity of thefacial AU7 also illustrating its different temporal segmentsneutral onset apex and offset

Having defined the AU intensity the intensity of dis-played AUs during a voluntary facial expression (disgust orneutral) is given by

AI = sum119894isinDAUs

1

1198731015840

119894

sum

119905isin119879119894

119868119905 (AU119894) (6)

where DAUs is the set of displayed (recognized) facial actionunits during the considered 30 s window 119879

119894is the set of

Computational and Mathematical Methods in Medicine 7

1

09

08

07

06

05

04

03

02

01

00 50 100 150 200 250 300 350

Frames

Inte

nsity

of A

U7

(lid

tight

ener

)

Figure 7 Example of AU intensity

frames where AU119894 is active and1198731015840119894is the cardinal of 119879

119894 being

the number of frames where AU119894 is active

(2) Amount of Movement Change The amount of movementchange during a voluntary facial expression (disgust orneutral) is given by

AMC = sum119894isinDAUs

1

1198731015840

119894

sum

119905isin119879119894

10038161003816100381610038161003816119868119905 (AU119894) minus 119868119905+1 (AU119894)

10038161003816100381610038161003816 (7)

with DAUs1198731015840119894 and 119879

119894as defined above

3 Results and Discussion

31 Manipulation Check Participantsrsquo emotion was suc-cessfully elicited using the movie clips of disgust Forboth disgust1 and disgust2 most participants (1415) self-reported the target emotion as their dominated emotionFor the other video clips the reported emotions are as fol-lows amusement1 (915) amusement2 (1215) surprise1(1115) surprise2 (215) anger1 (615) anger2 (915) fear1(1015) fear2 (1115) neutral1 (1315) and neutral2 (1315)Annotation of the video records using the Anvil annotationtool [42] further confirmed that the movie clips of disgustinduced the most reliable emotional data Therefore forfurther analysis we decided to use only the data recordedduring watching the disgust movie clips

The Wilcoxon rank sum tests were implemented on the14 disgusting self-reports Results showed that there wasno significant difference in self-reported emotional ratingsbetween disgust1 (119872 = 650 SD = 76) and disgust2(119872 = 664 SD = 63) and between PD group (119872 =651 SD = 65) and control group (119872 = 664 SD =74) This is what we expected since Vicente et al [43] alsoreported that PD patients at different stages of the disease didnot significantly differ from the controls in the self-reportedemotional experience to presented movie clips

32 Univariate Analysis of Group Effects TheEMGdata from3 participants (1 control and 2 PD) and the ECG data from 10participants (4 control and 6 PD) were discarded because ofnot being well recorded due to sensor malfunctions Results

of the repeated-measures ANOVAs on single physiologicalvariables showed that significant main effect of group on theLLS activity (119865(1 12) = 338 119875 = 09) the OO activity(119865(1 12) = 192 119875 = 19) HR (119865(1 3) = 93 119875 = 41)and HRV (119865(1 3) = 53 119875 = 83) was not found Table 4shows the descriptive data for control and PD groups duringexposure to the three movie clips disgust1 disgust2 andneutral2 and the tests of significance comparison betweengroups using Wilcoxon rank sum tests The Wilcoxon ranksum tests revealed that

(1) comparable levels of baseline activity in control andPD groups over both the LLS and the OOwere found

(2) although disgust1 elicited more muscle activity overthe LLS and the OO for control group than forPD group this difference did not reach statisticalsignificance

(3) disgust2 elicited significantly more LLS activity forcontrol than for PD

These results indicated that PD displayed less muscleactivity over the LLS when expressing disgust than controlIn addition disgust2 induced more muscle activity over theLLS and the OO than disgust1 which is consistent with theself-report andmay be due to the fact that disgust2 is slightlymore disgusting than disgust1

33 Univariate Analysis of Stimuli Effects For LLS we foundan effect of the stimuli on muscle activity (119865(2 24) = 947119875 = 001) Post-hoc tests indicated significant differencesbetween disgust1 and neutral2 (119875 = 01) and betweendisgust2 and neutral2 (119875 = 01) Both disgust1 (119872 = 282SD = 192) and disgust2 (119872 = 537 SD = 543) elicitedmore muscle activity than neutral2 (119872 = 110 SD = 38)Thismain effect was qualified by a stimulitimes group interaction(119865(2 24) = 417 119875 = 028) which was consistent with theresults of the Wilcoxon rank sum tests which compared thephysiological responses between groups

For OO we also found an effect of the stimuli on muscleactivity (119865(2 24) = 545 119875 = 012) Post-hoc tests indicatedsignificant difference only between disgust1 and neutral2(119875 = 002)Thedisgust1 (119872 = 232 SD = 157) elicitedmoremuscle activity than neutral2 (119872 = 135 SD = 111) Nosignificant stimuli times group interaction effect (119865(2 24) = 177119875 = 192) was found

We expected that the disgust clips elicited significantlymore muscle activity over LLS than the neutral clip becausenormally LLS is involved in producing the disgust facialexpression [44] The fact that OO was also significantlydifferent was probably due to the following

(1) crosstalk [45] the LLS and the OO lie in the vicinityof each other

(2) the disgust1 elicited is not only a disgust but alsoa bit of amusement by the funny motion of thecharacter and the background music so that the OOwas also involved in producing the facial expressionof amusement

8 Computational and Mathematical Methods in Medicine

Table 4 Statistical summary of physiological measures

Variable Stimuli Control PD SigMean SD Mean SD

LLSNeutral2 104 34 116 43 71Disgust1 336 204 227 177 26Disgust2 804 649 271 225 04

lowast

OONeutral2 135 111 94 54 81Disgust1 282 177 181 128 32Disgust2 520 571 207 148 13

HRNeutral2 172 277 276 mdash mdashDisgust1 minus53 264 512 mdash mdashDisgust2 222 553 565 mdash mdash

HRVNeutral2 153 1296 minus765 mdash mdashDisgust1 801 1145 2634 mdash mdashDisgust2 1486 3564 1492 mdash mdash

lowastThe two groups are significantly different that is 119875 lt 05ldquomdashrdquoWe did not compare the cardiac responses (ie HR and HRV) between groups because due to technical failures we lost the ECG data for 10 participantsand thus only 5 participants (4 controls and 1 PD) completed the ECG recording

Table 5 Facial expressivity assessment based on different methods

Var Clowast LPlowast IPlowast MPlowast

1198631lowast 1198632lowast 1198732lowast 1198631lowast 1198632lowast 1198732lowast 1198631lowast 1198632lowast 1198732lowast 1198631lowast 1198632lowast 1198732lowast

TFA 8 8 4 5 5 4 5 6 4 4 mdash 5AMCloz 139 277 108 67 65 43 82 62 69 39 mdash 82AI 54 51 18 28 28 22 38 39 31 22 mdash 33FEloz 722 751 246 209 187 142 444 356 262 135 mdash 322lowast119863111986321198732 C LP IP and MP denote disgust1 disgust2 neutral2 the control the least intermediate and most severely form of Parkinsonrsquos patients

respectivelylozpresented in percentagesldquomdashrdquoThe face of the MP while watching1198632 was blocked by his hand

Note that disgust2 (119872 = 364 SD = 432) elicitedmore muscle activity over OO (because of crosstalk) thanneutral2 (119872 = 114 SD = 87) however unlike disgust1the difference did not reach statistical significance (becausedisgust2 did not elicit amusement at all) which can also beinterpreted

Moreover the main effect of stimuli on cardiac parame-ters for both HR (119865(2 6) = 37 119875 = 70) and HRV (119865(2 6) =084 119875 = 48) was not found which is not consistent withwhat we expected unchanged [46 47] or increased [48 49]HR and increased HRV [21 47] This may be due to thefact that we did not have enough recorded ECG data for astatistical analysis

34 Qualitative Analysis of Facial Expressivity To qualita-tively analyze facial expressivity we used the total facialactivity TFA measure of [2 50] being the total numberof displayed AUs in response to the stimuli Compared tothe control (C) the PD groups (LP IP and MP) showedthe attenuation of their facial activities (variable TFA) whilewatching disgusting movie clips (see Table 5) Howevercomparable TFA was found while watching neutral movieclips

Visual inspection of the of the displayed AUs as illus-trated in Figure 8 shows that AU1 AU2 AU6 AU9 andAU45 occurred more frequently for C except for IP whoproduced more frequently AU6 than C A possible expla-nation is that the IP cannot deactivate AU6 even duringneutral state As shown in Figure 9 during neutral state PDpatients produced more continuous active action units suchas AU4 and AU25 (for all PD) AU6 (for IP) and AU9 (forMP) Moreover certain AU combinations were consideredlikely to signify ldquodisgustrdquo on the basis of Ekman and Friesenrsquosdescription of emotions [7] ldquoDisgustrdquo AUs were consideredas the combination of eyebrows lowerer (AU4) cheek raiser(AU6) and nose wrinkler (AU9) Because cheek raiser isvery difficult to produce on demand without including otherAUs especially the lid tightener (AU7) [51] lid tightenerwas also taken into consideration that is the expected AUpattern of ldquodisgustrdquo was the combination of AU4 AU6 AU7and AU9 Consistent with what we expected C displayed98 and 87 ldquodisgustrdquo frames while watching disgust1 anddisgust2 respectively The PD goup (LP IP and MP) didnot display any ldquodisgustrdquo frames only for IP who had4 ldquodisgustrdquo frames while watching disgust1 Furthermorethe IP displayed quite few frames with combinations ofcheek raiser and lid tightener during neutral state Instead

Computational and Mathematical Methods in Medicine 9

0 100 200 300 400 500 600 7000 1 2 4 6 7 9

122023252745

Frame0 100 200 300 400 500 600 700

Frame

AU

0 100 200 300 400 500 600 7000 1 2 4 6 7 9

122023252745

Frame

AU

0 100 200 300 400 500 600 7000 1 2 4 6 7 9

122023252745

Frame

AU

0 1 2 4 6 7 9

122023252745

AU

Disgust1 (C) Disgust1 (LP)

Disgust1 (IP) Disgust1 (MP)

Figure 8 The facial activities while watching disgust1

N(C) N(LP)

N(IP)

Frame

Frame Frame

Frame

N(MP)

0 100 200 300 400 500 600 7000 1 2 4 6 7 9

122023252745

AU

0 100 200 300 400 500 600 7000 1 2 4 6 7 9

122023252745

AU

0 100 200 300 400 500 600 7000 1 2 4 6 7 9

122023252745

AU

0 100 200 300 400 500 600 7000 1 2 4 6 7 9

122023252745

AU

Figure 9 The facial activities while watching neutral2

10 Computational and Mathematical Methods in Medicine

0

5

10

15

20

25

30

35

fe

C

LP

IP

MP

Disgust1 Disgust2

Figure 10 The quantified facial expressivity

the cheek raiser alone was displayed mostly (see Figure 9)which indicates that IP patients have control problems forfacial muscles combinations

Analyzing the defined quantities AMC AI and FE usingthe segments with active AUs as it can be seen from Table 5the control C had higher facial expressivity than LP IP andMP during the disgusting state while MP performed thelowest facial expressivity More specifically compared to CPD (LP IP andMP) had smaller values of variables AMC AIand FE while watching disgusting movie clips especially forbrow lower (AU4) nose wrinkler (AU9) and blink (AU45)C tended to show similar intensities of brow lower (variable119890) with bigger variance (variables AMC and FE) In additionC showed higher intensities of nose wrinkler with largervariance

35 Quantitative Analysis of Facial Expressivity Figure 10depicts the values of the facial expressivity equation (5) forthe participants We did not assess the facial expressivityof the MP patient during disgust2 as he had his hand onfront of his face during the experiments As it can be seen asignificant difference between C and PD patients is presentC got the highest score while the lowest score was obtainedby the MP patient However the score of the IP patient wasslightly higher than the LP one which is due to the fact thatthe LP and IP expressed ldquofacial maskingrdquo in different waysthe LP patient showed the attenuation of the intensities offacial movements On the contrary the IP patient producedhigh intensities of facial movements not only in his emotionalstate but also during neutral state that is he cannot relax themuscles well In order to take both kinds of ldquofacial maskingrdquointo account we computed the facial expressivity using (4)The results are shown in Figure 11 As it can be seen theproposed facial expressivity allows distinguishing betweencontrol and PD patients Moreover the facial expressivitydecreases along with the increase of PD severity

0

5

10

15

20

25

fe

CLP

IP

MP

Disgust1 Disgust2minus5

Figure 11 The quantified facial expressivity based on the improvedfunction

4 Conclusions

This study investigated the phenomenon of facial masking inParkinsonrsquos patientsWedesigned an automated and objectivemethod to assess the facial expressivity of PD patients Theproposed approach follows the methodology of the Inter-personal Communication Rating Protocol-Parkinsonrsquos DiseaseVersion (ICRP-IEB) [9] In this study based on the FacialAction Coding System (FACS) we proposed a methodologythat (i) automatically detects faces in a video stream (ii) codeseach framewith respect to 11 action units and (iii) estimates afacial expressivity as function of frequency that is how oftenAUs occur duration that is how long AUs last and intensitybeing the strength of AUs

Although the proposed facial expressivity assessmentapproach has been evaluated in a limited number of subjectsit allows capturing differences in facial expressivity betweencontrol participants andParkinsonrsquos patientsMoreover facialexpressivity differences between PD patients with differ-ent progression of Parkinsonrsquos disease have been assessedThe proposed method can capture these differences andgive a more accurate assessment of facial expressivity forParkinsonrsquos patients than what traditional observer basedratings allow for This confirms that our approach can beused for clinical assessment of facial expressivity in PDIndeed nonverbal signals contribute significantly to inter-personal communication Facial expressivity a major sourceof nonverbal information is compromised in Parkinsonrsquosdisease The resulting disconnect between subjective feelingand objective facial affect can lead people to form negativeand inaccurate impressions of people with PD with respectto their personality feelings and intelligence Assessing inan objective way the facial expressivity limitation of PDwould allow developing personalized therapy to benefit facialexpressivity in PD IndeedParkinsonrsquos is a progressive diseasewhich means that the symptoms will get worse as time

Computational and Mathematical Methods in Medicine 11

goes on Using the proposed assessment approach wouldallow regular facial expressivity assessment by therapist andclinicians to explore treatment options

Future work will (i) improve the AU recognition systemand extend it to more AUs and (ii) consider clinical usageof the proposed approach in a statistically significant PDpatients population

Conflict of Interests

The authors declare that there is no conflict of interestsregarding the publication of this paper

Acknowledgments

This work is supported by a CSC-VUB scholarship (Grant no2011629010) and the VUB-interdisciplinary research projectldquoObjectifying Assessment of Human Emotional Processingand Expression for Clinical and Psychotherapy Applications(EMO-App)rdquo The authors gratefully acknowledge the Flem-ish Parkinson League for supporting the current study andthe recruitment of the PD patients Also they acknowledgeFloriane Verbraeck for the setup of the experiments and dataacquisition

References

[1] M Katsikitis and I Pilowsky ldquoA study of facial expressionin Parkinsonrsquos disease using a novel microcomputer-basedmethodrdquo Journal of Neurology Neurosurgery and Psychiatry vol51 no 3 pp 362ndash366 1988

[2] G Simons H Ellgring and M C Smith Pasqualini ldquoDistur-bance of spontaneous and posed facial expressions in Parkin-sonrsquos diseaserdquoCognition and Emotion vol 17 no 5 pp 759ndash7782003

[3] D H Jacobs J Shuren D Bowers and K M Heilman ldquoEmo-tional facial imagery perception and expression in Parkinsonsdiseaserdquo Neurology vol 45 no 9 pp 1696ndash1702 1995

[4] L Tickle-Degnen and K D Lyons ldquoPractitionersrsquo impressionsof patients with Parkinsonrsquos disease the social ecology of theexpressive maskrdquo Social Science amp Medicine vol 58 no 3 pp603ndash614 2004

[5] J J van Hilten A D van der Zwan A H Zwinderman and RA C Roos ldquoRating impairment and disability in Parkinsonsdisease evaluation of the unified Parkinsons disease ratingscalerdquoMovement Disorders vol 9 no 1 pp 84ndash88 1994

[6] D Bowers K Miller W Bosch et al ldquoFaces of emotionin Parkinsons disease micro-expressivity and bradykinesiaduring voluntary facial expressionsrdquo Journal of the InternationalNeuropsychological Society vol 12 no 6 pp 765ndash773 2006

[7] P Ekman and W Friesen Facial Action Coding System ATechnique for the Measurement of Facial Movement ConsultingPsychologists Press Palo Alto Calif USA 1978

[8] P Wu D Jiang and H Sahli ldquoPhysiological signal processingfor emotional feature extractionrdquo in Proceedings of the Interna-tional Conference on Physiological Computing Systems pp 40ndash47 2014

[9] L Tickle-Degnen The Interpersonal Communication RatingProtocol A Manual for Measuring Individual Expressive Behav-ior 2010

[10] G Sandbach S Zafeiriou M Pantic and L Yin ldquoStatic anddynamic 3D facial expression recognition a comprehensivesurveyrdquo Image and Vision Computing vol 30 no 10 pp 683ndash697 2012

[11] I Gonzalez H Sahli V Enescu and W Verhelst ldquoContext-independent facial action unit recognition using shape andgabor phase informationrdquo in Proceedings of the 4th InternationalConference on Affective Computing and Intelligent Interaction(ACII rsquo11) vol 1 pp 548ndash557 Springer Memphis Tenn USAOctober 2011

[12] J C Platt ldquoProbabilistic outputs for support vector machinesand comparisons to regularized likelihood methodsrdquo inAdvances in Large Margin Classifiers A J Smola P Bartlett BSchoelkopf andD Schuurmans Eds pp 61ndash74TheMITPress1999

[13] J J Gross and RW Levenson ldquoEmotion elicitation using filmsrdquoCognition and Emotion vol 9 no 1 pp 87ndash108 1995

[14] D Hagemann E Naumann S Maier G Becker A Lurkenand D Bartussek ldquoThe assessment of affective reactivity usingfilms validity reliability and sex differencesrdquo Personality andIndividual Differences vol 26 no 4 pp 627ndash639 1999

[15] C L Lisetti and F Nasoz ldquoUsing noninvasive wearable comput-ers to recognize human emotions from physiological signalsrdquoEURASIP Journal on Advances in Signal Processing vol 2004Article ID 929414 pp 1672ndash1687 2004

[16] J Hewig D Hagemann J Seifert M Gollwitzer E Naumannand D Bartussek ldquoA revised film set for the induction of basicemotionsrdquo Cognition and Emotion vol 19 no 7 pp 1095ndash11092005

[17] J H Westerink E L van den Broek M H Schut J van Herkand K Tuinenbreijer ldquoProbing experiencerdquo in Assessment ofUser Emotions and Behaviour to Development of Products T JO W F P Joyce H D M Westerink M Ouwerkerk and B deRuyter Eds Springer New York NY USA 2008

[18] A Schaefer F Nils P Philippot and X Sanchez ldquoAssessing theeffectiveness of a large database of emotion-eliciting films a newtool for emotion researchersrdquo Cognition and Emotion vol 24no 7 pp 1153ndash1172 2010

[19] F Verbraeck Objectifying human facial expressions for clinicalapplications [MS thesis] Vrije Universiteit Brussel Belgium2012

[20] O AlzoubiAutomatic affect detection from physiological signalspractical issues [PhD thesis] The University of Sydney NewSouth Wales Australia 2012

[21] S D Kreibig ldquoAutonomic nervous system activity in emotiona reviewrdquo Biological Psychology vol 84 no 3 pp 394ndash421 2010

[22] F D Farfan J C Politti and C J Felice ldquoEvaluation of emgprocessing techniques using information theoryrdquo BioMedicalEngineering Online vol 9 article 72 2010

[23] N Ambady F J Bernieri and J A Richeson ldquoToward ahistology of social behavior judgmental accuracy from thinslices of the behavioral streamrdquoAdvances in Experimental SocialPsychology vol 32 pp 201ndash271 2000

[24] N Ambady and R Rosenthal ldquoThin slices of expressivebehavior as predictors of interpersonal consequences a meta-analysisrdquo Psychological Bulletin vol 111 no 2 pp 256ndash274 1992

[25] K D Lyons and L Tickle-Degnen ldquoReliability and validity ofa videotape method to describe expressive behavior in personswith Parkinsonrsquos diseaserdquoTheAmerican Journal of OccupationalTherapy vol 59 no 1 pp 41ndash49 2005

[26] N AMurphy ldquoUsing thin slices for behavioral codingrdquo Journalof Nonverbal Behavior vol 29 no 4 pp 235ndash246 2005

12 Computational and Mathematical Methods in Medicine

[27] A O Andrade S Nasuto P Kyberd C M Sweeney-Reed andF R vanKanijn ldquoEMG signal filtering based on empiricalmodedecompositionrdquo Biomedical Signal Processing and Control vol1 no 1 pp 44ndash55 2006

[28] M Blanco-Velasco B Weng and K E Barner ldquoECG signaldenoising and baseline wander correction based on the empir-ical mode decompositionrdquo Computers in Biology and Medicinevol 38 no 1 pp 1ndash13 2008

[29] A O Boudraa J C Cexus and Z Saidi ldquoEmd-based signalnoise reductionrdquo Signal Processing vol 1 pp 33ndash37 2005

[30] T Jing-tian Z Qing T Yan L Bin and Z Xiao-kai ldquoHilbert-Huang transform for ECG de-noisingrdquo in Proceedings of the1st International Conference on Bioinformatics and BiomedicalEngineering pp 664ndash667 July 2007

[31] A Karagiannis and P Constantinou ldquoNoise components identi-fication in biomedical signals based on empirical mode decom-positionrdquo in Proceedings of the 9th International Conference onInformation Technology and Applications in Biomedicine (ITABrsquo09) pp 1ndash4 November 2009

[32] Y Kopsinis and S McLaughlin ldquoDevelopment of EMD-baseddenoising methods inspired by wavelet thresholdingrdquo IEEETransactions on Signal Processing vol 57 no 4 pp 1351ndash13622009

[33] F Agrafioti D Hatzinakos and A K Anderson ldquoECG patternanalysis for emotion detectionrdquo IEEE Transactions on AffectiveComputing vol 3 no 1 pp 102ndash115 2012

[34] D L Donoho ldquoDe-noising by soft-thresholdingrdquo IEEE Trans-actions on Information Theory vol 41 no 3 pp 613ndash627 1995

[35] B-U Kohler C Hennig and R Orglmeister ldquoThe principlesof software QRS detectionrdquo IEEE Engineering in Medicine andBiology Magazine vol 21 no 1 pp 42ndash57 2002

[36] V Guralnik and J Srivastava ldquoEvent detection from time seriesdatardquo inKnowledge Discovery andDataMining pp 33ndash42 1999

[37] M Hamedi S-H Salleh and T T Swee ldquoSurfaceelectromyography-based facial expression recognition inBi-polar configurationrdquo Journal of Computer Science vol 7 no9 pp 1407ndash1415 2011

[38] Y Hou H Sahli R Ilse Y Zhang and R Zhao ldquoRobust shape-based head trackingrdquo inAdvanced Concepts for Intelligent VisionSystems vol 4678 ofLectureNotes inComputer Science pp 340ndash351 Springer 2007

[39] T Kanade J F Cohn and Y Tian ldquoComprehensive databasefor facial expression analysisrdquo in Proceedings of the IEEE Inter-national Conference on Automatic Face and Gesture Recognitionpp 46ndash53 Grenoble France 2000

[40] M S Bartlett G C Littlewort M G Frank C Lainscsek IR Fasel and J R Movellan ldquoAutomatic recognition of facialactions in spontaneous expressionsrdquo Journal of Multimedia vol1 no 6 pp 22ndash35 2006

[41] A Savran B Sankur and M Taha Bilge ldquoRegression-basedintensity estimation of facial action unitsrdquo Image and VisionComputing vol 30 no 10 pp 774ndash784 2012

[42] Anvil the video annotation tool httpwwwanvil-soft-wareorg

[43] S Vicente J Peron I Biseul et al ldquoSubjective emotionalexperience at different stages of Parkinsonrsquos diseaserdquo Journal ofthe Neurological Sciences vol 310 no 1-2 pp 241ndash247 2011

[44] A van Boxtel ldquoFacial emg as a tool for inferring affectivestatesrdquo in Proceedings of Measuring Behavior 2010 A J Spink FGrieco O Krips L Loijens L Noldus and P Zimmeran Edspp 104ndash108 Noldus Information TechnologyWageningenTheNetherlands 2010

[45] C-N Huang C-H Chen and H-Y Chung ldquoThe review ofapplications and measurements in facial electromyographyrdquoJournal of Medical and Biological Engineering vol 25 no 1 pp15ndash20 2005

[46] R W Levenson P Ekman K Heider and W V FriesenldquoEmotion and autonomic nervous system activity in theminangkabau of west sumatrardquo Journal of Personality and SocialPsychology vol 62 no 6 pp 972ndash988 1992

[47] S Rohrmann and H Hopp ldquoCardiovascular indicators ofdisgustrdquo International Journal of Psychophysiology vol 68 no3 pp 201ndash208 2008

[48] OAlaoui-Ismaıli O RobinH Rada ADittmar andEVernet-Maury ldquoBasic emotions evoked by odorants comparisonbetween autonomic responses and self-evaluationrdquo Physiologyand Behavior vol 62 no 4 pp 713ndash720 1997

[49] J Gruber S L Johnson C Oveis and D Keltner ldquoRisk formania and positive emotional responding too much of a goodthingrdquo Emotion vol 8 no 1 pp 23ndash33 2008

[50] W Gaebel and W Wolwer ldquoFacial expressivity in the course ofschizophrenia and depressionrdquo European Archives of Psychiatryand Clinical Neuroscience vol 254 no 5 pp 335ndash342 2004

[51] P EkmanW Friesen and J Hager Facial Action Coding SystemThe Manual 2002

Submit your manuscripts athttpwwwhindawicom

Stem CellsInternational

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

MEDIATORSINFLAMMATION

of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Behavioural Neurology

EndocrinologyInternational Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Disease Markers

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

BioMed Research International

OncologyJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Oxidative Medicine and Cellular Longevity

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

PPAR Research

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Immunology ResearchHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Journal of

ObesityJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Computational and Mathematical Methods in Medicine

OphthalmologyJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Diabetes ResearchJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Research and TreatmentAIDS

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Gastroenterology Research and Practice

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Parkinsonrsquos Disease

Evidence-Based Complementary and Alternative Medicine

Volume 2014Hindawi Publishing Corporationhttpwwwhindawicom

2 Computational and Mathematical Methods in Medicine

estimating an entropy score plotted over timeThe entropy is ameasure of pixel intensity change that occurred over the faceas it moved during expression Also they computed the timeit took an expression to reach its peak entropy value from theonset of each trial Using both measures entropy and timethe authors demonstrated that less movements occurred overthe face of PDpatientswhen theywere asked tomimic a targetexpression relative to control

Despite its good results the above described amount offacial movements does not directly relate to measuring facialmuscles activity Indeed facial expressions are generated bycontractions of facial muscles which lead to subtle changesin the area of the eyelids eye brows nose lips and skintexture often revealed by wrinkles and bulges To measurethese subtle changes Ekman and Friesen [7] developed theFacial Action Coding System (FACS) FACS is a human-observer-based system designed to detect subtle changes infacial features and describes facial expressions by action units(AUs) AUs are anatomically related to contraction of specificfacial muscles They can occur either singly or in combina-tions Simons et al in [2] used the FACS to analyze facialexpressivity of PD patients versus control In their study odorwas used as a means of inducing facial expression CertifiedFACS coders annotated the facial expressions A total facialactivity (TFA) measure estimated as the total number ofdisplayed AUs was used to assess facial expressions Theauthors demonstrated that the TFA measure revealed thatcompared to controls PD patients have reduced level of facialactivity in reaction to unpleasant odors

Estimating only the number of displayed AUs does notcompletely capture the dimensions of facial masking that arepresent in PD and defined in the Interpersonal Communica-tion Rating Protocol-Parkinsonrsquos Disease Version (ICRP-IEB)[9] The ICRP-IEB facial expressivity is based on the FACSIt defines expressivity in terms of (i) frequency that is howoften a behavior or movement occurs (ii) duration that ishow long a behavior or movement lasts and (iii) intensity ordegree being the strength force or levelamount of emotionor movement

In this study we propose a system that (i) automaticallydetects faces in a video stream (ii) codes each frame withrespect to 11 action units and (iii) estimates a facial expres-sivity as function of frequency duration and intensity of AUsAlthough there is already a substantial literature on automaticexpression and action unit recognition it still remains anactive area of study due to the challenging nature of theproblem [10] Moreover in contrast to AU detection thereis scarce work in the literature on AU intensity estimationThe proposed facial expressivity quantity makes use of theoutput of a previously developed automatic AU recognitionsystem [11] based on support vector machines (SVM) andAdaBoost To determine the AU intensity the resulting AUdistance measure from the AU SVM classifier is mapped tothe estimates of probability using the Platt scaling algorithmPlatt scaling [12] refers to a technique whereby a score-to-probability calibration curve is calculated using the trainingset Frame-by-frame intensity measurements are then used

Table 1 The selected movie clips listed with their sources

Emotion Excerptrsquos source1 2

Amusement Benny and Joone The god fatherSadness An officer and a gentleman UpSurprise Capricorn one Sea of loveAnger Witness GandhiDisgust Pink flamingos TrainspottingFear Silence of the lambs The shiningNeutral Colour bar patterns Hannah and her sisters

Clip 1 SR Clip 2 SR Clip 3 SR SR Clip 14Training clips SR

Figure 1 Emotion elicitation protocol (SR indicates self-report)

to estimate facial expression dynamics which were previouslyintractable by human coding

2 Methods

21 Pilot Study Our study aims to quantify facial expressivitydynamics of PD patients Thus gathering usable qualitativeemotional data is the essential step prior to the analysis offacial behaviors To voluntarily produce spontaneous facialexpressions that resemble those typically triggered by emo-tions in our study six emotions (amusement sadness angerdisgust surprise and fear) were elicited using movie clipsDuring the movie clips physiological signals and frontal facevideo of the participants were recorded Fifteen participants7 PD patients and 8 healthy control persons participated inthe pilot study After each movie the participants were askedto rate the intensity of dominant emotions they experiencedwhile watching the movie clips The evaluation of the partic-ipantrsquos self-reports showed that the disgust-induced emotionwas significantly higher than the other emotions Thus wefocused on the analysis of the recorded data during watchingdisgust movie clips

22 Emotion Induction Protocol Based on the studies ofGross and Levenson [13] Hagemann et al [14] Lisetti andNasoz [15] Hewig et al [16] Westerink et al [17] andSchaefer et al [18] we composed a set of movie clips [19] Foreach emotion (amusement sadness anger disgust surprisefear and neutral) two excerpts were used as listed in Table 1In the sequel we will refer to move clips as 119890119909119888119890119903119901119905119894 119894 = 12 For example the two surprise movie clips will be denotedas 1199041199061199031199011199031198941199041198901 and 1199041199061199031199011199031198941199041198902

The used protocol is depicted in Figure 1The participantswere told to watch 2 training movie clips and the 14 movieclips of Table 1 The movie clips were shown randomly 2successive ones with different emotions After each video clipthe participants filled in a questionnaire (ie self-report) withtheir emotion-feeling (amusement sadness anger disgustsurprise fear and neutral) and rated the strength of theirresponses using a 7-point scale

Computational and Mathematical Methods in Medicine 3

Figure 2 Experimental setup

The data recording took place in a sound-isolated labunder standardized lighting condition The movie clips werewatched by the participants while sitting on a sofa that wasfacing a screen with dimensions 125 by 125m (see Figure 2)

23 Participants This pilot study considered seven PDpatients (3 men and 4 women between the ages of 47 and 76years durations of PD ranged from 15 to 13 years) and eightcontrol participants (5 men and 3 women between the age of27 and 57 years)The control participants were recruited fromthe VUB ETRO department without specific characteristicsThe PD patients were selected with the help of the FlemishParkinson LeagueThe studywas approved by the committeeson human research of the respective institutions and wascompleted in accordance with the Helsinki Declaration

Based on the medical dossier as well as the feedback fromthe PD patients we defined three PD categories One patienthad a very light form of PD and therefore was classified asthe least severe case (denoted as LP) Another patient had themost severe form of PD of the whole group (MP) The restof PD patients were situated between those two extremes wedenoted them by intermediate PD (IP)

24 Data Acquisition During the movie clips physiologi-cal signals and frontal face video of the participants wererecorded As physiological signal we recorded electromyo-graphy (EMG) and electrocardiogram (ECG) All the datachannels (ie EMG ECG and videotape)were synchronized

The ECGmeasures the activity of heart contractionsThephysical action of the heart is induced by a local periodicelectrical stimulation and as a result a change in potentialof 10-20 120583V is measured during a cardiac cycle betweentwo surface electrodes [20] Heart rate (HR) and heart ratevariability (HRV) are the cardiovascular response featuresmost often reported as indicators of emotions [21] In thisstudy the ECG was measured at 512Hz using a Shimmer

The EMG measures the frequency of muscle tensionand contraction Within each period of interest the rootmean square (RMS) and absolute mean value (AMV) arecommonly used as features [22] In this study two facial mus-cles were measured at 2000Hz using the EMG Biomonitor

Ground

Orbicularis oculi

Levator labii superioris

Figure 3 Locations of the EMG electrodes

ME6000 namely the levator labii superioris (LLS) and theorbicularis oculi (OO) as illustrated in Figure 3

The frontal face video of the participants was recordedfor the purpose of quantifying facial expressivity Followingthe (ICRP-IEB) [9] we selected 11 action units (AU1 AU2AU4 AU6 AU7 AU9 AU12 AU20 AU23 AU25 and AU27)among the 41 ones defined by FACS Table 2 lists the usedAUs along with their associated facial muscles as well as theones defined by the measured EMG

Our purpose being the development of a nonobtrusiveapproach for facial expressivity assessment by automaticallyrecognizing facial action units (AUs) and estimating theirintensity the ECG and EMG measurements have beenincluded in our experimental protocol to quantitatively assesthe emotional manipulation for inducing facial expressionsand confirm facial muscle activity when expressing disgustexpression As our objective did not aim at physiologicallyinvestigating the effect of Parkinsonrsquos on facial EMG only theLLS andOOweremeasured by the EMG as a complementaryinformation to the considered AUs (see Table 2)

25 Data Preparation It is an accepted practice to code aproportion of the observations or ldquothin-slicesrdquo of recordeddata as a representation of the analyzed behavior [23] Inour experiments 30 s data (ECG EMG and video record)extractswere selected corresponding to the last 30 s segmentsof the shown video clips Intuitively one may think thatlonger segments of expressive behavior in persons with PDwould give more information and thus increase the accuracyof analysis However Ambady and Rosenthal [24] foundthat the judgment performance did not significantly improvewhen using 5 minutes slices versus 30 s slices This was alsoconfirmed in [25 26] Note that baseline data were alsoselected from the stimuli neutral1

26 Physiological Data Processing Physiological signals needto be preprocessed prior to feature extraction in orderto remove noise and enhance signal-to-noise ratio (SNR)[27 28] In this work we make use of a newly developedsignal denoising approach based on the empirical modedecomposition (EMD) [8] Different from state-of-the-art

4 Computational and Mathematical Methods in Medicine

Table 2 FACS AUs and related muscles

AU FACS name Facial muscle Videotape EMGAU1 Inner brow raiser Frontalis (pars medialis) X mdashAU2 Outer brow raiser Frontalis (pars lateralis) X mdashAU4 Brow lowerer Depressor glabellae depressor supercilii and CS X mdashAU6 Cheek raiser OO (pars orbitalis) X XAU7 Lid tightener OO (pars palpebralis) X XAU9 Nose wrinkler LLSAN X mdashAU10 Upper lip raiser LLS caput infraorbitalis mdash XAU12 Lip corner puller Zygomaticus Major X mdashAU20 Lip stretcher Risorius X mdashAU23 Lip tightener Orbicularis Oris X mdashAU25 Lips part Depressor labii inferioris X mdashAU27 Mouth stretch Pterygoids digastric X mdashAU45 Blink Contraction OO mdash XAU46 Wink OO mdash X

0 500 1000 1500 2000 2500

0100200300

The raw ECG signal

minus100

0 500 1000 1500 2000 2500

0100200

The denoised ECG signal

minus100

Figure 4 Example of physiological signal (ECG) denoising usingthe method proposed in [8]

methods [27ndash33] our approach estimates the noise level ofeach intrinsic mode functions (IMFs) rather than estimatingthe noise level of all IMFs using Donohorsquos strategy [34] priorto the reconstruction of the signal using the thresholdedIMFs Please refer to [8] for more details Figure 4 illustratesthe denoising results

261 Electrocardiogram (ECG) Heart rate (HR) and heartrate variability (HRV) are the cardiovascular response fea-tures most often reported as indicators of emotion [21]HR is computed using the time difference between twoconsecutive detected R peaks of the QRS complexes (ieRR interval) and is expressed in beats per minute HRVis the variation of beat-to-beat HR Thus the first step inextracting HR and HRV starts from the exact detection ofR peaks in the QRS complex Thus the detection of QRScomplex in particular R peak detection is the basis forECG processing and analysis Many approaches for R peaksdetection have been proposed [35] However most of themare off-line and targeting the noiseless signal which do notmeet the requirements of many real-time applications To

0 500 1000 1500 2000 2500

0

50

100

150

200

250

300

350

Samples

Noi

sy E

CG

minus100

minus50

Figure 5 Change points detection

overcome this problem in [8] we proposed an approachbased on a change point detection (CPD) algorithm for eventdetection in time series [36] that minimizes the error infitting a predefined function using maximum likelihood Inour current implementation polynomial fitting functions ofdegree 1 have been selected empirically An example of resultsis illustrated in Figure 5 which detects all R picks (cross) andsome irrelevant change points (circle) which can be filteredout using a predetermined threshold Once the R peaks weredetectedHRandHRV features are estimated as the differencebetween the HR (HRV) estimated from the considered 30 swindow (of the stimuli) and the one estimated from theneutral 30 s

262 Electromyogram (EMG) The absolute mean value(AMV) is the most commonly used feature for identifying

Computational and Mathematical Methods in Medicine 5

the strength of muscular contraction [22 37] defined overthe length119873 of the signal 119909(119905) as follows

AMV = 1119873

119873

sum

119905=1

|119909 (119905)| (1)

The AMV value during the considered 30 s window wasexpressed as a percentage of the mean amplitude during the30 s neutral baseline This percentage score was computed tostandardize the widely different EMG amplitudes of individ-uals and thus to enable comparison between individuals andgroups

27 Design and Statistical Analysis Preliminary analysis wereperformed to check the validity of the acquired data

(1) Manipulation Check We calculated the descriptive statis-tics (the mean and standard deviation) of self-reportedemotional ratings to check if the participantsrsquo emotionalexperience was successfully manipulated Then Wilcoxonrank sum tests were performed to compare the self-reportbetween groups (PD and control) and between the two clipsof the same emotion

(2) Analysis of Physiological Parameters Physiological vari-ables were tested for univariate significant differences viarepeated-measures analysis of variance (ANOVA) betweengroups (PD versus control) and between the disgust1disgust2 and neutral2 stimulants Group and stimuli arethe between-subjects and within-subject factors respectivelyResults were considered statistically significant at 119875 lt 05

28 Facial Action Units Recognition In this study we makeuse of an automatic facial action units recognition systemdeveloped at our department [11]This system allows context-independent recognition of the following action units AU1AU2 AU4 AU6 AU7 AU9 AU12 AU20 AU23 AU25 andAU27The overall recognition scheme is depicted in Figure 6

The head and facial features were tracked using theconstrained shape tracking approach of [38] This approachallows an automatic detection of the head and the trackingof a shape model composed of 83 landmarks After thefacial components have been tracked in each frame bothgeometry-based features and appearance-based features arecombined and fed to the AdaBoost (adaptive boosting)algorithm for feature selection Finally for each action unitwe used a binary support vector machine (SVM) for context-independent classificationOur systemwas trained and testedon the Kanade et al DFAT-504 dataset [39] The databaseconsists of 486 sequences of facial displays that are producedby 98 university students from 18 to 30 years old of which65 is female All sequences are annotated by certified FACScoders start with a neutral face and end with the apex ofthe expression SVM [11] has been proven to be powerful androbust tools for AU classification

A test sample (a new image) z can be classified accordingto the following decision function

119863 (z) = sign (ℎ (z)) (2)

Table 3 Facial expressivity items defined in ICRP-IEB [9] and usedAUs

Item Gestalt degree Related AUs(1) Active expressivityin face Intensity 11 AUs

(2) Eyebrows raising Intensity + frequency AU1 and AU2(3) Eyebrows pullingtogether Intensity + frequency AU4

(4) Blinking Frequency AU45(5) Cheek raising Intensity + frequency AU6(6) Lip corner puller Intensity + frequency AU12

The output ℎ(z) of a SVM is a distancemeasure between a testpattern and the separating hyperplane defined by the supportvectors The test sample is classified to the positive class (theAU to which it was trained) if 119863 (z) = +1 and is classified tothe negative class if119863(z) = minus1

Unfortunately we cannot use directly the output of aSVM as a probability There is no clear relationship with theposterior class probability 119875(119910 = +1 | z) that the pattern zbelongs to the class119910 = +1 Platt [12] proposed an estimate forthis probability by fitting the SVMoutput ℎ(z)with a sigmoidfunction as follows

119875 (119910 = +1 | z) = 1

1 + exp (119860ℎ (z) + 119861) (3)

Theparameters119860 and119861 are foundusingmaximum likelihoodestimation from the training set The above equation has amapping range in [0 1]

29 Facial Expressivity In this work we follow the rec-ommendation of the Interpersonal Communication RatingProtocol-Parkinsonrsquos Disease Version (ICRP-IEB) [9] wherethe expressivity based on the FACS has been defined interms of (i) frequency that is how often a behavior ormovement occurs (ii) duration that is how long a behavioror movement lasts and (iii) intensity or degree being thestrength force or levelamount of emotion or movement InICRP-IEB facial expressivity is coded according to the gestaltdegree of intensitydurationfrequency of 7 types of facialexpressive behavior (items) depicted in recorded videos ofPDs using a 5-point Likert type scale 1 = low (with low tono movement or change or infrequent) 2 = fairly low 3 =medium 4 = fairly high and 5 = high (very frequent veryactive) Table 3 lists 6 of the ICRP-IEB facial expressivityitems and the corresponding AUs used in this study forestimating them The last item is related to active mouthclosure during speech which was not used in our modelas the participants were not asked to speak during theexperiments

As our current AU recognition system considers only 11AUs (AU1 AU2 AU4 AU6 AU7 AU9 AU12 AU20 AU23AU25 and AU27) we did not consider blinking (AU45) inour facial expression formulation

6 Computational and Mathematical Methods in Medicine

Original video

I(0) I(1) I(T)

Shape model tracking

Appearance-based

P(0) P(1) P(T)

Geometry-based

Feature extraction

AU12 AU27 AU6 AU9 AU1 AU4

Lower facial features

Middle facial features

Upper facial features

Featureselection

AdaBoost

AU12 AU27 AU6 AU9 AU1 AU4Binary SVMclassification

Output decision values

AU1 AU2 AU4 AU6

AU7 AU9 AU12 AU20

AU23 AU25 AU27

Inner brower raiser Outer brower raiser Brow lowerer Cheek raiser

Lid tighten Nose wrinkle Lip corner puller Lip stretch

Lip tighten Lips part Mouth stretch

middot middot middot

middot middot middot

middot middot middot

middot middot middot

middot middot middot

middot middot middot

middot middot middot

middot middot middot middot middot middot middot middot middot

middot middot middot

Figure 6 The AU recognition system

Following the criteria of the ICRP-IEB we defined thefacial expressivity of a participant as follows

EFE = AMC119890+ AI119890minus (AMC

119899+ AI119899) (4)

where AMC119890and AMC

119899are the amount of movement

changes during the disgust and neutral facial expressionrespectively AI

119890and AI

119899are the intensity of the displayed

AUs during the disgust and neutral facial expression respec-tively

It has to be recalled that for all these estimations onlythe considered 30 s windows are used The quantities AMCand AI refer to frequency and intensity respectively Theirdetailed formulation is given in the following sections Inorder to assess the effectiveness of the proposed formulationwe also compared it to the following definition of facialexpression where the baseline of neutral emotion was notconsidered Consider the following

FE = AMC119890+ AI119890 (5)

(1) Intensity of Displayed AUs It has been shown in [40] thatthe output margin of the learned SVM classifiers contained

information about expression intensity Later Savran et al[41] estimated the AU intensity levels using logistic regres-sion on SVM scores In this work we propose using Plattrsquosprobability given by (3) as action unit intensity at frame 119905119868119905(AU119894) = 119875119905AU119894 with 119875

119905

AU119894 = 119875(119888 = AU119894 | z119905) is estimated

using (3) The 119868119905(AU119894) time series is then smoothed using a

Gaussian filter We denote by 119868119905(AU119894) the smoothed value

Figure 7 plots for a participant the smoothed intensity of thefacial AU7 also illustrating its different temporal segmentsneutral onset apex and offset

Having defined the AU intensity the intensity of dis-played AUs during a voluntary facial expression (disgust orneutral) is given by

AI = sum119894isinDAUs

1

1198731015840

119894

sum

119905isin119879119894

119868119905 (AU119894) (6)

where DAUs is the set of displayed (recognized) facial actionunits during the considered 30 s window 119879

119894is the set of

Computational and Mathematical Methods in Medicine 7

1

09

08

07

06

05

04

03

02

01

00 50 100 150 200 250 300 350

Frames

Inte

nsity

of A

U7

(lid

tight

ener

)

Figure 7 Example of AU intensity

frames where AU119894 is active and1198731015840119894is the cardinal of 119879

119894 being

the number of frames where AU119894 is active

(2) Amount of Movement Change The amount of movementchange during a voluntary facial expression (disgust orneutral) is given by

AMC = sum119894isinDAUs

1

1198731015840

119894

sum

119905isin119879119894

10038161003816100381610038161003816119868119905 (AU119894) minus 119868119905+1 (AU119894)

10038161003816100381610038161003816 (7)

with DAUs1198731015840119894 and 119879

119894as defined above

3 Results and Discussion

31 Manipulation Check Participantsrsquo emotion was suc-cessfully elicited using the movie clips of disgust Forboth disgust1 and disgust2 most participants (1415) self-reported the target emotion as their dominated emotionFor the other video clips the reported emotions are as fol-lows amusement1 (915) amusement2 (1215) surprise1(1115) surprise2 (215) anger1 (615) anger2 (915) fear1(1015) fear2 (1115) neutral1 (1315) and neutral2 (1315)Annotation of the video records using the Anvil annotationtool [42] further confirmed that the movie clips of disgustinduced the most reliable emotional data Therefore forfurther analysis we decided to use only the data recordedduring watching the disgust movie clips

The Wilcoxon rank sum tests were implemented on the14 disgusting self-reports Results showed that there wasno significant difference in self-reported emotional ratingsbetween disgust1 (119872 = 650 SD = 76) and disgust2(119872 = 664 SD = 63) and between PD group (119872 =651 SD = 65) and control group (119872 = 664 SD =74) This is what we expected since Vicente et al [43] alsoreported that PD patients at different stages of the disease didnot significantly differ from the controls in the self-reportedemotional experience to presented movie clips

32 Univariate Analysis of Group Effects TheEMGdata from3 participants (1 control and 2 PD) and the ECG data from 10participants (4 control and 6 PD) were discarded because ofnot being well recorded due to sensor malfunctions Results

of the repeated-measures ANOVAs on single physiologicalvariables showed that significant main effect of group on theLLS activity (119865(1 12) = 338 119875 = 09) the OO activity(119865(1 12) = 192 119875 = 19) HR (119865(1 3) = 93 119875 = 41)and HRV (119865(1 3) = 53 119875 = 83) was not found Table 4shows the descriptive data for control and PD groups duringexposure to the three movie clips disgust1 disgust2 andneutral2 and the tests of significance comparison betweengroups using Wilcoxon rank sum tests The Wilcoxon ranksum tests revealed that

(1) comparable levels of baseline activity in control andPD groups over both the LLS and the OOwere found

(2) although disgust1 elicited more muscle activity overthe LLS and the OO for control group than forPD group this difference did not reach statisticalsignificance

(3) disgust2 elicited significantly more LLS activity forcontrol than for PD

These results indicated that PD displayed less muscleactivity over the LLS when expressing disgust than controlIn addition disgust2 induced more muscle activity over theLLS and the OO than disgust1 which is consistent with theself-report andmay be due to the fact that disgust2 is slightlymore disgusting than disgust1

33 Univariate Analysis of Stimuli Effects For LLS we foundan effect of the stimuli on muscle activity (119865(2 24) = 947119875 = 001) Post-hoc tests indicated significant differencesbetween disgust1 and neutral2 (119875 = 01) and betweendisgust2 and neutral2 (119875 = 01) Both disgust1 (119872 = 282SD = 192) and disgust2 (119872 = 537 SD = 543) elicitedmore muscle activity than neutral2 (119872 = 110 SD = 38)Thismain effect was qualified by a stimulitimes group interaction(119865(2 24) = 417 119875 = 028) which was consistent with theresults of the Wilcoxon rank sum tests which compared thephysiological responses between groups

For OO we also found an effect of the stimuli on muscleactivity (119865(2 24) = 545 119875 = 012) Post-hoc tests indicatedsignificant difference only between disgust1 and neutral2(119875 = 002)Thedisgust1 (119872 = 232 SD = 157) elicitedmoremuscle activity than neutral2 (119872 = 135 SD = 111) Nosignificant stimuli times group interaction effect (119865(2 24) = 177119875 = 192) was found

We expected that the disgust clips elicited significantlymore muscle activity over LLS than the neutral clip becausenormally LLS is involved in producing the disgust facialexpression [44] The fact that OO was also significantlydifferent was probably due to the following

(1) crosstalk [45] the LLS and the OO lie in the vicinityof each other

(2) the disgust1 elicited is not only a disgust but alsoa bit of amusement by the funny motion of thecharacter and the background music so that the OOwas also involved in producing the facial expressionof amusement

8 Computational and Mathematical Methods in Medicine

Table 4 Statistical summary of physiological measures

Variable Stimuli Control PD SigMean SD Mean SD

LLSNeutral2 104 34 116 43 71Disgust1 336 204 227 177 26Disgust2 804 649 271 225 04

lowast

OONeutral2 135 111 94 54 81Disgust1 282 177 181 128 32Disgust2 520 571 207 148 13

HRNeutral2 172 277 276 mdash mdashDisgust1 minus53 264 512 mdash mdashDisgust2 222 553 565 mdash mdash

HRVNeutral2 153 1296 minus765 mdash mdashDisgust1 801 1145 2634 mdash mdashDisgust2 1486 3564 1492 mdash mdash

lowastThe two groups are significantly different that is 119875 lt 05ldquomdashrdquoWe did not compare the cardiac responses (ie HR and HRV) between groups because due to technical failures we lost the ECG data for 10 participantsand thus only 5 participants (4 controls and 1 PD) completed the ECG recording

Table 5 Facial expressivity assessment based on different methods

Var Clowast LPlowast IPlowast MPlowast

1198631lowast 1198632lowast 1198732lowast 1198631lowast 1198632lowast 1198732lowast 1198631lowast 1198632lowast 1198732lowast 1198631lowast 1198632lowast 1198732lowast

TFA 8 8 4 5 5 4 5 6 4 4 mdash 5AMCloz 139 277 108 67 65 43 82 62 69 39 mdash 82AI 54 51 18 28 28 22 38 39 31 22 mdash 33FEloz 722 751 246 209 187 142 444 356 262 135 mdash 322lowast119863111986321198732 C LP IP and MP denote disgust1 disgust2 neutral2 the control the least intermediate and most severely form of Parkinsonrsquos patients

respectivelylozpresented in percentagesldquomdashrdquoThe face of the MP while watching1198632 was blocked by his hand

Note that disgust2 (119872 = 364 SD = 432) elicitedmore muscle activity over OO (because of crosstalk) thanneutral2 (119872 = 114 SD = 87) however unlike disgust1the difference did not reach statistical significance (becausedisgust2 did not elicit amusement at all) which can also beinterpreted

Moreover the main effect of stimuli on cardiac parame-ters for both HR (119865(2 6) = 37 119875 = 70) and HRV (119865(2 6) =084 119875 = 48) was not found which is not consistent withwhat we expected unchanged [46 47] or increased [48 49]HR and increased HRV [21 47] This may be due to thefact that we did not have enough recorded ECG data for astatistical analysis

34 Qualitative Analysis of Facial Expressivity To qualita-tively analyze facial expressivity we used the total facialactivity TFA measure of [2 50] being the total numberof displayed AUs in response to the stimuli Compared tothe control (C) the PD groups (LP IP and MP) showedthe attenuation of their facial activities (variable TFA) whilewatching disgusting movie clips (see Table 5) Howevercomparable TFA was found while watching neutral movieclips

Visual inspection of the of the displayed AUs as illus-trated in Figure 8 shows that AU1 AU2 AU6 AU9 andAU45 occurred more frequently for C except for IP whoproduced more frequently AU6 than C A possible expla-nation is that the IP cannot deactivate AU6 even duringneutral state As shown in Figure 9 during neutral state PDpatients produced more continuous active action units suchas AU4 and AU25 (for all PD) AU6 (for IP) and AU9 (forMP) Moreover certain AU combinations were consideredlikely to signify ldquodisgustrdquo on the basis of Ekman and Friesenrsquosdescription of emotions [7] ldquoDisgustrdquo AUs were consideredas the combination of eyebrows lowerer (AU4) cheek raiser(AU6) and nose wrinkler (AU9) Because cheek raiser isvery difficult to produce on demand without including otherAUs especially the lid tightener (AU7) [51] lid tightenerwas also taken into consideration that is the expected AUpattern of ldquodisgustrdquo was the combination of AU4 AU6 AU7and AU9 Consistent with what we expected C displayed98 and 87 ldquodisgustrdquo frames while watching disgust1 anddisgust2 respectively The PD goup (LP IP and MP) didnot display any ldquodisgustrdquo frames only for IP who had4 ldquodisgustrdquo frames while watching disgust1 Furthermorethe IP displayed quite few frames with combinations ofcheek raiser and lid tightener during neutral state Instead

Computational and Mathematical Methods in Medicine 9

0 100 200 300 400 500 600 7000 1 2 4 6 7 9

122023252745

Frame0 100 200 300 400 500 600 700

Frame

AU

0 100 200 300 400 500 600 7000 1 2 4 6 7 9

122023252745

Frame

AU

0 100 200 300 400 500 600 7000 1 2 4 6 7 9

122023252745

Frame

AU

0 1 2 4 6 7 9

122023252745

AU

Disgust1 (C) Disgust1 (LP)

Disgust1 (IP) Disgust1 (MP)

Figure 8 The facial activities while watching disgust1

N(C) N(LP)

N(IP)

Frame

Frame Frame

Frame

N(MP)

0 100 200 300 400 500 600 7000 1 2 4 6 7 9

122023252745

AU

0 100 200 300 400 500 600 7000 1 2 4 6 7 9

122023252745

AU

0 100 200 300 400 500 600 7000 1 2 4 6 7 9

122023252745

AU

0 100 200 300 400 500 600 7000 1 2 4 6 7 9

122023252745

AU

Figure 9 The facial activities while watching neutral2

10 Computational and Mathematical Methods in Medicine

0

5

10

15

20

25

30

35

fe

C

LP

IP

MP

Disgust1 Disgust2

Figure 10 The quantified facial expressivity

the cheek raiser alone was displayed mostly (see Figure 9)which indicates that IP patients have control problems forfacial muscles combinations

Analyzing the defined quantities AMC AI and FE usingthe segments with active AUs as it can be seen from Table 5the control C had higher facial expressivity than LP IP andMP during the disgusting state while MP performed thelowest facial expressivity More specifically compared to CPD (LP IP andMP) had smaller values of variables AMC AIand FE while watching disgusting movie clips especially forbrow lower (AU4) nose wrinkler (AU9) and blink (AU45)C tended to show similar intensities of brow lower (variable119890) with bigger variance (variables AMC and FE) In additionC showed higher intensities of nose wrinkler with largervariance

35 Quantitative Analysis of Facial Expressivity Figure 10depicts the values of the facial expressivity equation (5) forthe participants We did not assess the facial expressivityof the MP patient during disgust2 as he had his hand onfront of his face during the experiments As it can be seen asignificant difference between C and PD patients is presentC got the highest score while the lowest score was obtainedby the MP patient However the score of the IP patient wasslightly higher than the LP one which is due to the fact thatthe LP and IP expressed ldquofacial maskingrdquo in different waysthe LP patient showed the attenuation of the intensities offacial movements On the contrary the IP patient producedhigh intensities of facial movements not only in his emotionalstate but also during neutral state that is he cannot relax themuscles well In order to take both kinds of ldquofacial maskingrdquointo account we computed the facial expressivity using (4)The results are shown in Figure 11 As it can be seen theproposed facial expressivity allows distinguishing betweencontrol and PD patients Moreover the facial expressivitydecreases along with the increase of PD severity

0

5

10

15

20

25

fe

CLP

IP

MP

Disgust1 Disgust2minus5

Figure 11 The quantified facial expressivity based on the improvedfunction

4 Conclusions

This study investigated the phenomenon of facial masking inParkinsonrsquos patientsWedesigned an automated and objectivemethod to assess the facial expressivity of PD patients Theproposed approach follows the methodology of the Inter-personal Communication Rating Protocol-Parkinsonrsquos DiseaseVersion (ICRP-IEB) [9] In this study based on the FacialAction Coding System (FACS) we proposed a methodologythat (i) automatically detects faces in a video stream (ii) codeseach framewith respect to 11 action units and (iii) estimates afacial expressivity as function of frequency that is how oftenAUs occur duration that is how long AUs last and intensitybeing the strength of AUs

Although the proposed facial expressivity assessmentapproach has been evaluated in a limited number of subjectsit allows capturing differences in facial expressivity betweencontrol participants andParkinsonrsquos patientsMoreover facialexpressivity differences between PD patients with differ-ent progression of Parkinsonrsquos disease have been assessedThe proposed method can capture these differences andgive a more accurate assessment of facial expressivity forParkinsonrsquos patients than what traditional observer basedratings allow for This confirms that our approach can beused for clinical assessment of facial expressivity in PDIndeed nonverbal signals contribute significantly to inter-personal communication Facial expressivity a major sourceof nonverbal information is compromised in Parkinsonrsquosdisease The resulting disconnect between subjective feelingand objective facial affect can lead people to form negativeand inaccurate impressions of people with PD with respectto their personality feelings and intelligence Assessing inan objective way the facial expressivity limitation of PDwould allow developing personalized therapy to benefit facialexpressivity in PD IndeedParkinsonrsquos is a progressive diseasewhich means that the symptoms will get worse as time

Computational and Mathematical Methods in Medicine 11

goes on Using the proposed assessment approach wouldallow regular facial expressivity assessment by therapist andclinicians to explore treatment options

Future work will (i) improve the AU recognition systemand extend it to more AUs and (ii) consider clinical usageof the proposed approach in a statistically significant PDpatients population

Conflict of Interests

The authors declare that there is no conflict of interestsregarding the publication of this paper

Acknowledgments

This work is supported by a CSC-VUB scholarship (Grant no2011629010) and the VUB-interdisciplinary research projectldquoObjectifying Assessment of Human Emotional Processingand Expression for Clinical and Psychotherapy Applications(EMO-App)rdquo The authors gratefully acknowledge the Flem-ish Parkinson League for supporting the current study andthe recruitment of the PD patients Also they acknowledgeFloriane Verbraeck for the setup of the experiments and dataacquisition

References

[1] M Katsikitis and I Pilowsky ldquoA study of facial expressionin Parkinsonrsquos disease using a novel microcomputer-basedmethodrdquo Journal of Neurology Neurosurgery and Psychiatry vol51 no 3 pp 362ndash366 1988

[2] G Simons H Ellgring and M C Smith Pasqualini ldquoDistur-bance of spontaneous and posed facial expressions in Parkin-sonrsquos diseaserdquoCognition and Emotion vol 17 no 5 pp 759ndash7782003

[3] D H Jacobs J Shuren D Bowers and K M Heilman ldquoEmo-tional facial imagery perception and expression in Parkinsonsdiseaserdquo Neurology vol 45 no 9 pp 1696ndash1702 1995

[4] L Tickle-Degnen and K D Lyons ldquoPractitionersrsquo impressionsof patients with Parkinsonrsquos disease the social ecology of theexpressive maskrdquo Social Science amp Medicine vol 58 no 3 pp603ndash614 2004

[5] J J van Hilten A D van der Zwan A H Zwinderman and RA C Roos ldquoRating impairment and disability in Parkinsonsdisease evaluation of the unified Parkinsons disease ratingscalerdquoMovement Disorders vol 9 no 1 pp 84ndash88 1994

[6] D Bowers K Miller W Bosch et al ldquoFaces of emotionin Parkinsons disease micro-expressivity and bradykinesiaduring voluntary facial expressionsrdquo Journal of the InternationalNeuropsychological Society vol 12 no 6 pp 765ndash773 2006

[7] P Ekman and W Friesen Facial Action Coding System ATechnique for the Measurement of Facial Movement ConsultingPsychologists Press Palo Alto Calif USA 1978

[8] P Wu D Jiang and H Sahli ldquoPhysiological signal processingfor emotional feature extractionrdquo in Proceedings of the Interna-tional Conference on Physiological Computing Systems pp 40ndash47 2014

[9] L Tickle-Degnen The Interpersonal Communication RatingProtocol A Manual for Measuring Individual Expressive Behav-ior 2010

[10] G Sandbach S Zafeiriou M Pantic and L Yin ldquoStatic anddynamic 3D facial expression recognition a comprehensivesurveyrdquo Image and Vision Computing vol 30 no 10 pp 683ndash697 2012

[11] I Gonzalez H Sahli V Enescu and W Verhelst ldquoContext-independent facial action unit recognition using shape andgabor phase informationrdquo in Proceedings of the 4th InternationalConference on Affective Computing and Intelligent Interaction(ACII rsquo11) vol 1 pp 548ndash557 Springer Memphis Tenn USAOctober 2011

[12] J C Platt ldquoProbabilistic outputs for support vector machinesand comparisons to regularized likelihood methodsrdquo inAdvances in Large Margin Classifiers A J Smola P Bartlett BSchoelkopf andD Schuurmans Eds pp 61ndash74TheMITPress1999

[13] J J Gross and RW Levenson ldquoEmotion elicitation using filmsrdquoCognition and Emotion vol 9 no 1 pp 87ndash108 1995

[14] D Hagemann E Naumann S Maier G Becker A Lurkenand D Bartussek ldquoThe assessment of affective reactivity usingfilms validity reliability and sex differencesrdquo Personality andIndividual Differences vol 26 no 4 pp 627ndash639 1999

[15] C L Lisetti and F Nasoz ldquoUsing noninvasive wearable comput-ers to recognize human emotions from physiological signalsrdquoEURASIP Journal on Advances in Signal Processing vol 2004Article ID 929414 pp 1672ndash1687 2004

[16] J Hewig D Hagemann J Seifert M Gollwitzer E Naumannand D Bartussek ldquoA revised film set for the induction of basicemotionsrdquo Cognition and Emotion vol 19 no 7 pp 1095ndash11092005

[17] J H Westerink E L van den Broek M H Schut J van Herkand K Tuinenbreijer ldquoProbing experiencerdquo in Assessment ofUser Emotions and Behaviour to Development of Products T JO W F P Joyce H D M Westerink M Ouwerkerk and B deRuyter Eds Springer New York NY USA 2008

[18] A Schaefer F Nils P Philippot and X Sanchez ldquoAssessing theeffectiveness of a large database of emotion-eliciting films a newtool for emotion researchersrdquo Cognition and Emotion vol 24no 7 pp 1153ndash1172 2010

[19] F Verbraeck Objectifying human facial expressions for clinicalapplications [MS thesis] Vrije Universiteit Brussel Belgium2012

[20] O AlzoubiAutomatic affect detection from physiological signalspractical issues [PhD thesis] The University of Sydney NewSouth Wales Australia 2012

[21] S D Kreibig ldquoAutonomic nervous system activity in emotiona reviewrdquo Biological Psychology vol 84 no 3 pp 394ndash421 2010

[22] F D Farfan J C Politti and C J Felice ldquoEvaluation of emgprocessing techniques using information theoryrdquo BioMedicalEngineering Online vol 9 article 72 2010

[23] N Ambady F J Bernieri and J A Richeson ldquoToward ahistology of social behavior judgmental accuracy from thinslices of the behavioral streamrdquoAdvances in Experimental SocialPsychology vol 32 pp 201ndash271 2000

[24] N Ambady and R Rosenthal ldquoThin slices of expressivebehavior as predictors of interpersonal consequences a meta-analysisrdquo Psychological Bulletin vol 111 no 2 pp 256ndash274 1992

[25] K D Lyons and L Tickle-Degnen ldquoReliability and validity ofa videotape method to describe expressive behavior in personswith Parkinsonrsquos diseaserdquoTheAmerican Journal of OccupationalTherapy vol 59 no 1 pp 41ndash49 2005

[26] N AMurphy ldquoUsing thin slices for behavioral codingrdquo Journalof Nonverbal Behavior vol 29 no 4 pp 235ndash246 2005

12 Computational and Mathematical Methods in Medicine

[27] A O Andrade S Nasuto P Kyberd C M Sweeney-Reed andF R vanKanijn ldquoEMG signal filtering based on empiricalmodedecompositionrdquo Biomedical Signal Processing and Control vol1 no 1 pp 44ndash55 2006

[28] M Blanco-Velasco B Weng and K E Barner ldquoECG signaldenoising and baseline wander correction based on the empir-ical mode decompositionrdquo Computers in Biology and Medicinevol 38 no 1 pp 1ndash13 2008

[29] A O Boudraa J C Cexus and Z Saidi ldquoEmd-based signalnoise reductionrdquo Signal Processing vol 1 pp 33ndash37 2005

[30] T Jing-tian Z Qing T Yan L Bin and Z Xiao-kai ldquoHilbert-Huang transform for ECG de-noisingrdquo in Proceedings of the1st International Conference on Bioinformatics and BiomedicalEngineering pp 664ndash667 July 2007

[31] A Karagiannis and P Constantinou ldquoNoise components identi-fication in biomedical signals based on empirical mode decom-positionrdquo in Proceedings of the 9th International Conference onInformation Technology and Applications in Biomedicine (ITABrsquo09) pp 1ndash4 November 2009

[32] Y Kopsinis and S McLaughlin ldquoDevelopment of EMD-baseddenoising methods inspired by wavelet thresholdingrdquo IEEETransactions on Signal Processing vol 57 no 4 pp 1351ndash13622009

[33] F Agrafioti D Hatzinakos and A K Anderson ldquoECG patternanalysis for emotion detectionrdquo IEEE Transactions on AffectiveComputing vol 3 no 1 pp 102ndash115 2012

[34] D L Donoho ldquoDe-noising by soft-thresholdingrdquo IEEE Trans-actions on Information Theory vol 41 no 3 pp 613ndash627 1995

[35] B-U Kohler C Hennig and R Orglmeister ldquoThe principlesof software QRS detectionrdquo IEEE Engineering in Medicine andBiology Magazine vol 21 no 1 pp 42ndash57 2002

[36] V Guralnik and J Srivastava ldquoEvent detection from time seriesdatardquo inKnowledge Discovery andDataMining pp 33ndash42 1999

[37] M Hamedi S-H Salleh and T T Swee ldquoSurfaceelectromyography-based facial expression recognition inBi-polar configurationrdquo Journal of Computer Science vol 7 no9 pp 1407ndash1415 2011

[38] Y Hou H Sahli R Ilse Y Zhang and R Zhao ldquoRobust shape-based head trackingrdquo inAdvanced Concepts for Intelligent VisionSystems vol 4678 ofLectureNotes inComputer Science pp 340ndash351 Springer 2007

[39] T Kanade J F Cohn and Y Tian ldquoComprehensive databasefor facial expression analysisrdquo in Proceedings of the IEEE Inter-national Conference on Automatic Face and Gesture Recognitionpp 46ndash53 Grenoble France 2000

[40] M S Bartlett G C Littlewort M G Frank C Lainscsek IR Fasel and J R Movellan ldquoAutomatic recognition of facialactions in spontaneous expressionsrdquo Journal of Multimedia vol1 no 6 pp 22ndash35 2006

[41] A Savran B Sankur and M Taha Bilge ldquoRegression-basedintensity estimation of facial action unitsrdquo Image and VisionComputing vol 30 no 10 pp 774ndash784 2012

[42] Anvil the video annotation tool httpwwwanvil-soft-wareorg

[43] S Vicente J Peron I Biseul et al ldquoSubjective emotionalexperience at different stages of Parkinsonrsquos diseaserdquo Journal ofthe Neurological Sciences vol 310 no 1-2 pp 241ndash247 2011

[44] A van Boxtel ldquoFacial emg as a tool for inferring affectivestatesrdquo in Proceedings of Measuring Behavior 2010 A J Spink FGrieco O Krips L Loijens L Noldus and P Zimmeran Edspp 104ndash108 Noldus Information TechnologyWageningenTheNetherlands 2010

[45] C-N Huang C-H Chen and H-Y Chung ldquoThe review ofapplications and measurements in facial electromyographyrdquoJournal of Medical and Biological Engineering vol 25 no 1 pp15ndash20 2005

[46] R W Levenson P Ekman K Heider and W V FriesenldquoEmotion and autonomic nervous system activity in theminangkabau of west sumatrardquo Journal of Personality and SocialPsychology vol 62 no 6 pp 972ndash988 1992

[47] S Rohrmann and H Hopp ldquoCardiovascular indicators ofdisgustrdquo International Journal of Psychophysiology vol 68 no3 pp 201ndash208 2008

[48] OAlaoui-Ismaıli O RobinH Rada ADittmar andEVernet-Maury ldquoBasic emotions evoked by odorants comparisonbetween autonomic responses and self-evaluationrdquo Physiologyand Behavior vol 62 no 4 pp 713ndash720 1997

[49] J Gruber S L Johnson C Oveis and D Keltner ldquoRisk formania and positive emotional responding too much of a goodthingrdquo Emotion vol 8 no 1 pp 23ndash33 2008

[50] W Gaebel and W Wolwer ldquoFacial expressivity in the course ofschizophrenia and depressionrdquo European Archives of Psychiatryand Clinical Neuroscience vol 254 no 5 pp 335ndash342 2004

[51] P EkmanW Friesen and J Hager Facial Action Coding SystemThe Manual 2002

Submit your manuscripts athttpwwwhindawicom

Stem CellsInternational

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

MEDIATORSINFLAMMATION

of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Behavioural Neurology

EndocrinologyInternational Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Disease Markers

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

BioMed Research International

OncologyJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Oxidative Medicine and Cellular Longevity

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

PPAR Research

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Immunology ResearchHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Journal of

ObesityJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Computational and Mathematical Methods in Medicine

OphthalmologyJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Diabetes ResearchJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Research and TreatmentAIDS

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Gastroenterology Research and Practice

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Parkinsonrsquos Disease

Evidence-Based Complementary and Alternative Medicine

Volume 2014Hindawi Publishing Corporationhttpwwwhindawicom

Computational and Mathematical Methods in Medicine 3

Figure 2 Experimental setup

The data recording took place in a sound-isolated labunder standardized lighting condition The movie clips werewatched by the participants while sitting on a sofa that wasfacing a screen with dimensions 125 by 125m (see Figure 2)

23 Participants This pilot study considered seven PDpatients (3 men and 4 women between the ages of 47 and 76years durations of PD ranged from 15 to 13 years) and eightcontrol participants (5 men and 3 women between the age of27 and 57 years)The control participants were recruited fromthe VUB ETRO department without specific characteristicsThe PD patients were selected with the help of the FlemishParkinson LeagueThe studywas approved by the committeeson human research of the respective institutions and wascompleted in accordance with the Helsinki Declaration

Based on the medical dossier as well as the feedback fromthe PD patients we defined three PD categories One patienthad a very light form of PD and therefore was classified asthe least severe case (denoted as LP) Another patient had themost severe form of PD of the whole group (MP) The restof PD patients were situated between those two extremes wedenoted them by intermediate PD (IP)

24 Data Acquisition During the movie clips physiologi-cal signals and frontal face video of the participants wererecorded As physiological signal we recorded electromyo-graphy (EMG) and electrocardiogram (ECG) All the datachannels (ie EMG ECG and videotape)were synchronized

The ECGmeasures the activity of heart contractionsThephysical action of the heart is induced by a local periodicelectrical stimulation and as a result a change in potentialof 10-20 120583V is measured during a cardiac cycle betweentwo surface electrodes [20] Heart rate (HR) and heart ratevariability (HRV) are the cardiovascular response featuresmost often reported as indicators of emotions [21] In thisstudy the ECG was measured at 512Hz using a Shimmer

The EMG measures the frequency of muscle tensionand contraction Within each period of interest the rootmean square (RMS) and absolute mean value (AMV) arecommonly used as features [22] In this study two facial mus-cles were measured at 2000Hz using the EMG Biomonitor

Ground

Orbicularis oculi

Levator labii superioris

Figure 3 Locations of the EMG electrodes

ME6000 namely the levator labii superioris (LLS) and theorbicularis oculi (OO) as illustrated in Figure 3

The frontal face video of the participants was recordedfor the purpose of quantifying facial expressivity Followingthe (ICRP-IEB) [9] we selected 11 action units (AU1 AU2AU4 AU6 AU7 AU9 AU12 AU20 AU23 AU25 and AU27)among the 41 ones defined by FACS Table 2 lists the usedAUs along with their associated facial muscles as well as theones defined by the measured EMG

Our purpose being the development of a nonobtrusiveapproach for facial expressivity assessment by automaticallyrecognizing facial action units (AUs) and estimating theirintensity the ECG and EMG measurements have beenincluded in our experimental protocol to quantitatively assesthe emotional manipulation for inducing facial expressionsand confirm facial muscle activity when expressing disgustexpression As our objective did not aim at physiologicallyinvestigating the effect of Parkinsonrsquos on facial EMG only theLLS andOOweremeasured by the EMG as a complementaryinformation to the considered AUs (see Table 2)

25 Data Preparation It is an accepted practice to code aproportion of the observations or ldquothin-slicesrdquo of recordeddata as a representation of the analyzed behavior [23] Inour experiments 30 s data (ECG EMG and video record)extractswere selected corresponding to the last 30 s segmentsof the shown video clips Intuitively one may think thatlonger segments of expressive behavior in persons with PDwould give more information and thus increase the accuracyof analysis However Ambady and Rosenthal [24] foundthat the judgment performance did not significantly improvewhen using 5 minutes slices versus 30 s slices This was alsoconfirmed in [25 26] Note that baseline data were alsoselected from the stimuli neutral1

26 Physiological Data Processing Physiological signals needto be preprocessed prior to feature extraction in orderto remove noise and enhance signal-to-noise ratio (SNR)[27 28] In this work we make use of a newly developedsignal denoising approach based on the empirical modedecomposition (EMD) [8] Different from state-of-the-art

4 Computational and Mathematical Methods in Medicine

Table 2 FACS AUs and related muscles

AU FACS name Facial muscle Videotape EMGAU1 Inner brow raiser Frontalis (pars medialis) X mdashAU2 Outer brow raiser Frontalis (pars lateralis) X mdashAU4 Brow lowerer Depressor glabellae depressor supercilii and CS X mdashAU6 Cheek raiser OO (pars orbitalis) X XAU7 Lid tightener OO (pars palpebralis) X XAU9 Nose wrinkler LLSAN X mdashAU10 Upper lip raiser LLS caput infraorbitalis mdash XAU12 Lip corner puller Zygomaticus Major X mdashAU20 Lip stretcher Risorius X mdashAU23 Lip tightener Orbicularis Oris X mdashAU25 Lips part Depressor labii inferioris X mdashAU27 Mouth stretch Pterygoids digastric X mdashAU45 Blink Contraction OO mdash XAU46 Wink OO mdash X

0 500 1000 1500 2000 2500

0100200300

The raw ECG signal

minus100

0 500 1000 1500 2000 2500

0100200

The denoised ECG signal

minus100

Figure 4 Example of physiological signal (ECG) denoising usingthe method proposed in [8]

methods [27ndash33] our approach estimates the noise level ofeach intrinsic mode functions (IMFs) rather than estimatingthe noise level of all IMFs using Donohorsquos strategy [34] priorto the reconstruction of the signal using the thresholdedIMFs Please refer to [8] for more details Figure 4 illustratesthe denoising results

261 Electrocardiogram (ECG) Heart rate (HR) and heartrate variability (HRV) are the cardiovascular response fea-tures most often reported as indicators of emotion [21]HR is computed using the time difference between twoconsecutive detected R peaks of the QRS complexes (ieRR interval) and is expressed in beats per minute HRVis the variation of beat-to-beat HR Thus the first step inextracting HR and HRV starts from the exact detection ofR peaks in the QRS complex Thus the detection of QRScomplex in particular R peak detection is the basis forECG processing and analysis Many approaches for R peaksdetection have been proposed [35] However most of themare off-line and targeting the noiseless signal which do notmeet the requirements of many real-time applications To

0 500 1000 1500 2000 2500

0

50

100

150

200

250

300

350

Samples

Noi

sy E

CG

minus100

minus50

Figure 5 Change points detection

overcome this problem in [8] we proposed an approachbased on a change point detection (CPD) algorithm for eventdetection in time series [36] that minimizes the error infitting a predefined function using maximum likelihood Inour current implementation polynomial fitting functions ofdegree 1 have been selected empirically An example of resultsis illustrated in Figure 5 which detects all R picks (cross) andsome irrelevant change points (circle) which can be filteredout using a predetermined threshold Once the R peaks weredetectedHRandHRV features are estimated as the differencebetween the HR (HRV) estimated from the considered 30 swindow (of the stimuli) and the one estimated from theneutral 30 s

262 Electromyogram (EMG) The absolute mean value(AMV) is the most commonly used feature for identifying

Computational and Mathematical Methods in Medicine 5

the strength of muscular contraction [22 37] defined overthe length119873 of the signal 119909(119905) as follows

AMV = 1119873

119873

sum

119905=1

|119909 (119905)| (1)

The AMV value during the considered 30 s window wasexpressed as a percentage of the mean amplitude during the30 s neutral baseline This percentage score was computed tostandardize the widely different EMG amplitudes of individ-uals and thus to enable comparison between individuals andgroups

27 Design and Statistical Analysis Preliminary analysis wereperformed to check the validity of the acquired data

(1) Manipulation Check We calculated the descriptive statis-tics (the mean and standard deviation) of self-reportedemotional ratings to check if the participantsrsquo emotionalexperience was successfully manipulated Then Wilcoxonrank sum tests were performed to compare the self-reportbetween groups (PD and control) and between the two clipsof the same emotion

(2) Analysis of Physiological Parameters Physiological vari-ables were tested for univariate significant differences viarepeated-measures analysis of variance (ANOVA) betweengroups (PD versus control) and between the disgust1disgust2 and neutral2 stimulants Group and stimuli arethe between-subjects and within-subject factors respectivelyResults were considered statistically significant at 119875 lt 05

28 Facial Action Units Recognition In this study we makeuse of an automatic facial action units recognition systemdeveloped at our department [11]This system allows context-independent recognition of the following action units AU1AU2 AU4 AU6 AU7 AU9 AU12 AU20 AU23 AU25 andAU27The overall recognition scheme is depicted in Figure 6

The head and facial features were tracked using theconstrained shape tracking approach of [38] This approachallows an automatic detection of the head and the trackingof a shape model composed of 83 landmarks After thefacial components have been tracked in each frame bothgeometry-based features and appearance-based features arecombined and fed to the AdaBoost (adaptive boosting)algorithm for feature selection Finally for each action unitwe used a binary support vector machine (SVM) for context-independent classificationOur systemwas trained and testedon the Kanade et al DFAT-504 dataset [39] The databaseconsists of 486 sequences of facial displays that are producedby 98 university students from 18 to 30 years old of which65 is female All sequences are annotated by certified FACScoders start with a neutral face and end with the apex ofthe expression SVM [11] has been proven to be powerful androbust tools for AU classification

A test sample (a new image) z can be classified accordingto the following decision function

119863 (z) = sign (ℎ (z)) (2)

Table 3 Facial expressivity items defined in ICRP-IEB [9] and usedAUs

Item Gestalt degree Related AUs(1) Active expressivityin face Intensity 11 AUs

(2) Eyebrows raising Intensity + frequency AU1 and AU2(3) Eyebrows pullingtogether Intensity + frequency AU4

(4) Blinking Frequency AU45(5) Cheek raising Intensity + frequency AU6(6) Lip corner puller Intensity + frequency AU12

The output ℎ(z) of a SVM is a distancemeasure between a testpattern and the separating hyperplane defined by the supportvectors The test sample is classified to the positive class (theAU to which it was trained) if 119863 (z) = +1 and is classified tothe negative class if119863(z) = minus1

Unfortunately we cannot use directly the output of aSVM as a probability There is no clear relationship with theposterior class probability 119875(119910 = +1 | z) that the pattern zbelongs to the class119910 = +1 Platt [12] proposed an estimate forthis probability by fitting the SVMoutput ℎ(z)with a sigmoidfunction as follows

119875 (119910 = +1 | z) = 1

1 + exp (119860ℎ (z) + 119861) (3)

Theparameters119860 and119861 are foundusingmaximum likelihoodestimation from the training set The above equation has amapping range in [0 1]

29 Facial Expressivity In this work we follow the rec-ommendation of the Interpersonal Communication RatingProtocol-Parkinsonrsquos Disease Version (ICRP-IEB) [9] wherethe expressivity based on the FACS has been defined interms of (i) frequency that is how often a behavior ormovement occurs (ii) duration that is how long a behavioror movement lasts and (iii) intensity or degree being thestrength force or levelamount of emotion or movement InICRP-IEB facial expressivity is coded according to the gestaltdegree of intensitydurationfrequency of 7 types of facialexpressive behavior (items) depicted in recorded videos ofPDs using a 5-point Likert type scale 1 = low (with low tono movement or change or infrequent) 2 = fairly low 3 =medium 4 = fairly high and 5 = high (very frequent veryactive) Table 3 lists 6 of the ICRP-IEB facial expressivityitems and the corresponding AUs used in this study forestimating them The last item is related to active mouthclosure during speech which was not used in our modelas the participants were not asked to speak during theexperiments

As our current AU recognition system considers only 11AUs (AU1 AU2 AU4 AU6 AU7 AU9 AU12 AU20 AU23AU25 and AU27) we did not consider blinking (AU45) inour facial expression formulation

6 Computational and Mathematical Methods in Medicine

Original video

I(0) I(1) I(T)

Shape model tracking

Appearance-based

P(0) P(1) P(T)

Geometry-based

Feature extraction

AU12 AU27 AU6 AU9 AU1 AU4

Lower facial features

Middle facial features

Upper facial features

Featureselection

AdaBoost

AU12 AU27 AU6 AU9 AU1 AU4Binary SVMclassification

Output decision values

AU1 AU2 AU4 AU6

AU7 AU9 AU12 AU20

AU23 AU25 AU27

Inner brower raiser Outer brower raiser Brow lowerer Cheek raiser

Lid tighten Nose wrinkle Lip corner puller Lip stretch

Lip tighten Lips part Mouth stretch

middot middot middot

middot middot middot

middot middot middot

middot middot middot

middot middot middot

middot middot middot

middot middot middot

middot middot middot middot middot middot middot middot middot

middot middot middot

Figure 6 The AU recognition system

Following the criteria of the ICRP-IEB we defined thefacial expressivity of a participant as follows

EFE = AMC119890+ AI119890minus (AMC

119899+ AI119899) (4)

where AMC119890and AMC

119899are the amount of movement

changes during the disgust and neutral facial expressionrespectively AI

119890and AI

119899are the intensity of the displayed

AUs during the disgust and neutral facial expression respec-tively

It has to be recalled that for all these estimations onlythe considered 30 s windows are used The quantities AMCand AI refer to frequency and intensity respectively Theirdetailed formulation is given in the following sections Inorder to assess the effectiveness of the proposed formulationwe also compared it to the following definition of facialexpression where the baseline of neutral emotion was notconsidered Consider the following

FE = AMC119890+ AI119890 (5)

(1) Intensity of Displayed AUs It has been shown in [40] thatthe output margin of the learned SVM classifiers contained

information about expression intensity Later Savran et al[41] estimated the AU intensity levels using logistic regres-sion on SVM scores In this work we propose using Plattrsquosprobability given by (3) as action unit intensity at frame 119905119868119905(AU119894) = 119875119905AU119894 with 119875

119905

AU119894 = 119875(119888 = AU119894 | z119905) is estimated

using (3) The 119868119905(AU119894) time series is then smoothed using a

Gaussian filter We denote by 119868119905(AU119894) the smoothed value

Figure 7 plots for a participant the smoothed intensity of thefacial AU7 also illustrating its different temporal segmentsneutral onset apex and offset

Having defined the AU intensity the intensity of dis-played AUs during a voluntary facial expression (disgust orneutral) is given by

AI = sum119894isinDAUs

1

1198731015840

119894

sum

119905isin119879119894

119868119905 (AU119894) (6)

where DAUs is the set of displayed (recognized) facial actionunits during the considered 30 s window 119879

119894is the set of

Computational and Mathematical Methods in Medicine 7

1

09

08

07

06

05

04

03

02

01

00 50 100 150 200 250 300 350

Frames

Inte

nsity

of A

U7

(lid

tight

ener

)

Figure 7 Example of AU intensity

frames where AU119894 is active and1198731015840119894is the cardinal of 119879

119894 being

the number of frames where AU119894 is active

(2) Amount of Movement Change The amount of movementchange during a voluntary facial expression (disgust orneutral) is given by

AMC = sum119894isinDAUs

1

1198731015840

119894

sum

119905isin119879119894

10038161003816100381610038161003816119868119905 (AU119894) minus 119868119905+1 (AU119894)

10038161003816100381610038161003816 (7)

with DAUs1198731015840119894 and 119879

119894as defined above

3 Results and Discussion

31 Manipulation Check Participantsrsquo emotion was suc-cessfully elicited using the movie clips of disgust Forboth disgust1 and disgust2 most participants (1415) self-reported the target emotion as their dominated emotionFor the other video clips the reported emotions are as fol-lows amusement1 (915) amusement2 (1215) surprise1(1115) surprise2 (215) anger1 (615) anger2 (915) fear1(1015) fear2 (1115) neutral1 (1315) and neutral2 (1315)Annotation of the video records using the Anvil annotationtool [42] further confirmed that the movie clips of disgustinduced the most reliable emotional data Therefore forfurther analysis we decided to use only the data recordedduring watching the disgust movie clips

The Wilcoxon rank sum tests were implemented on the14 disgusting self-reports Results showed that there wasno significant difference in self-reported emotional ratingsbetween disgust1 (119872 = 650 SD = 76) and disgust2(119872 = 664 SD = 63) and between PD group (119872 =651 SD = 65) and control group (119872 = 664 SD =74) This is what we expected since Vicente et al [43] alsoreported that PD patients at different stages of the disease didnot significantly differ from the controls in the self-reportedemotional experience to presented movie clips

32 Univariate Analysis of Group Effects TheEMGdata from3 participants (1 control and 2 PD) and the ECG data from 10participants (4 control and 6 PD) were discarded because ofnot being well recorded due to sensor malfunctions Results

of the repeated-measures ANOVAs on single physiologicalvariables showed that significant main effect of group on theLLS activity (119865(1 12) = 338 119875 = 09) the OO activity(119865(1 12) = 192 119875 = 19) HR (119865(1 3) = 93 119875 = 41)and HRV (119865(1 3) = 53 119875 = 83) was not found Table 4shows the descriptive data for control and PD groups duringexposure to the three movie clips disgust1 disgust2 andneutral2 and the tests of significance comparison betweengroups using Wilcoxon rank sum tests The Wilcoxon ranksum tests revealed that

(1) comparable levels of baseline activity in control andPD groups over both the LLS and the OOwere found

(2) although disgust1 elicited more muscle activity overthe LLS and the OO for control group than forPD group this difference did not reach statisticalsignificance

(3) disgust2 elicited significantly more LLS activity forcontrol than for PD

These results indicated that PD displayed less muscleactivity over the LLS when expressing disgust than controlIn addition disgust2 induced more muscle activity over theLLS and the OO than disgust1 which is consistent with theself-report andmay be due to the fact that disgust2 is slightlymore disgusting than disgust1

33 Univariate Analysis of Stimuli Effects For LLS we foundan effect of the stimuli on muscle activity (119865(2 24) = 947119875 = 001) Post-hoc tests indicated significant differencesbetween disgust1 and neutral2 (119875 = 01) and betweendisgust2 and neutral2 (119875 = 01) Both disgust1 (119872 = 282SD = 192) and disgust2 (119872 = 537 SD = 543) elicitedmore muscle activity than neutral2 (119872 = 110 SD = 38)Thismain effect was qualified by a stimulitimes group interaction(119865(2 24) = 417 119875 = 028) which was consistent with theresults of the Wilcoxon rank sum tests which compared thephysiological responses between groups

For OO we also found an effect of the stimuli on muscleactivity (119865(2 24) = 545 119875 = 012) Post-hoc tests indicatedsignificant difference only between disgust1 and neutral2(119875 = 002)Thedisgust1 (119872 = 232 SD = 157) elicitedmoremuscle activity than neutral2 (119872 = 135 SD = 111) Nosignificant stimuli times group interaction effect (119865(2 24) = 177119875 = 192) was found

We expected that the disgust clips elicited significantlymore muscle activity over LLS than the neutral clip becausenormally LLS is involved in producing the disgust facialexpression [44] The fact that OO was also significantlydifferent was probably due to the following

(1) crosstalk [45] the LLS and the OO lie in the vicinityof each other

(2) the disgust1 elicited is not only a disgust but alsoa bit of amusement by the funny motion of thecharacter and the background music so that the OOwas also involved in producing the facial expressionof amusement

8 Computational and Mathematical Methods in Medicine

Table 4 Statistical summary of physiological measures

Variable Stimuli Control PD SigMean SD Mean SD

LLSNeutral2 104 34 116 43 71Disgust1 336 204 227 177 26Disgust2 804 649 271 225 04

lowast

OONeutral2 135 111 94 54 81Disgust1 282 177 181 128 32Disgust2 520 571 207 148 13

HRNeutral2 172 277 276 mdash mdashDisgust1 minus53 264 512 mdash mdashDisgust2 222 553 565 mdash mdash

HRVNeutral2 153 1296 minus765 mdash mdashDisgust1 801 1145 2634 mdash mdashDisgust2 1486 3564 1492 mdash mdash

lowastThe two groups are significantly different that is 119875 lt 05ldquomdashrdquoWe did not compare the cardiac responses (ie HR and HRV) between groups because due to technical failures we lost the ECG data for 10 participantsand thus only 5 participants (4 controls and 1 PD) completed the ECG recording

Table 5 Facial expressivity assessment based on different methods

Var Clowast LPlowast IPlowast MPlowast

1198631lowast 1198632lowast 1198732lowast 1198631lowast 1198632lowast 1198732lowast 1198631lowast 1198632lowast 1198732lowast 1198631lowast 1198632lowast 1198732lowast

TFA 8 8 4 5 5 4 5 6 4 4 mdash 5AMCloz 139 277 108 67 65 43 82 62 69 39 mdash 82AI 54 51 18 28 28 22 38 39 31 22 mdash 33FEloz 722 751 246 209 187 142 444 356 262 135 mdash 322lowast119863111986321198732 C LP IP and MP denote disgust1 disgust2 neutral2 the control the least intermediate and most severely form of Parkinsonrsquos patients

respectivelylozpresented in percentagesldquomdashrdquoThe face of the MP while watching1198632 was blocked by his hand

Note that disgust2 (119872 = 364 SD = 432) elicitedmore muscle activity over OO (because of crosstalk) thanneutral2 (119872 = 114 SD = 87) however unlike disgust1the difference did not reach statistical significance (becausedisgust2 did not elicit amusement at all) which can also beinterpreted

Moreover the main effect of stimuli on cardiac parame-ters for both HR (119865(2 6) = 37 119875 = 70) and HRV (119865(2 6) =084 119875 = 48) was not found which is not consistent withwhat we expected unchanged [46 47] or increased [48 49]HR and increased HRV [21 47] This may be due to thefact that we did not have enough recorded ECG data for astatistical analysis

34 Qualitative Analysis of Facial Expressivity To qualita-tively analyze facial expressivity we used the total facialactivity TFA measure of [2 50] being the total numberof displayed AUs in response to the stimuli Compared tothe control (C) the PD groups (LP IP and MP) showedthe attenuation of their facial activities (variable TFA) whilewatching disgusting movie clips (see Table 5) Howevercomparable TFA was found while watching neutral movieclips

Visual inspection of the of the displayed AUs as illus-trated in Figure 8 shows that AU1 AU2 AU6 AU9 andAU45 occurred more frequently for C except for IP whoproduced more frequently AU6 than C A possible expla-nation is that the IP cannot deactivate AU6 even duringneutral state As shown in Figure 9 during neutral state PDpatients produced more continuous active action units suchas AU4 and AU25 (for all PD) AU6 (for IP) and AU9 (forMP) Moreover certain AU combinations were consideredlikely to signify ldquodisgustrdquo on the basis of Ekman and Friesenrsquosdescription of emotions [7] ldquoDisgustrdquo AUs were consideredas the combination of eyebrows lowerer (AU4) cheek raiser(AU6) and nose wrinkler (AU9) Because cheek raiser isvery difficult to produce on demand without including otherAUs especially the lid tightener (AU7) [51] lid tightenerwas also taken into consideration that is the expected AUpattern of ldquodisgustrdquo was the combination of AU4 AU6 AU7and AU9 Consistent with what we expected C displayed98 and 87 ldquodisgustrdquo frames while watching disgust1 anddisgust2 respectively The PD goup (LP IP and MP) didnot display any ldquodisgustrdquo frames only for IP who had4 ldquodisgustrdquo frames while watching disgust1 Furthermorethe IP displayed quite few frames with combinations ofcheek raiser and lid tightener during neutral state Instead

Computational and Mathematical Methods in Medicine 9

0 100 200 300 400 500 600 7000 1 2 4 6 7 9

122023252745

Frame0 100 200 300 400 500 600 700

Frame

AU

0 100 200 300 400 500 600 7000 1 2 4 6 7 9

122023252745

Frame

AU

0 100 200 300 400 500 600 7000 1 2 4 6 7 9

122023252745

Frame

AU

0 1 2 4 6 7 9

122023252745

AU

Disgust1 (C) Disgust1 (LP)

Disgust1 (IP) Disgust1 (MP)

Figure 8 The facial activities while watching disgust1

N(C) N(LP)

N(IP)

Frame

Frame Frame

Frame

N(MP)

0 100 200 300 400 500 600 7000 1 2 4 6 7 9

122023252745

AU

0 100 200 300 400 500 600 7000 1 2 4 6 7 9

122023252745

AU

0 100 200 300 400 500 600 7000 1 2 4 6 7 9

122023252745

AU

0 100 200 300 400 500 600 7000 1 2 4 6 7 9

122023252745

AU

Figure 9 The facial activities while watching neutral2

10 Computational and Mathematical Methods in Medicine

0

5

10

15

20

25

30

35

fe

C

LP

IP

MP

Disgust1 Disgust2

Figure 10 The quantified facial expressivity

the cheek raiser alone was displayed mostly (see Figure 9)which indicates that IP patients have control problems forfacial muscles combinations

Analyzing the defined quantities AMC AI and FE usingthe segments with active AUs as it can be seen from Table 5the control C had higher facial expressivity than LP IP andMP during the disgusting state while MP performed thelowest facial expressivity More specifically compared to CPD (LP IP andMP) had smaller values of variables AMC AIand FE while watching disgusting movie clips especially forbrow lower (AU4) nose wrinkler (AU9) and blink (AU45)C tended to show similar intensities of brow lower (variable119890) with bigger variance (variables AMC and FE) In additionC showed higher intensities of nose wrinkler with largervariance

35 Quantitative Analysis of Facial Expressivity Figure 10depicts the values of the facial expressivity equation (5) forthe participants We did not assess the facial expressivityof the MP patient during disgust2 as he had his hand onfront of his face during the experiments As it can be seen asignificant difference between C and PD patients is presentC got the highest score while the lowest score was obtainedby the MP patient However the score of the IP patient wasslightly higher than the LP one which is due to the fact thatthe LP and IP expressed ldquofacial maskingrdquo in different waysthe LP patient showed the attenuation of the intensities offacial movements On the contrary the IP patient producedhigh intensities of facial movements not only in his emotionalstate but also during neutral state that is he cannot relax themuscles well In order to take both kinds of ldquofacial maskingrdquointo account we computed the facial expressivity using (4)The results are shown in Figure 11 As it can be seen theproposed facial expressivity allows distinguishing betweencontrol and PD patients Moreover the facial expressivitydecreases along with the increase of PD severity

0

5

10

15

20

25

fe

CLP

IP

MP

Disgust1 Disgust2minus5

Figure 11 The quantified facial expressivity based on the improvedfunction

4 Conclusions

This study investigated the phenomenon of facial masking inParkinsonrsquos patientsWedesigned an automated and objectivemethod to assess the facial expressivity of PD patients Theproposed approach follows the methodology of the Inter-personal Communication Rating Protocol-Parkinsonrsquos DiseaseVersion (ICRP-IEB) [9] In this study based on the FacialAction Coding System (FACS) we proposed a methodologythat (i) automatically detects faces in a video stream (ii) codeseach framewith respect to 11 action units and (iii) estimates afacial expressivity as function of frequency that is how oftenAUs occur duration that is how long AUs last and intensitybeing the strength of AUs

Although the proposed facial expressivity assessmentapproach has been evaluated in a limited number of subjectsit allows capturing differences in facial expressivity betweencontrol participants andParkinsonrsquos patientsMoreover facialexpressivity differences between PD patients with differ-ent progression of Parkinsonrsquos disease have been assessedThe proposed method can capture these differences andgive a more accurate assessment of facial expressivity forParkinsonrsquos patients than what traditional observer basedratings allow for This confirms that our approach can beused for clinical assessment of facial expressivity in PDIndeed nonverbal signals contribute significantly to inter-personal communication Facial expressivity a major sourceof nonverbal information is compromised in Parkinsonrsquosdisease The resulting disconnect between subjective feelingand objective facial affect can lead people to form negativeand inaccurate impressions of people with PD with respectto their personality feelings and intelligence Assessing inan objective way the facial expressivity limitation of PDwould allow developing personalized therapy to benefit facialexpressivity in PD IndeedParkinsonrsquos is a progressive diseasewhich means that the symptoms will get worse as time

Computational and Mathematical Methods in Medicine 11

goes on Using the proposed assessment approach wouldallow regular facial expressivity assessment by therapist andclinicians to explore treatment options

Future work will (i) improve the AU recognition systemand extend it to more AUs and (ii) consider clinical usageof the proposed approach in a statistically significant PDpatients population

Conflict of Interests

The authors declare that there is no conflict of interestsregarding the publication of this paper

Acknowledgments

This work is supported by a CSC-VUB scholarship (Grant no2011629010) and the VUB-interdisciplinary research projectldquoObjectifying Assessment of Human Emotional Processingand Expression for Clinical and Psychotherapy Applications(EMO-App)rdquo The authors gratefully acknowledge the Flem-ish Parkinson League for supporting the current study andthe recruitment of the PD patients Also they acknowledgeFloriane Verbraeck for the setup of the experiments and dataacquisition

References

[1] M Katsikitis and I Pilowsky ldquoA study of facial expressionin Parkinsonrsquos disease using a novel microcomputer-basedmethodrdquo Journal of Neurology Neurosurgery and Psychiatry vol51 no 3 pp 362ndash366 1988

[2] G Simons H Ellgring and M C Smith Pasqualini ldquoDistur-bance of spontaneous and posed facial expressions in Parkin-sonrsquos diseaserdquoCognition and Emotion vol 17 no 5 pp 759ndash7782003

[3] D H Jacobs J Shuren D Bowers and K M Heilman ldquoEmo-tional facial imagery perception and expression in Parkinsonsdiseaserdquo Neurology vol 45 no 9 pp 1696ndash1702 1995

[4] L Tickle-Degnen and K D Lyons ldquoPractitionersrsquo impressionsof patients with Parkinsonrsquos disease the social ecology of theexpressive maskrdquo Social Science amp Medicine vol 58 no 3 pp603ndash614 2004

[5] J J van Hilten A D van der Zwan A H Zwinderman and RA C Roos ldquoRating impairment and disability in Parkinsonsdisease evaluation of the unified Parkinsons disease ratingscalerdquoMovement Disorders vol 9 no 1 pp 84ndash88 1994

[6] D Bowers K Miller W Bosch et al ldquoFaces of emotionin Parkinsons disease micro-expressivity and bradykinesiaduring voluntary facial expressionsrdquo Journal of the InternationalNeuropsychological Society vol 12 no 6 pp 765ndash773 2006

[7] P Ekman and W Friesen Facial Action Coding System ATechnique for the Measurement of Facial Movement ConsultingPsychologists Press Palo Alto Calif USA 1978

[8] P Wu D Jiang and H Sahli ldquoPhysiological signal processingfor emotional feature extractionrdquo in Proceedings of the Interna-tional Conference on Physiological Computing Systems pp 40ndash47 2014

[9] L Tickle-Degnen The Interpersonal Communication RatingProtocol A Manual for Measuring Individual Expressive Behav-ior 2010

[10] G Sandbach S Zafeiriou M Pantic and L Yin ldquoStatic anddynamic 3D facial expression recognition a comprehensivesurveyrdquo Image and Vision Computing vol 30 no 10 pp 683ndash697 2012

[11] I Gonzalez H Sahli V Enescu and W Verhelst ldquoContext-independent facial action unit recognition using shape andgabor phase informationrdquo in Proceedings of the 4th InternationalConference on Affective Computing and Intelligent Interaction(ACII rsquo11) vol 1 pp 548ndash557 Springer Memphis Tenn USAOctober 2011

[12] J C Platt ldquoProbabilistic outputs for support vector machinesand comparisons to regularized likelihood methodsrdquo inAdvances in Large Margin Classifiers A J Smola P Bartlett BSchoelkopf andD Schuurmans Eds pp 61ndash74TheMITPress1999

[13] J J Gross and RW Levenson ldquoEmotion elicitation using filmsrdquoCognition and Emotion vol 9 no 1 pp 87ndash108 1995

[14] D Hagemann E Naumann S Maier G Becker A Lurkenand D Bartussek ldquoThe assessment of affective reactivity usingfilms validity reliability and sex differencesrdquo Personality andIndividual Differences vol 26 no 4 pp 627ndash639 1999

[15] C L Lisetti and F Nasoz ldquoUsing noninvasive wearable comput-ers to recognize human emotions from physiological signalsrdquoEURASIP Journal on Advances in Signal Processing vol 2004Article ID 929414 pp 1672ndash1687 2004

[16] J Hewig D Hagemann J Seifert M Gollwitzer E Naumannand D Bartussek ldquoA revised film set for the induction of basicemotionsrdquo Cognition and Emotion vol 19 no 7 pp 1095ndash11092005

[17] J H Westerink E L van den Broek M H Schut J van Herkand K Tuinenbreijer ldquoProbing experiencerdquo in Assessment ofUser Emotions and Behaviour to Development of Products T JO W F P Joyce H D M Westerink M Ouwerkerk and B deRuyter Eds Springer New York NY USA 2008

[18] A Schaefer F Nils P Philippot and X Sanchez ldquoAssessing theeffectiveness of a large database of emotion-eliciting films a newtool for emotion researchersrdquo Cognition and Emotion vol 24no 7 pp 1153ndash1172 2010

[19] F Verbraeck Objectifying human facial expressions for clinicalapplications [MS thesis] Vrije Universiteit Brussel Belgium2012

[20] O AlzoubiAutomatic affect detection from physiological signalspractical issues [PhD thesis] The University of Sydney NewSouth Wales Australia 2012

[21] S D Kreibig ldquoAutonomic nervous system activity in emotiona reviewrdquo Biological Psychology vol 84 no 3 pp 394ndash421 2010

[22] F D Farfan J C Politti and C J Felice ldquoEvaluation of emgprocessing techniques using information theoryrdquo BioMedicalEngineering Online vol 9 article 72 2010

[23] N Ambady F J Bernieri and J A Richeson ldquoToward ahistology of social behavior judgmental accuracy from thinslices of the behavioral streamrdquoAdvances in Experimental SocialPsychology vol 32 pp 201ndash271 2000

[24] N Ambady and R Rosenthal ldquoThin slices of expressivebehavior as predictors of interpersonal consequences a meta-analysisrdquo Psychological Bulletin vol 111 no 2 pp 256ndash274 1992

[25] K D Lyons and L Tickle-Degnen ldquoReliability and validity ofa videotape method to describe expressive behavior in personswith Parkinsonrsquos diseaserdquoTheAmerican Journal of OccupationalTherapy vol 59 no 1 pp 41ndash49 2005

[26] N AMurphy ldquoUsing thin slices for behavioral codingrdquo Journalof Nonverbal Behavior vol 29 no 4 pp 235ndash246 2005

12 Computational and Mathematical Methods in Medicine

[27] A O Andrade S Nasuto P Kyberd C M Sweeney-Reed andF R vanKanijn ldquoEMG signal filtering based on empiricalmodedecompositionrdquo Biomedical Signal Processing and Control vol1 no 1 pp 44ndash55 2006

[28] M Blanco-Velasco B Weng and K E Barner ldquoECG signaldenoising and baseline wander correction based on the empir-ical mode decompositionrdquo Computers in Biology and Medicinevol 38 no 1 pp 1ndash13 2008

[29] A O Boudraa J C Cexus and Z Saidi ldquoEmd-based signalnoise reductionrdquo Signal Processing vol 1 pp 33ndash37 2005

[30] T Jing-tian Z Qing T Yan L Bin and Z Xiao-kai ldquoHilbert-Huang transform for ECG de-noisingrdquo in Proceedings of the1st International Conference on Bioinformatics and BiomedicalEngineering pp 664ndash667 July 2007

[31] A Karagiannis and P Constantinou ldquoNoise components identi-fication in biomedical signals based on empirical mode decom-positionrdquo in Proceedings of the 9th International Conference onInformation Technology and Applications in Biomedicine (ITABrsquo09) pp 1ndash4 November 2009

[32] Y Kopsinis and S McLaughlin ldquoDevelopment of EMD-baseddenoising methods inspired by wavelet thresholdingrdquo IEEETransactions on Signal Processing vol 57 no 4 pp 1351ndash13622009

[33] F Agrafioti D Hatzinakos and A K Anderson ldquoECG patternanalysis for emotion detectionrdquo IEEE Transactions on AffectiveComputing vol 3 no 1 pp 102ndash115 2012

[34] D L Donoho ldquoDe-noising by soft-thresholdingrdquo IEEE Trans-actions on Information Theory vol 41 no 3 pp 613ndash627 1995

[35] B-U Kohler C Hennig and R Orglmeister ldquoThe principlesof software QRS detectionrdquo IEEE Engineering in Medicine andBiology Magazine vol 21 no 1 pp 42ndash57 2002

[36] V Guralnik and J Srivastava ldquoEvent detection from time seriesdatardquo inKnowledge Discovery andDataMining pp 33ndash42 1999

[37] M Hamedi S-H Salleh and T T Swee ldquoSurfaceelectromyography-based facial expression recognition inBi-polar configurationrdquo Journal of Computer Science vol 7 no9 pp 1407ndash1415 2011

[38] Y Hou H Sahli R Ilse Y Zhang and R Zhao ldquoRobust shape-based head trackingrdquo inAdvanced Concepts for Intelligent VisionSystems vol 4678 ofLectureNotes inComputer Science pp 340ndash351 Springer 2007

[39] T Kanade J F Cohn and Y Tian ldquoComprehensive databasefor facial expression analysisrdquo in Proceedings of the IEEE Inter-national Conference on Automatic Face and Gesture Recognitionpp 46ndash53 Grenoble France 2000

[40] M S Bartlett G C Littlewort M G Frank C Lainscsek IR Fasel and J R Movellan ldquoAutomatic recognition of facialactions in spontaneous expressionsrdquo Journal of Multimedia vol1 no 6 pp 22ndash35 2006

[41] A Savran B Sankur and M Taha Bilge ldquoRegression-basedintensity estimation of facial action unitsrdquo Image and VisionComputing vol 30 no 10 pp 774ndash784 2012

[42] Anvil the video annotation tool httpwwwanvil-soft-wareorg

[43] S Vicente J Peron I Biseul et al ldquoSubjective emotionalexperience at different stages of Parkinsonrsquos diseaserdquo Journal ofthe Neurological Sciences vol 310 no 1-2 pp 241ndash247 2011

[44] A van Boxtel ldquoFacial emg as a tool for inferring affectivestatesrdquo in Proceedings of Measuring Behavior 2010 A J Spink FGrieco O Krips L Loijens L Noldus and P Zimmeran Edspp 104ndash108 Noldus Information TechnologyWageningenTheNetherlands 2010

[45] C-N Huang C-H Chen and H-Y Chung ldquoThe review ofapplications and measurements in facial electromyographyrdquoJournal of Medical and Biological Engineering vol 25 no 1 pp15ndash20 2005

[46] R W Levenson P Ekman K Heider and W V FriesenldquoEmotion and autonomic nervous system activity in theminangkabau of west sumatrardquo Journal of Personality and SocialPsychology vol 62 no 6 pp 972ndash988 1992

[47] S Rohrmann and H Hopp ldquoCardiovascular indicators ofdisgustrdquo International Journal of Psychophysiology vol 68 no3 pp 201ndash208 2008

[48] OAlaoui-Ismaıli O RobinH Rada ADittmar andEVernet-Maury ldquoBasic emotions evoked by odorants comparisonbetween autonomic responses and self-evaluationrdquo Physiologyand Behavior vol 62 no 4 pp 713ndash720 1997

[49] J Gruber S L Johnson C Oveis and D Keltner ldquoRisk formania and positive emotional responding too much of a goodthingrdquo Emotion vol 8 no 1 pp 23ndash33 2008

[50] W Gaebel and W Wolwer ldquoFacial expressivity in the course ofschizophrenia and depressionrdquo European Archives of Psychiatryand Clinical Neuroscience vol 254 no 5 pp 335ndash342 2004

[51] P EkmanW Friesen and J Hager Facial Action Coding SystemThe Manual 2002

Submit your manuscripts athttpwwwhindawicom

Stem CellsInternational

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

MEDIATORSINFLAMMATION

of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Behavioural Neurology

EndocrinologyInternational Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Disease Markers

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

BioMed Research International

OncologyJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Oxidative Medicine and Cellular Longevity

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

PPAR Research

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Immunology ResearchHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Journal of

ObesityJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Computational and Mathematical Methods in Medicine

OphthalmologyJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Diabetes ResearchJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Research and TreatmentAIDS

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Gastroenterology Research and Practice

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Parkinsonrsquos Disease

Evidence-Based Complementary and Alternative Medicine

Volume 2014Hindawi Publishing Corporationhttpwwwhindawicom

4 Computational and Mathematical Methods in Medicine

Table 2 FACS AUs and related muscles

AU FACS name Facial muscle Videotape EMGAU1 Inner brow raiser Frontalis (pars medialis) X mdashAU2 Outer brow raiser Frontalis (pars lateralis) X mdashAU4 Brow lowerer Depressor glabellae depressor supercilii and CS X mdashAU6 Cheek raiser OO (pars orbitalis) X XAU7 Lid tightener OO (pars palpebralis) X XAU9 Nose wrinkler LLSAN X mdashAU10 Upper lip raiser LLS caput infraorbitalis mdash XAU12 Lip corner puller Zygomaticus Major X mdashAU20 Lip stretcher Risorius X mdashAU23 Lip tightener Orbicularis Oris X mdashAU25 Lips part Depressor labii inferioris X mdashAU27 Mouth stretch Pterygoids digastric X mdashAU45 Blink Contraction OO mdash XAU46 Wink OO mdash X

0 500 1000 1500 2000 2500

0100200300

The raw ECG signal

minus100

0 500 1000 1500 2000 2500

0100200

The denoised ECG signal

minus100

Figure 4 Example of physiological signal (ECG) denoising usingthe method proposed in [8]

methods [27ndash33] our approach estimates the noise level ofeach intrinsic mode functions (IMFs) rather than estimatingthe noise level of all IMFs using Donohorsquos strategy [34] priorto the reconstruction of the signal using the thresholdedIMFs Please refer to [8] for more details Figure 4 illustratesthe denoising results

261 Electrocardiogram (ECG) Heart rate (HR) and heartrate variability (HRV) are the cardiovascular response fea-tures most often reported as indicators of emotion [21]HR is computed using the time difference between twoconsecutive detected R peaks of the QRS complexes (ieRR interval) and is expressed in beats per minute HRVis the variation of beat-to-beat HR Thus the first step inextracting HR and HRV starts from the exact detection ofR peaks in the QRS complex Thus the detection of QRScomplex in particular R peak detection is the basis forECG processing and analysis Many approaches for R peaksdetection have been proposed [35] However most of themare off-line and targeting the noiseless signal which do notmeet the requirements of many real-time applications To

0 500 1000 1500 2000 2500

0

50

100

150

200

250

300

350

Samples

Noi

sy E

CG

minus100

minus50

Figure 5 Change points detection

overcome this problem in [8] we proposed an approachbased on a change point detection (CPD) algorithm for eventdetection in time series [36] that minimizes the error infitting a predefined function using maximum likelihood Inour current implementation polynomial fitting functions ofdegree 1 have been selected empirically An example of resultsis illustrated in Figure 5 which detects all R picks (cross) andsome irrelevant change points (circle) which can be filteredout using a predetermined threshold Once the R peaks weredetectedHRandHRV features are estimated as the differencebetween the HR (HRV) estimated from the considered 30 swindow (of the stimuli) and the one estimated from theneutral 30 s

262 Electromyogram (EMG) The absolute mean value(AMV) is the most commonly used feature for identifying

Computational and Mathematical Methods in Medicine 5

the strength of muscular contraction [22 37] defined overthe length119873 of the signal 119909(119905) as follows

AMV = 1119873

119873

sum

119905=1

|119909 (119905)| (1)

The AMV value during the considered 30 s window wasexpressed as a percentage of the mean amplitude during the30 s neutral baseline This percentage score was computed tostandardize the widely different EMG amplitudes of individ-uals and thus to enable comparison between individuals andgroups

27 Design and Statistical Analysis Preliminary analysis wereperformed to check the validity of the acquired data

(1) Manipulation Check We calculated the descriptive statis-tics (the mean and standard deviation) of self-reportedemotional ratings to check if the participantsrsquo emotionalexperience was successfully manipulated Then Wilcoxonrank sum tests were performed to compare the self-reportbetween groups (PD and control) and between the two clipsof the same emotion

(2) Analysis of Physiological Parameters Physiological vari-ables were tested for univariate significant differences viarepeated-measures analysis of variance (ANOVA) betweengroups (PD versus control) and between the disgust1disgust2 and neutral2 stimulants Group and stimuli arethe between-subjects and within-subject factors respectivelyResults were considered statistically significant at 119875 lt 05

28 Facial Action Units Recognition In this study we makeuse of an automatic facial action units recognition systemdeveloped at our department [11]This system allows context-independent recognition of the following action units AU1AU2 AU4 AU6 AU7 AU9 AU12 AU20 AU23 AU25 andAU27The overall recognition scheme is depicted in Figure 6

The head and facial features were tracked using theconstrained shape tracking approach of [38] This approachallows an automatic detection of the head and the trackingof a shape model composed of 83 landmarks After thefacial components have been tracked in each frame bothgeometry-based features and appearance-based features arecombined and fed to the AdaBoost (adaptive boosting)algorithm for feature selection Finally for each action unitwe used a binary support vector machine (SVM) for context-independent classificationOur systemwas trained and testedon the Kanade et al DFAT-504 dataset [39] The databaseconsists of 486 sequences of facial displays that are producedby 98 university students from 18 to 30 years old of which65 is female All sequences are annotated by certified FACScoders start with a neutral face and end with the apex ofthe expression SVM [11] has been proven to be powerful androbust tools for AU classification

A test sample (a new image) z can be classified accordingto the following decision function

119863 (z) = sign (ℎ (z)) (2)

Table 3 Facial expressivity items defined in ICRP-IEB [9] and usedAUs

Item Gestalt degree Related AUs(1) Active expressivityin face Intensity 11 AUs

(2) Eyebrows raising Intensity + frequency AU1 and AU2(3) Eyebrows pullingtogether Intensity + frequency AU4

(4) Blinking Frequency AU45(5) Cheek raising Intensity + frequency AU6(6) Lip corner puller Intensity + frequency AU12

The output ℎ(z) of a SVM is a distancemeasure between a testpattern and the separating hyperplane defined by the supportvectors The test sample is classified to the positive class (theAU to which it was trained) if 119863 (z) = +1 and is classified tothe negative class if119863(z) = minus1

Unfortunately we cannot use directly the output of aSVM as a probability There is no clear relationship with theposterior class probability 119875(119910 = +1 | z) that the pattern zbelongs to the class119910 = +1 Platt [12] proposed an estimate forthis probability by fitting the SVMoutput ℎ(z)with a sigmoidfunction as follows

119875 (119910 = +1 | z) = 1

1 + exp (119860ℎ (z) + 119861) (3)

Theparameters119860 and119861 are foundusingmaximum likelihoodestimation from the training set The above equation has amapping range in [0 1]

29 Facial Expressivity In this work we follow the rec-ommendation of the Interpersonal Communication RatingProtocol-Parkinsonrsquos Disease Version (ICRP-IEB) [9] wherethe expressivity based on the FACS has been defined interms of (i) frequency that is how often a behavior ormovement occurs (ii) duration that is how long a behavioror movement lasts and (iii) intensity or degree being thestrength force or levelamount of emotion or movement InICRP-IEB facial expressivity is coded according to the gestaltdegree of intensitydurationfrequency of 7 types of facialexpressive behavior (items) depicted in recorded videos ofPDs using a 5-point Likert type scale 1 = low (with low tono movement or change or infrequent) 2 = fairly low 3 =medium 4 = fairly high and 5 = high (very frequent veryactive) Table 3 lists 6 of the ICRP-IEB facial expressivityitems and the corresponding AUs used in this study forestimating them The last item is related to active mouthclosure during speech which was not used in our modelas the participants were not asked to speak during theexperiments

As our current AU recognition system considers only 11AUs (AU1 AU2 AU4 AU6 AU7 AU9 AU12 AU20 AU23AU25 and AU27) we did not consider blinking (AU45) inour facial expression formulation

6 Computational and Mathematical Methods in Medicine

Original video

I(0) I(1) I(T)

Shape model tracking

Appearance-based

P(0) P(1) P(T)

Geometry-based

Feature extraction

AU12 AU27 AU6 AU9 AU1 AU4

Lower facial features

Middle facial features

Upper facial features

Featureselection

AdaBoost

AU12 AU27 AU6 AU9 AU1 AU4Binary SVMclassification

Output decision values

AU1 AU2 AU4 AU6

AU7 AU9 AU12 AU20

AU23 AU25 AU27

Inner brower raiser Outer brower raiser Brow lowerer Cheek raiser

Lid tighten Nose wrinkle Lip corner puller Lip stretch

Lip tighten Lips part Mouth stretch

middot middot middot

middot middot middot

middot middot middot

middot middot middot

middot middot middot

middot middot middot

middot middot middot

middot middot middot middot middot middot middot middot middot

middot middot middot

Figure 6 The AU recognition system

Following the criteria of the ICRP-IEB we defined thefacial expressivity of a participant as follows

EFE = AMC119890+ AI119890minus (AMC

119899+ AI119899) (4)

where AMC119890and AMC

119899are the amount of movement

changes during the disgust and neutral facial expressionrespectively AI

119890and AI

119899are the intensity of the displayed

AUs during the disgust and neutral facial expression respec-tively

It has to be recalled that for all these estimations onlythe considered 30 s windows are used The quantities AMCand AI refer to frequency and intensity respectively Theirdetailed formulation is given in the following sections Inorder to assess the effectiveness of the proposed formulationwe also compared it to the following definition of facialexpression where the baseline of neutral emotion was notconsidered Consider the following

FE = AMC119890+ AI119890 (5)

(1) Intensity of Displayed AUs It has been shown in [40] thatthe output margin of the learned SVM classifiers contained

information about expression intensity Later Savran et al[41] estimated the AU intensity levels using logistic regres-sion on SVM scores In this work we propose using Plattrsquosprobability given by (3) as action unit intensity at frame 119905119868119905(AU119894) = 119875119905AU119894 with 119875

119905

AU119894 = 119875(119888 = AU119894 | z119905) is estimated

using (3) The 119868119905(AU119894) time series is then smoothed using a

Gaussian filter We denote by 119868119905(AU119894) the smoothed value

Figure 7 plots for a participant the smoothed intensity of thefacial AU7 also illustrating its different temporal segmentsneutral onset apex and offset

Having defined the AU intensity the intensity of dis-played AUs during a voluntary facial expression (disgust orneutral) is given by

AI = sum119894isinDAUs

1

1198731015840

119894

sum

119905isin119879119894

119868119905 (AU119894) (6)

where DAUs is the set of displayed (recognized) facial actionunits during the considered 30 s window 119879

119894is the set of

Computational and Mathematical Methods in Medicine 7

1

09

08

07

06

05

04

03

02

01

00 50 100 150 200 250 300 350

Frames

Inte

nsity

of A

U7

(lid

tight

ener

)

Figure 7 Example of AU intensity

frames where AU119894 is active and1198731015840119894is the cardinal of 119879

119894 being

the number of frames where AU119894 is active

(2) Amount of Movement Change The amount of movementchange during a voluntary facial expression (disgust orneutral) is given by

AMC = sum119894isinDAUs

1

1198731015840

119894

sum

119905isin119879119894

10038161003816100381610038161003816119868119905 (AU119894) minus 119868119905+1 (AU119894)

10038161003816100381610038161003816 (7)

with DAUs1198731015840119894 and 119879

119894as defined above

3 Results and Discussion

31 Manipulation Check Participantsrsquo emotion was suc-cessfully elicited using the movie clips of disgust Forboth disgust1 and disgust2 most participants (1415) self-reported the target emotion as their dominated emotionFor the other video clips the reported emotions are as fol-lows amusement1 (915) amusement2 (1215) surprise1(1115) surprise2 (215) anger1 (615) anger2 (915) fear1(1015) fear2 (1115) neutral1 (1315) and neutral2 (1315)Annotation of the video records using the Anvil annotationtool [42] further confirmed that the movie clips of disgustinduced the most reliable emotional data Therefore forfurther analysis we decided to use only the data recordedduring watching the disgust movie clips

The Wilcoxon rank sum tests were implemented on the14 disgusting self-reports Results showed that there wasno significant difference in self-reported emotional ratingsbetween disgust1 (119872 = 650 SD = 76) and disgust2(119872 = 664 SD = 63) and between PD group (119872 =651 SD = 65) and control group (119872 = 664 SD =74) This is what we expected since Vicente et al [43] alsoreported that PD patients at different stages of the disease didnot significantly differ from the controls in the self-reportedemotional experience to presented movie clips

32 Univariate Analysis of Group Effects TheEMGdata from3 participants (1 control and 2 PD) and the ECG data from 10participants (4 control and 6 PD) were discarded because ofnot being well recorded due to sensor malfunctions Results

of the repeated-measures ANOVAs on single physiologicalvariables showed that significant main effect of group on theLLS activity (119865(1 12) = 338 119875 = 09) the OO activity(119865(1 12) = 192 119875 = 19) HR (119865(1 3) = 93 119875 = 41)and HRV (119865(1 3) = 53 119875 = 83) was not found Table 4shows the descriptive data for control and PD groups duringexposure to the three movie clips disgust1 disgust2 andneutral2 and the tests of significance comparison betweengroups using Wilcoxon rank sum tests The Wilcoxon ranksum tests revealed that

(1) comparable levels of baseline activity in control andPD groups over both the LLS and the OOwere found

(2) although disgust1 elicited more muscle activity overthe LLS and the OO for control group than forPD group this difference did not reach statisticalsignificance

(3) disgust2 elicited significantly more LLS activity forcontrol than for PD

These results indicated that PD displayed less muscleactivity over the LLS when expressing disgust than controlIn addition disgust2 induced more muscle activity over theLLS and the OO than disgust1 which is consistent with theself-report andmay be due to the fact that disgust2 is slightlymore disgusting than disgust1

33 Univariate Analysis of Stimuli Effects For LLS we foundan effect of the stimuli on muscle activity (119865(2 24) = 947119875 = 001) Post-hoc tests indicated significant differencesbetween disgust1 and neutral2 (119875 = 01) and betweendisgust2 and neutral2 (119875 = 01) Both disgust1 (119872 = 282SD = 192) and disgust2 (119872 = 537 SD = 543) elicitedmore muscle activity than neutral2 (119872 = 110 SD = 38)Thismain effect was qualified by a stimulitimes group interaction(119865(2 24) = 417 119875 = 028) which was consistent with theresults of the Wilcoxon rank sum tests which compared thephysiological responses between groups

For OO we also found an effect of the stimuli on muscleactivity (119865(2 24) = 545 119875 = 012) Post-hoc tests indicatedsignificant difference only between disgust1 and neutral2(119875 = 002)Thedisgust1 (119872 = 232 SD = 157) elicitedmoremuscle activity than neutral2 (119872 = 135 SD = 111) Nosignificant stimuli times group interaction effect (119865(2 24) = 177119875 = 192) was found

We expected that the disgust clips elicited significantlymore muscle activity over LLS than the neutral clip becausenormally LLS is involved in producing the disgust facialexpression [44] The fact that OO was also significantlydifferent was probably due to the following

(1) crosstalk [45] the LLS and the OO lie in the vicinityof each other

(2) the disgust1 elicited is not only a disgust but alsoa bit of amusement by the funny motion of thecharacter and the background music so that the OOwas also involved in producing the facial expressionof amusement

8 Computational and Mathematical Methods in Medicine

Table 4 Statistical summary of physiological measures

Variable Stimuli Control PD SigMean SD Mean SD

LLSNeutral2 104 34 116 43 71Disgust1 336 204 227 177 26Disgust2 804 649 271 225 04

lowast

OONeutral2 135 111 94 54 81Disgust1 282 177 181 128 32Disgust2 520 571 207 148 13

HRNeutral2 172 277 276 mdash mdashDisgust1 minus53 264 512 mdash mdashDisgust2 222 553 565 mdash mdash

HRVNeutral2 153 1296 minus765 mdash mdashDisgust1 801 1145 2634 mdash mdashDisgust2 1486 3564 1492 mdash mdash

lowastThe two groups are significantly different that is 119875 lt 05ldquomdashrdquoWe did not compare the cardiac responses (ie HR and HRV) between groups because due to technical failures we lost the ECG data for 10 participantsand thus only 5 participants (4 controls and 1 PD) completed the ECG recording

Table 5 Facial expressivity assessment based on different methods

Var Clowast LPlowast IPlowast MPlowast

1198631lowast 1198632lowast 1198732lowast 1198631lowast 1198632lowast 1198732lowast 1198631lowast 1198632lowast 1198732lowast 1198631lowast 1198632lowast 1198732lowast

TFA 8 8 4 5 5 4 5 6 4 4 mdash 5AMCloz 139 277 108 67 65 43 82 62 69 39 mdash 82AI 54 51 18 28 28 22 38 39 31 22 mdash 33FEloz 722 751 246 209 187 142 444 356 262 135 mdash 322lowast119863111986321198732 C LP IP and MP denote disgust1 disgust2 neutral2 the control the least intermediate and most severely form of Parkinsonrsquos patients

respectivelylozpresented in percentagesldquomdashrdquoThe face of the MP while watching1198632 was blocked by his hand

Note that disgust2 (119872 = 364 SD = 432) elicitedmore muscle activity over OO (because of crosstalk) thanneutral2 (119872 = 114 SD = 87) however unlike disgust1the difference did not reach statistical significance (becausedisgust2 did not elicit amusement at all) which can also beinterpreted

Moreover the main effect of stimuli on cardiac parame-ters for both HR (119865(2 6) = 37 119875 = 70) and HRV (119865(2 6) =084 119875 = 48) was not found which is not consistent withwhat we expected unchanged [46 47] or increased [48 49]HR and increased HRV [21 47] This may be due to thefact that we did not have enough recorded ECG data for astatistical analysis

34 Qualitative Analysis of Facial Expressivity To qualita-tively analyze facial expressivity we used the total facialactivity TFA measure of [2 50] being the total numberof displayed AUs in response to the stimuli Compared tothe control (C) the PD groups (LP IP and MP) showedthe attenuation of their facial activities (variable TFA) whilewatching disgusting movie clips (see Table 5) Howevercomparable TFA was found while watching neutral movieclips

Visual inspection of the of the displayed AUs as illus-trated in Figure 8 shows that AU1 AU2 AU6 AU9 andAU45 occurred more frequently for C except for IP whoproduced more frequently AU6 than C A possible expla-nation is that the IP cannot deactivate AU6 even duringneutral state As shown in Figure 9 during neutral state PDpatients produced more continuous active action units suchas AU4 and AU25 (for all PD) AU6 (for IP) and AU9 (forMP) Moreover certain AU combinations were consideredlikely to signify ldquodisgustrdquo on the basis of Ekman and Friesenrsquosdescription of emotions [7] ldquoDisgustrdquo AUs were consideredas the combination of eyebrows lowerer (AU4) cheek raiser(AU6) and nose wrinkler (AU9) Because cheek raiser isvery difficult to produce on demand without including otherAUs especially the lid tightener (AU7) [51] lid tightenerwas also taken into consideration that is the expected AUpattern of ldquodisgustrdquo was the combination of AU4 AU6 AU7and AU9 Consistent with what we expected C displayed98 and 87 ldquodisgustrdquo frames while watching disgust1 anddisgust2 respectively The PD goup (LP IP and MP) didnot display any ldquodisgustrdquo frames only for IP who had4 ldquodisgustrdquo frames while watching disgust1 Furthermorethe IP displayed quite few frames with combinations ofcheek raiser and lid tightener during neutral state Instead

Computational and Mathematical Methods in Medicine 9

0 100 200 300 400 500 600 7000 1 2 4 6 7 9

122023252745

Frame0 100 200 300 400 500 600 700

Frame

AU

0 100 200 300 400 500 600 7000 1 2 4 6 7 9

122023252745

Frame

AU

0 100 200 300 400 500 600 7000 1 2 4 6 7 9

122023252745

Frame

AU

0 1 2 4 6 7 9

122023252745

AU

Disgust1 (C) Disgust1 (LP)

Disgust1 (IP) Disgust1 (MP)

Figure 8 The facial activities while watching disgust1

N(C) N(LP)

N(IP)

Frame

Frame Frame

Frame

N(MP)

0 100 200 300 400 500 600 7000 1 2 4 6 7 9

122023252745

AU

0 100 200 300 400 500 600 7000 1 2 4 6 7 9

122023252745

AU

0 100 200 300 400 500 600 7000 1 2 4 6 7 9

122023252745

AU

0 100 200 300 400 500 600 7000 1 2 4 6 7 9

122023252745

AU

Figure 9 The facial activities while watching neutral2

10 Computational and Mathematical Methods in Medicine

0

5

10

15

20

25

30

35

fe

C

LP

IP

MP

Disgust1 Disgust2

Figure 10 The quantified facial expressivity

the cheek raiser alone was displayed mostly (see Figure 9)which indicates that IP patients have control problems forfacial muscles combinations

Analyzing the defined quantities AMC AI and FE usingthe segments with active AUs as it can be seen from Table 5the control C had higher facial expressivity than LP IP andMP during the disgusting state while MP performed thelowest facial expressivity More specifically compared to CPD (LP IP andMP) had smaller values of variables AMC AIand FE while watching disgusting movie clips especially forbrow lower (AU4) nose wrinkler (AU9) and blink (AU45)C tended to show similar intensities of brow lower (variable119890) with bigger variance (variables AMC and FE) In additionC showed higher intensities of nose wrinkler with largervariance

35 Quantitative Analysis of Facial Expressivity Figure 10depicts the values of the facial expressivity equation (5) forthe participants We did not assess the facial expressivityof the MP patient during disgust2 as he had his hand onfront of his face during the experiments As it can be seen asignificant difference between C and PD patients is presentC got the highest score while the lowest score was obtainedby the MP patient However the score of the IP patient wasslightly higher than the LP one which is due to the fact thatthe LP and IP expressed ldquofacial maskingrdquo in different waysthe LP patient showed the attenuation of the intensities offacial movements On the contrary the IP patient producedhigh intensities of facial movements not only in his emotionalstate but also during neutral state that is he cannot relax themuscles well In order to take both kinds of ldquofacial maskingrdquointo account we computed the facial expressivity using (4)The results are shown in Figure 11 As it can be seen theproposed facial expressivity allows distinguishing betweencontrol and PD patients Moreover the facial expressivitydecreases along with the increase of PD severity

0

5

10

15

20

25

fe

CLP

IP

MP

Disgust1 Disgust2minus5

Figure 11 The quantified facial expressivity based on the improvedfunction

4 Conclusions

This study investigated the phenomenon of facial masking inParkinsonrsquos patientsWedesigned an automated and objectivemethod to assess the facial expressivity of PD patients Theproposed approach follows the methodology of the Inter-personal Communication Rating Protocol-Parkinsonrsquos DiseaseVersion (ICRP-IEB) [9] In this study based on the FacialAction Coding System (FACS) we proposed a methodologythat (i) automatically detects faces in a video stream (ii) codeseach framewith respect to 11 action units and (iii) estimates afacial expressivity as function of frequency that is how oftenAUs occur duration that is how long AUs last and intensitybeing the strength of AUs

Although the proposed facial expressivity assessmentapproach has been evaluated in a limited number of subjectsit allows capturing differences in facial expressivity betweencontrol participants andParkinsonrsquos patientsMoreover facialexpressivity differences between PD patients with differ-ent progression of Parkinsonrsquos disease have been assessedThe proposed method can capture these differences andgive a more accurate assessment of facial expressivity forParkinsonrsquos patients than what traditional observer basedratings allow for This confirms that our approach can beused for clinical assessment of facial expressivity in PDIndeed nonverbal signals contribute significantly to inter-personal communication Facial expressivity a major sourceof nonverbal information is compromised in Parkinsonrsquosdisease The resulting disconnect between subjective feelingand objective facial affect can lead people to form negativeand inaccurate impressions of people with PD with respectto their personality feelings and intelligence Assessing inan objective way the facial expressivity limitation of PDwould allow developing personalized therapy to benefit facialexpressivity in PD IndeedParkinsonrsquos is a progressive diseasewhich means that the symptoms will get worse as time

Computational and Mathematical Methods in Medicine 11

goes on Using the proposed assessment approach wouldallow regular facial expressivity assessment by therapist andclinicians to explore treatment options

Future work will (i) improve the AU recognition systemand extend it to more AUs and (ii) consider clinical usageof the proposed approach in a statistically significant PDpatients population

Conflict of Interests

The authors declare that there is no conflict of interestsregarding the publication of this paper

Acknowledgments

This work is supported by a CSC-VUB scholarship (Grant no2011629010) and the VUB-interdisciplinary research projectldquoObjectifying Assessment of Human Emotional Processingand Expression for Clinical and Psychotherapy Applications(EMO-App)rdquo The authors gratefully acknowledge the Flem-ish Parkinson League for supporting the current study andthe recruitment of the PD patients Also they acknowledgeFloriane Verbraeck for the setup of the experiments and dataacquisition

References

[1] M Katsikitis and I Pilowsky ldquoA study of facial expressionin Parkinsonrsquos disease using a novel microcomputer-basedmethodrdquo Journal of Neurology Neurosurgery and Psychiatry vol51 no 3 pp 362ndash366 1988

[2] G Simons H Ellgring and M C Smith Pasqualini ldquoDistur-bance of spontaneous and posed facial expressions in Parkin-sonrsquos diseaserdquoCognition and Emotion vol 17 no 5 pp 759ndash7782003

[3] D H Jacobs J Shuren D Bowers and K M Heilman ldquoEmo-tional facial imagery perception and expression in Parkinsonsdiseaserdquo Neurology vol 45 no 9 pp 1696ndash1702 1995

[4] L Tickle-Degnen and K D Lyons ldquoPractitionersrsquo impressionsof patients with Parkinsonrsquos disease the social ecology of theexpressive maskrdquo Social Science amp Medicine vol 58 no 3 pp603ndash614 2004

[5] J J van Hilten A D van der Zwan A H Zwinderman and RA C Roos ldquoRating impairment and disability in Parkinsonsdisease evaluation of the unified Parkinsons disease ratingscalerdquoMovement Disorders vol 9 no 1 pp 84ndash88 1994

[6] D Bowers K Miller W Bosch et al ldquoFaces of emotionin Parkinsons disease micro-expressivity and bradykinesiaduring voluntary facial expressionsrdquo Journal of the InternationalNeuropsychological Society vol 12 no 6 pp 765ndash773 2006

[7] P Ekman and W Friesen Facial Action Coding System ATechnique for the Measurement of Facial Movement ConsultingPsychologists Press Palo Alto Calif USA 1978

[8] P Wu D Jiang and H Sahli ldquoPhysiological signal processingfor emotional feature extractionrdquo in Proceedings of the Interna-tional Conference on Physiological Computing Systems pp 40ndash47 2014

[9] L Tickle-Degnen The Interpersonal Communication RatingProtocol A Manual for Measuring Individual Expressive Behav-ior 2010

[10] G Sandbach S Zafeiriou M Pantic and L Yin ldquoStatic anddynamic 3D facial expression recognition a comprehensivesurveyrdquo Image and Vision Computing vol 30 no 10 pp 683ndash697 2012

[11] I Gonzalez H Sahli V Enescu and W Verhelst ldquoContext-independent facial action unit recognition using shape andgabor phase informationrdquo in Proceedings of the 4th InternationalConference on Affective Computing and Intelligent Interaction(ACII rsquo11) vol 1 pp 548ndash557 Springer Memphis Tenn USAOctober 2011

[12] J C Platt ldquoProbabilistic outputs for support vector machinesand comparisons to regularized likelihood methodsrdquo inAdvances in Large Margin Classifiers A J Smola P Bartlett BSchoelkopf andD Schuurmans Eds pp 61ndash74TheMITPress1999

[13] J J Gross and RW Levenson ldquoEmotion elicitation using filmsrdquoCognition and Emotion vol 9 no 1 pp 87ndash108 1995

[14] D Hagemann E Naumann S Maier G Becker A Lurkenand D Bartussek ldquoThe assessment of affective reactivity usingfilms validity reliability and sex differencesrdquo Personality andIndividual Differences vol 26 no 4 pp 627ndash639 1999

[15] C L Lisetti and F Nasoz ldquoUsing noninvasive wearable comput-ers to recognize human emotions from physiological signalsrdquoEURASIP Journal on Advances in Signal Processing vol 2004Article ID 929414 pp 1672ndash1687 2004

[16] J Hewig D Hagemann J Seifert M Gollwitzer E Naumannand D Bartussek ldquoA revised film set for the induction of basicemotionsrdquo Cognition and Emotion vol 19 no 7 pp 1095ndash11092005

[17] J H Westerink E L van den Broek M H Schut J van Herkand K Tuinenbreijer ldquoProbing experiencerdquo in Assessment ofUser Emotions and Behaviour to Development of Products T JO W F P Joyce H D M Westerink M Ouwerkerk and B deRuyter Eds Springer New York NY USA 2008

[18] A Schaefer F Nils P Philippot and X Sanchez ldquoAssessing theeffectiveness of a large database of emotion-eliciting films a newtool for emotion researchersrdquo Cognition and Emotion vol 24no 7 pp 1153ndash1172 2010

[19] F Verbraeck Objectifying human facial expressions for clinicalapplications [MS thesis] Vrije Universiteit Brussel Belgium2012

[20] O AlzoubiAutomatic affect detection from physiological signalspractical issues [PhD thesis] The University of Sydney NewSouth Wales Australia 2012

[21] S D Kreibig ldquoAutonomic nervous system activity in emotiona reviewrdquo Biological Psychology vol 84 no 3 pp 394ndash421 2010

[22] F D Farfan J C Politti and C J Felice ldquoEvaluation of emgprocessing techniques using information theoryrdquo BioMedicalEngineering Online vol 9 article 72 2010

[23] N Ambady F J Bernieri and J A Richeson ldquoToward ahistology of social behavior judgmental accuracy from thinslices of the behavioral streamrdquoAdvances in Experimental SocialPsychology vol 32 pp 201ndash271 2000

[24] N Ambady and R Rosenthal ldquoThin slices of expressivebehavior as predictors of interpersonal consequences a meta-analysisrdquo Psychological Bulletin vol 111 no 2 pp 256ndash274 1992

[25] K D Lyons and L Tickle-Degnen ldquoReliability and validity ofa videotape method to describe expressive behavior in personswith Parkinsonrsquos diseaserdquoTheAmerican Journal of OccupationalTherapy vol 59 no 1 pp 41ndash49 2005

[26] N AMurphy ldquoUsing thin slices for behavioral codingrdquo Journalof Nonverbal Behavior vol 29 no 4 pp 235ndash246 2005

12 Computational and Mathematical Methods in Medicine

[27] A O Andrade S Nasuto P Kyberd C M Sweeney-Reed andF R vanKanijn ldquoEMG signal filtering based on empiricalmodedecompositionrdquo Biomedical Signal Processing and Control vol1 no 1 pp 44ndash55 2006

[28] M Blanco-Velasco B Weng and K E Barner ldquoECG signaldenoising and baseline wander correction based on the empir-ical mode decompositionrdquo Computers in Biology and Medicinevol 38 no 1 pp 1ndash13 2008

[29] A O Boudraa J C Cexus and Z Saidi ldquoEmd-based signalnoise reductionrdquo Signal Processing vol 1 pp 33ndash37 2005

[30] T Jing-tian Z Qing T Yan L Bin and Z Xiao-kai ldquoHilbert-Huang transform for ECG de-noisingrdquo in Proceedings of the1st International Conference on Bioinformatics and BiomedicalEngineering pp 664ndash667 July 2007

[31] A Karagiannis and P Constantinou ldquoNoise components identi-fication in biomedical signals based on empirical mode decom-positionrdquo in Proceedings of the 9th International Conference onInformation Technology and Applications in Biomedicine (ITABrsquo09) pp 1ndash4 November 2009

[32] Y Kopsinis and S McLaughlin ldquoDevelopment of EMD-baseddenoising methods inspired by wavelet thresholdingrdquo IEEETransactions on Signal Processing vol 57 no 4 pp 1351ndash13622009

[33] F Agrafioti D Hatzinakos and A K Anderson ldquoECG patternanalysis for emotion detectionrdquo IEEE Transactions on AffectiveComputing vol 3 no 1 pp 102ndash115 2012

[34] D L Donoho ldquoDe-noising by soft-thresholdingrdquo IEEE Trans-actions on Information Theory vol 41 no 3 pp 613ndash627 1995

[35] B-U Kohler C Hennig and R Orglmeister ldquoThe principlesof software QRS detectionrdquo IEEE Engineering in Medicine andBiology Magazine vol 21 no 1 pp 42ndash57 2002

[36] V Guralnik and J Srivastava ldquoEvent detection from time seriesdatardquo inKnowledge Discovery andDataMining pp 33ndash42 1999

[37] M Hamedi S-H Salleh and T T Swee ldquoSurfaceelectromyography-based facial expression recognition inBi-polar configurationrdquo Journal of Computer Science vol 7 no9 pp 1407ndash1415 2011

[38] Y Hou H Sahli R Ilse Y Zhang and R Zhao ldquoRobust shape-based head trackingrdquo inAdvanced Concepts for Intelligent VisionSystems vol 4678 ofLectureNotes inComputer Science pp 340ndash351 Springer 2007

[39] T Kanade J F Cohn and Y Tian ldquoComprehensive databasefor facial expression analysisrdquo in Proceedings of the IEEE Inter-national Conference on Automatic Face and Gesture Recognitionpp 46ndash53 Grenoble France 2000

[40] M S Bartlett G C Littlewort M G Frank C Lainscsek IR Fasel and J R Movellan ldquoAutomatic recognition of facialactions in spontaneous expressionsrdquo Journal of Multimedia vol1 no 6 pp 22ndash35 2006

[41] A Savran B Sankur and M Taha Bilge ldquoRegression-basedintensity estimation of facial action unitsrdquo Image and VisionComputing vol 30 no 10 pp 774ndash784 2012

[42] Anvil the video annotation tool httpwwwanvil-soft-wareorg

[43] S Vicente J Peron I Biseul et al ldquoSubjective emotionalexperience at different stages of Parkinsonrsquos diseaserdquo Journal ofthe Neurological Sciences vol 310 no 1-2 pp 241ndash247 2011

[44] A van Boxtel ldquoFacial emg as a tool for inferring affectivestatesrdquo in Proceedings of Measuring Behavior 2010 A J Spink FGrieco O Krips L Loijens L Noldus and P Zimmeran Edspp 104ndash108 Noldus Information TechnologyWageningenTheNetherlands 2010

[45] C-N Huang C-H Chen and H-Y Chung ldquoThe review ofapplications and measurements in facial electromyographyrdquoJournal of Medical and Biological Engineering vol 25 no 1 pp15ndash20 2005

[46] R W Levenson P Ekman K Heider and W V FriesenldquoEmotion and autonomic nervous system activity in theminangkabau of west sumatrardquo Journal of Personality and SocialPsychology vol 62 no 6 pp 972ndash988 1992

[47] S Rohrmann and H Hopp ldquoCardiovascular indicators ofdisgustrdquo International Journal of Psychophysiology vol 68 no3 pp 201ndash208 2008

[48] OAlaoui-Ismaıli O RobinH Rada ADittmar andEVernet-Maury ldquoBasic emotions evoked by odorants comparisonbetween autonomic responses and self-evaluationrdquo Physiologyand Behavior vol 62 no 4 pp 713ndash720 1997

[49] J Gruber S L Johnson C Oveis and D Keltner ldquoRisk formania and positive emotional responding too much of a goodthingrdquo Emotion vol 8 no 1 pp 23ndash33 2008

[50] W Gaebel and W Wolwer ldquoFacial expressivity in the course ofschizophrenia and depressionrdquo European Archives of Psychiatryand Clinical Neuroscience vol 254 no 5 pp 335ndash342 2004

[51] P EkmanW Friesen and J Hager Facial Action Coding SystemThe Manual 2002

Submit your manuscripts athttpwwwhindawicom

Stem CellsInternational

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

MEDIATORSINFLAMMATION

of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Behavioural Neurology

EndocrinologyInternational Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Disease Markers

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

BioMed Research International

OncologyJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Oxidative Medicine and Cellular Longevity

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

PPAR Research

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Immunology ResearchHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Journal of

ObesityJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Computational and Mathematical Methods in Medicine

OphthalmologyJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Diabetes ResearchJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Research and TreatmentAIDS

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Gastroenterology Research and Practice

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Parkinsonrsquos Disease

Evidence-Based Complementary and Alternative Medicine

Volume 2014Hindawi Publishing Corporationhttpwwwhindawicom

Computational and Mathematical Methods in Medicine 5

the strength of muscular contraction [22 37] defined overthe length119873 of the signal 119909(119905) as follows

AMV = 1119873

119873

sum

119905=1

|119909 (119905)| (1)

The AMV value during the considered 30 s window wasexpressed as a percentage of the mean amplitude during the30 s neutral baseline This percentage score was computed tostandardize the widely different EMG amplitudes of individ-uals and thus to enable comparison between individuals andgroups

27 Design and Statistical Analysis Preliminary analysis wereperformed to check the validity of the acquired data

(1) Manipulation Check We calculated the descriptive statis-tics (the mean and standard deviation) of self-reportedemotional ratings to check if the participantsrsquo emotionalexperience was successfully manipulated Then Wilcoxonrank sum tests were performed to compare the self-reportbetween groups (PD and control) and between the two clipsof the same emotion

(2) Analysis of Physiological Parameters Physiological vari-ables were tested for univariate significant differences viarepeated-measures analysis of variance (ANOVA) betweengroups (PD versus control) and between the disgust1disgust2 and neutral2 stimulants Group and stimuli arethe between-subjects and within-subject factors respectivelyResults were considered statistically significant at 119875 lt 05

28 Facial Action Units Recognition In this study we makeuse of an automatic facial action units recognition systemdeveloped at our department [11]This system allows context-independent recognition of the following action units AU1AU2 AU4 AU6 AU7 AU9 AU12 AU20 AU23 AU25 andAU27The overall recognition scheme is depicted in Figure 6

The head and facial features were tracked using theconstrained shape tracking approach of [38] This approachallows an automatic detection of the head and the trackingof a shape model composed of 83 landmarks After thefacial components have been tracked in each frame bothgeometry-based features and appearance-based features arecombined and fed to the AdaBoost (adaptive boosting)algorithm for feature selection Finally for each action unitwe used a binary support vector machine (SVM) for context-independent classificationOur systemwas trained and testedon the Kanade et al DFAT-504 dataset [39] The databaseconsists of 486 sequences of facial displays that are producedby 98 university students from 18 to 30 years old of which65 is female All sequences are annotated by certified FACScoders start with a neutral face and end with the apex ofthe expression SVM [11] has been proven to be powerful androbust tools for AU classification

A test sample (a new image) z can be classified accordingto the following decision function

119863 (z) = sign (ℎ (z)) (2)

Table 3 Facial expressivity items defined in ICRP-IEB [9] and usedAUs

Item Gestalt degree Related AUs(1) Active expressivityin face Intensity 11 AUs

(2) Eyebrows raising Intensity + frequency AU1 and AU2(3) Eyebrows pullingtogether Intensity + frequency AU4

(4) Blinking Frequency AU45(5) Cheek raising Intensity + frequency AU6(6) Lip corner puller Intensity + frequency AU12

The output ℎ(z) of a SVM is a distancemeasure between a testpattern and the separating hyperplane defined by the supportvectors The test sample is classified to the positive class (theAU to which it was trained) if 119863 (z) = +1 and is classified tothe negative class if119863(z) = minus1

Unfortunately we cannot use directly the output of aSVM as a probability There is no clear relationship with theposterior class probability 119875(119910 = +1 | z) that the pattern zbelongs to the class119910 = +1 Platt [12] proposed an estimate forthis probability by fitting the SVMoutput ℎ(z)with a sigmoidfunction as follows

119875 (119910 = +1 | z) = 1

1 + exp (119860ℎ (z) + 119861) (3)

Theparameters119860 and119861 are foundusingmaximum likelihoodestimation from the training set The above equation has amapping range in [0 1]

29 Facial Expressivity In this work we follow the rec-ommendation of the Interpersonal Communication RatingProtocol-Parkinsonrsquos Disease Version (ICRP-IEB) [9] wherethe expressivity based on the FACS has been defined interms of (i) frequency that is how often a behavior ormovement occurs (ii) duration that is how long a behavioror movement lasts and (iii) intensity or degree being thestrength force or levelamount of emotion or movement InICRP-IEB facial expressivity is coded according to the gestaltdegree of intensitydurationfrequency of 7 types of facialexpressive behavior (items) depicted in recorded videos ofPDs using a 5-point Likert type scale 1 = low (with low tono movement or change or infrequent) 2 = fairly low 3 =medium 4 = fairly high and 5 = high (very frequent veryactive) Table 3 lists 6 of the ICRP-IEB facial expressivityitems and the corresponding AUs used in this study forestimating them The last item is related to active mouthclosure during speech which was not used in our modelas the participants were not asked to speak during theexperiments

As our current AU recognition system considers only 11AUs (AU1 AU2 AU4 AU6 AU7 AU9 AU12 AU20 AU23AU25 and AU27) we did not consider blinking (AU45) inour facial expression formulation

6 Computational and Mathematical Methods in Medicine

Original video

I(0) I(1) I(T)

Shape model tracking

Appearance-based

P(0) P(1) P(T)

Geometry-based

Feature extraction

AU12 AU27 AU6 AU9 AU1 AU4

Lower facial features

Middle facial features

Upper facial features

Featureselection

AdaBoost

AU12 AU27 AU6 AU9 AU1 AU4Binary SVMclassification

Output decision values

AU1 AU2 AU4 AU6

AU7 AU9 AU12 AU20

AU23 AU25 AU27

Inner brower raiser Outer brower raiser Brow lowerer Cheek raiser

Lid tighten Nose wrinkle Lip corner puller Lip stretch

Lip tighten Lips part Mouth stretch

middot middot middot

middot middot middot

middot middot middot

middot middot middot

middot middot middot

middot middot middot

middot middot middot

middot middot middot middot middot middot middot middot middot

middot middot middot

Figure 6 The AU recognition system

Following the criteria of the ICRP-IEB we defined thefacial expressivity of a participant as follows

EFE = AMC119890+ AI119890minus (AMC

119899+ AI119899) (4)

where AMC119890and AMC

119899are the amount of movement

changes during the disgust and neutral facial expressionrespectively AI

119890and AI

119899are the intensity of the displayed

AUs during the disgust and neutral facial expression respec-tively

It has to be recalled that for all these estimations onlythe considered 30 s windows are used The quantities AMCand AI refer to frequency and intensity respectively Theirdetailed formulation is given in the following sections Inorder to assess the effectiveness of the proposed formulationwe also compared it to the following definition of facialexpression where the baseline of neutral emotion was notconsidered Consider the following

FE = AMC119890+ AI119890 (5)

(1) Intensity of Displayed AUs It has been shown in [40] thatthe output margin of the learned SVM classifiers contained

information about expression intensity Later Savran et al[41] estimated the AU intensity levels using logistic regres-sion on SVM scores In this work we propose using Plattrsquosprobability given by (3) as action unit intensity at frame 119905119868119905(AU119894) = 119875119905AU119894 with 119875

119905

AU119894 = 119875(119888 = AU119894 | z119905) is estimated

using (3) The 119868119905(AU119894) time series is then smoothed using a

Gaussian filter We denote by 119868119905(AU119894) the smoothed value

Figure 7 plots for a participant the smoothed intensity of thefacial AU7 also illustrating its different temporal segmentsneutral onset apex and offset

Having defined the AU intensity the intensity of dis-played AUs during a voluntary facial expression (disgust orneutral) is given by

AI = sum119894isinDAUs

1

1198731015840

119894

sum

119905isin119879119894

119868119905 (AU119894) (6)

where DAUs is the set of displayed (recognized) facial actionunits during the considered 30 s window 119879

119894is the set of

Computational and Mathematical Methods in Medicine 7

1

09

08

07

06

05

04

03

02

01

00 50 100 150 200 250 300 350

Frames

Inte

nsity

of A

U7

(lid

tight

ener

)

Figure 7 Example of AU intensity

frames where AU119894 is active and1198731015840119894is the cardinal of 119879

119894 being

the number of frames where AU119894 is active

(2) Amount of Movement Change The amount of movementchange during a voluntary facial expression (disgust orneutral) is given by

AMC = sum119894isinDAUs

1

1198731015840

119894

sum

119905isin119879119894

10038161003816100381610038161003816119868119905 (AU119894) minus 119868119905+1 (AU119894)

10038161003816100381610038161003816 (7)

with DAUs1198731015840119894 and 119879

119894as defined above

3 Results and Discussion

31 Manipulation Check Participantsrsquo emotion was suc-cessfully elicited using the movie clips of disgust Forboth disgust1 and disgust2 most participants (1415) self-reported the target emotion as their dominated emotionFor the other video clips the reported emotions are as fol-lows amusement1 (915) amusement2 (1215) surprise1(1115) surprise2 (215) anger1 (615) anger2 (915) fear1(1015) fear2 (1115) neutral1 (1315) and neutral2 (1315)Annotation of the video records using the Anvil annotationtool [42] further confirmed that the movie clips of disgustinduced the most reliable emotional data Therefore forfurther analysis we decided to use only the data recordedduring watching the disgust movie clips

The Wilcoxon rank sum tests were implemented on the14 disgusting self-reports Results showed that there wasno significant difference in self-reported emotional ratingsbetween disgust1 (119872 = 650 SD = 76) and disgust2(119872 = 664 SD = 63) and between PD group (119872 =651 SD = 65) and control group (119872 = 664 SD =74) This is what we expected since Vicente et al [43] alsoreported that PD patients at different stages of the disease didnot significantly differ from the controls in the self-reportedemotional experience to presented movie clips

32 Univariate Analysis of Group Effects TheEMGdata from3 participants (1 control and 2 PD) and the ECG data from 10participants (4 control and 6 PD) were discarded because ofnot being well recorded due to sensor malfunctions Results

of the repeated-measures ANOVAs on single physiologicalvariables showed that significant main effect of group on theLLS activity (119865(1 12) = 338 119875 = 09) the OO activity(119865(1 12) = 192 119875 = 19) HR (119865(1 3) = 93 119875 = 41)and HRV (119865(1 3) = 53 119875 = 83) was not found Table 4shows the descriptive data for control and PD groups duringexposure to the three movie clips disgust1 disgust2 andneutral2 and the tests of significance comparison betweengroups using Wilcoxon rank sum tests The Wilcoxon ranksum tests revealed that

(1) comparable levels of baseline activity in control andPD groups over both the LLS and the OOwere found

(2) although disgust1 elicited more muscle activity overthe LLS and the OO for control group than forPD group this difference did not reach statisticalsignificance

(3) disgust2 elicited significantly more LLS activity forcontrol than for PD

These results indicated that PD displayed less muscleactivity over the LLS when expressing disgust than controlIn addition disgust2 induced more muscle activity over theLLS and the OO than disgust1 which is consistent with theself-report andmay be due to the fact that disgust2 is slightlymore disgusting than disgust1

33 Univariate Analysis of Stimuli Effects For LLS we foundan effect of the stimuli on muscle activity (119865(2 24) = 947119875 = 001) Post-hoc tests indicated significant differencesbetween disgust1 and neutral2 (119875 = 01) and betweendisgust2 and neutral2 (119875 = 01) Both disgust1 (119872 = 282SD = 192) and disgust2 (119872 = 537 SD = 543) elicitedmore muscle activity than neutral2 (119872 = 110 SD = 38)Thismain effect was qualified by a stimulitimes group interaction(119865(2 24) = 417 119875 = 028) which was consistent with theresults of the Wilcoxon rank sum tests which compared thephysiological responses between groups

For OO we also found an effect of the stimuli on muscleactivity (119865(2 24) = 545 119875 = 012) Post-hoc tests indicatedsignificant difference only between disgust1 and neutral2(119875 = 002)Thedisgust1 (119872 = 232 SD = 157) elicitedmoremuscle activity than neutral2 (119872 = 135 SD = 111) Nosignificant stimuli times group interaction effect (119865(2 24) = 177119875 = 192) was found

We expected that the disgust clips elicited significantlymore muscle activity over LLS than the neutral clip becausenormally LLS is involved in producing the disgust facialexpression [44] The fact that OO was also significantlydifferent was probably due to the following

(1) crosstalk [45] the LLS and the OO lie in the vicinityof each other

(2) the disgust1 elicited is not only a disgust but alsoa bit of amusement by the funny motion of thecharacter and the background music so that the OOwas also involved in producing the facial expressionof amusement

8 Computational and Mathematical Methods in Medicine

Table 4 Statistical summary of physiological measures

Variable Stimuli Control PD SigMean SD Mean SD

LLSNeutral2 104 34 116 43 71Disgust1 336 204 227 177 26Disgust2 804 649 271 225 04

lowast

OONeutral2 135 111 94 54 81Disgust1 282 177 181 128 32Disgust2 520 571 207 148 13

HRNeutral2 172 277 276 mdash mdashDisgust1 minus53 264 512 mdash mdashDisgust2 222 553 565 mdash mdash

HRVNeutral2 153 1296 minus765 mdash mdashDisgust1 801 1145 2634 mdash mdashDisgust2 1486 3564 1492 mdash mdash

lowastThe two groups are significantly different that is 119875 lt 05ldquomdashrdquoWe did not compare the cardiac responses (ie HR and HRV) between groups because due to technical failures we lost the ECG data for 10 participantsand thus only 5 participants (4 controls and 1 PD) completed the ECG recording

Table 5 Facial expressivity assessment based on different methods

Var Clowast LPlowast IPlowast MPlowast

1198631lowast 1198632lowast 1198732lowast 1198631lowast 1198632lowast 1198732lowast 1198631lowast 1198632lowast 1198732lowast 1198631lowast 1198632lowast 1198732lowast

TFA 8 8 4 5 5 4 5 6 4 4 mdash 5AMCloz 139 277 108 67 65 43 82 62 69 39 mdash 82AI 54 51 18 28 28 22 38 39 31 22 mdash 33FEloz 722 751 246 209 187 142 444 356 262 135 mdash 322lowast119863111986321198732 C LP IP and MP denote disgust1 disgust2 neutral2 the control the least intermediate and most severely form of Parkinsonrsquos patients

respectivelylozpresented in percentagesldquomdashrdquoThe face of the MP while watching1198632 was blocked by his hand

Note that disgust2 (119872 = 364 SD = 432) elicitedmore muscle activity over OO (because of crosstalk) thanneutral2 (119872 = 114 SD = 87) however unlike disgust1the difference did not reach statistical significance (becausedisgust2 did not elicit amusement at all) which can also beinterpreted

Moreover the main effect of stimuli on cardiac parame-ters for both HR (119865(2 6) = 37 119875 = 70) and HRV (119865(2 6) =084 119875 = 48) was not found which is not consistent withwhat we expected unchanged [46 47] or increased [48 49]HR and increased HRV [21 47] This may be due to thefact that we did not have enough recorded ECG data for astatistical analysis

34 Qualitative Analysis of Facial Expressivity To qualita-tively analyze facial expressivity we used the total facialactivity TFA measure of [2 50] being the total numberof displayed AUs in response to the stimuli Compared tothe control (C) the PD groups (LP IP and MP) showedthe attenuation of their facial activities (variable TFA) whilewatching disgusting movie clips (see Table 5) Howevercomparable TFA was found while watching neutral movieclips

Visual inspection of the of the displayed AUs as illus-trated in Figure 8 shows that AU1 AU2 AU6 AU9 andAU45 occurred more frequently for C except for IP whoproduced more frequently AU6 than C A possible expla-nation is that the IP cannot deactivate AU6 even duringneutral state As shown in Figure 9 during neutral state PDpatients produced more continuous active action units suchas AU4 and AU25 (for all PD) AU6 (for IP) and AU9 (forMP) Moreover certain AU combinations were consideredlikely to signify ldquodisgustrdquo on the basis of Ekman and Friesenrsquosdescription of emotions [7] ldquoDisgustrdquo AUs were consideredas the combination of eyebrows lowerer (AU4) cheek raiser(AU6) and nose wrinkler (AU9) Because cheek raiser isvery difficult to produce on demand without including otherAUs especially the lid tightener (AU7) [51] lid tightenerwas also taken into consideration that is the expected AUpattern of ldquodisgustrdquo was the combination of AU4 AU6 AU7and AU9 Consistent with what we expected C displayed98 and 87 ldquodisgustrdquo frames while watching disgust1 anddisgust2 respectively The PD goup (LP IP and MP) didnot display any ldquodisgustrdquo frames only for IP who had4 ldquodisgustrdquo frames while watching disgust1 Furthermorethe IP displayed quite few frames with combinations ofcheek raiser and lid tightener during neutral state Instead

Computational and Mathematical Methods in Medicine 9

0 100 200 300 400 500 600 7000 1 2 4 6 7 9

122023252745

Frame0 100 200 300 400 500 600 700

Frame

AU

0 100 200 300 400 500 600 7000 1 2 4 6 7 9

122023252745

Frame

AU

0 100 200 300 400 500 600 7000 1 2 4 6 7 9

122023252745

Frame

AU

0 1 2 4 6 7 9

122023252745

AU

Disgust1 (C) Disgust1 (LP)

Disgust1 (IP) Disgust1 (MP)

Figure 8 The facial activities while watching disgust1

N(C) N(LP)

N(IP)

Frame

Frame Frame

Frame

N(MP)

0 100 200 300 400 500 600 7000 1 2 4 6 7 9

122023252745

AU

0 100 200 300 400 500 600 7000 1 2 4 6 7 9

122023252745

AU

0 100 200 300 400 500 600 7000 1 2 4 6 7 9

122023252745

AU

0 100 200 300 400 500 600 7000 1 2 4 6 7 9

122023252745

AU

Figure 9 The facial activities while watching neutral2

10 Computational and Mathematical Methods in Medicine

0

5

10

15

20

25

30

35

fe

C

LP

IP

MP

Disgust1 Disgust2

Figure 10 The quantified facial expressivity

the cheek raiser alone was displayed mostly (see Figure 9)which indicates that IP patients have control problems forfacial muscles combinations

Analyzing the defined quantities AMC AI and FE usingthe segments with active AUs as it can be seen from Table 5the control C had higher facial expressivity than LP IP andMP during the disgusting state while MP performed thelowest facial expressivity More specifically compared to CPD (LP IP andMP) had smaller values of variables AMC AIand FE while watching disgusting movie clips especially forbrow lower (AU4) nose wrinkler (AU9) and blink (AU45)C tended to show similar intensities of brow lower (variable119890) with bigger variance (variables AMC and FE) In additionC showed higher intensities of nose wrinkler with largervariance

35 Quantitative Analysis of Facial Expressivity Figure 10depicts the values of the facial expressivity equation (5) forthe participants We did not assess the facial expressivityof the MP patient during disgust2 as he had his hand onfront of his face during the experiments As it can be seen asignificant difference between C and PD patients is presentC got the highest score while the lowest score was obtainedby the MP patient However the score of the IP patient wasslightly higher than the LP one which is due to the fact thatthe LP and IP expressed ldquofacial maskingrdquo in different waysthe LP patient showed the attenuation of the intensities offacial movements On the contrary the IP patient producedhigh intensities of facial movements not only in his emotionalstate but also during neutral state that is he cannot relax themuscles well In order to take both kinds of ldquofacial maskingrdquointo account we computed the facial expressivity using (4)The results are shown in Figure 11 As it can be seen theproposed facial expressivity allows distinguishing betweencontrol and PD patients Moreover the facial expressivitydecreases along with the increase of PD severity

0

5

10

15

20

25

fe

CLP

IP

MP

Disgust1 Disgust2minus5

Figure 11 The quantified facial expressivity based on the improvedfunction

4 Conclusions

This study investigated the phenomenon of facial masking inParkinsonrsquos patientsWedesigned an automated and objectivemethod to assess the facial expressivity of PD patients Theproposed approach follows the methodology of the Inter-personal Communication Rating Protocol-Parkinsonrsquos DiseaseVersion (ICRP-IEB) [9] In this study based on the FacialAction Coding System (FACS) we proposed a methodologythat (i) automatically detects faces in a video stream (ii) codeseach framewith respect to 11 action units and (iii) estimates afacial expressivity as function of frequency that is how oftenAUs occur duration that is how long AUs last and intensitybeing the strength of AUs

Although the proposed facial expressivity assessmentapproach has been evaluated in a limited number of subjectsit allows capturing differences in facial expressivity betweencontrol participants andParkinsonrsquos patientsMoreover facialexpressivity differences between PD patients with differ-ent progression of Parkinsonrsquos disease have been assessedThe proposed method can capture these differences andgive a more accurate assessment of facial expressivity forParkinsonrsquos patients than what traditional observer basedratings allow for This confirms that our approach can beused for clinical assessment of facial expressivity in PDIndeed nonverbal signals contribute significantly to inter-personal communication Facial expressivity a major sourceof nonverbal information is compromised in Parkinsonrsquosdisease The resulting disconnect between subjective feelingand objective facial affect can lead people to form negativeand inaccurate impressions of people with PD with respectto their personality feelings and intelligence Assessing inan objective way the facial expressivity limitation of PDwould allow developing personalized therapy to benefit facialexpressivity in PD IndeedParkinsonrsquos is a progressive diseasewhich means that the symptoms will get worse as time

Computational and Mathematical Methods in Medicine 11

goes on Using the proposed assessment approach wouldallow regular facial expressivity assessment by therapist andclinicians to explore treatment options

Future work will (i) improve the AU recognition systemand extend it to more AUs and (ii) consider clinical usageof the proposed approach in a statistically significant PDpatients population

Conflict of Interests

The authors declare that there is no conflict of interestsregarding the publication of this paper

Acknowledgments

This work is supported by a CSC-VUB scholarship (Grant no2011629010) and the VUB-interdisciplinary research projectldquoObjectifying Assessment of Human Emotional Processingand Expression for Clinical and Psychotherapy Applications(EMO-App)rdquo The authors gratefully acknowledge the Flem-ish Parkinson League for supporting the current study andthe recruitment of the PD patients Also they acknowledgeFloriane Verbraeck for the setup of the experiments and dataacquisition

References

[1] M Katsikitis and I Pilowsky ldquoA study of facial expressionin Parkinsonrsquos disease using a novel microcomputer-basedmethodrdquo Journal of Neurology Neurosurgery and Psychiatry vol51 no 3 pp 362ndash366 1988

[2] G Simons H Ellgring and M C Smith Pasqualini ldquoDistur-bance of spontaneous and posed facial expressions in Parkin-sonrsquos diseaserdquoCognition and Emotion vol 17 no 5 pp 759ndash7782003

[3] D H Jacobs J Shuren D Bowers and K M Heilman ldquoEmo-tional facial imagery perception and expression in Parkinsonsdiseaserdquo Neurology vol 45 no 9 pp 1696ndash1702 1995

[4] L Tickle-Degnen and K D Lyons ldquoPractitionersrsquo impressionsof patients with Parkinsonrsquos disease the social ecology of theexpressive maskrdquo Social Science amp Medicine vol 58 no 3 pp603ndash614 2004

[5] J J van Hilten A D van der Zwan A H Zwinderman and RA C Roos ldquoRating impairment and disability in Parkinsonsdisease evaluation of the unified Parkinsons disease ratingscalerdquoMovement Disorders vol 9 no 1 pp 84ndash88 1994

[6] D Bowers K Miller W Bosch et al ldquoFaces of emotionin Parkinsons disease micro-expressivity and bradykinesiaduring voluntary facial expressionsrdquo Journal of the InternationalNeuropsychological Society vol 12 no 6 pp 765ndash773 2006

[7] P Ekman and W Friesen Facial Action Coding System ATechnique for the Measurement of Facial Movement ConsultingPsychologists Press Palo Alto Calif USA 1978

[8] P Wu D Jiang and H Sahli ldquoPhysiological signal processingfor emotional feature extractionrdquo in Proceedings of the Interna-tional Conference on Physiological Computing Systems pp 40ndash47 2014

[9] L Tickle-Degnen The Interpersonal Communication RatingProtocol A Manual for Measuring Individual Expressive Behav-ior 2010

[10] G Sandbach S Zafeiriou M Pantic and L Yin ldquoStatic anddynamic 3D facial expression recognition a comprehensivesurveyrdquo Image and Vision Computing vol 30 no 10 pp 683ndash697 2012

[11] I Gonzalez H Sahli V Enescu and W Verhelst ldquoContext-independent facial action unit recognition using shape andgabor phase informationrdquo in Proceedings of the 4th InternationalConference on Affective Computing and Intelligent Interaction(ACII rsquo11) vol 1 pp 548ndash557 Springer Memphis Tenn USAOctober 2011

[12] J C Platt ldquoProbabilistic outputs for support vector machinesand comparisons to regularized likelihood methodsrdquo inAdvances in Large Margin Classifiers A J Smola P Bartlett BSchoelkopf andD Schuurmans Eds pp 61ndash74TheMITPress1999

[13] J J Gross and RW Levenson ldquoEmotion elicitation using filmsrdquoCognition and Emotion vol 9 no 1 pp 87ndash108 1995

[14] D Hagemann E Naumann S Maier G Becker A Lurkenand D Bartussek ldquoThe assessment of affective reactivity usingfilms validity reliability and sex differencesrdquo Personality andIndividual Differences vol 26 no 4 pp 627ndash639 1999

[15] C L Lisetti and F Nasoz ldquoUsing noninvasive wearable comput-ers to recognize human emotions from physiological signalsrdquoEURASIP Journal on Advances in Signal Processing vol 2004Article ID 929414 pp 1672ndash1687 2004

[16] J Hewig D Hagemann J Seifert M Gollwitzer E Naumannand D Bartussek ldquoA revised film set for the induction of basicemotionsrdquo Cognition and Emotion vol 19 no 7 pp 1095ndash11092005

[17] J H Westerink E L van den Broek M H Schut J van Herkand K Tuinenbreijer ldquoProbing experiencerdquo in Assessment ofUser Emotions and Behaviour to Development of Products T JO W F P Joyce H D M Westerink M Ouwerkerk and B deRuyter Eds Springer New York NY USA 2008

[18] A Schaefer F Nils P Philippot and X Sanchez ldquoAssessing theeffectiveness of a large database of emotion-eliciting films a newtool for emotion researchersrdquo Cognition and Emotion vol 24no 7 pp 1153ndash1172 2010

[19] F Verbraeck Objectifying human facial expressions for clinicalapplications [MS thesis] Vrije Universiteit Brussel Belgium2012

[20] O AlzoubiAutomatic affect detection from physiological signalspractical issues [PhD thesis] The University of Sydney NewSouth Wales Australia 2012

[21] S D Kreibig ldquoAutonomic nervous system activity in emotiona reviewrdquo Biological Psychology vol 84 no 3 pp 394ndash421 2010

[22] F D Farfan J C Politti and C J Felice ldquoEvaluation of emgprocessing techniques using information theoryrdquo BioMedicalEngineering Online vol 9 article 72 2010

[23] N Ambady F J Bernieri and J A Richeson ldquoToward ahistology of social behavior judgmental accuracy from thinslices of the behavioral streamrdquoAdvances in Experimental SocialPsychology vol 32 pp 201ndash271 2000

[24] N Ambady and R Rosenthal ldquoThin slices of expressivebehavior as predictors of interpersonal consequences a meta-analysisrdquo Psychological Bulletin vol 111 no 2 pp 256ndash274 1992

[25] K D Lyons and L Tickle-Degnen ldquoReliability and validity ofa videotape method to describe expressive behavior in personswith Parkinsonrsquos diseaserdquoTheAmerican Journal of OccupationalTherapy vol 59 no 1 pp 41ndash49 2005

[26] N AMurphy ldquoUsing thin slices for behavioral codingrdquo Journalof Nonverbal Behavior vol 29 no 4 pp 235ndash246 2005

12 Computational and Mathematical Methods in Medicine

[27] A O Andrade S Nasuto P Kyberd C M Sweeney-Reed andF R vanKanijn ldquoEMG signal filtering based on empiricalmodedecompositionrdquo Biomedical Signal Processing and Control vol1 no 1 pp 44ndash55 2006

[28] M Blanco-Velasco B Weng and K E Barner ldquoECG signaldenoising and baseline wander correction based on the empir-ical mode decompositionrdquo Computers in Biology and Medicinevol 38 no 1 pp 1ndash13 2008

[29] A O Boudraa J C Cexus and Z Saidi ldquoEmd-based signalnoise reductionrdquo Signal Processing vol 1 pp 33ndash37 2005

[30] T Jing-tian Z Qing T Yan L Bin and Z Xiao-kai ldquoHilbert-Huang transform for ECG de-noisingrdquo in Proceedings of the1st International Conference on Bioinformatics and BiomedicalEngineering pp 664ndash667 July 2007

[31] A Karagiannis and P Constantinou ldquoNoise components identi-fication in biomedical signals based on empirical mode decom-positionrdquo in Proceedings of the 9th International Conference onInformation Technology and Applications in Biomedicine (ITABrsquo09) pp 1ndash4 November 2009

[32] Y Kopsinis and S McLaughlin ldquoDevelopment of EMD-baseddenoising methods inspired by wavelet thresholdingrdquo IEEETransactions on Signal Processing vol 57 no 4 pp 1351ndash13622009

[33] F Agrafioti D Hatzinakos and A K Anderson ldquoECG patternanalysis for emotion detectionrdquo IEEE Transactions on AffectiveComputing vol 3 no 1 pp 102ndash115 2012

[34] D L Donoho ldquoDe-noising by soft-thresholdingrdquo IEEE Trans-actions on Information Theory vol 41 no 3 pp 613ndash627 1995

[35] B-U Kohler C Hennig and R Orglmeister ldquoThe principlesof software QRS detectionrdquo IEEE Engineering in Medicine andBiology Magazine vol 21 no 1 pp 42ndash57 2002

[36] V Guralnik and J Srivastava ldquoEvent detection from time seriesdatardquo inKnowledge Discovery andDataMining pp 33ndash42 1999

[37] M Hamedi S-H Salleh and T T Swee ldquoSurfaceelectromyography-based facial expression recognition inBi-polar configurationrdquo Journal of Computer Science vol 7 no9 pp 1407ndash1415 2011

[38] Y Hou H Sahli R Ilse Y Zhang and R Zhao ldquoRobust shape-based head trackingrdquo inAdvanced Concepts for Intelligent VisionSystems vol 4678 ofLectureNotes inComputer Science pp 340ndash351 Springer 2007

[39] T Kanade J F Cohn and Y Tian ldquoComprehensive databasefor facial expression analysisrdquo in Proceedings of the IEEE Inter-national Conference on Automatic Face and Gesture Recognitionpp 46ndash53 Grenoble France 2000

[40] M S Bartlett G C Littlewort M G Frank C Lainscsek IR Fasel and J R Movellan ldquoAutomatic recognition of facialactions in spontaneous expressionsrdquo Journal of Multimedia vol1 no 6 pp 22ndash35 2006

[41] A Savran B Sankur and M Taha Bilge ldquoRegression-basedintensity estimation of facial action unitsrdquo Image and VisionComputing vol 30 no 10 pp 774ndash784 2012

[42] Anvil the video annotation tool httpwwwanvil-soft-wareorg

[43] S Vicente J Peron I Biseul et al ldquoSubjective emotionalexperience at different stages of Parkinsonrsquos diseaserdquo Journal ofthe Neurological Sciences vol 310 no 1-2 pp 241ndash247 2011

[44] A van Boxtel ldquoFacial emg as a tool for inferring affectivestatesrdquo in Proceedings of Measuring Behavior 2010 A J Spink FGrieco O Krips L Loijens L Noldus and P Zimmeran Edspp 104ndash108 Noldus Information TechnologyWageningenTheNetherlands 2010

[45] C-N Huang C-H Chen and H-Y Chung ldquoThe review ofapplications and measurements in facial electromyographyrdquoJournal of Medical and Biological Engineering vol 25 no 1 pp15ndash20 2005

[46] R W Levenson P Ekman K Heider and W V FriesenldquoEmotion and autonomic nervous system activity in theminangkabau of west sumatrardquo Journal of Personality and SocialPsychology vol 62 no 6 pp 972ndash988 1992

[47] S Rohrmann and H Hopp ldquoCardiovascular indicators ofdisgustrdquo International Journal of Psychophysiology vol 68 no3 pp 201ndash208 2008

[48] OAlaoui-Ismaıli O RobinH Rada ADittmar andEVernet-Maury ldquoBasic emotions evoked by odorants comparisonbetween autonomic responses and self-evaluationrdquo Physiologyand Behavior vol 62 no 4 pp 713ndash720 1997

[49] J Gruber S L Johnson C Oveis and D Keltner ldquoRisk formania and positive emotional responding too much of a goodthingrdquo Emotion vol 8 no 1 pp 23ndash33 2008

[50] W Gaebel and W Wolwer ldquoFacial expressivity in the course ofschizophrenia and depressionrdquo European Archives of Psychiatryand Clinical Neuroscience vol 254 no 5 pp 335ndash342 2004

[51] P EkmanW Friesen and J Hager Facial Action Coding SystemThe Manual 2002

Submit your manuscripts athttpwwwhindawicom

Stem CellsInternational

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

MEDIATORSINFLAMMATION

of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Behavioural Neurology

EndocrinologyInternational Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Disease Markers

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

BioMed Research International

OncologyJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Oxidative Medicine and Cellular Longevity

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

PPAR Research

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Immunology ResearchHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Journal of

ObesityJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Computational and Mathematical Methods in Medicine

OphthalmologyJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Diabetes ResearchJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Research and TreatmentAIDS

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Gastroenterology Research and Practice

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Parkinsonrsquos Disease

Evidence-Based Complementary and Alternative Medicine

Volume 2014Hindawi Publishing Corporationhttpwwwhindawicom

6 Computational and Mathematical Methods in Medicine

Original video

I(0) I(1) I(T)

Shape model tracking

Appearance-based

P(0) P(1) P(T)

Geometry-based

Feature extraction

AU12 AU27 AU6 AU9 AU1 AU4

Lower facial features

Middle facial features

Upper facial features

Featureselection

AdaBoost

AU12 AU27 AU6 AU9 AU1 AU4Binary SVMclassification

Output decision values

AU1 AU2 AU4 AU6

AU7 AU9 AU12 AU20

AU23 AU25 AU27

Inner brower raiser Outer brower raiser Brow lowerer Cheek raiser

Lid tighten Nose wrinkle Lip corner puller Lip stretch

Lip tighten Lips part Mouth stretch

middot middot middot

middot middot middot

middot middot middot

middot middot middot

middot middot middot

middot middot middot

middot middot middot

middot middot middot middot middot middot middot middot middot

middot middot middot

Figure 6 The AU recognition system

Following the criteria of the ICRP-IEB we defined thefacial expressivity of a participant as follows

EFE = AMC119890+ AI119890minus (AMC

119899+ AI119899) (4)

where AMC119890and AMC

119899are the amount of movement

changes during the disgust and neutral facial expressionrespectively AI

119890and AI

119899are the intensity of the displayed

AUs during the disgust and neutral facial expression respec-tively

It has to be recalled that for all these estimations onlythe considered 30 s windows are used The quantities AMCand AI refer to frequency and intensity respectively Theirdetailed formulation is given in the following sections Inorder to assess the effectiveness of the proposed formulationwe also compared it to the following definition of facialexpression where the baseline of neutral emotion was notconsidered Consider the following

FE = AMC119890+ AI119890 (5)

(1) Intensity of Displayed AUs It has been shown in [40] thatthe output margin of the learned SVM classifiers contained

information about expression intensity Later Savran et al[41] estimated the AU intensity levels using logistic regres-sion on SVM scores In this work we propose using Plattrsquosprobability given by (3) as action unit intensity at frame 119905119868119905(AU119894) = 119875119905AU119894 with 119875

119905

AU119894 = 119875(119888 = AU119894 | z119905) is estimated

using (3) The 119868119905(AU119894) time series is then smoothed using a

Gaussian filter We denote by 119868119905(AU119894) the smoothed value

Figure 7 plots for a participant the smoothed intensity of thefacial AU7 also illustrating its different temporal segmentsneutral onset apex and offset

Having defined the AU intensity the intensity of dis-played AUs during a voluntary facial expression (disgust orneutral) is given by

AI = sum119894isinDAUs

1

1198731015840

119894

sum

119905isin119879119894

119868119905 (AU119894) (6)

where DAUs is the set of displayed (recognized) facial actionunits during the considered 30 s window 119879

119894is the set of

Computational and Mathematical Methods in Medicine 7

1

09

08

07

06

05

04

03

02

01

00 50 100 150 200 250 300 350

Frames

Inte

nsity

of A

U7

(lid

tight

ener

)

Figure 7 Example of AU intensity

frames where AU119894 is active and1198731015840119894is the cardinal of 119879

119894 being

the number of frames where AU119894 is active

(2) Amount of Movement Change The amount of movementchange during a voluntary facial expression (disgust orneutral) is given by

AMC = sum119894isinDAUs

1

1198731015840

119894

sum

119905isin119879119894

10038161003816100381610038161003816119868119905 (AU119894) minus 119868119905+1 (AU119894)

10038161003816100381610038161003816 (7)

with DAUs1198731015840119894 and 119879

119894as defined above

3 Results and Discussion

31 Manipulation Check Participantsrsquo emotion was suc-cessfully elicited using the movie clips of disgust Forboth disgust1 and disgust2 most participants (1415) self-reported the target emotion as their dominated emotionFor the other video clips the reported emotions are as fol-lows amusement1 (915) amusement2 (1215) surprise1(1115) surprise2 (215) anger1 (615) anger2 (915) fear1(1015) fear2 (1115) neutral1 (1315) and neutral2 (1315)Annotation of the video records using the Anvil annotationtool [42] further confirmed that the movie clips of disgustinduced the most reliable emotional data Therefore forfurther analysis we decided to use only the data recordedduring watching the disgust movie clips

The Wilcoxon rank sum tests were implemented on the14 disgusting self-reports Results showed that there wasno significant difference in self-reported emotional ratingsbetween disgust1 (119872 = 650 SD = 76) and disgust2(119872 = 664 SD = 63) and between PD group (119872 =651 SD = 65) and control group (119872 = 664 SD =74) This is what we expected since Vicente et al [43] alsoreported that PD patients at different stages of the disease didnot significantly differ from the controls in the self-reportedemotional experience to presented movie clips

32 Univariate Analysis of Group Effects TheEMGdata from3 participants (1 control and 2 PD) and the ECG data from 10participants (4 control and 6 PD) were discarded because ofnot being well recorded due to sensor malfunctions Results

of the repeated-measures ANOVAs on single physiologicalvariables showed that significant main effect of group on theLLS activity (119865(1 12) = 338 119875 = 09) the OO activity(119865(1 12) = 192 119875 = 19) HR (119865(1 3) = 93 119875 = 41)and HRV (119865(1 3) = 53 119875 = 83) was not found Table 4shows the descriptive data for control and PD groups duringexposure to the three movie clips disgust1 disgust2 andneutral2 and the tests of significance comparison betweengroups using Wilcoxon rank sum tests The Wilcoxon ranksum tests revealed that

(1) comparable levels of baseline activity in control andPD groups over both the LLS and the OOwere found

(2) although disgust1 elicited more muscle activity overthe LLS and the OO for control group than forPD group this difference did not reach statisticalsignificance

(3) disgust2 elicited significantly more LLS activity forcontrol than for PD

These results indicated that PD displayed less muscleactivity over the LLS when expressing disgust than controlIn addition disgust2 induced more muscle activity over theLLS and the OO than disgust1 which is consistent with theself-report andmay be due to the fact that disgust2 is slightlymore disgusting than disgust1

33 Univariate Analysis of Stimuli Effects For LLS we foundan effect of the stimuli on muscle activity (119865(2 24) = 947119875 = 001) Post-hoc tests indicated significant differencesbetween disgust1 and neutral2 (119875 = 01) and betweendisgust2 and neutral2 (119875 = 01) Both disgust1 (119872 = 282SD = 192) and disgust2 (119872 = 537 SD = 543) elicitedmore muscle activity than neutral2 (119872 = 110 SD = 38)Thismain effect was qualified by a stimulitimes group interaction(119865(2 24) = 417 119875 = 028) which was consistent with theresults of the Wilcoxon rank sum tests which compared thephysiological responses between groups

For OO we also found an effect of the stimuli on muscleactivity (119865(2 24) = 545 119875 = 012) Post-hoc tests indicatedsignificant difference only between disgust1 and neutral2(119875 = 002)Thedisgust1 (119872 = 232 SD = 157) elicitedmoremuscle activity than neutral2 (119872 = 135 SD = 111) Nosignificant stimuli times group interaction effect (119865(2 24) = 177119875 = 192) was found

We expected that the disgust clips elicited significantlymore muscle activity over LLS than the neutral clip becausenormally LLS is involved in producing the disgust facialexpression [44] The fact that OO was also significantlydifferent was probably due to the following

(1) crosstalk [45] the LLS and the OO lie in the vicinityof each other

(2) the disgust1 elicited is not only a disgust but alsoa bit of amusement by the funny motion of thecharacter and the background music so that the OOwas also involved in producing the facial expressionof amusement

8 Computational and Mathematical Methods in Medicine

Table 4 Statistical summary of physiological measures

Variable Stimuli Control PD SigMean SD Mean SD

LLSNeutral2 104 34 116 43 71Disgust1 336 204 227 177 26Disgust2 804 649 271 225 04

lowast

OONeutral2 135 111 94 54 81Disgust1 282 177 181 128 32Disgust2 520 571 207 148 13

HRNeutral2 172 277 276 mdash mdashDisgust1 minus53 264 512 mdash mdashDisgust2 222 553 565 mdash mdash

HRVNeutral2 153 1296 minus765 mdash mdashDisgust1 801 1145 2634 mdash mdashDisgust2 1486 3564 1492 mdash mdash

lowastThe two groups are significantly different that is 119875 lt 05ldquomdashrdquoWe did not compare the cardiac responses (ie HR and HRV) between groups because due to technical failures we lost the ECG data for 10 participantsand thus only 5 participants (4 controls and 1 PD) completed the ECG recording

Table 5 Facial expressivity assessment based on different methods

Var Clowast LPlowast IPlowast MPlowast

1198631lowast 1198632lowast 1198732lowast 1198631lowast 1198632lowast 1198732lowast 1198631lowast 1198632lowast 1198732lowast 1198631lowast 1198632lowast 1198732lowast

TFA 8 8 4 5 5 4 5 6 4 4 mdash 5AMCloz 139 277 108 67 65 43 82 62 69 39 mdash 82AI 54 51 18 28 28 22 38 39 31 22 mdash 33FEloz 722 751 246 209 187 142 444 356 262 135 mdash 322lowast119863111986321198732 C LP IP and MP denote disgust1 disgust2 neutral2 the control the least intermediate and most severely form of Parkinsonrsquos patients

respectivelylozpresented in percentagesldquomdashrdquoThe face of the MP while watching1198632 was blocked by his hand

Note that disgust2 (119872 = 364 SD = 432) elicitedmore muscle activity over OO (because of crosstalk) thanneutral2 (119872 = 114 SD = 87) however unlike disgust1the difference did not reach statistical significance (becausedisgust2 did not elicit amusement at all) which can also beinterpreted

Moreover the main effect of stimuli on cardiac parame-ters for both HR (119865(2 6) = 37 119875 = 70) and HRV (119865(2 6) =084 119875 = 48) was not found which is not consistent withwhat we expected unchanged [46 47] or increased [48 49]HR and increased HRV [21 47] This may be due to thefact that we did not have enough recorded ECG data for astatistical analysis

34 Qualitative Analysis of Facial Expressivity To qualita-tively analyze facial expressivity we used the total facialactivity TFA measure of [2 50] being the total numberof displayed AUs in response to the stimuli Compared tothe control (C) the PD groups (LP IP and MP) showedthe attenuation of their facial activities (variable TFA) whilewatching disgusting movie clips (see Table 5) Howevercomparable TFA was found while watching neutral movieclips

Visual inspection of the of the displayed AUs as illus-trated in Figure 8 shows that AU1 AU2 AU6 AU9 andAU45 occurred more frequently for C except for IP whoproduced more frequently AU6 than C A possible expla-nation is that the IP cannot deactivate AU6 even duringneutral state As shown in Figure 9 during neutral state PDpatients produced more continuous active action units suchas AU4 and AU25 (for all PD) AU6 (for IP) and AU9 (forMP) Moreover certain AU combinations were consideredlikely to signify ldquodisgustrdquo on the basis of Ekman and Friesenrsquosdescription of emotions [7] ldquoDisgustrdquo AUs were consideredas the combination of eyebrows lowerer (AU4) cheek raiser(AU6) and nose wrinkler (AU9) Because cheek raiser isvery difficult to produce on demand without including otherAUs especially the lid tightener (AU7) [51] lid tightenerwas also taken into consideration that is the expected AUpattern of ldquodisgustrdquo was the combination of AU4 AU6 AU7and AU9 Consistent with what we expected C displayed98 and 87 ldquodisgustrdquo frames while watching disgust1 anddisgust2 respectively The PD goup (LP IP and MP) didnot display any ldquodisgustrdquo frames only for IP who had4 ldquodisgustrdquo frames while watching disgust1 Furthermorethe IP displayed quite few frames with combinations ofcheek raiser and lid tightener during neutral state Instead

Computational and Mathematical Methods in Medicine 9

0 100 200 300 400 500 600 7000 1 2 4 6 7 9

122023252745

Frame0 100 200 300 400 500 600 700

Frame

AU

0 100 200 300 400 500 600 7000 1 2 4 6 7 9

122023252745

Frame

AU

0 100 200 300 400 500 600 7000 1 2 4 6 7 9

122023252745

Frame

AU

0 1 2 4 6 7 9

122023252745

AU

Disgust1 (C) Disgust1 (LP)

Disgust1 (IP) Disgust1 (MP)

Figure 8 The facial activities while watching disgust1

N(C) N(LP)

N(IP)

Frame

Frame Frame

Frame

N(MP)

0 100 200 300 400 500 600 7000 1 2 4 6 7 9

122023252745

AU

0 100 200 300 400 500 600 7000 1 2 4 6 7 9

122023252745

AU

0 100 200 300 400 500 600 7000 1 2 4 6 7 9

122023252745

AU

0 100 200 300 400 500 600 7000 1 2 4 6 7 9

122023252745

AU

Figure 9 The facial activities while watching neutral2

10 Computational and Mathematical Methods in Medicine

0

5

10

15

20

25

30

35

fe

C

LP

IP

MP

Disgust1 Disgust2

Figure 10 The quantified facial expressivity

the cheek raiser alone was displayed mostly (see Figure 9)which indicates that IP patients have control problems forfacial muscles combinations

Analyzing the defined quantities AMC AI and FE usingthe segments with active AUs as it can be seen from Table 5the control C had higher facial expressivity than LP IP andMP during the disgusting state while MP performed thelowest facial expressivity More specifically compared to CPD (LP IP andMP) had smaller values of variables AMC AIand FE while watching disgusting movie clips especially forbrow lower (AU4) nose wrinkler (AU9) and blink (AU45)C tended to show similar intensities of brow lower (variable119890) with bigger variance (variables AMC and FE) In additionC showed higher intensities of nose wrinkler with largervariance

35 Quantitative Analysis of Facial Expressivity Figure 10depicts the values of the facial expressivity equation (5) forthe participants We did not assess the facial expressivityof the MP patient during disgust2 as he had his hand onfront of his face during the experiments As it can be seen asignificant difference between C and PD patients is presentC got the highest score while the lowest score was obtainedby the MP patient However the score of the IP patient wasslightly higher than the LP one which is due to the fact thatthe LP and IP expressed ldquofacial maskingrdquo in different waysthe LP patient showed the attenuation of the intensities offacial movements On the contrary the IP patient producedhigh intensities of facial movements not only in his emotionalstate but also during neutral state that is he cannot relax themuscles well In order to take both kinds of ldquofacial maskingrdquointo account we computed the facial expressivity using (4)The results are shown in Figure 11 As it can be seen theproposed facial expressivity allows distinguishing betweencontrol and PD patients Moreover the facial expressivitydecreases along with the increase of PD severity

0

5

10

15

20

25

fe

CLP

IP

MP

Disgust1 Disgust2minus5

Figure 11 The quantified facial expressivity based on the improvedfunction

4 Conclusions

This study investigated the phenomenon of facial masking inParkinsonrsquos patientsWedesigned an automated and objectivemethod to assess the facial expressivity of PD patients Theproposed approach follows the methodology of the Inter-personal Communication Rating Protocol-Parkinsonrsquos DiseaseVersion (ICRP-IEB) [9] In this study based on the FacialAction Coding System (FACS) we proposed a methodologythat (i) automatically detects faces in a video stream (ii) codeseach framewith respect to 11 action units and (iii) estimates afacial expressivity as function of frequency that is how oftenAUs occur duration that is how long AUs last and intensitybeing the strength of AUs

Although the proposed facial expressivity assessmentapproach has been evaluated in a limited number of subjectsit allows capturing differences in facial expressivity betweencontrol participants andParkinsonrsquos patientsMoreover facialexpressivity differences between PD patients with differ-ent progression of Parkinsonrsquos disease have been assessedThe proposed method can capture these differences andgive a more accurate assessment of facial expressivity forParkinsonrsquos patients than what traditional observer basedratings allow for This confirms that our approach can beused for clinical assessment of facial expressivity in PDIndeed nonverbal signals contribute significantly to inter-personal communication Facial expressivity a major sourceof nonverbal information is compromised in Parkinsonrsquosdisease The resulting disconnect between subjective feelingand objective facial affect can lead people to form negativeand inaccurate impressions of people with PD with respectto their personality feelings and intelligence Assessing inan objective way the facial expressivity limitation of PDwould allow developing personalized therapy to benefit facialexpressivity in PD IndeedParkinsonrsquos is a progressive diseasewhich means that the symptoms will get worse as time

Computational and Mathematical Methods in Medicine 11

goes on Using the proposed assessment approach wouldallow regular facial expressivity assessment by therapist andclinicians to explore treatment options

Future work will (i) improve the AU recognition systemand extend it to more AUs and (ii) consider clinical usageof the proposed approach in a statistically significant PDpatients population

Conflict of Interests

The authors declare that there is no conflict of interestsregarding the publication of this paper

Acknowledgments

This work is supported by a CSC-VUB scholarship (Grant no2011629010) and the VUB-interdisciplinary research projectldquoObjectifying Assessment of Human Emotional Processingand Expression for Clinical and Psychotherapy Applications(EMO-App)rdquo The authors gratefully acknowledge the Flem-ish Parkinson League for supporting the current study andthe recruitment of the PD patients Also they acknowledgeFloriane Verbraeck for the setup of the experiments and dataacquisition

References

[1] M Katsikitis and I Pilowsky ldquoA study of facial expressionin Parkinsonrsquos disease using a novel microcomputer-basedmethodrdquo Journal of Neurology Neurosurgery and Psychiatry vol51 no 3 pp 362ndash366 1988

[2] G Simons H Ellgring and M C Smith Pasqualini ldquoDistur-bance of spontaneous and posed facial expressions in Parkin-sonrsquos diseaserdquoCognition and Emotion vol 17 no 5 pp 759ndash7782003

[3] D H Jacobs J Shuren D Bowers and K M Heilman ldquoEmo-tional facial imagery perception and expression in Parkinsonsdiseaserdquo Neurology vol 45 no 9 pp 1696ndash1702 1995

[4] L Tickle-Degnen and K D Lyons ldquoPractitionersrsquo impressionsof patients with Parkinsonrsquos disease the social ecology of theexpressive maskrdquo Social Science amp Medicine vol 58 no 3 pp603ndash614 2004

[5] J J van Hilten A D van der Zwan A H Zwinderman and RA C Roos ldquoRating impairment and disability in Parkinsonsdisease evaluation of the unified Parkinsons disease ratingscalerdquoMovement Disorders vol 9 no 1 pp 84ndash88 1994

[6] D Bowers K Miller W Bosch et al ldquoFaces of emotionin Parkinsons disease micro-expressivity and bradykinesiaduring voluntary facial expressionsrdquo Journal of the InternationalNeuropsychological Society vol 12 no 6 pp 765ndash773 2006

[7] P Ekman and W Friesen Facial Action Coding System ATechnique for the Measurement of Facial Movement ConsultingPsychologists Press Palo Alto Calif USA 1978

[8] P Wu D Jiang and H Sahli ldquoPhysiological signal processingfor emotional feature extractionrdquo in Proceedings of the Interna-tional Conference on Physiological Computing Systems pp 40ndash47 2014

[9] L Tickle-Degnen The Interpersonal Communication RatingProtocol A Manual for Measuring Individual Expressive Behav-ior 2010

[10] G Sandbach S Zafeiriou M Pantic and L Yin ldquoStatic anddynamic 3D facial expression recognition a comprehensivesurveyrdquo Image and Vision Computing vol 30 no 10 pp 683ndash697 2012

[11] I Gonzalez H Sahli V Enescu and W Verhelst ldquoContext-independent facial action unit recognition using shape andgabor phase informationrdquo in Proceedings of the 4th InternationalConference on Affective Computing and Intelligent Interaction(ACII rsquo11) vol 1 pp 548ndash557 Springer Memphis Tenn USAOctober 2011

[12] J C Platt ldquoProbabilistic outputs for support vector machinesand comparisons to regularized likelihood methodsrdquo inAdvances in Large Margin Classifiers A J Smola P Bartlett BSchoelkopf andD Schuurmans Eds pp 61ndash74TheMITPress1999

[13] J J Gross and RW Levenson ldquoEmotion elicitation using filmsrdquoCognition and Emotion vol 9 no 1 pp 87ndash108 1995

[14] D Hagemann E Naumann S Maier G Becker A Lurkenand D Bartussek ldquoThe assessment of affective reactivity usingfilms validity reliability and sex differencesrdquo Personality andIndividual Differences vol 26 no 4 pp 627ndash639 1999

[15] C L Lisetti and F Nasoz ldquoUsing noninvasive wearable comput-ers to recognize human emotions from physiological signalsrdquoEURASIP Journal on Advances in Signal Processing vol 2004Article ID 929414 pp 1672ndash1687 2004

[16] J Hewig D Hagemann J Seifert M Gollwitzer E Naumannand D Bartussek ldquoA revised film set for the induction of basicemotionsrdquo Cognition and Emotion vol 19 no 7 pp 1095ndash11092005

[17] J H Westerink E L van den Broek M H Schut J van Herkand K Tuinenbreijer ldquoProbing experiencerdquo in Assessment ofUser Emotions and Behaviour to Development of Products T JO W F P Joyce H D M Westerink M Ouwerkerk and B deRuyter Eds Springer New York NY USA 2008

[18] A Schaefer F Nils P Philippot and X Sanchez ldquoAssessing theeffectiveness of a large database of emotion-eliciting films a newtool for emotion researchersrdquo Cognition and Emotion vol 24no 7 pp 1153ndash1172 2010

[19] F Verbraeck Objectifying human facial expressions for clinicalapplications [MS thesis] Vrije Universiteit Brussel Belgium2012

[20] O AlzoubiAutomatic affect detection from physiological signalspractical issues [PhD thesis] The University of Sydney NewSouth Wales Australia 2012

[21] S D Kreibig ldquoAutonomic nervous system activity in emotiona reviewrdquo Biological Psychology vol 84 no 3 pp 394ndash421 2010

[22] F D Farfan J C Politti and C J Felice ldquoEvaluation of emgprocessing techniques using information theoryrdquo BioMedicalEngineering Online vol 9 article 72 2010

[23] N Ambady F J Bernieri and J A Richeson ldquoToward ahistology of social behavior judgmental accuracy from thinslices of the behavioral streamrdquoAdvances in Experimental SocialPsychology vol 32 pp 201ndash271 2000

[24] N Ambady and R Rosenthal ldquoThin slices of expressivebehavior as predictors of interpersonal consequences a meta-analysisrdquo Psychological Bulletin vol 111 no 2 pp 256ndash274 1992

[25] K D Lyons and L Tickle-Degnen ldquoReliability and validity ofa videotape method to describe expressive behavior in personswith Parkinsonrsquos diseaserdquoTheAmerican Journal of OccupationalTherapy vol 59 no 1 pp 41ndash49 2005

[26] N AMurphy ldquoUsing thin slices for behavioral codingrdquo Journalof Nonverbal Behavior vol 29 no 4 pp 235ndash246 2005

12 Computational and Mathematical Methods in Medicine

[27] A O Andrade S Nasuto P Kyberd C M Sweeney-Reed andF R vanKanijn ldquoEMG signal filtering based on empiricalmodedecompositionrdquo Biomedical Signal Processing and Control vol1 no 1 pp 44ndash55 2006

[28] M Blanco-Velasco B Weng and K E Barner ldquoECG signaldenoising and baseline wander correction based on the empir-ical mode decompositionrdquo Computers in Biology and Medicinevol 38 no 1 pp 1ndash13 2008

[29] A O Boudraa J C Cexus and Z Saidi ldquoEmd-based signalnoise reductionrdquo Signal Processing vol 1 pp 33ndash37 2005

[30] T Jing-tian Z Qing T Yan L Bin and Z Xiao-kai ldquoHilbert-Huang transform for ECG de-noisingrdquo in Proceedings of the1st International Conference on Bioinformatics and BiomedicalEngineering pp 664ndash667 July 2007

[31] A Karagiannis and P Constantinou ldquoNoise components identi-fication in biomedical signals based on empirical mode decom-positionrdquo in Proceedings of the 9th International Conference onInformation Technology and Applications in Biomedicine (ITABrsquo09) pp 1ndash4 November 2009

[32] Y Kopsinis and S McLaughlin ldquoDevelopment of EMD-baseddenoising methods inspired by wavelet thresholdingrdquo IEEETransactions on Signal Processing vol 57 no 4 pp 1351ndash13622009

[33] F Agrafioti D Hatzinakos and A K Anderson ldquoECG patternanalysis for emotion detectionrdquo IEEE Transactions on AffectiveComputing vol 3 no 1 pp 102ndash115 2012

[34] D L Donoho ldquoDe-noising by soft-thresholdingrdquo IEEE Trans-actions on Information Theory vol 41 no 3 pp 613ndash627 1995

[35] B-U Kohler C Hennig and R Orglmeister ldquoThe principlesof software QRS detectionrdquo IEEE Engineering in Medicine andBiology Magazine vol 21 no 1 pp 42ndash57 2002

[36] V Guralnik and J Srivastava ldquoEvent detection from time seriesdatardquo inKnowledge Discovery andDataMining pp 33ndash42 1999

[37] M Hamedi S-H Salleh and T T Swee ldquoSurfaceelectromyography-based facial expression recognition inBi-polar configurationrdquo Journal of Computer Science vol 7 no9 pp 1407ndash1415 2011

[38] Y Hou H Sahli R Ilse Y Zhang and R Zhao ldquoRobust shape-based head trackingrdquo inAdvanced Concepts for Intelligent VisionSystems vol 4678 ofLectureNotes inComputer Science pp 340ndash351 Springer 2007

[39] T Kanade J F Cohn and Y Tian ldquoComprehensive databasefor facial expression analysisrdquo in Proceedings of the IEEE Inter-national Conference on Automatic Face and Gesture Recognitionpp 46ndash53 Grenoble France 2000

[40] M S Bartlett G C Littlewort M G Frank C Lainscsek IR Fasel and J R Movellan ldquoAutomatic recognition of facialactions in spontaneous expressionsrdquo Journal of Multimedia vol1 no 6 pp 22ndash35 2006

[41] A Savran B Sankur and M Taha Bilge ldquoRegression-basedintensity estimation of facial action unitsrdquo Image and VisionComputing vol 30 no 10 pp 774ndash784 2012

[42] Anvil the video annotation tool httpwwwanvil-soft-wareorg

[43] S Vicente J Peron I Biseul et al ldquoSubjective emotionalexperience at different stages of Parkinsonrsquos diseaserdquo Journal ofthe Neurological Sciences vol 310 no 1-2 pp 241ndash247 2011

[44] A van Boxtel ldquoFacial emg as a tool for inferring affectivestatesrdquo in Proceedings of Measuring Behavior 2010 A J Spink FGrieco O Krips L Loijens L Noldus and P Zimmeran Edspp 104ndash108 Noldus Information TechnologyWageningenTheNetherlands 2010

[45] C-N Huang C-H Chen and H-Y Chung ldquoThe review ofapplications and measurements in facial electromyographyrdquoJournal of Medical and Biological Engineering vol 25 no 1 pp15ndash20 2005

[46] R W Levenson P Ekman K Heider and W V FriesenldquoEmotion and autonomic nervous system activity in theminangkabau of west sumatrardquo Journal of Personality and SocialPsychology vol 62 no 6 pp 972ndash988 1992

[47] S Rohrmann and H Hopp ldquoCardiovascular indicators ofdisgustrdquo International Journal of Psychophysiology vol 68 no3 pp 201ndash208 2008

[48] OAlaoui-Ismaıli O RobinH Rada ADittmar andEVernet-Maury ldquoBasic emotions evoked by odorants comparisonbetween autonomic responses and self-evaluationrdquo Physiologyand Behavior vol 62 no 4 pp 713ndash720 1997

[49] J Gruber S L Johnson C Oveis and D Keltner ldquoRisk formania and positive emotional responding too much of a goodthingrdquo Emotion vol 8 no 1 pp 23ndash33 2008

[50] W Gaebel and W Wolwer ldquoFacial expressivity in the course ofschizophrenia and depressionrdquo European Archives of Psychiatryand Clinical Neuroscience vol 254 no 5 pp 335ndash342 2004

[51] P EkmanW Friesen and J Hager Facial Action Coding SystemThe Manual 2002

Submit your manuscripts athttpwwwhindawicom

Stem CellsInternational

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

MEDIATORSINFLAMMATION

of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Behavioural Neurology

EndocrinologyInternational Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Disease Markers

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

BioMed Research International

OncologyJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Oxidative Medicine and Cellular Longevity

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

PPAR Research

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Immunology ResearchHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Journal of

ObesityJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Computational and Mathematical Methods in Medicine

OphthalmologyJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Diabetes ResearchJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Research and TreatmentAIDS

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Gastroenterology Research and Practice

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Parkinsonrsquos Disease

Evidence-Based Complementary and Alternative Medicine

Volume 2014Hindawi Publishing Corporationhttpwwwhindawicom

Computational and Mathematical Methods in Medicine 7

1

09

08

07

06

05

04

03

02

01

00 50 100 150 200 250 300 350

Frames

Inte

nsity

of A

U7

(lid

tight

ener

)

Figure 7 Example of AU intensity

frames where AU119894 is active and1198731015840119894is the cardinal of 119879

119894 being

the number of frames where AU119894 is active

(2) Amount of Movement Change The amount of movementchange during a voluntary facial expression (disgust orneutral) is given by

AMC = sum119894isinDAUs

1

1198731015840

119894

sum

119905isin119879119894

10038161003816100381610038161003816119868119905 (AU119894) minus 119868119905+1 (AU119894)

10038161003816100381610038161003816 (7)

with DAUs1198731015840119894 and 119879

119894as defined above

3 Results and Discussion

31 Manipulation Check Participantsrsquo emotion was suc-cessfully elicited using the movie clips of disgust Forboth disgust1 and disgust2 most participants (1415) self-reported the target emotion as their dominated emotionFor the other video clips the reported emotions are as fol-lows amusement1 (915) amusement2 (1215) surprise1(1115) surprise2 (215) anger1 (615) anger2 (915) fear1(1015) fear2 (1115) neutral1 (1315) and neutral2 (1315)Annotation of the video records using the Anvil annotationtool [42] further confirmed that the movie clips of disgustinduced the most reliable emotional data Therefore forfurther analysis we decided to use only the data recordedduring watching the disgust movie clips

The Wilcoxon rank sum tests were implemented on the14 disgusting self-reports Results showed that there wasno significant difference in self-reported emotional ratingsbetween disgust1 (119872 = 650 SD = 76) and disgust2(119872 = 664 SD = 63) and between PD group (119872 =651 SD = 65) and control group (119872 = 664 SD =74) This is what we expected since Vicente et al [43] alsoreported that PD patients at different stages of the disease didnot significantly differ from the controls in the self-reportedemotional experience to presented movie clips

32 Univariate Analysis of Group Effects TheEMGdata from3 participants (1 control and 2 PD) and the ECG data from 10participants (4 control and 6 PD) were discarded because ofnot being well recorded due to sensor malfunctions Results

of the repeated-measures ANOVAs on single physiologicalvariables showed that significant main effect of group on theLLS activity (119865(1 12) = 338 119875 = 09) the OO activity(119865(1 12) = 192 119875 = 19) HR (119865(1 3) = 93 119875 = 41)and HRV (119865(1 3) = 53 119875 = 83) was not found Table 4shows the descriptive data for control and PD groups duringexposure to the three movie clips disgust1 disgust2 andneutral2 and the tests of significance comparison betweengroups using Wilcoxon rank sum tests The Wilcoxon ranksum tests revealed that

(1) comparable levels of baseline activity in control andPD groups over both the LLS and the OOwere found

(2) although disgust1 elicited more muscle activity overthe LLS and the OO for control group than forPD group this difference did not reach statisticalsignificance

(3) disgust2 elicited significantly more LLS activity forcontrol than for PD

These results indicated that PD displayed less muscleactivity over the LLS when expressing disgust than controlIn addition disgust2 induced more muscle activity over theLLS and the OO than disgust1 which is consistent with theself-report andmay be due to the fact that disgust2 is slightlymore disgusting than disgust1

33 Univariate Analysis of Stimuli Effects For LLS we foundan effect of the stimuli on muscle activity (119865(2 24) = 947119875 = 001) Post-hoc tests indicated significant differencesbetween disgust1 and neutral2 (119875 = 01) and betweendisgust2 and neutral2 (119875 = 01) Both disgust1 (119872 = 282SD = 192) and disgust2 (119872 = 537 SD = 543) elicitedmore muscle activity than neutral2 (119872 = 110 SD = 38)Thismain effect was qualified by a stimulitimes group interaction(119865(2 24) = 417 119875 = 028) which was consistent with theresults of the Wilcoxon rank sum tests which compared thephysiological responses between groups

For OO we also found an effect of the stimuli on muscleactivity (119865(2 24) = 545 119875 = 012) Post-hoc tests indicatedsignificant difference only between disgust1 and neutral2(119875 = 002)Thedisgust1 (119872 = 232 SD = 157) elicitedmoremuscle activity than neutral2 (119872 = 135 SD = 111) Nosignificant stimuli times group interaction effect (119865(2 24) = 177119875 = 192) was found

We expected that the disgust clips elicited significantlymore muscle activity over LLS than the neutral clip becausenormally LLS is involved in producing the disgust facialexpression [44] The fact that OO was also significantlydifferent was probably due to the following

(1) crosstalk [45] the LLS and the OO lie in the vicinityof each other

(2) the disgust1 elicited is not only a disgust but alsoa bit of amusement by the funny motion of thecharacter and the background music so that the OOwas also involved in producing the facial expressionof amusement

8 Computational and Mathematical Methods in Medicine

Table 4 Statistical summary of physiological measures

Variable Stimuli Control PD SigMean SD Mean SD

LLSNeutral2 104 34 116 43 71Disgust1 336 204 227 177 26Disgust2 804 649 271 225 04

lowast

OONeutral2 135 111 94 54 81Disgust1 282 177 181 128 32Disgust2 520 571 207 148 13

HRNeutral2 172 277 276 mdash mdashDisgust1 minus53 264 512 mdash mdashDisgust2 222 553 565 mdash mdash

HRVNeutral2 153 1296 minus765 mdash mdashDisgust1 801 1145 2634 mdash mdashDisgust2 1486 3564 1492 mdash mdash

lowastThe two groups are significantly different that is 119875 lt 05ldquomdashrdquoWe did not compare the cardiac responses (ie HR and HRV) between groups because due to technical failures we lost the ECG data for 10 participantsand thus only 5 participants (4 controls and 1 PD) completed the ECG recording

Table 5 Facial expressivity assessment based on different methods

Var Clowast LPlowast IPlowast MPlowast

1198631lowast 1198632lowast 1198732lowast 1198631lowast 1198632lowast 1198732lowast 1198631lowast 1198632lowast 1198732lowast 1198631lowast 1198632lowast 1198732lowast

TFA 8 8 4 5 5 4 5 6 4 4 mdash 5AMCloz 139 277 108 67 65 43 82 62 69 39 mdash 82AI 54 51 18 28 28 22 38 39 31 22 mdash 33FEloz 722 751 246 209 187 142 444 356 262 135 mdash 322lowast119863111986321198732 C LP IP and MP denote disgust1 disgust2 neutral2 the control the least intermediate and most severely form of Parkinsonrsquos patients

respectivelylozpresented in percentagesldquomdashrdquoThe face of the MP while watching1198632 was blocked by his hand

Note that disgust2 (119872 = 364 SD = 432) elicitedmore muscle activity over OO (because of crosstalk) thanneutral2 (119872 = 114 SD = 87) however unlike disgust1the difference did not reach statistical significance (becausedisgust2 did not elicit amusement at all) which can also beinterpreted

Moreover the main effect of stimuli on cardiac parame-ters for both HR (119865(2 6) = 37 119875 = 70) and HRV (119865(2 6) =084 119875 = 48) was not found which is not consistent withwhat we expected unchanged [46 47] or increased [48 49]HR and increased HRV [21 47] This may be due to thefact that we did not have enough recorded ECG data for astatistical analysis

34 Qualitative Analysis of Facial Expressivity To qualita-tively analyze facial expressivity we used the total facialactivity TFA measure of [2 50] being the total numberof displayed AUs in response to the stimuli Compared tothe control (C) the PD groups (LP IP and MP) showedthe attenuation of their facial activities (variable TFA) whilewatching disgusting movie clips (see Table 5) Howevercomparable TFA was found while watching neutral movieclips

Visual inspection of the of the displayed AUs as illus-trated in Figure 8 shows that AU1 AU2 AU6 AU9 andAU45 occurred more frequently for C except for IP whoproduced more frequently AU6 than C A possible expla-nation is that the IP cannot deactivate AU6 even duringneutral state As shown in Figure 9 during neutral state PDpatients produced more continuous active action units suchas AU4 and AU25 (for all PD) AU6 (for IP) and AU9 (forMP) Moreover certain AU combinations were consideredlikely to signify ldquodisgustrdquo on the basis of Ekman and Friesenrsquosdescription of emotions [7] ldquoDisgustrdquo AUs were consideredas the combination of eyebrows lowerer (AU4) cheek raiser(AU6) and nose wrinkler (AU9) Because cheek raiser isvery difficult to produce on demand without including otherAUs especially the lid tightener (AU7) [51] lid tightenerwas also taken into consideration that is the expected AUpattern of ldquodisgustrdquo was the combination of AU4 AU6 AU7and AU9 Consistent with what we expected C displayed98 and 87 ldquodisgustrdquo frames while watching disgust1 anddisgust2 respectively The PD goup (LP IP and MP) didnot display any ldquodisgustrdquo frames only for IP who had4 ldquodisgustrdquo frames while watching disgust1 Furthermorethe IP displayed quite few frames with combinations ofcheek raiser and lid tightener during neutral state Instead

Computational and Mathematical Methods in Medicine 9

0 100 200 300 400 500 600 7000 1 2 4 6 7 9

122023252745

Frame0 100 200 300 400 500 600 700

Frame

AU

0 100 200 300 400 500 600 7000 1 2 4 6 7 9

122023252745

Frame

AU

0 100 200 300 400 500 600 7000 1 2 4 6 7 9

122023252745

Frame

AU

0 1 2 4 6 7 9

122023252745

AU

Disgust1 (C) Disgust1 (LP)

Disgust1 (IP) Disgust1 (MP)

Figure 8 The facial activities while watching disgust1

N(C) N(LP)

N(IP)

Frame

Frame Frame

Frame

N(MP)

0 100 200 300 400 500 600 7000 1 2 4 6 7 9

122023252745

AU

0 100 200 300 400 500 600 7000 1 2 4 6 7 9

122023252745

AU

0 100 200 300 400 500 600 7000 1 2 4 6 7 9

122023252745

AU

0 100 200 300 400 500 600 7000 1 2 4 6 7 9

122023252745

AU

Figure 9 The facial activities while watching neutral2

10 Computational and Mathematical Methods in Medicine

0

5

10

15

20

25

30

35

fe

C

LP

IP

MP

Disgust1 Disgust2

Figure 10 The quantified facial expressivity

the cheek raiser alone was displayed mostly (see Figure 9)which indicates that IP patients have control problems forfacial muscles combinations

Analyzing the defined quantities AMC AI and FE usingthe segments with active AUs as it can be seen from Table 5the control C had higher facial expressivity than LP IP andMP during the disgusting state while MP performed thelowest facial expressivity More specifically compared to CPD (LP IP andMP) had smaller values of variables AMC AIand FE while watching disgusting movie clips especially forbrow lower (AU4) nose wrinkler (AU9) and blink (AU45)C tended to show similar intensities of brow lower (variable119890) with bigger variance (variables AMC and FE) In additionC showed higher intensities of nose wrinkler with largervariance

35 Quantitative Analysis of Facial Expressivity Figure 10depicts the values of the facial expressivity equation (5) forthe participants We did not assess the facial expressivityof the MP patient during disgust2 as he had his hand onfront of his face during the experiments As it can be seen asignificant difference between C and PD patients is presentC got the highest score while the lowest score was obtainedby the MP patient However the score of the IP patient wasslightly higher than the LP one which is due to the fact thatthe LP and IP expressed ldquofacial maskingrdquo in different waysthe LP patient showed the attenuation of the intensities offacial movements On the contrary the IP patient producedhigh intensities of facial movements not only in his emotionalstate but also during neutral state that is he cannot relax themuscles well In order to take both kinds of ldquofacial maskingrdquointo account we computed the facial expressivity using (4)The results are shown in Figure 11 As it can be seen theproposed facial expressivity allows distinguishing betweencontrol and PD patients Moreover the facial expressivitydecreases along with the increase of PD severity

0

5

10

15

20

25

fe

CLP

IP

MP

Disgust1 Disgust2minus5

Figure 11 The quantified facial expressivity based on the improvedfunction

4 Conclusions

This study investigated the phenomenon of facial masking inParkinsonrsquos patientsWedesigned an automated and objectivemethod to assess the facial expressivity of PD patients Theproposed approach follows the methodology of the Inter-personal Communication Rating Protocol-Parkinsonrsquos DiseaseVersion (ICRP-IEB) [9] In this study based on the FacialAction Coding System (FACS) we proposed a methodologythat (i) automatically detects faces in a video stream (ii) codeseach framewith respect to 11 action units and (iii) estimates afacial expressivity as function of frequency that is how oftenAUs occur duration that is how long AUs last and intensitybeing the strength of AUs

Although the proposed facial expressivity assessmentapproach has been evaluated in a limited number of subjectsit allows capturing differences in facial expressivity betweencontrol participants andParkinsonrsquos patientsMoreover facialexpressivity differences between PD patients with differ-ent progression of Parkinsonrsquos disease have been assessedThe proposed method can capture these differences andgive a more accurate assessment of facial expressivity forParkinsonrsquos patients than what traditional observer basedratings allow for This confirms that our approach can beused for clinical assessment of facial expressivity in PDIndeed nonverbal signals contribute significantly to inter-personal communication Facial expressivity a major sourceof nonverbal information is compromised in Parkinsonrsquosdisease The resulting disconnect between subjective feelingand objective facial affect can lead people to form negativeand inaccurate impressions of people with PD with respectto their personality feelings and intelligence Assessing inan objective way the facial expressivity limitation of PDwould allow developing personalized therapy to benefit facialexpressivity in PD IndeedParkinsonrsquos is a progressive diseasewhich means that the symptoms will get worse as time

Computational and Mathematical Methods in Medicine 11

goes on Using the proposed assessment approach wouldallow regular facial expressivity assessment by therapist andclinicians to explore treatment options

Future work will (i) improve the AU recognition systemand extend it to more AUs and (ii) consider clinical usageof the proposed approach in a statistically significant PDpatients population

Conflict of Interests

The authors declare that there is no conflict of interestsregarding the publication of this paper

Acknowledgments

This work is supported by a CSC-VUB scholarship (Grant no2011629010) and the VUB-interdisciplinary research projectldquoObjectifying Assessment of Human Emotional Processingand Expression for Clinical and Psychotherapy Applications(EMO-App)rdquo The authors gratefully acknowledge the Flem-ish Parkinson League for supporting the current study andthe recruitment of the PD patients Also they acknowledgeFloriane Verbraeck for the setup of the experiments and dataacquisition

References

[1] M Katsikitis and I Pilowsky ldquoA study of facial expressionin Parkinsonrsquos disease using a novel microcomputer-basedmethodrdquo Journal of Neurology Neurosurgery and Psychiatry vol51 no 3 pp 362ndash366 1988

[2] G Simons H Ellgring and M C Smith Pasqualini ldquoDistur-bance of spontaneous and posed facial expressions in Parkin-sonrsquos diseaserdquoCognition and Emotion vol 17 no 5 pp 759ndash7782003

[3] D H Jacobs J Shuren D Bowers and K M Heilman ldquoEmo-tional facial imagery perception and expression in Parkinsonsdiseaserdquo Neurology vol 45 no 9 pp 1696ndash1702 1995

[4] L Tickle-Degnen and K D Lyons ldquoPractitionersrsquo impressionsof patients with Parkinsonrsquos disease the social ecology of theexpressive maskrdquo Social Science amp Medicine vol 58 no 3 pp603ndash614 2004

[5] J J van Hilten A D van der Zwan A H Zwinderman and RA C Roos ldquoRating impairment and disability in Parkinsonsdisease evaluation of the unified Parkinsons disease ratingscalerdquoMovement Disorders vol 9 no 1 pp 84ndash88 1994

[6] D Bowers K Miller W Bosch et al ldquoFaces of emotionin Parkinsons disease micro-expressivity and bradykinesiaduring voluntary facial expressionsrdquo Journal of the InternationalNeuropsychological Society vol 12 no 6 pp 765ndash773 2006

[7] P Ekman and W Friesen Facial Action Coding System ATechnique for the Measurement of Facial Movement ConsultingPsychologists Press Palo Alto Calif USA 1978

[8] P Wu D Jiang and H Sahli ldquoPhysiological signal processingfor emotional feature extractionrdquo in Proceedings of the Interna-tional Conference on Physiological Computing Systems pp 40ndash47 2014

[9] L Tickle-Degnen The Interpersonal Communication RatingProtocol A Manual for Measuring Individual Expressive Behav-ior 2010

[10] G Sandbach S Zafeiriou M Pantic and L Yin ldquoStatic anddynamic 3D facial expression recognition a comprehensivesurveyrdquo Image and Vision Computing vol 30 no 10 pp 683ndash697 2012

[11] I Gonzalez H Sahli V Enescu and W Verhelst ldquoContext-independent facial action unit recognition using shape andgabor phase informationrdquo in Proceedings of the 4th InternationalConference on Affective Computing and Intelligent Interaction(ACII rsquo11) vol 1 pp 548ndash557 Springer Memphis Tenn USAOctober 2011

[12] J C Platt ldquoProbabilistic outputs for support vector machinesand comparisons to regularized likelihood methodsrdquo inAdvances in Large Margin Classifiers A J Smola P Bartlett BSchoelkopf andD Schuurmans Eds pp 61ndash74TheMITPress1999

[13] J J Gross and RW Levenson ldquoEmotion elicitation using filmsrdquoCognition and Emotion vol 9 no 1 pp 87ndash108 1995

[14] D Hagemann E Naumann S Maier G Becker A Lurkenand D Bartussek ldquoThe assessment of affective reactivity usingfilms validity reliability and sex differencesrdquo Personality andIndividual Differences vol 26 no 4 pp 627ndash639 1999

[15] C L Lisetti and F Nasoz ldquoUsing noninvasive wearable comput-ers to recognize human emotions from physiological signalsrdquoEURASIP Journal on Advances in Signal Processing vol 2004Article ID 929414 pp 1672ndash1687 2004

[16] J Hewig D Hagemann J Seifert M Gollwitzer E Naumannand D Bartussek ldquoA revised film set for the induction of basicemotionsrdquo Cognition and Emotion vol 19 no 7 pp 1095ndash11092005

[17] J H Westerink E L van den Broek M H Schut J van Herkand K Tuinenbreijer ldquoProbing experiencerdquo in Assessment ofUser Emotions and Behaviour to Development of Products T JO W F P Joyce H D M Westerink M Ouwerkerk and B deRuyter Eds Springer New York NY USA 2008

[18] A Schaefer F Nils P Philippot and X Sanchez ldquoAssessing theeffectiveness of a large database of emotion-eliciting films a newtool for emotion researchersrdquo Cognition and Emotion vol 24no 7 pp 1153ndash1172 2010

[19] F Verbraeck Objectifying human facial expressions for clinicalapplications [MS thesis] Vrije Universiteit Brussel Belgium2012

[20] O AlzoubiAutomatic affect detection from physiological signalspractical issues [PhD thesis] The University of Sydney NewSouth Wales Australia 2012

[21] S D Kreibig ldquoAutonomic nervous system activity in emotiona reviewrdquo Biological Psychology vol 84 no 3 pp 394ndash421 2010

[22] F D Farfan J C Politti and C J Felice ldquoEvaluation of emgprocessing techniques using information theoryrdquo BioMedicalEngineering Online vol 9 article 72 2010

[23] N Ambady F J Bernieri and J A Richeson ldquoToward ahistology of social behavior judgmental accuracy from thinslices of the behavioral streamrdquoAdvances in Experimental SocialPsychology vol 32 pp 201ndash271 2000

[24] N Ambady and R Rosenthal ldquoThin slices of expressivebehavior as predictors of interpersonal consequences a meta-analysisrdquo Psychological Bulletin vol 111 no 2 pp 256ndash274 1992

[25] K D Lyons and L Tickle-Degnen ldquoReliability and validity ofa videotape method to describe expressive behavior in personswith Parkinsonrsquos diseaserdquoTheAmerican Journal of OccupationalTherapy vol 59 no 1 pp 41ndash49 2005

[26] N AMurphy ldquoUsing thin slices for behavioral codingrdquo Journalof Nonverbal Behavior vol 29 no 4 pp 235ndash246 2005

12 Computational and Mathematical Methods in Medicine

[27] A O Andrade S Nasuto P Kyberd C M Sweeney-Reed andF R vanKanijn ldquoEMG signal filtering based on empiricalmodedecompositionrdquo Biomedical Signal Processing and Control vol1 no 1 pp 44ndash55 2006

[28] M Blanco-Velasco B Weng and K E Barner ldquoECG signaldenoising and baseline wander correction based on the empir-ical mode decompositionrdquo Computers in Biology and Medicinevol 38 no 1 pp 1ndash13 2008

[29] A O Boudraa J C Cexus and Z Saidi ldquoEmd-based signalnoise reductionrdquo Signal Processing vol 1 pp 33ndash37 2005

[30] T Jing-tian Z Qing T Yan L Bin and Z Xiao-kai ldquoHilbert-Huang transform for ECG de-noisingrdquo in Proceedings of the1st International Conference on Bioinformatics and BiomedicalEngineering pp 664ndash667 July 2007

[31] A Karagiannis and P Constantinou ldquoNoise components identi-fication in biomedical signals based on empirical mode decom-positionrdquo in Proceedings of the 9th International Conference onInformation Technology and Applications in Biomedicine (ITABrsquo09) pp 1ndash4 November 2009

[32] Y Kopsinis and S McLaughlin ldquoDevelopment of EMD-baseddenoising methods inspired by wavelet thresholdingrdquo IEEETransactions on Signal Processing vol 57 no 4 pp 1351ndash13622009

[33] F Agrafioti D Hatzinakos and A K Anderson ldquoECG patternanalysis for emotion detectionrdquo IEEE Transactions on AffectiveComputing vol 3 no 1 pp 102ndash115 2012

[34] D L Donoho ldquoDe-noising by soft-thresholdingrdquo IEEE Trans-actions on Information Theory vol 41 no 3 pp 613ndash627 1995

[35] B-U Kohler C Hennig and R Orglmeister ldquoThe principlesof software QRS detectionrdquo IEEE Engineering in Medicine andBiology Magazine vol 21 no 1 pp 42ndash57 2002

[36] V Guralnik and J Srivastava ldquoEvent detection from time seriesdatardquo inKnowledge Discovery andDataMining pp 33ndash42 1999

[37] M Hamedi S-H Salleh and T T Swee ldquoSurfaceelectromyography-based facial expression recognition inBi-polar configurationrdquo Journal of Computer Science vol 7 no9 pp 1407ndash1415 2011

[38] Y Hou H Sahli R Ilse Y Zhang and R Zhao ldquoRobust shape-based head trackingrdquo inAdvanced Concepts for Intelligent VisionSystems vol 4678 ofLectureNotes inComputer Science pp 340ndash351 Springer 2007

[39] T Kanade J F Cohn and Y Tian ldquoComprehensive databasefor facial expression analysisrdquo in Proceedings of the IEEE Inter-national Conference on Automatic Face and Gesture Recognitionpp 46ndash53 Grenoble France 2000

[40] M S Bartlett G C Littlewort M G Frank C Lainscsek IR Fasel and J R Movellan ldquoAutomatic recognition of facialactions in spontaneous expressionsrdquo Journal of Multimedia vol1 no 6 pp 22ndash35 2006

[41] A Savran B Sankur and M Taha Bilge ldquoRegression-basedintensity estimation of facial action unitsrdquo Image and VisionComputing vol 30 no 10 pp 774ndash784 2012

[42] Anvil the video annotation tool httpwwwanvil-soft-wareorg

[43] S Vicente J Peron I Biseul et al ldquoSubjective emotionalexperience at different stages of Parkinsonrsquos diseaserdquo Journal ofthe Neurological Sciences vol 310 no 1-2 pp 241ndash247 2011

[44] A van Boxtel ldquoFacial emg as a tool for inferring affectivestatesrdquo in Proceedings of Measuring Behavior 2010 A J Spink FGrieco O Krips L Loijens L Noldus and P Zimmeran Edspp 104ndash108 Noldus Information TechnologyWageningenTheNetherlands 2010

[45] C-N Huang C-H Chen and H-Y Chung ldquoThe review ofapplications and measurements in facial electromyographyrdquoJournal of Medical and Biological Engineering vol 25 no 1 pp15ndash20 2005

[46] R W Levenson P Ekman K Heider and W V FriesenldquoEmotion and autonomic nervous system activity in theminangkabau of west sumatrardquo Journal of Personality and SocialPsychology vol 62 no 6 pp 972ndash988 1992

[47] S Rohrmann and H Hopp ldquoCardiovascular indicators ofdisgustrdquo International Journal of Psychophysiology vol 68 no3 pp 201ndash208 2008

[48] OAlaoui-Ismaıli O RobinH Rada ADittmar andEVernet-Maury ldquoBasic emotions evoked by odorants comparisonbetween autonomic responses and self-evaluationrdquo Physiologyand Behavior vol 62 no 4 pp 713ndash720 1997

[49] J Gruber S L Johnson C Oveis and D Keltner ldquoRisk formania and positive emotional responding too much of a goodthingrdquo Emotion vol 8 no 1 pp 23ndash33 2008

[50] W Gaebel and W Wolwer ldquoFacial expressivity in the course ofschizophrenia and depressionrdquo European Archives of Psychiatryand Clinical Neuroscience vol 254 no 5 pp 335ndash342 2004

[51] P EkmanW Friesen and J Hager Facial Action Coding SystemThe Manual 2002

Submit your manuscripts athttpwwwhindawicom

Stem CellsInternational

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

MEDIATORSINFLAMMATION

of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Behavioural Neurology

EndocrinologyInternational Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Disease Markers

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

BioMed Research International

OncologyJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Oxidative Medicine and Cellular Longevity

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

PPAR Research

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Immunology ResearchHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Journal of

ObesityJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Computational and Mathematical Methods in Medicine

OphthalmologyJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Diabetes ResearchJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Research and TreatmentAIDS

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Gastroenterology Research and Practice

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Parkinsonrsquos Disease

Evidence-Based Complementary and Alternative Medicine

Volume 2014Hindawi Publishing Corporationhttpwwwhindawicom

8 Computational and Mathematical Methods in Medicine

Table 4 Statistical summary of physiological measures

Variable Stimuli Control PD SigMean SD Mean SD

LLSNeutral2 104 34 116 43 71Disgust1 336 204 227 177 26Disgust2 804 649 271 225 04

lowast

OONeutral2 135 111 94 54 81Disgust1 282 177 181 128 32Disgust2 520 571 207 148 13

HRNeutral2 172 277 276 mdash mdashDisgust1 minus53 264 512 mdash mdashDisgust2 222 553 565 mdash mdash

HRVNeutral2 153 1296 minus765 mdash mdashDisgust1 801 1145 2634 mdash mdashDisgust2 1486 3564 1492 mdash mdash

lowastThe two groups are significantly different that is 119875 lt 05ldquomdashrdquoWe did not compare the cardiac responses (ie HR and HRV) between groups because due to technical failures we lost the ECG data for 10 participantsand thus only 5 participants (4 controls and 1 PD) completed the ECG recording

Table 5 Facial expressivity assessment based on different methods

Var Clowast LPlowast IPlowast MPlowast

1198631lowast 1198632lowast 1198732lowast 1198631lowast 1198632lowast 1198732lowast 1198631lowast 1198632lowast 1198732lowast 1198631lowast 1198632lowast 1198732lowast

TFA 8 8 4 5 5 4 5 6 4 4 mdash 5AMCloz 139 277 108 67 65 43 82 62 69 39 mdash 82AI 54 51 18 28 28 22 38 39 31 22 mdash 33FEloz 722 751 246 209 187 142 444 356 262 135 mdash 322lowast119863111986321198732 C LP IP and MP denote disgust1 disgust2 neutral2 the control the least intermediate and most severely form of Parkinsonrsquos patients

respectivelylozpresented in percentagesldquomdashrdquoThe face of the MP while watching1198632 was blocked by his hand

Note that disgust2 (119872 = 364 SD = 432) elicitedmore muscle activity over OO (because of crosstalk) thanneutral2 (119872 = 114 SD = 87) however unlike disgust1the difference did not reach statistical significance (becausedisgust2 did not elicit amusement at all) which can also beinterpreted

Moreover the main effect of stimuli on cardiac parame-ters for both HR (119865(2 6) = 37 119875 = 70) and HRV (119865(2 6) =084 119875 = 48) was not found which is not consistent withwhat we expected unchanged [46 47] or increased [48 49]HR and increased HRV [21 47] This may be due to thefact that we did not have enough recorded ECG data for astatistical analysis

34 Qualitative Analysis of Facial Expressivity To qualita-tively analyze facial expressivity we used the total facialactivity TFA measure of [2 50] being the total numberof displayed AUs in response to the stimuli Compared tothe control (C) the PD groups (LP IP and MP) showedthe attenuation of their facial activities (variable TFA) whilewatching disgusting movie clips (see Table 5) Howevercomparable TFA was found while watching neutral movieclips

Visual inspection of the of the displayed AUs as illus-trated in Figure 8 shows that AU1 AU2 AU6 AU9 andAU45 occurred more frequently for C except for IP whoproduced more frequently AU6 than C A possible expla-nation is that the IP cannot deactivate AU6 even duringneutral state As shown in Figure 9 during neutral state PDpatients produced more continuous active action units suchas AU4 and AU25 (for all PD) AU6 (for IP) and AU9 (forMP) Moreover certain AU combinations were consideredlikely to signify ldquodisgustrdquo on the basis of Ekman and Friesenrsquosdescription of emotions [7] ldquoDisgustrdquo AUs were consideredas the combination of eyebrows lowerer (AU4) cheek raiser(AU6) and nose wrinkler (AU9) Because cheek raiser isvery difficult to produce on demand without including otherAUs especially the lid tightener (AU7) [51] lid tightenerwas also taken into consideration that is the expected AUpattern of ldquodisgustrdquo was the combination of AU4 AU6 AU7and AU9 Consistent with what we expected C displayed98 and 87 ldquodisgustrdquo frames while watching disgust1 anddisgust2 respectively The PD goup (LP IP and MP) didnot display any ldquodisgustrdquo frames only for IP who had4 ldquodisgustrdquo frames while watching disgust1 Furthermorethe IP displayed quite few frames with combinations ofcheek raiser and lid tightener during neutral state Instead

Computational and Mathematical Methods in Medicine 9

0 100 200 300 400 500 600 7000 1 2 4 6 7 9

122023252745

Frame0 100 200 300 400 500 600 700

Frame

AU

0 100 200 300 400 500 600 7000 1 2 4 6 7 9

122023252745

Frame

AU

0 100 200 300 400 500 600 7000 1 2 4 6 7 9

122023252745

Frame

AU

0 1 2 4 6 7 9

122023252745

AU

Disgust1 (C) Disgust1 (LP)

Disgust1 (IP) Disgust1 (MP)

Figure 8 The facial activities while watching disgust1

N(C) N(LP)

N(IP)

Frame

Frame Frame

Frame

N(MP)

0 100 200 300 400 500 600 7000 1 2 4 6 7 9

122023252745

AU

0 100 200 300 400 500 600 7000 1 2 4 6 7 9

122023252745

AU

0 100 200 300 400 500 600 7000 1 2 4 6 7 9

122023252745

AU

0 100 200 300 400 500 600 7000 1 2 4 6 7 9

122023252745

AU

Figure 9 The facial activities while watching neutral2

10 Computational and Mathematical Methods in Medicine

0

5

10

15

20

25

30

35

fe

C

LP

IP

MP

Disgust1 Disgust2

Figure 10 The quantified facial expressivity

the cheek raiser alone was displayed mostly (see Figure 9)which indicates that IP patients have control problems forfacial muscles combinations

Analyzing the defined quantities AMC AI and FE usingthe segments with active AUs as it can be seen from Table 5the control C had higher facial expressivity than LP IP andMP during the disgusting state while MP performed thelowest facial expressivity More specifically compared to CPD (LP IP andMP) had smaller values of variables AMC AIand FE while watching disgusting movie clips especially forbrow lower (AU4) nose wrinkler (AU9) and blink (AU45)C tended to show similar intensities of brow lower (variable119890) with bigger variance (variables AMC and FE) In additionC showed higher intensities of nose wrinkler with largervariance

35 Quantitative Analysis of Facial Expressivity Figure 10depicts the values of the facial expressivity equation (5) forthe participants We did not assess the facial expressivityof the MP patient during disgust2 as he had his hand onfront of his face during the experiments As it can be seen asignificant difference between C and PD patients is presentC got the highest score while the lowest score was obtainedby the MP patient However the score of the IP patient wasslightly higher than the LP one which is due to the fact thatthe LP and IP expressed ldquofacial maskingrdquo in different waysthe LP patient showed the attenuation of the intensities offacial movements On the contrary the IP patient producedhigh intensities of facial movements not only in his emotionalstate but also during neutral state that is he cannot relax themuscles well In order to take both kinds of ldquofacial maskingrdquointo account we computed the facial expressivity using (4)The results are shown in Figure 11 As it can be seen theproposed facial expressivity allows distinguishing betweencontrol and PD patients Moreover the facial expressivitydecreases along with the increase of PD severity

0

5

10

15

20

25

fe

CLP

IP

MP

Disgust1 Disgust2minus5

Figure 11 The quantified facial expressivity based on the improvedfunction

4 Conclusions

This study investigated the phenomenon of facial masking inParkinsonrsquos patientsWedesigned an automated and objectivemethod to assess the facial expressivity of PD patients Theproposed approach follows the methodology of the Inter-personal Communication Rating Protocol-Parkinsonrsquos DiseaseVersion (ICRP-IEB) [9] In this study based on the FacialAction Coding System (FACS) we proposed a methodologythat (i) automatically detects faces in a video stream (ii) codeseach framewith respect to 11 action units and (iii) estimates afacial expressivity as function of frequency that is how oftenAUs occur duration that is how long AUs last and intensitybeing the strength of AUs

Although the proposed facial expressivity assessmentapproach has been evaluated in a limited number of subjectsit allows capturing differences in facial expressivity betweencontrol participants andParkinsonrsquos patientsMoreover facialexpressivity differences between PD patients with differ-ent progression of Parkinsonrsquos disease have been assessedThe proposed method can capture these differences andgive a more accurate assessment of facial expressivity forParkinsonrsquos patients than what traditional observer basedratings allow for This confirms that our approach can beused for clinical assessment of facial expressivity in PDIndeed nonverbal signals contribute significantly to inter-personal communication Facial expressivity a major sourceof nonverbal information is compromised in Parkinsonrsquosdisease The resulting disconnect between subjective feelingand objective facial affect can lead people to form negativeand inaccurate impressions of people with PD with respectto their personality feelings and intelligence Assessing inan objective way the facial expressivity limitation of PDwould allow developing personalized therapy to benefit facialexpressivity in PD IndeedParkinsonrsquos is a progressive diseasewhich means that the symptoms will get worse as time

Computational and Mathematical Methods in Medicine 11

goes on Using the proposed assessment approach wouldallow regular facial expressivity assessment by therapist andclinicians to explore treatment options

Future work will (i) improve the AU recognition systemand extend it to more AUs and (ii) consider clinical usageof the proposed approach in a statistically significant PDpatients population

Conflict of Interests

The authors declare that there is no conflict of interestsregarding the publication of this paper

Acknowledgments

This work is supported by a CSC-VUB scholarship (Grant no2011629010) and the VUB-interdisciplinary research projectldquoObjectifying Assessment of Human Emotional Processingand Expression for Clinical and Psychotherapy Applications(EMO-App)rdquo The authors gratefully acknowledge the Flem-ish Parkinson League for supporting the current study andthe recruitment of the PD patients Also they acknowledgeFloriane Verbraeck for the setup of the experiments and dataacquisition

References

[1] M Katsikitis and I Pilowsky ldquoA study of facial expressionin Parkinsonrsquos disease using a novel microcomputer-basedmethodrdquo Journal of Neurology Neurosurgery and Psychiatry vol51 no 3 pp 362ndash366 1988

[2] G Simons H Ellgring and M C Smith Pasqualini ldquoDistur-bance of spontaneous and posed facial expressions in Parkin-sonrsquos diseaserdquoCognition and Emotion vol 17 no 5 pp 759ndash7782003

[3] D H Jacobs J Shuren D Bowers and K M Heilman ldquoEmo-tional facial imagery perception and expression in Parkinsonsdiseaserdquo Neurology vol 45 no 9 pp 1696ndash1702 1995

[4] L Tickle-Degnen and K D Lyons ldquoPractitionersrsquo impressionsof patients with Parkinsonrsquos disease the social ecology of theexpressive maskrdquo Social Science amp Medicine vol 58 no 3 pp603ndash614 2004

[5] J J van Hilten A D van der Zwan A H Zwinderman and RA C Roos ldquoRating impairment and disability in Parkinsonsdisease evaluation of the unified Parkinsons disease ratingscalerdquoMovement Disorders vol 9 no 1 pp 84ndash88 1994

[6] D Bowers K Miller W Bosch et al ldquoFaces of emotionin Parkinsons disease micro-expressivity and bradykinesiaduring voluntary facial expressionsrdquo Journal of the InternationalNeuropsychological Society vol 12 no 6 pp 765ndash773 2006

[7] P Ekman and W Friesen Facial Action Coding System ATechnique for the Measurement of Facial Movement ConsultingPsychologists Press Palo Alto Calif USA 1978

[8] P Wu D Jiang and H Sahli ldquoPhysiological signal processingfor emotional feature extractionrdquo in Proceedings of the Interna-tional Conference on Physiological Computing Systems pp 40ndash47 2014

[9] L Tickle-Degnen The Interpersonal Communication RatingProtocol A Manual for Measuring Individual Expressive Behav-ior 2010

[10] G Sandbach S Zafeiriou M Pantic and L Yin ldquoStatic anddynamic 3D facial expression recognition a comprehensivesurveyrdquo Image and Vision Computing vol 30 no 10 pp 683ndash697 2012

[11] I Gonzalez H Sahli V Enescu and W Verhelst ldquoContext-independent facial action unit recognition using shape andgabor phase informationrdquo in Proceedings of the 4th InternationalConference on Affective Computing and Intelligent Interaction(ACII rsquo11) vol 1 pp 548ndash557 Springer Memphis Tenn USAOctober 2011

[12] J C Platt ldquoProbabilistic outputs for support vector machinesand comparisons to regularized likelihood methodsrdquo inAdvances in Large Margin Classifiers A J Smola P Bartlett BSchoelkopf andD Schuurmans Eds pp 61ndash74TheMITPress1999

[13] J J Gross and RW Levenson ldquoEmotion elicitation using filmsrdquoCognition and Emotion vol 9 no 1 pp 87ndash108 1995

[14] D Hagemann E Naumann S Maier G Becker A Lurkenand D Bartussek ldquoThe assessment of affective reactivity usingfilms validity reliability and sex differencesrdquo Personality andIndividual Differences vol 26 no 4 pp 627ndash639 1999

[15] C L Lisetti and F Nasoz ldquoUsing noninvasive wearable comput-ers to recognize human emotions from physiological signalsrdquoEURASIP Journal on Advances in Signal Processing vol 2004Article ID 929414 pp 1672ndash1687 2004

[16] J Hewig D Hagemann J Seifert M Gollwitzer E Naumannand D Bartussek ldquoA revised film set for the induction of basicemotionsrdquo Cognition and Emotion vol 19 no 7 pp 1095ndash11092005

[17] J H Westerink E L van den Broek M H Schut J van Herkand K Tuinenbreijer ldquoProbing experiencerdquo in Assessment ofUser Emotions and Behaviour to Development of Products T JO W F P Joyce H D M Westerink M Ouwerkerk and B deRuyter Eds Springer New York NY USA 2008

[18] A Schaefer F Nils P Philippot and X Sanchez ldquoAssessing theeffectiveness of a large database of emotion-eliciting films a newtool for emotion researchersrdquo Cognition and Emotion vol 24no 7 pp 1153ndash1172 2010

[19] F Verbraeck Objectifying human facial expressions for clinicalapplications [MS thesis] Vrije Universiteit Brussel Belgium2012

[20] O AlzoubiAutomatic affect detection from physiological signalspractical issues [PhD thesis] The University of Sydney NewSouth Wales Australia 2012

[21] S D Kreibig ldquoAutonomic nervous system activity in emotiona reviewrdquo Biological Psychology vol 84 no 3 pp 394ndash421 2010

[22] F D Farfan J C Politti and C J Felice ldquoEvaluation of emgprocessing techniques using information theoryrdquo BioMedicalEngineering Online vol 9 article 72 2010

[23] N Ambady F J Bernieri and J A Richeson ldquoToward ahistology of social behavior judgmental accuracy from thinslices of the behavioral streamrdquoAdvances in Experimental SocialPsychology vol 32 pp 201ndash271 2000

[24] N Ambady and R Rosenthal ldquoThin slices of expressivebehavior as predictors of interpersonal consequences a meta-analysisrdquo Psychological Bulletin vol 111 no 2 pp 256ndash274 1992

[25] K D Lyons and L Tickle-Degnen ldquoReliability and validity ofa videotape method to describe expressive behavior in personswith Parkinsonrsquos diseaserdquoTheAmerican Journal of OccupationalTherapy vol 59 no 1 pp 41ndash49 2005

[26] N AMurphy ldquoUsing thin slices for behavioral codingrdquo Journalof Nonverbal Behavior vol 29 no 4 pp 235ndash246 2005

12 Computational and Mathematical Methods in Medicine

[27] A O Andrade S Nasuto P Kyberd C M Sweeney-Reed andF R vanKanijn ldquoEMG signal filtering based on empiricalmodedecompositionrdquo Biomedical Signal Processing and Control vol1 no 1 pp 44ndash55 2006

[28] M Blanco-Velasco B Weng and K E Barner ldquoECG signaldenoising and baseline wander correction based on the empir-ical mode decompositionrdquo Computers in Biology and Medicinevol 38 no 1 pp 1ndash13 2008

[29] A O Boudraa J C Cexus and Z Saidi ldquoEmd-based signalnoise reductionrdquo Signal Processing vol 1 pp 33ndash37 2005

[30] T Jing-tian Z Qing T Yan L Bin and Z Xiao-kai ldquoHilbert-Huang transform for ECG de-noisingrdquo in Proceedings of the1st International Conference on Bioinformatics and BiomedicalEngineering pp 664ndash667 July 2007

[31] A Karagiannis and P Constantinou ldquoNoise components identi-fication in biomedical signals based on empirical mode decom-positionrdquo in Proceedings of the 9th International Conference onInformation Technology and Applications in Biomedicine (ITABrsquo09) pp 1ndash4 November 2009

[32] Y Kopsinis and S McLaughlin ldquoDevelopment of EMD-baseddenoising methods inspired by wavelet thresholdingrdquo IEEETransactions on Signal Processing vol 57 no 4 pp 1351ndash13622009

[33] F Agrafioti D Hatzinakos and A K Anderson ldquoECG patternanalysis for emotion detectionrdquo IEEE Transactions on AffectiveComputing vol 3 no 1 pp 102ndash115 2012

[34] D L Donoho ldquoDe-noising by soft-thresholdingrdquo IEEE Trans-actions on Information Theory vol 41 no 3 pp 613ndash627 1995

[35] B-U Kohler C Hennig and R Orglmeister ldquoThe principlesof software QRS detectionrdquo IEEE Engineering in Medicine andBiology Magazine vol 21 no 1 pp 42ndash57 2002

[36] V Guralnik and J Srivastava ldquoEvent detection from time seriesdatardquo inKnowledge Discovery andDataMining pp 33ndash42 1999

[37] M Hamedi S-H Salleh and T T Swee ldquoSurfaceelectromyography-based facial expression recognition inBi-polar configurationrdquo Journal of Computer Science vol 7 no9 pp 1407ndash1415 2011

[38] Y Hou H Sahli R Ilse Y Zhang and R Zhao ldquoRobust shape-based head trackingrdquo inAdvanced Concepts for Intelligent VisionSystems vol 4678 ofLectureNotes inComputer Science pp 340ndash351 Springer 2007

[39] T Kanade J F Cohn and Y Tian ldquoComprehensive databasefor facial expression analysisrdquo in Proceedings of the IEEE Inter-national Conference on Automatic Face and Gesture Recognitionpp 46ndash53 Grenoble France 2000

[40] M S Bartlett G C Littlewort M G Frank C Lainscsek IR Fasel and J R Movellan ldquoAutomatic recognition of facialactions in spontaneous expressionsrdquo Journal of Multimedia vol1 no 6 pp 22ndash35 2006

[41] A Savran B Sankur and M Taha Bilge ldquoRegression-basedintensity estimation of facial action unitsrdquo Image and VisionComputing vol 30 no 10 pp 774ndash784 2012

[42] Anvil the video annotation tool httpwwwanvil-soft-wareorg

[43] S Vicente J Peron I Biseul et al ldquoSubjective emotionalexperience at different stages of Parkinsonrsquos diseaserdquo Journal ofthe Neurological Sciences vol 310 no 1-2 pp 241ndash247 2011

[44] A van Boxtel ldquoFacial emg as a tool for inferring affectivestatesrdquo in Proceedings of Measuring Behavior 2010 A J Spink FGrieco O Krips L Loijens L Noldus and P Zimmeran Edspp 104ndash108 Noldus Information TechnologyWageningenTheNetherlands 2010

[45] C-N Huang C-H Chen and H-Y Chung ldquoThe review ofapplications and measurements in facial electromyographyrdquoJournal of Medical and Biological Engineering vol 25 no 1 pp15ndash20 2005

[46] R W Levenson P Ekman K Heider and W V FriesenldquoEmotion and autonomic nervous system activity in theminangkabau of west sumatrardquo Journal of Personality and SocialPsychology vol 62 no 6 pp 972ndash988 1992

[47] S Rohrmann and H Hopp ldquoCardiovascular indicators ofdisgustrdquo International Journal of Psychophysiology vol 68 no3 pp 201ndash208 2008

[48] OAlaoui-Ismaıli O RobinH Rada ADittmar andEVernet-Maury ldquoBasic emotions evoked by odorants comparisonbetween autonomic responses and self-evaluationrdquo Physiologyand Behavior vol 62 no 4 pp 713ndash720 1997

[49] J Gruber S L Johnson C Oveis and D Keltner ldquoRisk formania and positive emotional responding too much of a goodthingrdquo Emotion vol 8 no 1 pp 23ndash33 2008

[50] W Gaebel and W Wolwer ldquoFacial expressivity in the course ofschizophrenia and depressionrdquo European Archives of Psychiatryand Clinical Neuroscience vol 254 no 5 pp 335ndash342 2004

[51] P EkmanW Friesen and J Hager Facial Action Coding SystemThe Manual 2002

Submit your manuscripts athttpwwwhindawicom

Stem CellsInternational

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

MEDIATORSINFLAMMATION

of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Behavioural Neurology

EndocrinologyInternational Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Disease Markers

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

BioMed Research International

OncologyJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Oxidative Medicine and Cellular Longevity

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

PPAR Research

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Immunology ResearchHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Journal of

ObesityJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Computational and Mathematical Methods in Medicine

OphthalmologyJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Diabetes ResearchJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Research and TreatmentAIDS

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Gastroenterology Research and Practice

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Parkinsonrsquos Disease

Evidence-Based Complementary and Alternative Medicine

Volume 2014Hindawi Publishing Corporationhttpwwwhindawicom

Computational and Mathematical Methods in Medicine 9

0 100 200 300 400 500 600 7000 1 2 4 6 7 9

122023252745

Frame0 100 200 300 400 500 600 700

Frame

AU

0 100 200 300 400 500 600 7000 1 2 4 6 7 9

122023252745

Frame

AU

0 100 200 300 400 500 600 7000 1 2 4 6 7 9

122023252745

Frame

AU

0 1 2 4 6 7 9

122023252745

AU

Disgust1 (C) Disgust1 (LP)

Disgust1 (IP) Disgust1 (MP)

Figure 8 The facial activities while watching disgust1

N(C) N(LP)

N(IP)

Frame

Frame Frame

Frame

N(MP)

0 100 200 300 400 500 600 7000 1 2 4 6 7 9

122023252745

AU

0 100 200 300 400 500 600 7000 1 2 4 6 7 9

122023252745

AU

0 100 200 300 400 500 600 7000 1 2 4 6 7 9

122023252745

AU

0 100 200 300 400 500 600 7000 1 2 4 6 7 9

122023252745

AU

Figure 9 The facial activities while watching neutral2

10 Computational and Mathematical Methods in Medicine

0

5

10

15

20

25

30

35

fe

C

LP

IP

MP

Disgust1 Disgust2

Figure 10 The quantified facial expressivity

the cheek raiser alone was displayed mostly (see Figure 9)which indicates that IP patients have control problems forfacial muscles combinations

Analyzing the defined quantities AMC AI and FE usingthe segments with active AUs as it can be seen from Table 5the control C had higher facial expressivity than LP IP andMP during the disgusting state while MP performed thelowest facial expressivity More specifically compared to CPD (LP IP andMP) had smaller values of variables AMC AIand FE while watching disgusting movie clips especially forbrow lower (AU4) nose wrinkler (AU9) and blink (AU45)C tended to show similar intensities of brow lower (variable119890) with bigger variance (variables AMC and FE) In additionC showed higher intensities of nose wrinkler with largervariance

35 Quantitative Analysis of Facial Expressivity Figure 10depicts the values of the facial expressivity equation (5) forthe participants We did not assess the facial expressivityof the MP patient during disgust2 as he had his hand onfront of his face during the experiments As it can be seen asignificant difference between C and PD patients is presentC got the highest score while the lowest score was obtainedby the MP patient However the score of the IP patient wasslightly higher than the LP one which is due to the fact thatthe LP and IP expressed ldquofacial maskingrdquo in different waysthe LP patient showed the attenuation of the intensities offacial movements On the contrary the IP patient producedhigh intensities of facial movements not only in his emotionalstate but also during neutral state that is he cannot relax themuscles well In order to take both kinds of ldquofacial maskingrdquointo account we computed the facial expressivity using (4)The results are shown in Figure 11 As it can be seen theproposed facial expressivity allows distinguishing betweencontrol and PD patients Moreover the facial expressivitydecreases along with the increase of PD severity

0

5

10

15

20

25

fe

CLP

IP

MP

Disgust1 Disgust2minus5

Figure 11 The quantified facial expressivity based on the improvedfunction

4 Conclusions

This study investigated the phenomenon of facial masking inParkinsonrsquos patientsWedesigned an automated and objectivemethod to assess the facial expressivity of PD patients Theproposed approach follows the methodology of the Inter-personal Communication Rating Protocol-Parkinsonrsquos DiseaseVersion (ICRP-IEB) [9] In this study based on the FacialAction Coding System (FACS) we proposed a methodologythat (i) automatically detects faces in a video stream (ii) codeseach framewith respect to 11 action units and (iii) estimates afacial expressivity as function of frequency that is how oftenAUs occur duration that is how long AUs last and intensitybeing the strength of AUs

Although the proposed facial expressivity assessmentapproach has been evaluated in a limited number of subjectsit allows capturing differences in facial expressivity betweencontrol participants andParkinsonrsquos patientsMoreover facialexpressivity differences between PD patients with differ-ent progression of Parkinsonrsquos disease have been assessedThe proposed method can capture these differences andgive a more accurate assessment of facial expressivity forParkinsonrsquos patients than what traditional observer basedratings allow for This confirms that our approach can beused for clinical assessment of facial expressivity in PDIndeed nonverbal signals contribute significantly to inter-personal communication Facial expressivity a major sourceof nonverbal information is compromised in Parkinsonrsquosdisease The resulting disconnect between subjective feelingand objective facial affect can lead people to form negativeand inaccurate impressions of people with PD with respectto their personality feelings and intelligence Assessing inan objective way the facial expressivity limitation of PDwould allow developing personalized therapy to benefit facialexpressivity in PD IndeedParkinsonrsquos is a progressive diseasewhich means that the symptoms will get worse as time

Computational and Mathematical Methods in Medicine 11

goes on Using the proposed assessment approach wouldallow regular facial expressivity assessment by therapist andclinicians to explore treatment options

Future work will (i) improve the AU recognition systemand extend it to more AUs and (ii) consider clinical usageof the proposed approach in a statistically significant PDpatients population

Conflict of Interests

The authors declare that there is no conflict of interestsregarding the publication of this paper

Acknowledgments

This work is supported by a CSC-VUB scholarship (Grant no2011629010) and the VUB-interdisciplinary research projectldquoObjectifying Assessment of Human Emotional Processingand Expression for Clinical and Psychotherapy Applications(EMO-App)rdquo The authors gratefully acknowledge the Flem-ish Parkinson League for supporting the current study andthe recruitment of the PD patients Also they acknowledgeFloriane Verbraeck for the setup of the experiments and dataacquisition

References

[1] M Katsikitis and I Pilowsky ldquoA study of facial expressionin Parkinsonrsquos disease using a novel microcomputer-basedmethodrdquo Journal of Neurology Neurosurgery and Psychiatry vol51 no 3 pp 362ndash366 1988

[2] G Simons H Ellgring and M C Smith Pasqualini ldquoDistur-bance of spontaneous and posed facial expressions in Parkin-sonrsquos diseaserdquoCognition and Emotion vol 17 no 5 pp 759ndash7782003

[3] D H Jacobs J Shuren D Bowers and K M Heilman ldquoEmo-tional facial imagery perception and expression in Parkinsonsdiseaserdquo Neurology vol 45 no 9 pp 1696ndash1702 1995

[4] L Tickle-Degnen and K D Lyons ldquoPractitionersrsquo impressionsof patients with Parkinsonrsquos disease the social ecology of theexpressive maskrdquo Social Science amp Medicine vol 58 no 3 pp603ndash614 2004

[5] J J van Hilten A D van der Zwan A H Zwinderman and RA C Roos ldquoRating impairment and disability in Parkinsonsdisease evaluation of the unified Parkinsons disease ratingscalerdquoMovement Disorders vol 9 no 1 pp 84ndash88 1994

[6] D Bowers K Miller W Bosch et al ldquoFaces of emotionin Parkinsons disease micro-expressivity and bradykinesiaduring voluntary facial expressionsrdquo Journal of the InternationalNeuropsychological Society vol 12 no 6 pp 765ndash773 2006

[7] P Ekman and W Friesen Facial Action Coding System ATechnique for the Measurement of Facial Movement ConsultingPsychologists Press Palo Alto Calif USA 1978

[8] P Wu D Jiang and H Sahli ldquoPhysiological signal processingfor emotional feature extractionrdquo in Proceedings of the Interna-tional Conference on Physiological Computing Systems pp 40ndash47 2014

[9] L Tickle-Degnen The Interpersonal Communication RatingProtocol A Manual for Measuring Individual Expressive Behav-ior 2010

[10] G Sandbach S Zafeiriou M Pantic and L Yin ldquoStatic anddynamic 3D facial expression recognition a comprehensivesurveyrdquo Image and Vision Computing vol 30 no 10 pp 683ndash697 2012

[11] I Gonzalez H Sahli V Enescu and W Verhelst ldquoContext-independent facial action unit recognition using shape andgabor phase informationrdquo in Proceedings of the 4th InternationalConference on Affective Computing and Intelligent Interaction(ACII rsquo11) vol 1 pp 548ndash557 Springer Memphis Tenn USAOctober 2011

[12] J C Platt ldquoProbabilistic outputs for support vector machinesand comparisons to regularized likelihood methodsrdquo inAdvances in Large Margin Classifiers A J Smola P Bartlett BSchoelkopf andD Schuurmans Eds pp 61ndash74TheMITPress1999

[13] J J Gross and RW Levenson ldquoEmotion elicitation using filmsrdquoCognition and Emotion vol 9 no 1 pp 87ndash108 1995

[14] D Hagemann E Naumann S Maier G Becker A Lurkenand D Bartussek ldquoThe assessment of affective reactivity usingfilms validity reliability and sex differencesrdquo Personality andIndividual Differences vol 26 no 4 pp 627ndash639 1999

[15] C L Lisetti and F Nasoz ldquoUsing noninvasive wearable comput-ers to recognize human emotions from physiological signalsrdquoEURASIP Journal on Advances in Signal Processing vol 2004Article ID 929414 pp 1672ndash1687 2004

[16] J Hewig D Hagemann J Seifert M Gollwitzer E Naumannand D Bartussek ldquoA revised film set for the induction of basicemotionsrdquo Cognition and Emotion vol 19 no 7 pp 1095ndash11092005

[17] J H Westerink E L van den Broek M H Schut J van Herkand K Tuinenbreijer ldquoProbing experiencerdquo in Assessment ofUser Emotions and Behaviour to Development of Products T JO W F P Joyce H D M Westerink M Ouwerkerk and B deRuyter Eds Springer New York NY USA 2008

[18] A Schaefer F Nils P Philippot and X Sanchez ldquoAssessing theeffectiveness of a large database of emotion-eliciting films a newtool for emotion researchersrdquo Cognition and Emotion vol 24no 7 pp 1153ndash1172 2010

[19] F Verbraeck Objectifying human facial expressions for clinicalapplications [MS thesis] Vrije Universiteit Brussel Belgium2012

[20] O AlzoubiAutomatic affect detection from physiological signalspractical issues [PhD thesis] The University of Sydney NewSouth Wales Australia 2012

[21] S D Kreibig ldquoAutonomic nervous system activity in emotiona reviewrdquo Biological Psychology vol 84 no 3 pp 394ndash421 2010

[22] F D Farfan J C Politti and C J Felice ldquoEvaluation of emgprocessing techniques using information theoryrdquo BioMedicalEngineering Online vol 9 article 72 2010

[23] N Ambady F J Bernieri and J A Richeson ldquoToward ahistology of social behavior judgmental accuracy from thinslices of the behavioral streamrdquoAdvances in Experimental SocialPsychology vol 32 pp 201ndash271 2000

[24] N Ambady and R Rosenthal ldquoThin slices of expressivebehavior as predictors of interpersonal consequences a meta-analysisrdquo Psychological Bulletin vol 111 no 2 pp 256ndash274 1992

[25] K D Lyons and L Tickle-Degnen ldquoReliability and validity ofa videotape method to describe expressive behavior in personswith Parkinsonrsquos diseaserdquoTheAmerican Journal of OccupationalTherapy vol 59 no 1 pp 41ndash49 2005

[26] N AMurphy ldquoUsing thin slices for behavioral codingrdquo Journalof Nonverbal Behavior vol 29 no 4 pp 235ndash246 2005

12 Computational and Mathematical Methods in Medicine

[27] A O Andrade S Nasuto P Kyberd C M Sweeney-Reed andF R vanKanijn ldquoEMG signal filtering based on empiricalmodedecompositionrdquo Biomedical Signal Processing and Control vol1 no 1 pp 44ndash55 2006

[28] M Blanco-Velasco B Weng and K E Barner ldquoECG signaldenoising and baseline wander correction based on the empir-ical mode decompositionrdquo Computers in Biology and Medicinevol 38 no 1 pp 1ndash13 2008

[29] A O Boudraa J C Cexus and Z Saidi ldquoEmd-based signalnoise reductionrdquo Signal Processing vol 1 pp 33ndash37 2005

[30] T Jing-tian Z Qing T Yan L Bin and Z Xiao-kai ldquoHilbert-Huang transform for ECG de-noisingrdquo in Proceedings of the1st International Conference on Bioinformatics and BiomedicalEngineering pp 664ndash667 July 2007

[31] A Karagiannis and P Constantinou ldquoNoise components identi-fication in biomedical signals based on empirical mode decom-positionrdquo in Proceedings of the 9th International Conference onInformation Technology and Applications in Biomedicine (ITABrsquo09) pp 1ndash4 November 2009

[32] Y Kopsinis and S McLaughlin ldquoDevelopment of EMD-baseddenoising methods inspired by wavelet thresholdingrdquo IEEETransactions on Signal Processing vol 57 no 4 pp 1351ndash13622009

[33] F Agrafioti D Hatzinakos and A K Anderson ldquoECG patternanalysis for emotion detectionrdquo IEEE Transactions on AffectiveComputing vol 3 no 1 pp 102ndash115 2012

[34] D L Donoho ldquoDe-noising by soft-thresholdingrdquo IEEE Trans-actions on Information Theory vol 41 no 3 pp 613ndash627 1995

[35] B-U Kohler C Hennig and R Orglmeister ldquoThe principlesof software QRS detectionrdquo IEEE Engineering in Medicine andBiology Magazine vol 21 no 1 pp 42ndash57 2002

[36] V Guralnik and J Srivastava ldquoEvent detection from time seriesdatardquo inKnowledge Discovery andDataMining pp 33ndash42 1999

[37] M Hamedi S-H Salleh and T T Swee ldquoSurfaceelectromyography-based facial expression recognition inBi-polar configurationrdquo Journal of Computer Science vol 7 no9 pp 1407ndash1415 2011

[38] Y Hou H Sahli R Ilse Y Zhang and R Zhao ldquoRobust shape-based head trackingrdquo inAdvanced Concepts for Intelligent VisionSystems vol 4678 ofLectureNotes inComputer Science pp 340ndash351 Springer 2007

[39] T Kanade J F Cohn and Y Tian ldquoComprehensive databasefor facial expression analysisrdquo in Proceedings of the IEEE Inter-national Conference on Automatic Face and Gesture Recognitionpp 46ndash53 Grenoble France 2000

[40] M S Bartlett G C Littlewort M G Frank C Lainscsek IR Fasel and J R Movellan ldquoAutomatic recognition of facialactions in spontaneous expressionsrdquo Journal of Multimedia vol1 no 6 pp 22ndash35 2006

[41] A Savran B Sankur and M Taha Bilge ldquoRegression-basedintensity estimation of facial action unitsrdquo Image and VisionComputing vol 30 no 10 pp 774ndash784 2012

[42] Anvil the video annotation tool httpwwwanvil-soft-wareorg

[43] S Vicente J Peron I Biseul et al ldquoSubjective emotionalexperience at different stages of Parkinsonrsquos diseaserdquo Journal ofthe Neurological Sciences vol 310 no 1-2 pp 241ndash247 2011

[44] A van Boxtel ldquoFacial emg as a tool for inferring affectivestatesrdquo in Proceedings of Measuring Behavior 2010 A J Spink FGrieco O Krips L Loijens L Noldus and P Zimmeran Edspp 104ndash108 Noldus Information TechnologyWageningenTheNetherlands 2010

[45] C-N Huang C-H Chen and H-Y Chung ldquoThe review ofapplications and measurements in facial electromyographyrdquoJournal of Medical and Biological Engineering vol 25 no 1 pp15ndash20 2005

[46] R W Levenson P Ekman K Heider and W V FriesenldquoEmotion and autonomic nervous system activity in theminangkabau of west sumatrardquo Journal of Personality and SocialPsychology vol 62 no 6 pp 972ndash988 1992

[47] S Rohrmann and H Hopp ldquoCardiovascular indicators ofdisgustrdquo International Journal of Psychophysiology vol 68 no3 pp 201ndash208 2008

[48] OAlaoui-Ismaıli O RobinH Rada ADittmar andEVernet-Maury ldquoBasic emotions evoked by odorants comparisonbetween autonomic responses and self-evaluationrdquo Physiologyand Behavior vol 62 no 4 pp 713ndash720 1997

[49] J Gruber S L Johnson C Oveis and D Keltner ldquoRisk formania and positive emotional responding too much of a goodthingrdquo Emotion vol 8 no 1 pp 23ndash33 2008

[50] W Gaebel and W Wolwer ldquoFacial expressivity in the course ofschizophrenia and depressionrdquo European Archives of Psychiatryand Clinical Neuroscience vol 254 no 5 pp 335ndash342 2004

[51] P EkmanW Friesen and J Hager Facial Action Coding SystemThe Manual 2002

Submit your manuscripts athttpwwwhindawicom

Stem CellsInternational

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

MEDIATORSINFLAMMATION

of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Behavioural Neurology

EndocrinologyInternational Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Disease Markers

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

BioMed Research International

OncologyJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Oxidative Medicine and Cellular Longevity

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

PPAR Research

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Immunology ResearchHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Journal of

ObesityJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Computational and Mathematical Methods in Medicine

OphthalmologyJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Diabetes ResearchJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Research and TreatmentAIDS

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Gastroenterology Research and Practice

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Parkinsonrsquos Disease

Evidence-Based Complementary and Alternative Medicine

Volume 2014Hindawi Publishing Corporationhttpwwwhindawicom

10 Computational and Mathematical Methods in Medicine

0

5

10

15

20

25

30

35

fe

C

LP

IP

MP

Disgust1 Disgust2

Figure 10 The quantified facial expressivity

the cheek raiser alone was displayed mostly (see Figure 9)which indicates that IP patients have control problems forfacial muscles combinations

Analyzing the defined quantities AMC AI and FE usingthe segments with active AUs as it can be seen from Table 5the control C had higher facial expressivity than LP IP andMP during the disgusting state while MP performed thelowest facial expressivity More specifically compared to CPD (LP IP andMP) had smaller values of variables AMC AIand FE while watching disgusting movie clips especially forbrow lower (AU4) nose wrinkler (AU9) and blink (AU45)C tended to show similar intensities of brow lower (variable119890) with bigger variance (variables AMC and FE) In additionC showed higher intensities of nose wrinkler with largervariance

35 Quantitative Analysis of Facial Expressivity Figure 10depicts the values of the facial expressivity equation (5) forthe participants We did not assess the facial expressivityof the MP patient during disgust2 as he had his hand onfront of his face during the experiments As it can be seen asignificant difference between C and PD patients is presentC got the highest score while the lowest score was obtainedby the MP patient However the score of the IP patient wasslightly higher than the LP one which is due to the fact thatthe LP and IP expressed ldquofacial maskingrdquo in different waysthe LP patient showed the attenuation of the intensities offacial movements On the contrary the IP patient producedhigh intensities of facial movements not only in his emotionalstate but also during neutral state that is he cannot relax themuscles well In order to take both kinds of ldquofacial maskingrdquointo account we computed the facial expressivity using (4)The results are shown in Figure 11 As it can be seen theproposed facial expressivity allows distinguishing betweencontrol and PD patients Moreover the facial expressivitydecreases along with the increase of PD severity

0

5

10

15

20

25

fe

CLP

IP

MP

Disgust1 Disgust2minus5

Figure 11 The quantified facial expressivity based on the improvedfunction

4 Conclusions

This study investigated the phenomenon of facial masking inParkinsonrsquos patientsWedesigned an automated and objectivemethod to assess the facial expressivity of PD patients Theproposed approach follows the methodology of the Inter-personal Communication Rating Protocol-Parkinsonrsquos DiseaseVersion (ICRP-IEB) [9] In this study based on the FacialAction Coding System (FACS) we proposed a methodologythat (i) automatically detects faces in a video stream (ii) codeseach framewith respect to 11 action units and (iii) estimates afacial expressivity as function of frequency that is how oftenAUs occur duration that is how long AUs last and intensitybeing the strength of AUs

Although the proposed facial expressivity assessmentapproach has been evaluated in a limited number of subjectsit allows capturing differences in facial expressivity betweencontrol participants andParkinsonrsquos patientsMoreover facialexpressivity differences between PD patients with differ-ent progression of Parkinsonrsquos disease have been assessedThe proposed method can capture these differences andgive a more accurate assessment of facial expressivity forParkinsonrsquos patients than what traditional observer basedratings allow for This confirms that our approach can beused for clinical assessment of facial expressivity in PDIndeed nonverbal signals contribute significantly to inter-personal communication Facial expressivity a major sourceof nonverbal information is compromised in Parkinsonrsquosdisease The resulting disconnect between subjective feelingand objective facial affect can lead people to form negativeand inaccurate impressions of people with PD with respectto their personality feelings and intelligence Assessing inan objective way the facial expressivity limitation of PDwould allow developing personalized therapy to benefit facialexpressivity in PD IndeedParkinsonrsquos is a progressive diseasewhich means that the symptoms will get worse as time

Computational and Mathematical Methods in Medicine 11

goes on Using the proposed assessment approach wouldallow regular facial expressivity assessment by therapist andclinicians to explore treatment options

Future work will (i) improve the AU recognition systemand extend it to more AUs and (ii) consider clinical usageof the proposed approach in a statistically significant PDpatients population

Conflict of Interests

The authors declare that there is no conflict of interestsregarding the publication of this paper

Acknowledgments

This work is supported by a CSC-VUB scholarship (Grant no2011629010) and the VUB-interdisciplinary research projectldquoObjectifying Assessment of Human Emotional Processingand Expression for Clinical and Psychotherapy Applications(EMO-App)rdquo The authors gratefully acknowledge the Flem-ish Parkinson League for supporting the current study andthe recruitment of the PD patients Also they acknowledgeFloriane Verbraeck for the setup of the experiments and dataacquisition

References

[1] M Katsikitis and I Pilowsky ldquoA study of facial expressionin Parkinsonrsquos disease using a novel microcomputer-basedmethodrdquo Journal of Neurology Neurosurgery and Psychiatry vol51 no 3 pp 362ndash366 1988

[2] G Simons H Ellgring and M C Smith Pasqualini ldquoDistur-bance of spontaneous and posed facial expressions in Parkin-sonrsquos diseaserdquoCognition and Emotion vol 17 no 5 pp 759ndash7782003

[3] D H Jacobs J Shuren D Bowers and K M Heilman ldquoEmo-tional facial imagery perception and expression in Parkinsonsdiseaserdquo Neurology vol 45 no 9 pp 1696ndash1702 1995

[4] L Tickle-Degnen and K D Lyons ldquoPractitionersrsquo impressionsof patients with Parkinsonrsquos disease the social ecology of theexpressive maskrdquo Social Science amp Medicine vol 58 no 3 pp603ndash614 2004

[5] J J van Hilten A D van der Zwan A H Zwinderman and RA C Roos ldquoRating impairment and disability in Parkinsonsdisease evaluation of the unified Parkinsons disease ratingscalerdquoMovement Disorders vol 9 no 1 pp 84ndash88 1994

[6] D Bowers K Miller W Bosch et al ldquoFaces of emotionin Parkinsons disease micro-expressivity and bradykinesiaduring voluntary facial expressionsrdquo Journal of the InternationalNeuropsychological Society vol 12 no 6 pp 765ndash773 2006

[7] P Ekman and W Friesen Facial Action Coding System ATechnique for the Measurement of Facial Movement ConsultingPsychologists Press Palo Alto Calif USA 1978

[8] P Wu D Jiang and H Sahli ldquoPhysiological signal processingfor emotional feature extractionrdquo in Proceedings of the Interna-tional Conference on Physiological Computing Systems pp 40ndash47 2014

[9] L Tickle-Degnen The Interpersonal Communication RatingProtocol A Manual for Measuring Individual Expressive Behav-ior 2010

[10] G Sandbach S Zafeiriou M Pantic and L Yin ldquoStatic anddynamic 3D facial expression recognition a comprehensivesurveyrdquo Image and Vision Computing vol 30 no 10 pp 683ndash697 2012

[11] I Gonzalez H Sahli V Enescu and W Verhelst ldquoContext-independent facial action unit recognition using shape andgabor phase informationrdquo in Proceedings of the 4th InternationalConference on Affective Computing and Intelligent Interaction(ACII rsquo11) vol 1 pp 548ndash557 Springer Memphis Tenn USAOctober 2011

[12] J C Platt ldquoProbabilistic outputs for support vector machinesand comparisons to regularized likelihood methodsrdquo inAdvances in Large Margin Classifiers A J Smola P Bartlett BSchoelkopf andD Schuurmans Eds pp 61ndash74TheMITPress1999

[13] J J Gross and RW Levenson ldquoEmotion elicitation using filmsrdquoCognition and Emotion vol 9 no 1 pp 87ndash108 1995

[14] D Hagemann E Naumann S Maier G Becker A Lurkenand D Bartussek ldquoThe assessment of affective reactivity usingfilms validity reliability and sex differencesrdquo Personality andIndividual Differences vol 26 no 4 pp 627ndash639 1999

[15] C L Lisetti and F Nasoz ldquoUsing noninvasive wearable comput-ers to recognize human emotions from physiological signalsrdquoEURASIP Journal on Advances in Signal Processing vol 2004Article ID 929414 pp 1672ndash1687 2004

[16] J Hewig D Hagemann J Seifert M Gollwitzer E Naumannand D Bartussek ldquoA revised film set for the induction of basicemotionsrdquo Cognition and Emotion vol 19 no 7 pp 1095ndash11092005

[17] J H Westerink E L van den Broek M H Schut J van Herkand K Tuinenbreijer ldquoProbing experiencerdquo in Assessment ofUser Emotions and Behaviour to Development of Products T JO W F P Joyce H D M Westerink M Ouwerkerk and B deRuyter Eds Springer New York NY USA 2008

[18] A Schaefer F Nils P Philippot and X Sanchez ldquoAssessing theeffectiveness of a large database of emotion-eliciting films a newtool for emotion researchersrdquo Cognition and Emotion vol 24no 7 pp 1153ndash1172 2010

[19] F Verbraeck Objectifying human facial expressions for clinicalapplications [MS thesis] Vrije Universiteit Brussel Belgium2012

[20] O AlzoubiAutomatic affect detection from physiological signalspractical issues [PhD thesis] The University of Sydney NewSouth Wales Australia 2012

[21] S D Kreibig ldquoAutonomic nervous system activity in emotiona reviewrdquo Biological Psychology vol 84 no 3 pp 394ndash421 2010

[22] F D Farfan J C Politti and C J Felice ldquoEvaluation of emgprocessing techniques using information theoryrdquo BioMedicalEngineering Online vol 9 article 72 2010

[23] N Ambady F J Bernieri and J A Richeson ldquoToward ahistology of social behavior judgmental accuracy from thinslices of the behavioral streamrdquoAdvances in Experimental SocialPsychology vol 32 pp 201ndash271 2000

[24] N Ambady and R Rosenthal ldquoThin slices of expressivebehavior as predictors of interpersonal consequences a meta-analysisrdquo Psychological Bulletin vol 111 no 2 pp 256ndash274 1992

[25] K D Lyons and L Tickle-Degnen ldquoReliability and validity ofa videotape method to describe expressive behavior in personswith Parkinsonrsquos diseaserdquoTheAmerican Journal of OccupationalTherapy vol 59 no 1 pp 41ndash49 2005

[26] N AMurphy ldquoUsing thin slices for behavioral codingrdquo Journalof Nonverbal Behavior vol 29 no 4 pp 235ndash246 2005

12 Computational and Mathematical Methods in Medicine

[27] A O Andrade S Nasuto P Kyberd C M Sweeney-Reed andF R vanKanijn ldquoEMG signal filtering based on empiricalmodedecompositionrdquo Biomedical Signal Processing and Control vol1 no 1 pp 44ndash55 2006

[28] M Blanco-Velasco B Weng and K E Barner ldquoECG signaldenoising and baseline wander correction based on the empir-ical mode decompositionrdquo Computers in Biology and Medicinevol 38 no 1 pp 1ndash13 2008

[29] A O Boudraa J C Cexus and Z Saidi ldquoEmd-based signalnoise reductionrdquo Signal Processing vol 1 pp 33ndash37 2005

[30] T Jing-tian Z Qing T Yan L Bin and Z Xiao-kai ldquoHilbert-Huang transform for ECG de-noisingrdquo in Proceedings of the1st International Conference on Bioinformatics and BiomedicalEngineering pp 664ndash667 July 2007

[31] A Karagiannis and P Constantinou ldquoNoise components identi-fication in biomedical signals based on empirical mode decom-positionrdquo in Proceedings of the 9th International Conference onInformation Technology and Applications in Biomedicine (ITABrsquo09) pp 1ndash4 November 2009

[32] Y Kopsinis and S McLaughlin ldquoDevelopment of EMD-baseddenoising methods inspired by wavelet thresholdingrdquo IEEETransactions on Signal Processing vol 57 no 4 pp 1351ndash13622009

[33] F Agrafioti D Hatzinakos and A K Anderson ldquoECG patternanalysis for emotion detectionrdquo IEEE Transactions on AffectiveComputing vol 3 no 1 pp 102ndash115 2012

[34] D L Donoho ldquoDe-noising by soft-thresholdingrdquo IEEE Trans-actions on Information Theory vol 41 no 3 pp 613ndash627 1995

[35] B-U Kohler C Hennig and R Orglmeister ldquoThe principlesof software QRS detectionrdquo IEEE Engineering in Medicine andBiology Magazine vol 21 no 1 pp 42ndash57 2002

[36] V Guralnik and J Srivastava ldquoEvent detection from time seriesdatardquo inKnowledge Discovery andDataMining pp 33ndash42 1999

[37] M Hamedi S-H Salleh and T T Swee ldquoSurfaceelectromyography-based facial expression recognition inBi-polar configurationrdquo Journal of Computer Science vol 7 no9 pp 1407ndash1415 2011

[38] Y Hou H Sahli R Ilse Y Zhang and R Zhao ldquoRobust shape-based head trackingrdquo inAdvanced Concepts for Intelligent VisionSystems vol 4678 ofLectureNotes inComputer Science pp 340ndash351 Springer 2007

[39] T Kanade J F Cohn and Y Tian ldquoComprehensive databasefor facial expression analysisrdquo in Proceedings of the IEEE Inter-national Conference on Automatic Face and Gesture Recognitionpp 46ndash53 Grenoble France 2000

[40] M S Bartlett G C Littlewort M G Frank C Lainscsek IR Fasel and J R Movellan ldquoAutomatic recognition of facialactions in spontaneous expressionsrdquo Journal of Multimedia vol1 no 6 pp 22ndash35 2006

[41] A Savran B Sankur and M Taha Bilge ldquoRegression-basedintensity estimation of facial action unitsrdquo Image and VisionComputing vol 30 no 10 pp 774ndash784 2012

[42] Anvil the video annotation tool httpwwwanvil-soft-wareorg

[43] S Vicente J Peron I Biseul et al ldquoSubjective emotionalexperience at different stages of Parkinsonrsquos diseaserdquo Journal ofthe Neurological Sciences vol 310 no 1-2 pp 241ndash247 2011

[44] A van Boxtel ldquoFacial emg as a tool for inferring affectivestatesrdquo in Proceedings of Measuring Behavior 2010 A J Spink FGrieco O Krips L Loijens L Noldus and P Zimmeran Edspp 104ndash108 Noldus Information TechnologyWageningenTheNetherlands 2010

[45] C-N Huang C-H Chen and H-Y Chung ldquoThe review ofapplications and measurements in facial electromyographyrdquoJournal of Medical and Biological Engineering vol 25 no 1 pp15ndash20 2005

[46] R W Levenson P Ekman K Heider and W V FriesenldquoEmotion and autonomic nervous system activity in theminangkabau of west sumatrardquo Journal of Personality and SocialPsychology vol 62 no 6 pp 972ndash988 1992

[47] S Rohrmann and H Hopp ldquoCardiovascular indicators ofdisgustrdquo International Journal of Psychophysiology vol 68 no3 pp 201ndash208 2008

[48] OAlaoui-Ismaıli O RobinH Rada ADittmar andEVernet-Maury ldquoBasic emotions evoked by odorants comparisonbetween autonomic responses and self-evaluationrdquo Physiologyand Behavior vol 62 no 4 pp 713ndash720 1997

[49] J Gruber S L Johnson C Oveis and D Keltner ldquoRisk formania and positive emotional responding too much of a goodthingrdquo Emotion vol 8 no 1 pp 23ndash33 2008

[50] W Gaebel and W Wolwer ldquoFacial expressivity in the course ofschizophrenia and depressionrdquo European Archives of Psychiatryand Clinical Neuroscience vol 254 no 5 pp 335ndash342 2004

[51] P EkmanW Friesen and J Hager Facial Action Coding SystemThe Manual 2002

Submit your manuscripts athttpwwwhindawicom

Stem CellsInternational

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

MEDIATORSINFLAMMATION

of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Behavioural Neurology

EndocrinologyInternational Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Disease Markers

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

BioMed Research International

OncologyJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Oxidative Medicine and Cellular Longevity

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

PPAR Research

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Immunology ResearchHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Journal of

ObesityJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Computational and Mathematical Methods in Medicine

OphthalmologyJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Diabetes ResearchJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Research and TreatmentAIDS

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Gastroenterology Research and Practice

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Parkinsonrsquos Disease

Evidence-Based Complementary and Alternative Medicine

Volume 2014Hindawi Publishing Corporationhttpwwwhindawicom

Computational and Mathematical Methods in Medicine 11

goes on Using the proposed assessment approach wouldallow regular facial expressivity assessment by therapist andclinicians to explore treatment options

Future work will (i) improve the AU recognition systemand extend it to more AUs and (ii) consider clinical usageof the proposed approach in a statistically significant PDpatients population

Conflict of Interests

The authors declare that there is no conflict of interestsregarding the publication of this paper

Acknowledgments

This work is supported by a CSC-VUB scholarship (Grant no2011629010) and the VUB-interdisciplinary research projectldquoObjectifying Assessment of Human Emotional Processingand Expression for Clinical and Psychotherapy Applications(EMO-App)rdquo The authors gratefully acknowledge the Flem-ish Parkinson League for supporting the current study andthe recruitment of the PD patients Also they acknowledgeFloriane Verbraeck for the setup of the experiments and dataacquisition

References

[1] M Katsikitis and I Pilowsky ldquoA study of facial expressionin Parkinsonrsquos disease using a novel microcomputer-basedmethodrdquo Journal of Neurology Neurosurgery and Psychiatry vol51 no 3 pp 362ndash366 1988

[2] G Simons H Ellgring and M C Smith Pasqualini ldquoDistur-bance of spontaneous and posed facial expressions in Parkin-sonrsquos diseaserdquoCognition and Emotion vol 17 no 5 pp 759ndash7782003

[3] D H Jacobs J Shuren D Bowers and K M Heilman ldquoEmo-tional facial imagery perception and expression in Parkinsonsdiseaserdquo Neurology vol 45 no 9 pp 1696ndash1702 1995

[4] L Tickle-Degnen and K D Lyons ldquoPractitionersrsquo impressionsof patients with Parkinsonrsquos disease the social ecology of theexpressive maskrdquo Social Science amp Medicine vol 58 no 3 pp603ndash614 2004

[5] J J van Hilten A D van der Zwan A H Zwinderman and RA C Roos ldquoRating impairment and disability in Parkinsonsdisease evaluation of the unified Parkinsons disease ratingscalerdquoMovement Disorders vol 9 no 1 pp 84ndash88 1994

[6] D Bowers K Miller W Bosch et al ldquoFaces of emotionin Parkinsons disease micro-expressivity and bradykinesiaduring voluntary facial expressionsrdquo Journal of the InternationalNeuropsychological Society vol 12 no 6 pp 765ndash773 2006

[7] P Ekman and W Friesen Facial Action Coding System ATechnique for the Measurement of Facial Movement ConsultingPsychologists Press Palo Alto Calif USA 1978

[8] P Wu D Jiang and H Sahli ldquoPhysiological signal processingfor emotional feature extractionrdquo in Proceedings of the Interna-tional Conference on Physiological Computing Systems pp 40ndash47 2014

[9] L Tickle-Degnen The Interpersonal Communication RatingProtocol A Manual for Measuring Individual Expressive Behav-ior 2010

[10] G Sandbach S Zafeiriou M Pantic and L Yin ldquoStatic anddynamic 3D facial expression recognition a comprehensivesurveyrdquo Image and Vision Computing vol 30 no 10 pp 683ndash697 2012

[11] I Gonzalez H Sahli V Enescu and W Verhelst ldquoContext-independent facial action unit recognition using shape andgabor phase informationrdquo in Proceedings of the 4th InternationalConference on Affective Computing and Intelligent Interaction(ACII rsquo11) vol 1 pp 548ndash557 Springer Memphis Tenn USAOctober 2011

[12] J C Platt ldquoProbabilistic outputs for support vector machinesand comparisons to regularized likelihood methodsrdquo inAdvances in Large Margin Classifiers A J Smola P Bartlett BSchoelkopf andD Schuurmans Eds pp 61ndash74TheMITPress1999

[13] J J Gross and RW Levenson ldquoEmotion elicitation using filmsrdquoCognition and Emotion vol 9 no 1 pp 87ndash108 1995

[14] D Hagemann E Naumann S Maier G Becker A Lurkenand D Bartussek ldquoThe assessment of affective reactivity usingfilms validity reliability and sex differencesrdquo Personality andIndividual Differences vol 26 no 4 pp 627ndash639 1999

[15] C L Lisetti and F Nasoz ldquoUsing noninvasive wearable comput-ers to recognize human emotions from physiological signalsrdquoEURASIP Journal on Advances in Signal Processing vol 2004Article ID 929414 pp 1672ndash1687 2004

[16] J Hewig D Hagemann J Seifert M Gollwitzer E Naumannand D Bartussek ldquoA revised film set for the induction of basicemotionsrdquo Cognition and Emotion vol 19 no 7 pp 1095ndash11092005

[17] J H Westerink E L van den Broek M H Schut J van Herkand K Tuinenbreijer ldquoProbing experiencerdquo in Assessment ofUser Emotions and Behaviour to Development of Products T JO W F P Joyce H D M Westerink M Ouwerkerk and B deRuyter Eds Springer New York NY USA 2008

[18] A Schaefer F Nils P Philippot and X Sanchez ldquoAssessing theeffectiveness of a large database of emotion-eliciting films a newtool for emotion researchersrdquo Cognition and Emotion vol 24no 7 pp 1153ndash1172 2010

[19] F Verbraeck Objectifying human facial expressions for clinicalapplications [MS thesis] Vrije Universiteit Brussel Belgium2012

[20] O AlzoubiAutomatic affect detection from physiological signalspractical issues [PhD thesis] The University of Sydney NewSouth Wales Australia 2012

[21] S D Kreibig ldquoAutonomic nervous system activity in emotiona reviewrdquo Biological Psychology vol 84 no 3 pp 394ndash421 2010

[22] F D Farfan J C Politti and C J Felice ldquoEvaluation of emgprocessing techniques using information theoryrdquo BioMedicalEngineering Online vol 9 article 72 2010

[23] N Ambady F J Bernieri and J A Richeson ldquoToward ahistology of social behavior judgmental accuracy from thinslices of the behavioral streamrdquoAdvances in Experimental SocialPsychology vol 32 pp 201ndash271 2000

[24] N Ambady and R Rosenthal ldquoThin slices of expressivebehavior as predictors of interpersonal consequences a meta-analysisrdquo Psychological Bulletin vol 111 no 2 pp 256ndash274 1992

[25] K D Lyons and L Tickle-Degnen ldquoReliability and validity ofa videotape method to describe expressive behavior in personswith Parkinsonrsquos diseaserdquoTheAmerican Journal of OccupationalTherapy vol 59 no 1 pp 41ndash49 2005

[26] N AMurphy ldquoUsing thin slices for behavioral codingrdquo Journalof Nonverbal Behavior vol 29 no 4 pp 235ndash246 2005

12 Computational and Mathematical Methods in Medicine

[27] A O Andrade S Nasuto P Kyberd C M Sweeney-Reed andF R vanKanijn ldquoEMG signal filtering based on empiricalmodedecompositionrdquo Biomedical Signal Processing and Control vol1 no 1 pp 44ndash55 2006

[28] M Blanco-Velasco B Weng and K E Barner ldquoECG signaldenoising and baseline wander correction based on the empir-ical mode decompositionrdquo Computers in Biology and Medicinevol 38 no 1 pp 1ndash13 2008

[29] A O Boudraa J C Cexus and Z Saidi ldquoEmd-based signalnoise reductionrdquo Signal Processing vol 1 pp 33ndash37 2005

[30] T Jing-tian Z Qing T Yan L Bin and Z Xiao-kai ldquoHilbert-Huang transform for ECG de-noisingrdquo in Proceedings of the1st International Conference on Bioinformatics and BiomedicalEngineering pp 664ndash667 July 2007

[31] A Karagiannis and P Constantinou ldquoNoise components identi-fication in biomedical signals based on empirical mode decom-positionrdquo in Proceedings of the 9th International Conference onInformation Technology and Applications in Biomedicine (ITABrsquo09) pp 1ndash4 November 2009

[32] Y Kopsinis and S McLaughlin ldquoDevelopment of EMD-baseddenoising methods inspired by wavelet thresholdingrdquo IEEETransactions on Signal Processing vol 57 no 4 pp 1351ndash13622009

[33] F Agrafioti D Hatzinakos and A K Anderson ldquoECG patternanalysis for emotion detectionrdquo IEEE Transactions on AffectiveComputing vol 3 no 1 pp 102ndash115 2012

[34] D L Donoho ldquoDe-noising by soft-thresholdingrdquo IEEE Trans-actions on Information Theory vol 41 no 3 pp 613ndash627 1995

[35] B-U Kohler C Hennig and R Orglmeister ldquoThe principlesof software QRS detectionrdquo IEEE Engineering in Medicine andBiology Magazine vol 21 no 1 pp 42ndash57 2002

[36] V Guralnik and J Srivastava ldquoEvent detection from time seriesdatardquo inKnowledge Discovery andDataMining pp 33ndash42 1999

[37] M Hamedi S-H Salleh and T T Swee ldquoSurfaceelectromyography-based facial expression recognition inBi-polar configurationrdquo Journal of Computer Science vol 7 no9 pp 1407ndash1415 2011

[38] Y Hou H Sahli R Ilse Y Zhang and R Zhao ldquoRobust shape-based head trackingrdquo inAdvanced Concepts for Intelligent VisionSystems vol 4678 ofLectureNotes inComputer Science pp 340ndash351 Springer 2007

[39] T Kanade J F Cohn and Y Tian ldquoComprehensive databasefor facial expression analysisrdquo in Proceedings of the IEEE Inter-national Conference on Automatic Face and Gesture Recognitionpp 46ndash53 Grenoble France 2000

[40] M S Bartlett G C Littlewort M G Frank C Lainscsek IR Fasel and J R Movellan ldquoAutomatic recognition of facialactions in spontaneous expressionsrdquo Journal of Multimedia vol1 no 6 pp 22ndash35 2006

[41] A Savran B Sankur and M Taha Bilge ldquoRegression-basedintensity estimation of facial action unitsrdquo Image and VisionComputing vol 30 no 10 pp 774ndash784 2012

[42] Anvil the video annotation tool httpwwwanvil-soft-wareorg

[43] S Vicente J Peron I Biseul et al ldquoSubjective emotionalexperience at different stages of Parkinsonrsquos diseaserdquo Journal ofthe Neurological Sciences vol 310 no 1-2 pp 241ndash247 2011

[44] A van Boxtel ldquoFacial emg as a tool for inferring affectivestatesrdquo in Proceedings of Measuring Behavior 2010 A J Spink FGrieco O Krips L Loijens L Noldus and P Zimmeran Edspp 104ndash108 Noldus Information TechnologyWageningenTheNetherlands 2010

[45] C-N Huang C-H Chen and H-Y Chung ldquoThe review ofapplications and measurements in facial electromyographyrdquoJournal of Medical and Biological Engineering vol 25 no 1 pp15ndash20 2005

[46] R W Levenson P Ekman K Heider and W V FriesenldquoEmotion and autonomic nervous system activity in theminangkabau of west sumatrardquo Journal of Personality and SocialPsychology vol 62 no 6 pp 972ndash988 1992

[47] S Rohrmann and H Hopp ldquoCardiovascular indicators ofdisgustrdquo International Journal of Psychophysiology vol 68 no3 pp 201ndash208 2008

[48] OAlaoui-Ismaıli O RobinH Rada ADittmar andEVernet-Maury ldquoBasic emotions evoked by odorants comparisonbetween autonomic responses and self-evaluationrdquo Physiologyand Behavior vol 62 no 4 pp 713ndash720 1997

[49] J Gruber S L Johnson C Oveis and D Keltner ldquoRisk formania and positive emotional responding too much of a goodthingrdquo Emotion vol 8 no 1 pp 23ndash33 2008

[50] W Gaebel and W Wolwer ldquoFacial expressivity in the course ofschizophrenia and depressionrdquo European Archives of Psychiatryand Clinical Neuroscience vol 254 no 5 pp 335ndash342 2004

[51] P EkmanW Friesen and J Hager Facial Action Coding SystemThe Manual 2002

Submit your manuscripts athttpwwwhindawicom

Stem CellsInternational

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

MEDIATORSINFLAMMATION

of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Behavioural Neurology

EndocrinologyInternational Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Disease Markers

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

BioMed Research International

OncologyJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Oxidative Medicine and Cellular Longevity

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

PPAR Research

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Immunology ResearchHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Journal of

ObesityJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Computational and Mathematical Methods in Medicine

OphthalmologyJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Diabetes ResearchJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Research and TreatmentAIDS

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Gastroenterology Research and Practice

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Parkinsonrsquos Disease

Evidence-Based Complementary and Alternative Medicine

Volume 2014Hindawi Publishing Corporationhttpwwwhindawicom

12 Computational and Mathematical Methods in Medicine

[27] A O Andrade S Nasuto P Kyberd C M Sweeney-Reed andF R vanKanijn ldquoEMG signal filtering based on empiricalmodedecompositionrdquo Biomedical Signal Processing and Control vol1 no 1 pp 44ndash55 2006

[28] M Blanco-Velasco B Weng and K E Barner ldquoECG signaldenoising and baseline wander correction based on the empir-ical mode decompositionrdquo Computers in Biology and Medicinevol 38 no 1 pp 1ndash13 2008

[29] A O Boudraa J C Cexus and Z Saidi ldquoEmd-based signalnoise reductionrdquo Signal Processing vol 1 pp 33ndash37 2005

[30] T Jing-tian Z Qing T Yan L Bin and Z Xiao-kai ldquoHilbert-Huang transform for ECG de-noisingrdquo in Proceedings of the1st International Conference on Bioinformatics and BiomedicalEngineering pp 664ndash667 July 2007

[31] A Karagiannis and P Constantinou ldquoNoise components identi-fication in biomedical signals based on empirical mode decom-positionrdquo in Proceedings of the 9th International Conference onInformation Technology and Applications in Biomedicine (ITABrsquo09) pp 1ndash4 November 2009

[32] Y Kopsinis and S McLaughlin ldquoDevelopment of EMD-baseddenoising methods inspired by wavelet thresholdingrdquo IEEETransactions on Signal Processing vol 57 no 4 pp 1351ndash13622009

[33] F Agrafioti D Hatzinakos and A K Anderson ldquoECG patternanalysis for emotion detectionrdquo IEEE Transactions on AffectiveComputing vol 3 no 1 pp 102ndash115 2012

[34] D L Donoho ldquoDe-noising by soft-thresholdingrdquo IEEE Trans-actions on Information Theory vol 41 no 3 pp 613ndash627 1995

[35] B-U Kohler C Hennig and R Orglmeister ldquoThe principlesof software QRS detectionrdquo IEEE Engineering in Medicine andBiology Magazine vol 21 no 1 pp 42ndash57 2002

[36] V Guralnik and J Srivastava ldquoEvent detection from time seriesdatardquo inKnowledge Discovery andDataMining pp 33ndash42 1999

[37] M Hamedi S-H Salleh and T T Swee ldquoSurfaceelectromyography-based facial expression recognition inBi-polar configurationrdquo Journal of Computer Science vol 7 no9 pp 1407ndash1415 2011

[38] Y Hou H Sahli R Ilse Y Zhang and R Zhao ldquoRobust shape-based head trackingrdquo inAdvanced Concepts for Intelligent VisionSystems vol 4678 ofLectureNotes inComputer Science pp 340ndash351 Springer 2007

[39] T Kanade J F Cohn and Y Tian ldquoComprehensive databasefor facial expression analysisrdquo in Proceedings of the IEEE Inter-national Conference on Automatic Face and Gesture Recognitionpp 46ndash53 Grenoble France 2000

[40] M S Bartlett G C Littlewort M G Frank C Lainscsek IR Fasel and J R Movellan ldquoAutomatic recognition of facialactions in spontaneous expressionsrdquo Journal of Multimedia vol1 no 6 pp 22ndash35 2006

[41] A Savran B Sankur and M Taha Bilge ldquoRegression-basedintensity estimation of facial action unitsrdquo Image and VisionComputing vol 30 no 10 pp 774ndash784 2012

[42] Anvil the video annotation tool httpwwwanvil-soft-wareorg

[43] S Vicente J Peron I Biseul et al ldquoSubjective emotionalexperience at different stages of Parkinsonrsquos diseaserdquo Journal ofthe Neurological Sciences vol 310 no 1-2 pp 241ndash247 2011

[44] A van Boxtel ldquoFacial emg as a tool for inferring affectivestatesrdquo in Proceedings of Measuring Behavior 2010 A J Spink FGrieco O Krips L Loijens L Noldus and P Zimmeran Edspp 104ndash108 Noldus Information TechnologyWageningenTheNetherlands 2010

[45] C-N Huang C-H Chen and H-Y Chung ldquoThe review ofapplications and measurements in facial electromyographyrdquoJournal of Medical and Biological Engineering vol 25 no 1 pp15ndash20 2005

[46] R W Levenson P Ekman K Heider and W V FriesenldquoEmotion and autonomic nervous system activity in theminangkabau of west sumatrardquo Journal of Personality and SocialPsychology vol 62 no 6 pp 972ndash988 1992

[47] S Rohrmann and H Hopp ldquoCardiovascular indicators ofdisgustrdquo International Journal of Psychophysiology vol 68 no3 pp 201ndash208 2008

[48] OAlaoui-Ismaıli O RobinH Rada ADittmar andEVernet-Maury ldquoBasic emotions evoked by odorants comparisonbetween autonomic responses and self-evaluationrdquo Physiologyand Behavior vol 62 no 4 pp 713ndash720 1997

[49] J Gruber S L Johnson C Oveis and D Keltner ldquoRisk formania and positive emotional responding too much of a goodthingrdquo Emotion vol 8 no 1 pp 23ndash33 2008

[50] W Gaebel and W Wolwer ldquoFacial expressivity in the course ofschizophrenia and depressionrdquo European Archives of Psychiatryand Clinical Neuroscience vol 254 no 5 pp 335ndash342 2004

[51] P EkmanW Friesen and J Hager Facial Action Coding SystemThe Manual 2002

Submit your manuscripts athttpwwwhindawicom

Stem CellsInternational

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

MEDIATORSINFLAMMATION

of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Behavioural Neurology

EndocrinologyInternational Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Disease Markers

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

BioMed Research International

OncologyJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Oxidative Medicine and Cellular Longevity

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

PPAR Research

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Immunology ResearchHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Journal of

ObesityJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Computational and Mathematical Methods in Medicine

OphthalmologyJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Diabetes ResearchJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Research and TreatmentAIDS

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Gastroenterology Research and Practice

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Parkinsonrsquos Disease

Evidence-Based Complementary and Alternative Medicine

Volume 2014Hindawi Publishing Corporationhttpwwwhindawicom

Submit your manuscripts athttpwwwhindawicom

Stem CellsInternational

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

MEDIATORSINFLAMMATION

of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Behavioural Neurology

EndocrinologyInternational Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Disease Markers

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

BioMed Research International

OncologyJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Oxidative Medicine and Cellular Longevity

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

PPAR Research

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Immunology ResearchHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Journal of

ObesityJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Computational and Mathematical Methods in Medicine

OphthalmologyJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Diabetes ResearchJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Research and TreatmentAIDS

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Gastroenterology Research and Practice

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Parkinsonrsquos Disease

Evidence-Based Complementary and Alternative Medicine

Volume 2014Hindawi Publishing Corporationhttpwwwhindawicom