Fear and happiness in the eyes: An intra-cerebral event-related potential study from the human...

11
Please cite this article in press as: Meletti, S., et al. Fear and happiness in the eyes: An intra-cerebral event-related potential study from the human amygdala. Neuropsychologia (2011), doi:10.1016/j.neuropsychologia.2011.10.020 ARTICLE IN PRESS G Model NSY-4309; No. of Pages 11 Neuropsychologia xxx (2011) xxx–xxx Contents lists available at SciVerse ScienceDirect Neuropsychologia jo u rn al hom epa ge : www.elsevier.com/locate/neuropsychologia Fear and happiness in the eyes: An intra-cerebral event-related potential study from the human amygdala Stefano Meletti a,, Gaetano Cantalupo c , Francesca Benuzzi a , Roberto Mai b , Laura Tassi b , Elisa Gasparini a , Carlo Alberto Tassinari c , Paolo Nichelli a a Dept. Neuroscience, University of Modena and Reggio Emilia, Nuovo Ospedale Civile Sant’Agostino Estense, Via Giardini 41100 Modena, Italy b Epilepsy Surgery Centre “C.Munari”, Ospedale Niguarda, Viale Ca’ Granda Milano, Italy c Neuroscience Department, University of Parma, Italy a r t i c l e i n f o Article history: Received 11 March 2011 Received in revised form 15 October 2011 Accepted 20 October 2011 Available online xxx Keywords: Amygdala Eyes Facial expression Fear Theta frequency a b s t r a c t We present the response pattern of intracranial event-related potentials (ERPs) recorded from depth- electrodes in the human amygdala (four patients) to faces or face parts encoding fearful, happy or neutral expressions. The amygdala showed increased amplitude ERPs (from 200 to 400 ms post-stimulus) in response to the eye region of the face compared to whole faces and to the mouth region. In particu- lar, a strong emotional valence effect was observed, both at group and at single-subject level, with a preferential response to fearful eyes respect to every other stimulus category from 200 to 400 ms after stimulus presentation. A preferential response to smiling eyes compared to happy faces and smiling mouths was also observed at group level from 300 to 400 ms post-stimulus presentation. A comple- mentary time–frequency analysis was performed showing that an increase in the theta frequency band (4–7 Hz) accounted for the main event-related band power (ERBP) change during the 200–500 ms post stimulus interval. The analysis of the ERBPs changes according to their emotional valence showed a strong increase in theta ERBP to fearful eyes, which was higher respect to any other facial stimulus. Moreover, theta ERBP increase to “smiling eyes” was larger respect with that evoked by smiling mouths and whole happy faces. Minimal post-stimulus ERBPs changes were evoked by neutral stimuli. These data are con- sistent with a special role of the amygdala in processing facial signals, both with negative and positive valence, conveyed by the eye region of the face. © 2011 Elsevier Ltd. All rights reserved. 1. Introduction In the last decade a number of studies have highlighted the role of the amygdala in processing emotional signals and different lines of evidences indicate that the amygdala is critically involved in processing facial expressions. In particular, it has been shown that human amygdala lesions produce recognition deficits both for ‘basic’ facial emotions, such as sadness and fear (Adolphs & Tranel, 2004; Adolphs, Tranel, Damasio, & Damasio, 1994; Young, Hellawell, Van De Wal, & Johnson, 1996), and for more complex facial expressions that signal social or cognitive emotions (Adolphs, Baron-Cohen, & Tranel, 2002; Shaw et al., 2005). Consistent with these data, functional neuroimaging experiments in healthy sub- jects and intracranial event-related potentials (ERPs) recorded in epileptic patients demonstrated that the view of faces depicting a wide range of facial emotions, and especially frightened ones, elicits neural activity in the amygdala (Breiter et al., 1996; Krolak- Corresponding author. E-mail address: [email protected] (S. Meletti). Salmon, Henaff, Vighetto, Bertrand, & Mauguiere, 2004; Morris et al., 1996). However, the relative contribution of the amygdala in facial expression processing is not completely clear. Lesion stud- ies and data from high-functioning autistic subjects suggest that the human amygdala is especially involved in decoding features of the eye region (Adolphs et al., 2005; Baron-Cohen et al., 1999). Accordingly, recent PET and fMRI studies showed that the amygdala responds to direct gaze (Kawashima et al., 1999) and to the “wide- eyed” expression that signal fear (Morris, deBonis, & Dolan, 2002; Whalen et al., 2004). Furthermore, Adolphs et al. (2005) proposed that an impairment in fear recognition can arise from the inability to process information from the eye region that is normally essen- tial for recognizing this emotion: the failure could be related to the lack of spontaneous fixation on the eye region and implicates that the amygdala damage would impair the use of visual information from the eye region for decoding different facial expressions. In other words, the amygdala might be involved in detecting salient facial features and in reflexively triggering fixation changes toward them rather than being involved in emotion discrimination per se (Adolphs & Spezio, 2006; Spezio, Adolphs, Hurley, & Piven, 2007; Spezio, Huang, Castelli, & Adolphs, 2007). This support the idea that 0028-3932/$ see front matter © 2011 Elsevier Ltd. All rights reserved. doi:10.1016/j.neuropsychologia.2011.10.020

Transcript of Fear and happiness in the eyes: An intra-cerebral event-related potential study from the human...

G

N

Ff

SEa

b

c

a

ARRAA

KAEFFT

1

rlitfTHfBtjeae

0d

ARTICLE IN PRESS Model

SY-4309; No. of Pages 11

Neuropsychologia xxx (2011) xxx– xxx

Contents lists available at SciVerse ScienceDirect

Neuropsychologia

jo u rn al hom epa ge : www.elsev ier .com/ locate /neuropsychologia

ear and happiness in the eyes: An intra-cerebral event-related potential studyrom the human amygdala

tefano Meletti a,∗, Gaetano Cantalupoc, Francesca Benuzzia, Roberto Maib, Laura Tassib,lisa Gasparinia, Carlo Alberto Tassinari c, Paolo Nichelli a

Dept. Neuroscience, University of Modena and Reggio Emilia, Nuovo Ospedale Civile Sant’Agostino Estense, Via Giardini 41100 Modena, ItalyEpilepsy Surgery Centre “C.Munari”, Ospedale Niguarda, Viale Ca’ Granda Milano, ItalyNeuroscience Department, University of Parma, Italy

r t i c l e i n f o

rticle history:eceived 11 March 2011eceived in revised form 15 October 2011ccepted 20 October 2011vailable online xxx

eywords:mygdalayesacial expressionear

a b s t r a c t

We present the response pattern of intracranial event-related potentials (ERPs) recorded from depth-electrodes in the human amygdala (four patients) to faces or face parts encoding fearful, happy or neutralexpressions. The amygdala showed increased amplitude ERPs (from 200 to 400 ms post-stimulus) inresponse to the eye region of the face compared to whole faces and to the mouth region. In particu-lar, a strong emotional valence effect was observed, both at group and at single-subject level, with apreferential response to fearful eyes respect to every other stimulus category from 200 to 400 ms afterstimulus presentation. A preferential response to smiling eyes compared to happy faces and smilingmouths was also observed at group level from 300 to 400 ms post-stimulus presentation. A comple-mentary time–frequency analysis was performed showing that an increase in the theta frequency band(4–7 Hz) accounted for the main event-related band power (ERBP) change during the 200–500 ms post

heta frequency stimulus interval. The analysis of the ERBPs changes according to their emotional valence showed a strongincrease in theta ERBP to fearful eyes, which was higher respect to any other facial stimulus. Moreover,theta ERBP increase to “smiling eyes” was larger respect with that evoked by smiling mouths and wholehappy faces. Minimal post-stimulus ERBPs changes were evoked by neutral stimuli. These data are con-sistent with a special role of the amygdala in processing facial signals, both with negative and positivevalence, conveyed by the eye region of the face.

. Introduction

In the last decade a number of studies have highlighted theole of the amygdala in processing emotional signals and differentines of evidences indicate that the amygdala is critically involvedn processing facial expressions. In particular, it has been shownhat human amygdala lesions produce recognition deficits bothor ‘basic’ facial emotions, such as sadness and fear (Adolphs &ranel, 2004; Adolphs, Tranel, Damasio, & Damasio, 1994; Young,ellawell, Van De Wal, & Johnson, 1996), and for more complex

acial expressions that signal social or cognitive emotions (Adolphs,aron-Cohen, & Tranel, 2002; Shaw et al., 2005). Consistent withhese data, functional neuroimaging experiments in healthy sub-ects and intracranial event-related potentials (ERPs) recorded in

Please cite this article in press as: Meletti, S., et al. Fear and happiness inhuman amygdala. Neuropsychologia (2011), doi:10.1016/j.neuropsychologi

pileptic patients demonstrated that the view of faces depicting wide range of facial emotions, and especially frightened ones,licits neural activity in the amygdala (Breiter et al., 1996; Krolak-

∗ Corresponding author.E-mail address: [email protected] (S. Meletti).

028-3932/$ – see front matter © 2011 Elsevier Ltd. All rights reserved.oi:10.1016/j.neuropsychologia.2011.10.020

© 2011 Elsevier Ltd. All rights reserved.

Salmon, Henaff, Vighetto, Bertrand, & Mauguiere, 2004; Morriset al., 1996). However, the relative contribution of the amygdalain facial expression processing is not completely clear. Lesion stud-ies and data from high-functioning autistic subjects suggest thatthe human amygdala is especially involved in decoding featuresof the eye region (Adolphs et al., 2005; Baron-Cohen et al., 1999).Accordingly, recent PET and fMRI studies showed that the amygdalaresponds to direct gaze (Kawashima et al., 1999) and to the “wide-eyed” expression that signal fear (Morris, deBonis, & Dolan, 2002;Whalen et al., 2004). Furthermore, Adolphs et al. (2005) proposedthat an impairment in fear recognition can arise from the inabilityto process information from the eye region that is normally essen-tial for recognizing this emotion: the failure could be related to thelack of spontaneous fixation on the eye region and implicates thatthe amygdala damage would impair the use of visual informationfrom the eye region for decoding different facial expressions. Inother words, the amygdala might be involved in detecting salient

the eyes: An intra-cerebral event-related potential study from thea.2011.10.020

facial features and in reflexively triggering fixation changes towardthem rather than being involved in emotion discrimination per se(Adolphs & Spezio, 2006; Spezio, Adolphs, Hurley, & Piven, 2007;Spezio, Huang, Castelli, & Adolphs, 2007). This support the idea that

ING Model

N

2 ychol

tuobgte

fetohrefrtioawwedtt(tsr“

das(pfltnjhtAJstC2prc

2

2

iTe(erti

ARTICLESY-4309; No. of Pages 11

S. Meletti et al. / Neurops

he amygdala’s role is much broader than the older view, and seemsnlikely to be restricted to processing only stimuli related to threatr danger. The amygdala is rather important to detect salience andiological relevance of environmental stimuli and to resolve ambi-uity. In this line the amygdala could process information abouthe eye region of faces to obtain cues for rapid decoding of facialxpressions.

To our knowledge, direct electrophysiological evidence of a dif-erential or preferential response of the human amygdala to theye region compared to the whole face or to the lower part ofhe face, is lacking. In this study, we tested the response patternf intracranial event-related potentials (ERPs) recorded from theuman amygdala to static faces and face parts (eyes and mouthegion) encoding fear, happy and neutral expressions. We hypoth-size that, if the amygdala is especially sensitive to the expressiveeatures of the eye region of conspecifics, brain potentials directlyecorded from it would show a different response to the eye and tohe mouth region of the face. In particular, we expect that viewingsolated fearful eyes should be sufficient to evoke the ERP responsebserved to fearful faces (a broad negative potential between 200nd 800 ms post-stimulus) (Krolak-Salmon et al., 2004). Moreover,e expect that even if the “smiling mouth” is the face-featuree use to decode happiness in explicit recognition tasks (Adolphs

t al., 2005; Smith, Cottrell, Gosselin, & Schyns, 2005), the amyg-ala should respond preferentially to the “smiling eyes” rather thano the “smiling mouth”. In contrast with fear, joy is mostly charac-erized by the inferior part of the face, especially by the mouthsmile) which is the diagnostic element enabling the recognition ofhis emotion (Schyns, Bonnar, & Gosselin, 2002). However, a fakemile will be betrayed by the absence of expressive cues in the eyeegion, such as the wrinkles around the eye corners (the so-calledDuchenne smile”) (Ekman, Davidson, & Friesen, 1990).

To test these hypotheses intracranial ERPs were recorded in fourrug-resistant epileptic patients implanted with depth-electrodesccording to stereo-EEG (SEEG) methodology as part of their pre-urgical evaluation. Data were first analyzed in the domain of timeERPs), and then a time–frequency analysis (event-related bandower, ERBP) was carried out to investigate the role of differentrequency changes over time after the stimulus presentation. Thisast analysis has the aim to explore the relative contributions ofheta and gamma oscillations to the processing of emotional sig-als, and in particular to the processing of facial signals of threat and

oy. Recently, the contribution of gamma frequency (around 40 Hz)as been demonstrated within the human amygdala in responseo complex emotional visual scenes (Oya, Kawasaki, Howard, &dolphs, 2002) and to fearful facial expressions (Luo, Holroyd,

ones, Hendler, & Blair, 2007; Sato et al., 2011). However, othertudies underlined the importance of frequencies oscillations inhe theta range during emotional processing (Knyazev, 2007; Paré,ollins, & Pelletier, 2002; Maratos, Mogg, Bradley, Rippon, & Senior,009). In particular it has been proposed that theta band activitylays an important role in integrating and synchronizing neuralesponses to emotional stimuli across sub-cortical (amygdala) andortical structures, such as frontal and visual cortices (Lewis, 2005).

. Material and methods

.1. Patients

Four patients suffering from drug-resistant focal epilepsy were stereotacticallymplanted with intracerebral electrodes as part of their pre-surgical evaluation.he structures to be explored were defined on the basis of ictal manifestations,lectroencephalography (EEG), and neuroimaging studies. Event-related potentials

Please cite this article in press as: Meletti, S., et al. Fear and happiness inhuman amygdala. Neuropsychologia (2011), doi:10.1016/j.neuropsycholog

ERPs) recordings were performed as part of the cortical functional mapping at thend of the stereotactic-EEG (SEEG) monitoring, once relevant seizures had beenecorded. No seizures were recorded the day of the ERPs experiment, nor duringhe 24 h prior to it. According to a direct individual benefit, patients were fullynformed of the electrode implantation, SEEG, and ERPs recordings, and they gave

PRESSogia xxx (2011) xxx– xxx

their consent. The subjects’ consent was obtained according to the declaration ofHelsinki and the local Ethical Committee approved the study. Table 1 (online sup-plementary material) reports the clinical and neuroradiological features and theimplantation sites of each subject. All subjects were right-handed as determinedby the Edinburgh Inventory (Oldfield, 1971). Patients 1 and 2 were diagnosed withleft occipito-temporal lobe epilepsy. Patient 3 was diagnosed with a right occipito-temporal lobe epilepsy. Patient 4 was diagnosed with a right frontal-lobe epilepsy.A total of four amygdale were implanted, two from the right hemisphere (Pt. 3 and4) and two from the left hemisphere (Pt. 1 and 2).

2.2. Stereotactic implantation

Cerebral angiography was first performed in stereotactic conditions. In order toreach the clinically relevant targets, the stereotactic coordinates of each electrodewere calculated preoperatively on the individual cerebral MRI previously enlarged atthe angiography scale. The electrodes were implanted perpendicularly to the mid-sagittal plane using Talairach’s stereotactic method (Talairach & Bancaud, 1973).Depth electrodes (DIXI, Besancon, France) were 0.8 mm in diameter and had 5, 10,or 15 recording contacts. Contacts were 2.0 mm long, and successive contacts wereseparated by 1.5 mm. Since the voltage field related to amygdala activity has theproperties of a “closed” field, particular attention was paid to ensure that the record-ing was from within the amygdala structures (Hudry, Ryvlin, Royet, & Mauguiere,2001; Lorente de Nò, 1947). Therefore, at first we identified the cerebral structuresexplored by each stereo-EEG electrode through their entry point and target point asdefined in the three-dimensional proportional grid system devised by Talairach andTournoux (1988). Talairach coordinates of the electrode contacts of interest for thisstudy were calculated, by superposing the X-ray of the skull of the patient with theelectrodes implanted on the Talairach proportional grid system with a grid scaleof 1 mm. Then, to visualize with higher accuracy the exact locations of the elec-trode contacts, we reconstructed the trajectories of the stereo-EEG electrodes onthe post-implantation MRI images in each patient. A probabilistic cytoarchitectonicmap of the amygdala (Eickhoff et al., 2005) was also referenced to validate contactspositions.

2.3. Facial emotion recognition (FER) assessment

Since previous research demonstrated that drug-resistant medial temporal lobeepilepsy can impair explicit recognition of facial emotions (Benuzzi et al., 2004;Meletti et al., 2003) patients’ facial emotion recognition ability was tested accord-ing to a previously published protocol before performing the ERPs paradigm (Melettiet al., 2009). Pictures of facial affect, taken from the Ekman and Friesen series (1976),were used to prepare a task requiring subjects to match a facial expression withthe appropriate verbal label, choosing among the following five basic emotions:happiness, sadness, fear, disgust, and anger. Ten pictures (facial stimuli) were usedfor each emotion giving a total of 50 trials. Normative data (for the pictures offacial affect series) report the following mean percentages of correct recognition forthe selected items: happiness = 99.2%; sadness = 95.6%; fear = 88.4%; disgust = 95.6%;anger = 94.4%.

Testing procedure: pictures (10 cm × 13 cm) were presented, one by one, on asheet of paper. The verbal labels for the five facial expressions were printed undereach picture and the subjects were asked to select the word that best described theemotion shown in each photograph. The participants were instructed to considerall five alternatives carefully before responding. There was no time limit and thepatients were given no feedback on their performances. All the subjects completedthe test without difficulty in a single session that typically lasted from 10 to 20 min.

Fifty right-handed (Oldfield, 1971) healthy volunteers with no history of neu-rological or psychiatric illness participated as controls.

2.4. Stimuli and ERP paradigm

To test the effects of the different facial emotions and the role of the upperand lower face part on amygdala ERPs, subjects were presented with the followingstimulus categories: whole face, eyes, and mouth, showing either fearful, happy orneutral expressions.

Whole face stimuli were 33 static gray-scale images of emotionally expressivefaces (six women and five men depicting three different emotional expressions,i.e., fear, happiness, and neutral) taken from the Ekman’s set of pictures of facialaffect (Ekman & Friesen, 1976). The pictures used are referred as EM, JJ, MF, MO,PE, PF, SW, WF, NR, GS and C in the Ekman’s data set. An elliptic mask was fittedto solely reveal the face itself while hiding hair and ears. Stimuli showing only theeye or the mouth region were created from whole face stimuli (Fig. 1). We thereforeobtained 99 stimuli: (a) 33 fear encoding stimuli (11 fearful faces; 11 fearful eyes;11 fearful mouths); (b) 33 happiness-encoding stimuli (11 happy faces; 11 happyeyes; 11 happy mouths); (c) 33 neutral stimuli (11 neutral faces; 11 neutral eyes;11 neutral mouths). A luminance meter was used to adjust the images: average

the eyes: An intra-cerebral event-related potential study from theia.2011.10.020

luminance was 30–35 cd/m2. Uniform figure/background was ensured by using thesame mid-gray background. Luminance and contrast were equated for whole facestimuli, eyes-stimuli and mouth-stimuli.

Each trial started with a fixation cross (2 s) followed by the presentation of astimulus (500 ms). Size-, brightness-, and contrast-adjusted images were presented

ARTICLE ING Model

NSY-4309; No. of Pages 11

S. Meletti et al. / Neuropsychol

oltpp(

2

rAo(op1

Fityc

Fig. 1. Examples of the nine stimulus categories used in the ERPs protocol.

n a computer screen centered at the position of the fixation cross. After stimu-us offset, a blank screen followed (2–3 s). Subjects were asked to pay attention tohe emotion conveyed by the stimuli. Five blocks of stimuli were delivered to eachatient (99 stimuli per block) with 20-min intervals between blocks. Stimulus orderresentation was randomized. Therefore, 495 trials during the entire experiment99 × 5) were presented to each patient.

.5. Recordings and signal averaging

Recordings were obtained in a dimly light, quiet room. Continuous SEEG wasecorded with a 64-channel-EEG device (Neuroscan), with nose used as reference.

bipolar Electro-Oculogram (EOG) was recorded from the supra-orbital ridge anduter canthus of the right eye. The ground was located at the mid-frontal scalp

Please cite this article in press as: Meletti, S., et al. Fear and happiness inhuman amygdala. Neuropsychologia (2011), doi:10.1016/j.neuropsychologi

Fz site). SEEG was recorded with a 1024 Hz-sampling rate through a band-passf 0.1–200 Hz. ERPs were time-locked to images displayed on a computer screenlaced 60 cm from the patient’s face. Whole face stimuli subtended a visual angle of2.3◦ × 8.7◦ .

ig. 2. (A) Post-implantation coronal MR image showing electrode contacts within the

ndicates the position of this contact on the pre-implantation MRI of the patient (Talairachhe location (on pre-implantation MRI) of the electrodes contacts with highest amplitude

= −5; z = −16). Field potentials from these contacts were used for statistical analyses inolor in this figure legend, the reader is referred to the web version of the article.)

PRESSogia xxx (2011) xxx– xxx 3

A 200 ms pre-stimulus baseline correction was performed. Epochs with eye blinkartifacts greater than 100 �V on EOG and epileptic spikes or sharp waves greaterthan 150 �V were rejected. The artifact rejection and averaging processes were car-ried out on an analysis time-window of 1400 ms (200 ms pre-stimulus – 1200 mspost-stimulus).

Overall 103 trials (5.2%) out of 1980 were rejected due to epileptic transient,eye blinks or electrical interference. At single subject level the number of rejectedtrials was: 21 (4.2% in Pt. 1), 25 (5.0% in Pt. 2), 19 (3.8% in Pt. 3), and 38 (7.6% in Pt.4). There was no association between rejection rate and stimulus category (�2 = 2.3;p > 0.1).

Correlation analyses for raw waveforms also showed no significant relationshipbetween the amygdala contacts selected for analysis and either vertical or horizontalEOGs (mean ± SD r = −0.05 ± 0.08, r = −0.03 ± 0.06 for amygdala-vertical EOG andamygdala-horizontal EOG respectively; one-sample t-test for differences from zeroafter Fisher r-to-z transformation, p > 0.1). These results correspond to those fromprevious studies confirming that intracranial data are immune to eye-movementscontaminations (Lachaux, Rudrauf, & Khane, 2003; Sato et al., 2011).

2.6. Group and single-subject ERPs statistics

After visual inspection of single subject’s averaged potentials we first analyzedstatistical differences at group level between responses to the different stimuluscategories. Measurements from the contacts having the largest ERPs were enteredinto the statistical analysis. In the amygdala, for each patient and for each stimu-lus category, the mean amplitudes of single trial potentials were calculated over100 ms contiguous epochs (Krolak-Salmon et al., 2004) from −100 before stimu-lus to 700 ms after stimulus presentation thus obtaining eight time-windows. Trialswere entered into the analysis separately, rather than each participant’s averagevalue. Before statistical analysis, these single trial mean amplitudes were screenedfor normality (Kolmogorov–Smirnov test) and then for homogeneity of variance(Levene test). Because the data met the assumptions required for the analysisof variance, they were entered as dependant variable in an analysis of variance(ANOVA), the stimulus category being the factor. When a significance of the stim-ulus category factor was observed for a determined time epoch, additional posthoc two-tailed paired comparisons with Bonferroni correction were performed todetermine which categories differed from each other ( ̨ = 0.05/n. of possible com-parisons).

To evaluate if the results obtained at group level were reproducible across dif-ferent subjects analyses were repeated at single subject level (trial number = 495per subject).

the eyes: An intra-cerebral event-related potential study from thea.2011.10.020

2.7. Group time–frequency analyses (event related band power – ERBP)

The acquired neurophysiological data were also submitted to a frequency-domain analysis in order to obtain complementary information in respect with

amygdala in pt 1. The contact C4′ is the one with the largest ERPs. The red cross coordinates: x = −25; y = −3; z = −12). (B)–(D) Coronal and sagittal images showings ERPs in pt 2 (x = −21; y = −4; z = −10), pt 3 (x = 25; y = −5; z = −15), and pt 4 (x = 22;

the time and frequency domain analyses. (For interpretation of the references to

ARTICLE IN PRESSG Model

NSY-4309; No. of Pages 11

4 S. Meletti et al. / Neuropsychologia xxx (2011) xxx– xxx

Fig. 3. (A) Face part effect. On the left, group amygdala ERPs obtained by averaging eyes (continuous line), whole faces (dotted line) and mouths (dashed line) independentlyfrom their expressive features. On the right, graph bar showing mean ERPs amplitudes (n = 660) ± 1 SEM across trials between 300 and 400 ms post-stimulus onset. (B)Emotional valence effect. On the left, group amygdala ERPs obtained by averaging stimuli according to their expressive features: the red curve represent ERPs obtained bya and nS rs rep

tttdrtqsgptpaTrs

c(alpbtft

b12Mcitt

veraging fearful faces, eyes, and mouths; the blue and black curves represent happyEM across trials calculated between 300 and 400 ms post-stimulus onset. Error ba

ime-domain analyses (ERPs). The time–frequency analysis was performed forhe same electrodes’ contacts used for the ERPs computations. To evaluatehe time–frequency characteristics of the local field potential trial signals wereigitally convolved with a complex Morlet wavelet (as implemented by Neu-oscan 4.6): center frequencies ranged from 2 to 50 Hz in 2 Hz intervals. Thisransform computes power of event-related EEG activity in a centered fre-uency band as a function of time. Complex demodulation was performed oningle trials row epochs. Averages and variances were calculated across sin-le trials on the complex time-series and the resulting event-related bandower (ERBP) is based on these. Power computations were based on magni-ude squared (uV2 units). For each frequency the evoked event-related bandower envelope was computed on the time-locked average, in which the valuet a given time point is the average of all of the voltages at that point.his means that the calculated ERBP represent the evoked variety of event-elated band power computation that is phase-locked (or time-locked) with thetimulus.

To assess the statistical significance of ERBP envelopes we analyzed ERBPhanges over time in four frequency bands centered on the following frequencieshalf bandwidth of 2 Hz; rolloff 48 dB/oct): 5 Hz (theta), 10 Hz (alpha), 20 Hz (beta),nd 40 Hz (gamma bands). To quantify the event-related change in the power enve-ope that occur after presentation of the stimulus, we first calculated the meanower envelope values from a reference period, Ref (f0), sampled from 500 msefore the onset of stimuli. The values of ERBP change at time t with respect tohe reference period was given by the following equation: ERBP (t, f0) = 10* log [E(t,0)/Ref(f0)] (dB). Where f0 is the center frequency, E(t, f0) is the power envelope atime t.

The time–frequency plane was divided into 32 blocks, namely, four frequencyands (theta, alpha, beta, gamma) and eight time windows (1, pre-stimulus,00 ms to stimulus onset; 2, post-stimulus, 0–100 ms; 3, 100–200 ms; 4,00–300 ms; 5, 300–400 ms; 6, 400–500 ms; 7, 500–600 ms; 8, 600–700 ms).ean values of the ERBP data in these 32 time–frequency windows were

Please cite this article in press as: Meletti, S., et al. Fear and happiness inhuman amygdala. Neuropsychologia (2011), doi:10.1016/j.neuropsycholog

alculated for each trial and used for statistical analysis. Before conduct-ng parametric statistical tests, the normality of the cumulative distribu-ions of these values was assessed with Kolmogorov–Smirnov goodness-of-fitests.

eutral stimuli. On the right, graph bar showing mean ERPs amplitudes (n = 660) ± 1resent 1 SEM (n = 660). *Pairwise post hoc comparison p < 0.001.

3. Results

3.1. Facial emotion recognition

All patients showed an emotion recognition performance acrossall emotion above 90% of correct recognition. As for the recogni-tion of happiness and fear, the two emotions tested in the ERPsprotocol, three out of four subjects recognized happy faces with100% accuracy and one subject with 90% accuracy. Accuracy in fearrecognition was 100% in one patient, 90% in two patients and 80% inone. Overall, no measure of recognition was less than one standarddeviation from controls’ mean.

3.2. Amygdala event related potentials

Clear event-related responses were evident in each of the fourpatients in at least one amygdala contact. The typical morphologyand latency amygdala ERPs were characterized by a broad nega-tive component (N1) between 300 and 500 ms post-stimulus onsetfollowed by a positive component (P1) between 500 and 800 ms(Fig. 2a).

Data for statistical analysis were obtained from the contactslocated in the amygdala that showed the averaged potentials withthe highest amplitude (Fig. 2b–d).

3.2.1. Face part and emotional valence effects

the eyes: An intra-cerebral event-related potential study from theia.2011.10.020

First, a two-way ANOVA with a 3 (emotional valence) × 3 (facepart) design was performed in order to consider the main effect ofemotion (fear, happiness, neutral), face-part (faces, eyes, mouths),

ARTICLE IN PRESSG Model

NSY-4309; No. of Pages 11

S. Meletti et al. / Neuropsychologia xxx (2011) xxx– xxx 5

Fig. 4. Group amygdala ERPs to eyes, faces and mouths encoding fear (A), happiness (B), and neutral (C) expression. Each ERPs curve is the average of approximately 220single trials. Continuous lines represent ERPs in response to eyes stimuli; dashed lines represent ERPs evoked by whole faces, dotted lines represent ERPs evoked by mouths.The horizontal black bracket indicates the post-stimulus time windows showing post hoc significant differences between stimulus categories. Graph bars (mean ± 1 SEMacross trials) of ERPs amplitudes of the corresponding time windows are showed on the right. In the time interval between 200 and 400 ms amygdala evoked potentials tof very oi oweda with n

aptspota

wrarh

earful eyes (A) showed an higher mean amplitude respect to potentials evoked by enterval between 300 and 400 ms amygdala evoked potentials to happy eyes (B) shnd mouths (*p = 0.05). No difference was observed comparing eyes, faces, mouths

nd their interaction in each 100 ms time-window from stimulusresentation to 700 ms post-stimulus (the ̨ level was set at 0.007o correct for multiple comparisons in different time epochs). Aignificant main effect of the factor “face part” (F2, 1874 = 7.467,

= 0.004) and “emotional valence” (F2, 1874 = 7.870, p = 0.001) wasbserved in the time-window between 300 and 400 ms. No “emo-ion” by “face-part” interaction reached statistical significance inny time-window (F4, 1869 = 5.426; p = 0.01 in the 300–400 epoch).

Post hoc paired tests in the 300–400 ms time-window betweenhole face and face-parts showed that the response to the eye

Please cite this article in press as: Meletti, S., et al. Fear and happiness inhuman amygdala. Neuropsychologia (2011), doi:10.1016/j.neuropsychologi

egion of the face (collapsing trials for emotional content) had greater amplitude N1 component respect to that obtained inesponse to mouths and whole faces in the (p < 0.001, Fig. 3a). Postoc paired tests between emotions (collapsing trials for ‘face parts’)

ther stimulus category, including fearful faces and mouths (**p < 0.001). In the time a higher mean amplitude respect to potentials evoked by happy faces (*p = 0.005)eutral expression (C).

showed that responses to fear encoding stimuli evoked a largeramplitude N1 component respect to happy and neutral stimuli(p < 0.001, Fig. 3b).

3.2.2. The effect of eyes encoding fear and joySince the main effects findings above reported represent a

result obtained from different discrete stimuli collapsed together,to better explore the relationship between the emotional valencesencoded by the stimuli and the part of the face presented to thesubject, the averaged evoked responses to each stimulus category

the eyes: An intra-cerebral event-related potential study from thea.2011.10.020

(n = 9) were analyzed independently.Bonferroni corrected ANOVAs ( ̨ level at 0.007) showed a signif-

icant main effect of the stimulus category during the time-windowfrom 200 to 300 ms (F8, 1868 = 7.632, p < 0.001) and during the

ARTICLE ING Model

NSY-4309; No. of Pages 11

6 S. Meletti et al. / Neuropsychol

Fig. 5. Graph bars (mean ± SEM across trials) of group ERPs amplitudes in the200–300 ms interval post-stimulus presentation evoked by fearful (red), happy(hl

tpbshewiiwcirpfsnr(

eawrto(

3

fiptsstp2fhac

blue), and neutral (gray) eyes. Error bars represent 1 SEM (n = 220). Pairwise postoc comparison p < 0.001. (For interpretation of the references to color in this figure

egend, the reader is referred to the web version of the article.)

ime-window from 300 to 400 ms post-stimulus (F8, 1868 = 4.743, < 0.001). Firstly, we performed post hoc paired comparisonetween whole faces stimuli with different emotional valence,ince previous studies demonstrated that fearful faces evokedigher amplitude ERPs respect with happy and neutral faces. Asxpected, fearful faces evoked a higher N1 component respectith both happy and neutral faces (p < 0.001 for both compar-

sons). However, the main finding was that group ERPs evoked bysolated fearful eyes showed the highest N1 component respect

ith any other stimulus category. We set our ̨ level at 0.005 toorrect for inflation of type I errors attributable to multiple compar-sons. Statistical analysis with post hoc paired tests confirmed thatesponses to “fearful eyes” evoked an higher amplitude N1 com-onent in the 200–300 and 300–400 ms time window respect toearful faces and mouths (Fig. 4a) and also respect to any other facialtimulus (p < 0.001 for all comparisons). Moreover, the N1 compo-ent evoked by fearful eyes was significantly higher than the N1esponse to happy and neutral eyes (p < 0.001 for both comparisons)Fig. 5).

The analysis of response to “smiling eyes” compared to thatvoked by whole happy faces and to “smiling mouths” revealed

larger amplitude of the N1 component in the 300–400 ms timeindow to (p = 0.005) (Fig. 4b). Moreover, the N1 component in

esponse to “smiling eyes” was also larger compared to that of neu-ral faces and face-parts (p < 0.001). No significant difference wasbserved in ERPs to faces, eyes, or mouths with neutral expressionFig. 4c).

.2.3. Single subject resultsWe also explored if this pattern of evoked responses was con-

rmed at single subject level. Therefore, the ERPs obtained in eachatient were analyzed (Fig. 6). At visual inspection ERPs in responseo “fearful eyes” had the highest amplitude N1 component in eachubject. Statistical analysis (one-way repeated measures ANOVA)howed a significant main effect of the stimulus category in theime-window between 300 and 500 ms for Patient 1 (F8, 465 = 4.586,

< 0.01), in the time window between 300 and 400 ms for Patient (F8, 461 = 3.282, p < 0.05), Patient 3 (F8, 468 = 2.965, p < 0.05) and

Please cite this article in press as: Meletti, S., et al. Fear and happiness inhuman amygdala. Neuropsychologia (2011), doi:10.1016/j.neuropsycholog

or Patient 4 (F8, 449 = 2.660, p < 0.05). In patient 1, 2, 3 and 4 postoc paired tests showed that responses to “fearful eyes” evokedn higher amplitude N1 component respect to each other stimulusategory in the 200–400 ms time window (p < 0.001) (Fig. 6). No

PRESSogia xxx (2011) xxx– xxx

other significant differences were observed for any other pairedcomparison at single subject level.

3.2.4. Laterality effectsERPs obtained grouping together single trials from right (n = 2)

and left (n = 2) amygdala were analyzed. No laterality effects wereobserved when comparing ERPs evoked by different face-parts(2 × 3 two-way ANOVA, with factors of side and ‘face parts’) or as afunction of the emotion displayed (2 × 3 two-way ANOVA, with fac-tors of side and ‘emotion’) in the 100–700 ms interval post-stimuluspresentation.

3.3. Time–frequency analyses (event related band power – ERBP)

As shown in Fig. 7a, ERBP changes (increases in frequencyband power) were mainly due to a marked increase in theta(5 Hz) frequency around 300–600 ms post-stimulus presentation.To assess the statistical significance of these ERBP post-stimuluschanges a two-way (8 × 4) ANOVA was calculated with time-window and frequency band as factors and ERBP values as thedependent measure (data were collapsed across all trials in thisanalysis). There was a main effect of time (F7, 128 = 5.167, p < 0.001),a strong main effect of frequency (F3, 128 = 52.734, p < 0.001), andalso a frequency × time interaction (F21, 128 = 4.564, p < 0.001). Posthoc Bonferroni-corrected t tests ( ̨ < 0.006) indicated that overallERBP changes were significantly different from the pre-stimulusperiod (−100–0 ms) to each of the following post-stimulus peri-ods: 200–300 ms (p < 0.001), 300–400 ms (p < 0.001), 400–500 ms(p < 0.001), with a maximal response in the 300–400 ms window.These results were confirmed by additional orthogonal one-wayANOVAs with a between-subject factor of frequency (theta; alpha;beta; gamma) performed separately on the ERBP data obtainedfor each of the seven post-stimulus time windows. For theseANOVAs, we set our ̨ level at 0.007 to correct for multiple compar-isons. ERBP differed significantly among frequency bands duringthe time windows of 200–300, 300–400 and 400–500 ms afterstimulus onset (F3, 16 = 18.081, p < 0.001; F3, 16 = 8.712, p = 0.003;F3, 16 = 14.45, p < 0.001, respectively). Post hoc tests confirmed thattheta ERBP changes were significantly different from alpha, beta,and gamma ERBP changes (200–300 ms window, p < 0.001 for allcomparisons; 300–400 ms window, p < 0.001 for all comparisons;400–500 ms window, p < 0.001 for all comparisons) with the highertheta band increase in the 300–400 ms post-stimulus interval(Fig. 7a).

Given these results subsequent analyses were restricted toassess the effect of the emotional valence of the stimuli on thetheta ERBP changes. In particular, as for the time-domain analyses,we investigated the interactions between the emotional valenceencoded by the stimulus and the part of the face presented to thesubject. Therefore, the averaged theta ERBPs changes to each stim-ulus category (n = 9) were analyzed. Repeated one-way ANOVAsfor each time-window (Bonferroni corrected for ̨ level at 0.007)showed a significant main effect of the stimulus category duringthe time-windows from 200 to 300 ms (F8, 36 = 4.768, p < 0.005), andfrom 300 to 400 ms post-stimulus (F8, 36 = 7.373, p < 0.001). Fig. 7breports the group theta ERBP changes over time for each stimu-lus category. Isolated fearful eyes evoked the highest post-stimulusERBP increment as observed in the time-domain analyses. Post hocpaired tests confirmed that responses to “fearful eyes” evoked anhigher theta ERBP increment in the 200–300 and 300–400 ms time-windows compared to fearful faces, mouths, as well as to any otherfacial stimulus (p < 0.001 for all comparisons). For displays purposes

the eyes: An intra-cerebral event-related potential study from theia.2011.10.020

Fig. 7c reports the time–frequency plots in response to fearful eyes,faces, and mouths in Pt. 3.

Finally, considering the effect of stimuli with positive valence,“smiling eyes” evoked an higher theta ERBP increment compared

ARTICLE IN PRESSG Model

NSY-4309; No. of Pages 11

S. Meletti et al. / Neuropsychologia xxx (2011) xxx– xxx 7

Fig. 6. Single-subjects ERPs to fearful encoding stimuli. On the left axial MRI image showing the electrode contact location. In the middle amygdala ERPs obtained by averaginge P is thE t. Erro

tttai

4

tff

(

yes (continuous line), whole faces (dotted line) and mouths (dashed line); each ERRPs amplitudes ± 1 SEM across trials between 300 and 400 ms post-stimulus onse

o that evoked by whole happy faces and to “smiling mouths” inhe 300–400 ms time windows (p < 0.005) (Fig. 7b). The responseo “smiling eyes” was also larger compared to that of neutral facesnd face-parts (p < 0.001). No significant difference was observedn ERPs to faces, eyes, or mouths with neutral expression.

. Discussion

We explored the response properties of the human amygdalao faces and face parts encoding fear, happiness, and neutrality inour patients implanted with SEEG electrodes. Our data suggest the

Please cite this article in press as: Meletti, S., et al. Fear and happiness inhuman amygdala. Neuropsychologia (2011), doi:10.1016/j.neuropsychologi

ollowing main conclusions:

1) Intracranial field potentials to face signals of dangers evoked aERPs response in amplitude that was twice as big as the ERPs

e average of approximately 55 single trials. On the right, graph bar showing meanr bars represent 1 SEM (n = 55). *Pairwise post hoc comparison p < 0.001.

evoked by neutral and happiness-encoding stimuli. This fear-effect was mainly due to ERPs recorded in response to fearfuleyes in isolation during the 200–400 ms post stimulus inter-val. This response was confirmed also by single-subject ERPsanalysis.

(2) Amygdala ERPs also showed a preferential response to “smilingeyes” compared to smiling mouths and whole happy faces. Thiseffect was observed at later latencies from 300 to 400 ms post-stimulus.

(3) Time–frequency analyses disclosed that the increase in thetheta frequency band accounted for the main event-related

the eyes: An intra-cerebral event-related potential study from thea.2011.10.020

band power changes in the 200–500 ms time-windows afterstimulus presentation. Group analyses in the frequency domainalso confirmed that theta band increase was sensitive to boththe emotional valence of the stimulus and to the part of the face

ARTICLE IN PRESSG Model

NSY-4309; No. of Pages 11

8 S. Meletti et al. / Neuropsychologia xxx (2011) xxx– xxx

Fig. 7. Group time–frequency analyses. (A) Mean ERBP changes across time for theta, alpha, beta and lower gamma frequency bands (collapsing for all stimulus category).T Error

s imumT

df

4

wffmdwner(pt

tatf

he main power changes were due to a marked increase of theta band frequency.

timulus used in the experiment. In the 200–500 ms interval post stimulus the maxime–frequency plots in response to fearful encoding stimuli in pt 3.

presented to subjects with the larger event-related band powerchanges evoked by isolated fearful eyes.

Overall, these data are consistent with a special role of the amyg-ala in processing facial signals conveyed by the eye region of theace.

.1. Methodological considerations on ERPs recordings

Despite subjects suffered from chronic drug-resistant epilepsy,e were reasonably confident to record electrophysiological data

rom “healthy” amygdala tissue. This assumption is based on theollowing considerations: in the first place, on the basis of the SEEG

onitoring, the amygdala was not included in the surgical proce-ure performed to remove the epileptogenic tissue, and all patientsere seizure-free at follow-up (>2 years for each patient). Secondly,o subject had an impairment in explicit emotion recognition asstablished by a protocol used by our group in previous clinicalesearch studies in patients with chronic temporal lobe epilepsyMeletti et al., 2003, 2009). Thirdly, during the experimental ERPsrotocol slow waves and spikes potentials were observed in lesshan 5% of single trials in each patient.

As for the location of the amygdala contacts submitted to sta-

Please cite this article in press as: Meletti, S., et al. Fear and happiness inhuman amygdala. Neuropsychologia (2011), doi:10.1016/j.neuropsycholog

istical analysis we are confident that all sites were within themygdala structure as confirmed by a probabilistic cytoarchitec-onic map of the amygdala (Eickhoff et al., 2005) (see Fig. 2). Theact that ERPs showed rapid incremental–decremental amplitude

bars represent 1 SEM (n = 1980). (B) Time course of theta frequency band for each increase of theta band is evident for fearful eyes (error bars represent 1 SEM). (C)

moving from few electrodes contacts is in accordance with theclose-field electric configuration of the nuclear amygdala structureand support that the recorded potentials reflect within amygdalaneuronal activity.

4.2. Response properties of ERPs recorded from the amygdala

ERPs recorded in the amygdala showed a broad negative com-ponent (N1) between 200 and 500 ms post-stimulus onset followedby a positive component (P1) between 500 and 800 ms. Impor-tantly, amygdala ERPs were observed for all the stimulus categoriesused in the experiment (i.e., amygdala neurons responded to facesand face-parts showing fear, happy, and neutral expression). How-ever, we demonstrated that the amygdala response profile wasmodulated either by the part of face that was presented than bythe emotional valence of the expressions. In particular, the amyg-dala was especially sensitive the eye region of the face displayingfear and joy. This result supports the hypothesis that the humanamygdala is especially sensitive to the eye region of conspecifics. Aspecial role of the amygdala in decoding features of the eye region ofthe face has been proposed from lesion studies and data from high-functioning autistic subjects (Adolphs, 2008; Adolphs et al., 2005;Baron-Cohen et al., 1999; Shaw et al., 2005). Accordingly, recent

the eyes: An intra-cerebral event-related potential study from theia.2011.10.020

PET evidences indicate that the human amygdala responds to directgaze (Kawashima et al., 1999). To this point it should be noted thatin the present study we did not explore the effect of different gazedirections on amygdala ERPs, but all stimuli were directed toward

ING Model

N

ychol

tefirS

aeoossbtfrfaiti2totrietpiantttrEoeee

flattcfnefisomhtm&etcdp

amygdala.It must be noted that two recent studies investigating emo-

ARTICLESY-4309; No. of Pages 11

S. Meletti et al. / Neurops

he observer (Hoffman, Gothard, Schmid, & Logothetis, 2007). Nev-rtheless, the demonstration of the amygdala preferential responseor emotional eyes could sustain a role of this structure in reflex-vely triggering attention for further detailed analysis of the eyeegion of the face (Adolphs & Spezio, 2006; Benuzzi et al., 2007;pezio, Adolphs, et al., 2007; Spezio, Huang, et al., 2007).

As expected from previous ERPs (Krolak-Salmon et al., 2004)nd fMRI studies (Breiter et al., 1996; Morris et al., 1996) fear-ncoding stimuli elicited the largest ERPs responses. This patternf response confirms the role of the amygdala in the evaluationf threatening stimuli and its preferential response to stimuli thatignal potential danger to the observer. However, a deeper analy-is of the response properties of the amygdala from ERPs elicitedy each stimulus category used in the experiment clearly revealedhat the observed ‘fear effect’ was due to the response to fear-ul eyes. Indeed, fearful eyes in isolation elicited an amygdalaesponse that was twice as large as the response to the whole fear-ul faces and to the mouths. Importantly this result, first obtainedt group level, was replicated at single-subject level. This findings in good accordance with previous fMRI evidences indicating thathe “open wide eye” expression is the key facial stimulus in evok-ng the amygdala response to threatening stimuli (Morris et al.,002; Whalen et al., 2004). However, it remain to elucidate whyhe ERPs response to fearful faces is weaker that the responsebtained to fearful eyes since both fearful faces and eyes con-ain the same information from the eyes region. So why is theesponse weaker? What is really happening is that the responses stronger to “isolated fearful eyes” than to “non-isolated fearfulyes” (in which the eyes are equivalent, only what is different ishe context). One hypothesis is that isolated fearful eyes are a sim-ler stimulus respect to whole faces and the amygdala response

s therefore facilitated. It should be noted that the pattern of ERPsmplitudes obtained in response to the eye region of the face can-ot be simply explained by the increasing in the white sclera sizehat characterize the fearful eyes expression. Indeed, if this wasrue we would have observed larger amplitudes ERPs from neu-ral eyes respect to happy eyes. On the contrary, the oppositeesponse was observed (i.e., happy eyes evoked larger amplitudesRPs compared to neutral eyes). We have not performed an analysisf the sclera size of the images used in the experiment, how-ver one can confidentially assume that the sclera size of happyyes is smaller, or at least not larger, compared to that of neutralyes.

Finally, but not less important, the analysis of amygdala ERPs toacial stimuli of joy revealed that the view of smiling eyes in iso-ation evoked larger amplitudes responses respect to happy facesnd smiling mouths. This pattern of response could suggest thathe amygdala present a preferential response to the eye region ofhe face also for facial stimuli with positive valence. However, aautionary note in the interpretation of this finding is that it comesrom the group level analysis. Indeed single-subject analysis wasot able to confirm this result at statistical level. Genuine facialxpressions of happiness involve two groups of facial muscles. Therst one is associated with upward movement of the mouth, theecond one with surrounding muscles of the eyes. Duchenne firstbserved, in 1862, that non-genuine smiles lack contraction of theuscle groups around the eyes that produce crows’ feet or what

ave been called ‘Duchenne markers’. Previous studies suggestedhat Duchenne markers have served as a form of true ‘enjoy-

ent signal’ during interpersonal communication (Frank, Ekman, Friesen, 1993). Have we evolved a perceptual mechanism thatnsures attention to such an adaptive social cue? Our data suggesthat the amygdala activity might be a neuronal ‘hardwired’ sub-ortical region to ensure attention to the eye region of the face to

Please cite this article in press as: Meletti, S., et al. Fear and happiness inhuman amygdala. Neuropsychologia (2011), doi:10.1016/j.neuropsychologi

ecode different facial expression, not only fear signals but alsoositive facial displays.

PRESSogia xxx (2011) xxx– xxx 9

4.3. Temporal dynamic of amygdala ERPs

The temporal dynamics of ERPs obtained in response to fear-ful eyes disclosed that a ‘fear effect’ relative to other stimuluscategories was evident at 200 ms after stimulus displays and main-tained as long as 400 ms post-stimulus. The preferential response to‘smiling eyes’ compared to other happiness-encoding stimuli andto other neutral expressions was evident from 300 to 400 ms afterstimulus displays. This finding could suggest that eyes signalingdanger and possible threat to the observer are processed fasterthan signals with positive valence and it supports the idea thatthe amygdala favors detection of potential threats in the environ-ment setting the appropriate behavioral responses. Present datashowing a ‘fear-effect’ at 200–300 ms post-stimulus are consis-tent with other studies in monkeys (Gothard, Battaglia, Erickson,Spitler, & Amaral, 2007; Kuraoka & Nakamura, 2007) as well as inhumans (Krolak-Salmon et al., 2004; Oya et al., 2002) that havefound responses well in excess of 100 ms, and as slow as or slowerthan those observed in temporal visual cortex (Adolphs, 2008). Thiselectrophysiological evidence contrasts with the hypothesis of adirect and fast sub-cortical route of sensorial stimuli processing inthe amygdala. In particular, our results are at odds with previousintracranial ERPs evidence that showed that human amygdala dif-ferentiates fearful from neutral faces at 200–300 ms after stimulusdisplays (Krolak-Salmon et al., 2004). Moreover, this latency rangefor the amygdala ‘fear effect’ relative to appeasing faces has beenrecently shown also in single-neuron recording study in monkeys(Gothard et al., 2007).

4.4. Time–frequency findings

Our data demonstrate that the change in the ERBP envelopeafter the stimulus presentation was substantially due to an increasein the energy power of theta oscillations. An increase in event-related theta frequency oscillations within the amygdala has beenreported during animal fear conditioning (Paré & Dawn, 2000) andin human studies investigating different neuropsychological func-tions, such as emotional face processing (Maratos et al., 2009), errormonitoring (Pourtois et al., 2010), and emotional memory encoding(Rutishauser, Ross, Mamelack, & Schuman, 2010). On an anatomi-cal basis these data are supported by the fact that lateral amygdalanuclei receive strong sensory synaptic inputs from the rhinal cor-tices and hippocampal formation where rhythmic neuronal activityin the theta range have repeatedly been observed (Buzsáki, Leung,& Vanderwolf, 1983; Collins, Lang, & Paré, 1999; Mitchell & Ranck,1980). The observed event-related band power increase thereforepresumably reflects sensory processing driven by the stimulus fea-tures. Moreover, it has been recently proposed that theta-bandactivity (Lewis, 2005) plays an important role in integrating andsynchronizing neuronal activities within the medial temporal lobenetwork during emotional processing (Knyazev, 2007; Paré et al.,2002). In this line, it has been recently suggested that theta bandactivity is modulated by the affective valence of pictorial stimuli(Aftanas, Varlamov, Pavlov, Makhnev, & Reva, 2001). Our findingsdemonstrate for the first time that viewing faces and face partselicit an increase in theta-band oscillatory activities. More impor-tantly, we observed that the emotional valence of the stimuli, andspecifically emotional eyes, evoked an increase in ERBP that wassignificantly higher compared to other facial stimuli. These resultsare complementary to our findings in the time-domain analysisand suggest that the ERPs observed in the present study could bedriven by such modulation of theta oscillatory activities within the

the eyes: An intra-cerebral event-related potential study from thea.2011.10.020

tional face processing with magneto-electroencephalography (Luoet al., 2007) and intracranial recordings (Sato et al., 2011) revealed

ING Model

N

1 ychol

afarcpogdwrwuitasaluicgpispcbsor

5

e&toutcsetodfit

F

Rd

A

t

R

A

ARTICLESY-4309; No. of Pages 11

0 S. Meletti et al. / Neurops

n increase in gamma-band frequencies after the presentation ofearful faces, while a previous study by Oya et al. (2002) revealed

gamma-band increase after the presentation of aversive picto-ial stimuli. Importantly, these studies underlined that an earlyhange in high frequency oscillations at 100–150 ms post-stimulusresentation supports the concept of a fast and rapid processingf threatening stimuli by the amygdala. The failure of disclosingamma-band power changes in our study could be attributed toifferent causes. First, the type of bands power changes analyzedere of the evoked type, that means that the observed ERBP changes

eflect oscillatory activities that are phase-locked and time-lockedith stimulus presentation. This could be important since Oya et al.nderlined that gamma-band changes in their study were of the

nduced type, that are not phase locked with the stimulus presen-ation. This means that gamma-band changes probably result from

temporally dispersed evaluation of the emotional meaning of thetimulus. On the contrary, our results reflect the ERBP changes thatre temporally synchronized with the stimulus presentation. In thisine, the repetition of a very large number of stereotyped stim-li in our experimental paradigm could have washed out changes

n neural activities that were not time-locked to the stimulus. Ofourse, the finding that the theta-band activity was larger thanamma-band activity do not indicate no effect of emotion/face-art on gamma-band activity. We believe that the results obtained

n this study are not in contradiction to the above mentionedtudies investigating the role of gamma frequency on emotionerception. Rather, the role of theta frequency should be view asomplementary or expanding the knowledge of the role of differentrain rhythms on emotional processing. It could be hypothe-ized that a fast emotional processing is sub-served by gammascillations, while a slower processing is driven by theta-bandhythms.

. Conclusions

The eye region of the face represents a special area due to thextensive amount of information that can be extracted from it (Itier

Batty, 2009). The eyes, more than other facial features, are centralo all aspect of social communication such as emotions, directionf attention, and identity. In particular, several lines of evidencesnderline that processing the eye region of the face depends onhe amygdala (Adolphs, 2008; Adolphs et al., 2005). Our data areonsistent with a special role of the amygdala in processing facialignals conveyed by the eye region of the face, and especially bymotional eyes encoding fear and joy. Moreover, we demonstratedhat an increase in the theta frequency band was the principalscillatory neuronal activity responsible for the observed amyg-ala responses. Finally, a cautionary note concerns the fact that ourndings come from a small sample of four subjects, and thereforehe generalization of the observed finding is limited.

undings

This works was supported by a grant of the Emilia-Romagnaegion, Italy. Project title: Mechanism, diagnosis and treatment ofrug-resistant epilepsy; Area 1a Ricerca Innovativa.

ppendix A. Supplementary data

Supplementary data associated with this article can be found, inhe online version, at doi:10.1016/j.neuropsychologia.2011.10.020.

Please cite this article in press as: Meletti, S., et al. Fear and happiness inhuman amygdala. Neuropsychologia (2011), doi:10.1016/j.neuropsycholog

eferences

dolphs, R. (2008). Fear, faces, and the human amygdala. Current Opinion in Neuro-biology, 18, 166–172.

PRESSogia xxx (2011) xxx– xxx

Adolphs, R. & Spezio, M. (2006). Role of the amygdala in processing visual socialstimuli. Progress in Brain Research, 156, 363–378.

Adolphs, R. & Tranel, D. (2004). Impaired judgments of sadness but not happi-ness following bilateral amygdala damage. Journal of Cognitive Neuroscience, 16,453–462.

Adolphs, R., Baron-Cohen, S. & Tranel, D. (2002). Impaired recognition of socialemotions following amygdala damage. Journal of Cognitive Neuroscience, 14,1264–1274.

Adolphs, R., Tranel, D., Damasio, H. & Damasio, A. (1994). Impaired recognition ofemotion in facial expressions following bilateral damage to the human amyg-dala. Nature, 372, 669–672.

Adolphs, R., Gosselin, F., Buchanan, T. W., Tranel, D., Schyns, P. & Damasio, A. R.(2005). A mechanism for impaired fear recognition after amygdala damage.Nature, 433, 68–72.

Aftanas, L. I., Varlamov, A. A., Pavlov, S. V., Makhnev, V. P. & Reva, N. V. (2001).Affective picture processing: Event-related synchronization within individuallydefined human theta band is modulated by valence dimension. NeuroscienceLetters, 303, 115–118.

Baron-Cohen, S., Ring, H. A., Wheelwright, S., Bullmore, E. T., Brammer, M. J., Sim-mons, A., et al. (1999). Social intelligence in the normal and autistic brain: AnfMRI study. European Journal of Neuroscience, 11, 1891–1898.

Benuzzi, F., Meletti, S., Zamboni, G., Calandra-Buonaura, G., Serafini, M., Lui, F., et al.(2004). Impaired fear processing in right mesial temporal sclerosis: A fMRI study.Brain Research Bulletin, 63, 269–281.

Benuzzi, F., Pugnaghi, M., Meletti, S., Lui, F., Serafini, M., Baraldi, P., et al. (2007).Processing the socially relevant parts of faces. Brain Research Bulletin, 74,344–356.

Breiter, H. C., Etcoff, N. L., Whalen, P. J., Kennedy, W. A., Rauch, S. L., Buckner, R. L.,et al. (1996). Response and habituation of the human amygdala during visualprocessing of facial expression. Neuron, 17, 875–887.

Buzsáki, G., Leung, L. W. & Vanderwolf, C. H. (1983). Cellular bases of hippocampalEEG in the behaving rat. Brain Research Review, 6, 139–171.

Collins, D. R., Lang, E. J. & Paré, D. (1999). Spontaneous activity of perirhinal cortexin behaving cats. Neuroscience, 89, 1025–1039.

Eickhoff, S. B., Stephan, K. E., Mohlberg, H., Grefkes, C., Fink, G. R., Amunts, K., et al.(2005). A new SPM toolbox for combining probabilistic cytoarchitectonic mapsand functional imaging data. Neuroimage, 25, 1325–1335.

Ekman, P. & Friesen, W. V. (1976). Pictures of facial affect. Palo Alto: ConsultingPsychologists Press.

Ekman, P., Davidson, R. J. & Friesen, W. V. (1990). The Duchenne smile: Emotionalexpression and brain physiology II. Journal of Personality and Social Psychology,58, 342–353.

Frank, M. G., Ekman, P. & Friesen, W. V. (1993). Behavioral markers and recogniz-ability of the smile of enjoyment. Journal of Personality and Social Psychology, 64,83–93.

Gothard, K. M., Battaglia, F. P., Erickson, C. A., Spitler, K. M. & Amaral, D. G. (2007).Neural responses to facial expression and face identity in the monkey amygdala.Journal of Neurophysiology, 97, 1671–1683.

Hoffman, K. L., Gothard, K. M., Schmid, M. C. & Logothetis, N. K. (2007). Facial-expression and gaze-selective responses in the monkey amygdala. CurrentBiology, 17, 766–772.

Hudry, J., Ryvlin, P., Royet, J. P. & Mauguiere, F. (2001). Odorants elicit evoked poten-tials in the human amygdala. Cerebral Cortex, 11, 619–627.

Itier, R. J. & Batty, M. (2009). Neural bases of eye and gaze processing:The core of social cognition. Neuroscience and Biobehavioral Reviews, 33,843–863.

Kawashima, R., Sugiura, M., Kato, T., Nakamura, A., Hatano, K., Ito, K., et al. (1999).The human amygdala plays an important role in gaze monitoring. A PET study.Brain, 122, 779–783.

Knyazev, G. (2007). Motivation, emotion, and their inhibitory control mirrored inbrain oscillations. Neuroscience and Biobehavioral Reviews, 31, 377–395.

Krolak-Salmon, P., Henaff, M. A., Vighetto, A., Bertrand, O. & Mauguiere, F.(2004). Early amygdala reaction to fear spreading in occipital, temporal,and frontal cortex: A depth electrode ERP study in human. Neuron, 42,665–676.

Kuraoka, K. & Nakamura, K. (2007). Responses of single neurons in monkey amygdalato facial and vocal emotions. Journal of Neurophysiology, 97, 1379–1387.

Lachaux, J. P., Rudrauf, D. & Kahane, P. (2003). Intracranial EEG and human brainmapping. Journal of Physiology, Paris, 97, 613–628.

Lewis, M. D. (2005). Bridging emotion theory and neurobiology through dynamicsystems modeling. Behavioural Brain Research, 28, 169–245.

Lorente de Nò, R. (1947). Analysis of the distribution of action currents of nerve involume conductors. Studies From The Rockefeller Institute for Medical Research,132, 384–477.

Luo, Q., Holroyd, T., Jones, M., Hendler, T. & Blair, J. (2007). Neural dynamics forfacial threat processing as revealed by gamma band synchronization using MEG.Neuroimage, 34, 839–847.

Maratos, F. A., Mogg, K., Bradley, B. P., Rippon, G. & Senior, C. (2009). Coarse threatimages reveal theta oscillations in the amygdala: A magnetoencephalographystudy. Cognitive, Affective, & Behavioral Neuroscience, 9, 133–143.

Meletti, S., Benuzzi, F., Rubboli, G., Cantalupo, G., Stanzani Maserati, M., Nichelli,

the eyes: An intra-cerebral event-related potential study from theia.2011.10.020

P., et al. (2003). Impaired facial emotion recognition in early-onset right mesialtemporal lobe epilepsy. Neurology, 60, 426–431.

Meletti, S., Benuzzi, F., Cantalupo, G., Rubboli, G., Tassinari, C. A. & Nichelli, P. (2009).Facial emotion recognition impairment in chronic temporal lobe epilepsy.Epilepsia, 50, 1547–1559.

ING Model

N

ychol

M

M

M

O

O

P

P

P

R

S

proportioned system: An approach to cerebral imaging. New York: Thieme-Verlag.Whalen, P. J., Kagan, J., Cook, R. G., Davis, F. C., Kim, H., Polis, S., et al. (2004). Human

amygdala responsivity to masked fearful eye whites. Science, 306, 2061.

ARTICLESY-4309; No. of Pages 11

S. Meletti et al. / Neurops

itchell, S. & Ranck, J. B. (1980). Generation of theta rhythm in medial entorhinalcortex of freely moving rats. Brain Research, 189, 49–66.

orris, J. S., deBonis, M. & Dolan, R. J. (2002). Human amygdala responses to fearfuleyes. Neuroimage, 17, 214–222.

orris, J. S., Frith, C. D., Perrett, D. I., Rowland, D., Young, A. W., Calder, A. J., et al.(1996). A differential neural response in the human amygdala to fearful andhappy facial expressions. Nature, 383, 812–815.

ldfield, R. C. (1971). The assessment and analysis of handedness: The Edinburghinventory. Neuropsychologia, 9, 97–113.

ya, H., Kawasaki, H., Howard, M. A., 3rd & Adolphs, R. (2002). Electrophysiologicalresponses in the human amygdala discriminate emotion categories of complexvisual stimuli. Journal of Neuroscience, 22, 9502–9512.

aré, D. & Dawn, R. C. (2000). Neuronal correlates of fear in the lateral amygdala:Multiple extracellular recordings in conscious cats. Journal of Neuroscience, 20,2701–2710.

aré, D., Collins, D. R. & Pelletier, J. G. (2002). Amygdala oscillations and the consol-idation of emotional memories. Trends in Cognitive Science, 6, 306–314.

ourtois, G., Vocat, R., N‘Diaye, K., Spinelli, L., Seeck, M. & Vuilleumier, P. (2010).Errors recruit both cognitive and emotional monitoring systems: Simultane-ous intracranial recordings in the dorsal anterior cingulate gyrus and amygdalacombined with fMRI. Neuropsychologia, 48, 1144–1159.

utishauser, U., Ross, I. B., Mamelak, A. N. & Schuman, E. M. (2010). Human mem-

Please cite this article in press as: Meletti, S., et al. Fear and happiness inhuman amygdala. Neuropsychologia (2011), doi:10.1016/j.neuropsychologi

ory strength is predicted by theta-frequency phase-locking of single neurons.Nature, 464, 903–907.

ato, W., Kochiyama, T., Uono, S., Matsuda, K., Usui, K., Inoue, Y., et al. (2011).Rapid amygdala gamma oscillations in response to fearful facial expressions.Neuropsychologia, 49, 612–617.

PRESSogia xxx (2011) xxx– xxx 11

Schyns, P. G., Bonnar, L. & Gosselin, F. (2002). Show me the features! Understand-ing recognition from the use of visual information. Psychological Science, 13,402–409.

Shaw, P., Bramham, J., Lawrence, E. J., Morris, R., Baron-Cohen, S. & David, A. S. (2005).Differential effects of lesions of the amygdala and prefrontal cortex on recog-nizing facial expressions of complex emotions. Journal of Cognitive Neuroscience,17, 1410–1419.

Smith, M. L., Cottrell, G. W., Gosselin, F. & Schyns, P. G. (2005). Transmitting anddecoding facial expressions. Psychological Science, 16, 184–189.

Spezio, M. L., Adolphs, R., Hurley, R. S. & Piven, J. (2007). Abnormal use of facialinformation in high-functioning autism. Journal of Autism and DevelopmentalDisorders, 37, 929–939.

Spezio, M. L., Huang, P. Y., Castelli, F. & Adolphs, R. (2007). Amygdala damage impairseye contact during conversations with real people. Journal of Neuroscience, 27,3994–3997.

Talairach, J. & Bancaud, J. (1973). Stereotactici approach to epilepsy. Methodology ofanatomo-functional stereotactic investigations. Progress in Neurological Surgery,5(297), 354.

Talairach, J. & Tournoux, P. (1988). Co-planar stereotaxic atlas of the human brain. 3-D

the eyes: An intra-cerebral event-related potential study from thea.2011.10.020

Young, A. W., Hellawell, D. J., Van De Wal, C. & Johnson, M. (1996). Facial expressionprocessing after amygdalotomy. Neuropsychologia, 34, 31–39.