Explicit and Incidental Facial Expression Processing: An fMRI Study

9
Explicit and Incidental Facial Expression Processing: An fMRI Study Maria Luisa Gorno-Tempini,* , ² Samanta Pradelli,* Marco Serafini,‡ Giuseppe Pagnoni,§ Patrizia Baraldi,§ Carlo Porro, Roberto Nicoletti,\ Carlo Umita ` ,** and Paolo Nichelli* *Dipartimento di Patologia Neuropsicosensoriale, Universita ` di Modena e Reggio Emilia, Modena, Italy; ²Wellcome Department of Cognitive Neurology, Institute of Neurology, London, United Kingdom; ASL Modena, Modena, Italy; §Dipartimento di Scienze Biomediche, Universita ` di Modena e Reggio Emilia, Modena, Italy; Dipartimento di Scienze e Tecnologie Biomediche, Universita ` di Udine, Udine, Italy; \Dipartimento di Psicologia dello Sviluppo e della Socializzazione, Padova, Italy; and **Dipartimento di Psicologia Generale, Universita ` di Padova, Padova, Italy Received September 22, 2000 Considerable evidence indicates that processing fa- cial expression involves both subcortical (amygdala and basal ganglia) and cortical (occipito-temporal, or- bitofrontal, and prefrontal cortex) structures. How- ever, the specificity of these regions for single types of emotions and for the cognitive demands of expression processing, is still unclear. This functional magnetic resonance imaging (fMRI) study investigated the neu- ral correlates of incidental and explicit processing of the emotional content of faces expressing either dis- gust or happiness. Subjects were examined while they were viewing neutral, disgusted, or happy faces. The incidental task required subjects to decide about face gender, the explicit task to decide about face expres- sion. In the control task subjects were requested to detect a white square in a greyscale mosaic stimulus. Results showed that the left inferior frontal cortex and the bilateral occipito-temporal junction responded equally to all face conditions. Several cortical and sub- cortical regions were modulated by task type, and by facial expression. Right neostriatum and left amyg- dala were activated when subjects made explicit judgements of disgust, bilateral orbitofrontal cortex when they made judgement of happiness, and right frontal and insular cortex when they made judge- ments about any emotion. © 2001 Academic Press Key Words: fMRI; disgust; happiness; facial expres- sion; emotion; explicit processing. INTRODUCTION Converging evidence indicates that both subcortical structures, such as the amygdala and the basal gan- glia, and cortical regions, such as prefrontal and oc- cipito-temporal areas, are involved in processing facial expressions (Adolphs et al., 1999; Blair et al., 1999, 2000; Hornak, et al., 1996; Morris et al., 1996, 1998a; Nakamura et al., 1999; Phillips et al., 1997; Sprengel- meyer et al., 1996). However, the contribution of these regions to recognition of different facial expressions and their specific role in emotional processing are still unresolved issues. For instance a number of clinical (Broks et al., 1998) and neuroimaging studies have shown that the amygdala is involved in recognition of fearful expressions. Yet, its role with emotions differ- ent from fear is under discussion. Patient data sug- gests that this structure could also be involved in pro- cessing other expressions denoting negative emotions, such as anger (Adolphs et al., 1999; Calder et al., 1996; Scott et al., 1997), disgust (Adolphs et al., 1999), and sadness (Sprengelmeyer et al., 1999). Functional imag- ing data have not provided conclusive evidence in this matter. In fact, amygdalar activity was enhanced by increasing intensity of sad but not angry (Blair et al., 1999) or disgusted expressions (Phillips et al., 1997). On the other hand, the amygdala was modulated even by unconscious exposure to conditioned angry expres- sions (Morris et al., 1998b). Furthermore, one func- tional imaging study has suggested its involvement also in processing happy expressions (Breiter et al., 1996). The role of the basal ganglia is also a matter of controversy. Huntington disease patients (Gray et al., 1997; Sprengelmeyer et al., 1996) are especially im- paired in recognising the facial expression of disgust, suggesting a role of the caudate nucleus in processing this emotion. More recently, Calder et al. (2000) re- ported a patient who showed a selective deficit in recog- nising disgust signals from multiple modalities after a left hemisphere infarct involving insula, putamen, glo- bus pallidus, and the head of the caudate nucleus. However, the sole functional neuroimaging study that has investigated this question (Phillips et al., 1997), found that the insula, but not the caudate, was acti- vated in response to increasingly disgusted faces. Furthermore, functional neuroimaging has shown that multiple regions within the prefrontal cortex are activated by perceiving facial expressions Yet, their specific role in facial expression recognition and, more NeuroImage 14, 465– 473 (2001) doi:10.1006/nimg.2001.0811, available online at http://www.idealibrary.com on 465 1053-8119/01 $35.00 Copyright © 2001 by Academic Press All rights of reproduction in any form reserved.

Transcript of Explicit and Incidental Facial Expression Processing: An fMRI Study

NeuroImage 14, 465–473 (2001)doi:10.1006/nimg.2001.0811, available online at http://www.idealibrary.com on

Explicit and Incidental Facial Expression Processing: An fMRI StudyMaria Luisa Gorno-Tempini,*,† Samanta Pradelli,* Marco Serafini,‡ Giuseppe Pagnoni,§ Patrizia Baraldi,§

Carlo Porro,¶ Roberto Nicoletti,\ Carlo Umita,** and Paolo Nichelli**Dipartimento di Patologia Neuropsicosensoriale, Universita di Modena e Reggio Emilia, Modena, Italy; †Wellcome Department of

Cognitive Neurology, Institute of Neurology, London, United Kingdom; ‡ASL Modena, Modena, Italy; §Dipartimento di ScienzeBiomediche, Universita di Modena e Reggio Emilia, Modena, Italy; ¶Dipartimento di Scienze e Tecnologie Biomediche, Universita di

Udine, Udine, Italy; \Dipartimento di Psicologia dello Sviluppo e della Socializzazione, Padova, Italy; and**Dipartimento di Psicologia Generale, Universita di Padova, Padova, Italy

Received September 22, 2000

Considerable evidence indicates that processing fa-cial expression involves both subcortical (amygdalaand basal ganglia) and cortical (occipito-temporal, or-bitofrontal, and prefrontal cortex) structures. How-ever, the specificity of these regions for single types ofemotions and for the cognitive demands of expressionprocessing, is still unclear. This functional magneticresonance imaging (fMRI) study investigated the neu-ral correlates of incidental and explicit processing ofthe emotional content of faces expressing either dis-gust or happiness. Subjects were examined while theywere viewing neutral, disgusted, or happy faces. Theincidental task required subjects to decide about facegender, the explicit task to decide about face expres-sion. In the control task subjects were requested todetect a white square in a greyscale mosaic stimulus.Results showed that the left inferior frontal cortex andthe bilateral occipito-temporal junction respondedequally to all face conditions. Several cortical and sub-cortical regions were modulated by task type, and byfacial expression. Right neostriatum and left amyg-dala were activated when subjects made explicitjudgements of disgust, bilateral orbitofrontal cortexwhen they made judgement of happiness, and rightfrontal and insular cortex when they made judge-ments about any emotion. © 2001 Academic Press

Key Words: fMRI; disgust; happiness; facial expres-sion; emotion; explicit processing.

INTRODUCTION

Converging evidence indicates that both subcorticalstructures, such as the amygdala and the basal gan-glia, and cortical regions, such as prefrontal and oc-cipito-temporal areas, are involved in processing facialexpressions (Adolphs et al., 1999; Blair et al., 1999,2000; Hornak, et al., 1996; Morris et al., 1996, 1998a;Nakamura et al., 1999; Phillips et al., 1997; Sprengel-

meyer et al., 1996). However, the contribution of these

465

regions to recognition of different facial expressionsand their specific role in emotional processing are stillunresolved issues. For instance a number of clinical(Broks et al., 1998) and neuroimaging studies haveshown that the amygdala is involved in recognition offearful expressions. Yet, its role with emotions differ-ent from fear is under discussion. Patient data sug-gests that this structure could also be involved in pro-cessing other expressions denoting negative emotions,such as anger (Adolphs et al., 1999; Calder et al., 1996;Scott et al., 1997), disgust (Adolphs et al., 1999), andsadness (Sprengelmeyer et al., 1999). Functional imag-ing data have not provided conclusive evidence in thismatter. In fact, amygdalar activity was enhanced byincreasing intensity of sad but not angry (Blair et al.,1999) or disgusted expressions (Phillips et al., 1997).On the other hand, the amygdala was modulated evenby unconscious exposure to conditioned angry expres-sions (Morris et al., 1998b). Furthermore, one func-tional imaging study has suggested its involvementalso in processing happy expressions (Breiter et al.,1996).

The role of the basal ganglia is also a matter ofcontroversy. Huntington disease patients (Gray et al.,1997; Sprengelmeyer et al., 1996) are especially im-paired in recognising the facial expression of disgust,suggesting a role of the caudate nucleus in processingthis emotion. More recently, Calder et al. (2000) re-ported a patient who showed a selective deficit in recog-nising disgust signals from multiple modalities after aleft hemisphere infarct involving insula, putamen, glo-bus pallidus, and the head of the caudate nucleus.However, the sole functional neuroimaging study thathas investigated this question (Phillips et al., 1997),found that the insula, but not the caudate, was acti-vated in response to increasingly disgusted faces.

Furthermore, functional neuroimaging has shownthat multiple regions within the prefrontal cortex areactivated by perceiving facial expressions Yet, their

specific role in facial expression recognition and, more

1053-8119/01 $35.00Copyright © 2001 by Academic Press

All rights of reproduction in any form reserved.

466 GORNO-TEMPINI ET AL.

generally, in processing emotions remains controver-sial. While there is considerable evidence suggestingthat the orbitofrontal cortex might play a role in emo-tional and social behaviour (Blair et al., 1999, 2000;Petit and Haxby, 1999; Rolls, 2000), the specificity ofthe more lateral prefrontal regions is less clear. Theselateral areas have been activated in studies of facialexpression processing (George et al., 1993; Nakamuraet al., 1999). However, they have also been repeatedlyinvolved in more general memory functions (Courtneyet al., 1998; Fletcher et al., 1998; Lepage et al., 2000;Haxby et al., 2000; Henson et al., 1999), suggestingthat their activation might not be specific to emotionalprocessing.

In summary, while the network of brain regions sus-taining facial expression processing is well established,the relative contribution of its components is still un-clear. Inconsistency across neuroimaging studies couldbe due to differences in the cognitive task utilised.Incidental tasks, such as gender decision, may be ap-propriate in revealing responses to emotions that de-note great survival value, such as fear (Morris et al.,1996), but may be less effective for other types of ex-pressions. Furthermore, cortical responses in regionssuch as the occipital and prefrontal lobe are likely to begreatly influenced by demands that different cognitivetasks place on perceptual and memory retrieval pro-cesses.

We investigated the effect of task modulation onbrain responses to facial expression with functionalmagnetic resonance imaging (fMRI). Ten subjects wererequested to process faces expressing disgust or hap-piness while performing either an incidental task (gen-der decision) or an explicit emotional judgement (dis-gust or happiness). We hypothesized that structuresshowing a modulation due to both task and type ofemotion, are specifically involved in emotional process-ing. On the other hand, areas modulated by task de-mands, but not by emotion type, could be involved inmore general cognitive functions.

MATERIALS AND METHODS

Subjects

Ten right-handed subjects, five males and five fe-males, between 25 and 30 years of age, participated inthe experiment. Handedness was assessed by means ofthe Edinburgh questionnaire (Oldfield, 1971). Exclu-sion criteria included a history of past or present neu-rological or psychiatric illness. All subjects gave in-formed written consent after the nature of theexperiment was explained. The Ethic Committee of theUniversity of Modena and Reggio Emilia approved the

study.

Scanning Parameters

A General Electric Signa Horizon High Speed 77system at 1.5 Tesla was used to acquire both structuralT1 and gradient echoplanar T2* BOLD-contrast im-ages. Echoplanar images were collected using a singleshot, blipped, gradient echo echoplanar pulse sequencedeveloped by Peter Jezzard at the National Institutesof Health (Bethesda MD). To maximize field homoge-neity, fine manual prescan and localized shimmingwere performed at the beginning of the first session.Each BOLD-echoplanar volume scan consisted of 16transverse slices (in plane matrix 64 3 64; voxel size3.75 3 3.75 3 5 mm; TE 5 40 ms; TR: 3380 ms).Sixty-three volumes were collected in each scanningsession and each subject underwent four sessions for atotal of 252 vol. A blocked design was used (see follow-ing section) and nine volumes were acquired in eachblock. A T1-weighted high-resolution MRI of each sub-ject was acquired to facilitate anatomical localization.

Stimuli and Experimental Design

Pictures of 10 individuals (6 males, 4 females) wereused in this study. Each individual showed disgusted,happy, or neutral expressions in different pictures.Faces were black and white photographs taken from astandard set of expressions of emotion (Ekman andFrisen, 1976). According to the data from Ekman andFrisen, the mean percentage of emotion recognitionwas 99.10 (SD 5 2.51) for happy faces and 93.10 (SD 55.2) for disgusted faces. Control stimuli were preparedapplying an Adobe Photoshop mosaic filter of 512 pixelto the face pictures, thus obtaining greyscale imagesformed of 8 3 11 squares, no longer recognisable asfaces.

The experiment used a blocked paradigm and wasdesigned as a 2 3 2 factorial design with a controlcondition. One factor was type of emotion: either dis-gust (D) or happiness (H). The second factor was thetask: subjects were asked either explicit (E) expressionrecognition (i.e., disgust/neutral or happiness/neutraldiscrimination) or gender decision (incidental process-ing, I), i.e., male/female discrimination. Four experi-mental conditions were thus created: explicit recogni-tion of disgusted or happy faces (ED and EH) in theemotion recognition task and incidental processing ofthe same facial expressions (ID and IH) in the genderdecision task. In the control condition (C), subjectswere asked to detect a white square in the center of thecontrol stimuli. Stimuli were presented one at a timefor 2500 ms with an interstimulus interval of 3800 ms.Six disgusted or happy and two neutral faces werepresented in each block that lasted 30 s. Faces of thesame individual were not repeated within one block.However, each face was presented expressing each ofthe three emotions within the experiment. To maintain

the same proportion of motor responses in the gender

M(Ta

467FACIAL EXPRESSION PROCESSING

decision task, six faces belonged to one sex and two tothe other. Before the beginning of each block, writteninstructions were presented to inform subjects of thetask they had to perform. Blocks with the control con-dition were included between each experimental block,and each scanning session comprised seven blocks, e.g.,ED, C, IH, C, EH, C, ID. Each subject underwent fourscanning sessions. Presentation order of faces within ablock was randomized, while conditions were counter-balanced within and between subjects. Responses weregiven using a two-position button. Accuracy and reac-tion times data were collected during the scanningsessions.

Image Processing and Statistical Analysis

Data were preprocessed and analyzed using SPM 97(Wellcome Department of Cognitive Neurology, Lon-don, UK; http://www.fil.ion.ucl.ac.uk; Friston et al.,1995a). All functional volumes for each subject wererealigned to the first volume acquired. Images werethen spatially normalised (Friston et al., 1995b) to the

ontreal Neurological Institute (MNI) standard brainCocosco et al., 1997) in the space of Talairach andourneaux (1988) and resampled to obtain images withvoxel size of 4 3 4 3 4 mm. All volumes were then

smoothed with an 8-mm full width maximum isotropicGaussian kernel.

The statistical analysis of the group modelled each ofthe four sessions obtained in each subject separately,obtaining an analysis that comprised 40 sessions. Theexperimental conditions were modeled as boxcar func-tions convolved with the hemodynamic response func-tion. Condition effects were estimated according to thegeneral linear model and regionally specific effectswere compared using linear contrasts. Each contrastproduced a statistical parametric map of the t statisticfor each voxel, which was subsequently transformed tothe unit normal Z distribution.

We performed contrasts that enabled us to investi-gate which brain regions showed common or differen-tial effects of the type of emotion, the task performed,or the interaction between emotion and task (see re-sults section). We report activation that reached a levelof significance of P , 0.001 uncorrected for multiplecomparisons in areas of interest and of P , 0.05 cor-rected for the entire volume in other regions. Thechoice of the areas of interest was based on previousfindings from neuroimaging and neuropsychologicalstudies as reviewed in the Introduction, and they com-prised amygdala, basal ganglia, fusiform, orbitofron-tal, and insular cortex. Anatomical localisation of theareas of interest was obtained by confronting the groupactivation data superimposed on the MNI standardbrain, to the Duvernoy (1999) and Talairach and Tour-

neaux (1988) atlases.

RESULTS

Behavioral Results

Accuracy of performance was greater than 98% forall conditions. Differences between reaction times inthe four experimental conditions were investigated us-ing a 2 3 2 factorial analysis of variance (ANOVA) withtype of emotion (D and H) as one factor and task (E andI) as the other, testing for main effects and interactions(Fig. 1). The results showed only a significant effect oftask (P , 0.005). A post hoc Scheffe’s test showed thatsubjects were slower in the explicit than in the inci-dental task (P , 0.01).

Neuroimaging Results

1. Areas activated by all stimuli versus control (Ta-ble 1). To identify regions common to all experimentalconditions versus control, the main effect of stimuliversus control was inclusively masked with each sim-ple main effect (ED-C, ID-C, EH-C, and IH-C) at P ,0.001. Bilateral activation of the posterior middle tem-poral gyrus (BA 37/21) was found. In the left hemi-sphere this activation spread to the superior temporalgyrus (BA 22) and to the temporal parietal junction(BA 39). In the right hemisphere the activation ex-tended posteriorly and inferiorly in the middle occipitalgyrus (BA 19).

In addition, a cluster of activation comprising themore posterior portion of the left amygdala, spreadingto the anterior hippocampus, was present for all facestimuli compared to the controls, regardless of type ofexpression and task. We found this region activated oneach of the four experimental conditions. However, wenoted a trend toward greater activation in the explicitrecognition of disgust (see below).

2. Effects of type of emotion and interaction with

FIG. 1. Mean and standard error of the reaction times of eachexperimental condition.

task. (a) Faces expressing disgust (Table 2 and Fig.

R

R

L

c

468 GORNO-TEMPINI ET AL.

2): A conjunction analysis was performed to identifybrain regions that were more active when viewing dis-gusted compared to happy faces for both the explicitand the incidental tasks: (ED-EH) 1 (ID-IH). Thisanalysis allows the identification of regions wherethere is a main effect of disgust versus happiness, inthe absence of any significant interaction with task. Nosignificant effect was found because all regions thatshowed a main effect of disgust were also qualified byan interaction (ED-ID) 2 (EH-IH) and showed agreater effect in the explicit task (see Table 2).

The right head of caudate nucleus, the right thala-mus, and the left amygdala showed a significantlygreater activation for explicit recognition of disgustrelative to happiness. Both a significant interactionand a simple main effect of ED versus EH were found(see Table 2 and Fig. 2). These effects did not reach acorrected level of significance but were neverthelessconsidered because on a priori areas of interest. (b)

TAB

Areas Commonly Activated by All Four

Areas (BA) Main effect ED vs control

L mid/suptemporal(37/21/22)

264, 248, 16 (6.5) 264, 244, 20 (4.5) 2260, 260, 8 (7.8) 260, 260, 8 (6.5) 2260, 232, 212 (5.9) 260, 232, 212 (3.6) 2

L angulargyrus (39)

252, 256, 40 (5.6) 256, 252, 44 (3.6) 2

L inf frontal(44/45)

240, 24, 24 (7.2) 236, 24, 24 (4.5) 2240, 20, 8 (5.8) 248, 20, 8 (3.4) 2

R mid occipito-temporal(19/37/21)

56, 260, 8 (8.1) 60, 256, 8 (6.9)64, 248, 0 (7.2) 62, 246, 0 (5.9)40, 280, 28 (8.1) 44, 280, 28 (6.9)

L posterioramygdala/hippocampus

224, 28, 220 (5.5) 224, 28, 220 (4.2) 2228, 216, 212 (3.0) 2

Note. Coordinates and Z values are reported for the main effect andontrols.

Faces expressing happiness (Table 3 and Fig. 3): Con-

ED vs EH and for each single simple main effects of the experimental

junction and interaction analyses were performed asdescribed above (see section 2a). No significant clusterof activation was found in the conjunction analysis ofhappy versus disgusted faces irrespective of task. How-ever, the bilateral orbitofrontal cortex (BA 11/47)showed a marked effect of happiness versus disgustthat was greater in the explicit task.

3. Effects of task Conjunction analysis was per-formed to identify areas that were more involved in theexplicit compared to incidental recognition of emo-tional faces: (ED-ID) 1 (EH-IH) (Table 4 and Fig. 4).This analysis allowed us to identify areas involved inperforming the explicit task irrespective to the type ofemotion to be recognized. The most significant effectwas found in the right precentral sulcus. Furthermore,we found an extensive activation of the right prefrontalcortex, comprising the middle (BA 46) and the inferior(BA 44/45) frontal gyri. The activation of the inferior

1

perimental Conditions versus Controls

ID vs control EH vs control IH vs control

, 248, 12 (5.1) 264, 248, 16 (5.5) 264, 248, 220 (3.3), 260, 8 (7.1) 260, 264, 4 (6.7) 260, 264, 4 (4.7), 232, 212 (5.5) 260, 232, 28 (5.1) 260, 236, 28 (3.5), 256, 44 (4.9) 260, 252, 36 (4.6) 256, 252, 40 (3.9)

, 24, 24 (5.7) 240, 24, 24 (4.8) 240, 24, 24 (3.9), 28, 12 (3.4) 240, 20, 8 (3.5) 240, 20, 8 (4.6), 264, 8 (6.5) 56, 260, 12 (6.8) 60, 260, 4 (6.8), 248, 0 (5.3) 64, 248, 2 (5.6) 48, 276, 0 (6.5), 280, 28 (4.2) 44, 280, 28 (7.2) 44, 280, 28 (6.7), 28, 220 (3.1) 224, 28, 216 (3.2) 224, 28, 220 (3.8), 216, 216 (3.8) 228, 212, 220 (3.9) 220, 212, 216 (3.4)

each single simple main effect of the experimental conditions versus

frontal gyrus also spread inferiorly and medially to

TABLE 2

Disgust versus Happiness

Areas (BA)Interaction and

Simple main effect ED vs control ID vs control EH vs control IH vs control

striatum:putamen I: 24, 0, 4 (4.2) 20, 24, 0 (4.6) 20, 24, 0 (2.1) 24, 0, 24 (1.6) 24, 0, 4 (2.7)

SM: 20, 28, 0 (3.7)caudate I: 8, 20, 8 (3.8) 8, 16, 12 (3.1) 8, 20, 8 (1.7) — —

SM: 8, 20, 8 (3.1)thalamus I: 8, 24, 4 (3.3) 16, 28, 0 (4.6) 16, 28, 4 (2.0) 8, 212, 12 (1.6) 16, 28, 4 (3.2)

SM: 8, 212, 4 (3.3)amygdala I: 228, 4, 220 (3.3) 224, 4, 220 (3.9) 224, 8, 212 (2.1) — 232, 4, 216 (1.7)

SM: 228, 0, 220 (4.0)

Note. Coordinates and Z values are reported for the interaction (I) between type of emotion and task, for the simple main effect (SM) of

LE

Ex

60606048

40365664442428

for

conditions versus control.

E tal

469FACIAL EXPRESSION PROCESSING

include the most anterior region of the insula. A smallregion in the right fusiform gyrus (medial and anteriorto the occipital area common to all the face conditions)was also modulated by task.

DISCUSSION

To investigate how differential task demands modu-late brain activation related to perceiving facial expres-sions, we examined the neural substrates of processingdisgusted and happy faces under two task conditions:gender decision and explicit emotional recognition. Weobtained a widespread profile of activation of both cor-tical and subcortical regions previously implicated inprocessing faces and emotionally related stimuli.These regions include the occipito-temporal and pre-frontal cortex, the amygdala, and the basal ganglia. Anumber of these areas showed a modulation due totype of emotion and task, whereas others were com-monly activated by all the experimental conditions.

Most of the regions that showed a common effect toall stimuli compared to controls are likely to be in-volved in visuo-perceptual (right temporo-occipitaljunction) and semantic analysis (left temporal and leftinferior frontal cortex) of faces. However, the activationof the left amygdala could be attributed to its role infacial expression processing. As discussed in the intro-duction, the role of the amygdala in recognition ofexpressions other than fear is not yet established. Oneneuroimaging study elicited its activation for sad faces(Blair et al., 1999). Other human studies have shownthat it is more activated for happy than for neutralfaces (Breiter et al., 1996), that it is proportionallydeactivated by increasingly happy expressions (Morriset al., 1996), and that it is modulated by conscious andunconscious exposure to conditioned angry faces (Mor-ris et al., 1998b). On the other hand, animal studiesdemonstrated that a group of neurons in the primateamygdala responds primarily to faces (Rolls, 1981; Leo-nard et al., 1985). In our study we observed that leftamygdala was activated when viewing facial expres-

TAB

Happiness v

Areas (BA)Interaction and Simple

main effect ED vs control

R orbitofrontal(11/47)

I: 36, 28, 224 (4.2) 28, 24, 220 (3.5)24, 32, 216 (3.2) —

SM: 36, 28, 224 (5.9)28, 32, 220 (4.4)

L orbitofrontal(11/47)

I: 240, 32, 220 (4.0) —SM: 236, 32, 216 (5.5) 236, 28, 216 (2.2)

Note. Coordinates and Z values are reported for the interaction (IH vs ED and for each single simple main effects of the experimen

sions was compared to a low-level visual task baseline.

This finding supports the notion that the amygdalaplays a general role in extracting the emotional rele-vance of faces (Rolls, 1999). Yet, its greater activationfor disgusted compared to happy faces (see below) ar-gues in favour of its preferential response to negativeexpressions and emotions.

A number of brain regions exhibited a modulationdue to task manipulation, with greater activation inthe explicit recognition condition. This effect was foundin structures that showed differential activation fordisgusted or happy expressions, as well as in regionsthat were equally responsive to both types of emotions.Explicit recognition of facial expression of disgust ac-tivated the right striatum, including the caudate nu-cleus. Furthermore, the left amygdala showed agreater response to explicit disgust than to any othercondition, supporting the notion of its involvement inprocessing all negative expressions, rather than fearalone (Adolphs et al., 1999). The fact that the left, butnot the right, amygdala was activated is consistentwith the findings of Morris et al. (1998b), who demon-strated a left amygdala involvement in processing su-praliminal stimuli and a right amygdala activation inprocessing subliminal stimuli. The caudate activationis consistent with defective recognition of disgustedfaces by Huntington disease patients (Gray et al., 1997;Sprengelmeyer et al., 1996). Two features might ex-plain why the amygdala and caudate were activated inthe present but not in Phillips et al.’s (1997) imagingstudy. First, by comparing disgust with happiness, wechose the two emotions that were, respectively, mostimpaired and most spared in Huntington disease(Sprengelmeyer et al., 1996). Second, Phillips et al.(1997) used a gender decision task and therefore didnot direct subjects to explicitly process the emotionalcontent of faces. Indeed, we have shown that the amyg-dala and caudate activations were significantly greaterin the explicit rather than in the incidental task.

Bilateral orbitofrontal cortex was the only regionthat responded to explicit recognition of happy facesmore than to any other condition. This region has been

3

us Disgust

ID vs control EH vs control IH vs control

40, 28, 224 (2.2) 36, 28, 224 (7.3) 32, 32, 220 (3.0)— 28, 32, 216 (6.9) 28, 24, 220 (2.8)

240, 32, 220 (3.1) 240, 28, 220 (6.0) 236, 24, 224 (3.2)— 232, 36, 212 (6.9) 220, 40, 212 (2.7)

tween type of emotion and task, for the simple main effect (SM) ofconditions versus control.

LE

ers

) be

involved in establishing stimulus-reinforcement asso-

Fvrr

shif

tis

nda

470 GORNO-TEMPINI ET AL.

ciations, especially with reward expectation (Rolls,2000). Its lesion can cause severe behavioral distur-bances and sometimes a deficit in recognizing facialexpressions (Hornak et al., 1996; Blair et al., 2000).

unctional neuroimaging studies have shown its acti-ation when a behavioural decision is based on theeward value of the response (Elliott et al., 2000 foreview) and when pictures of pleasant stimuli are pre-

FIG. 2. Subcortical structures that showed greater activation foraxial [(a) z 5 4; (b) z 5 8] and coronal [(c) y 5 0] sections of the sta

TAB

Explicit versus In

Areas (BA) conjunction ED vs control

R precentral sulcus(6/8)

52, 0, 44 (6.8) 52, 0, 44 (4.9)52, 12, 40 (5.2) 56, 8, 36 (4.4)

R insula/inf frontal(44/45)

32, 24, 4 (5.1) 32, 24, 4 (4.0)48, 20, 8 (5.9) 52, 24, 8 (6.0)52, 28, 16 (5.2) 48, 28, 16 (6.2)

R mid frontal (46) 48, 44, 16 (5.4) 48, 40, 10 (4.2)R fusiform (19) 36, 272, 212 (4.7) 40, 272, 28 (5.0)

Note. Coordinates and Z values are reported for the conjunction an

versus control.

ented (Paradiso et al., 1999). Our results confirm theypothesis of a specific role of the orbitofrontal cortex

n emotional processing, possibly indicating pleasantacial expressions as important clues of social reward.

The precentral sulcus, the middle and inferior fron-al gyri, the posterior fusiform gyrus and the anteriornsula of the right hemisphere showed a greater re-ponse for the explicit recognition task regardless of

licit recognition of disgusted faces. Activations are superimposed onrd Montreal Neurological Institute brain (Cocosco et al., 1997).

4

ental Processing

ID vs control EH vs control IH vs control

56, 4, 40 (2.0) 52, 4, 44 (6.3) 52, 0, 44 (2.0)48, 12, 44 (3.0) 52, 12, 40 (5.5) 52, 12, 40 (3.0)36, 24, 4 (1.8) 28, 28, 8 (3.9) 32, 20, 8 (1.4)44, 28, 8 (1.8) 48, 20, 8 (5.7) 40, 20, 8 (1.7)56, 32, 16 (3.4) 52, 24, 8 (6.0) 52, 24, 8 (2.2)48, 44, 16 (2.0) 48, 44, 16 (6.0) 48, 44, 16 (1.8)40, 276, 212 (2.2) 36, 272, 212 (6.0) 40, 276, 212 (3.2)

r each single simple main effect of the four experimental conditions

exp

LE

cid

d fo

Sigope

vcetHsrvamreacrett

on

471FACIAL EXPRESSION PROCESSING

the type of facial expression. The strong right lateral-isation of the activations during explicit emotional rec-ognition is also in agreement with clinical findings thatlink conscious emotional processing with the righthemisphere. (Adolphs et al., 2000; Borod et al., 1999;Bowers et al., 1985). We argue that these activationscan be attributed to general cognitive processes, suchas face perception, memory retrieval, and monitoringfunctions, rather than to specific emotion-related pro-cess. For instance, the right fusiform gyrus has beenrepeatedly involved in perceptual processing of faces(Gorno-Tempini et al., 1998; Kanwisher et al., 1997;

ergent et al., 1992) and our results demonstrate thatts function can be modulated by task demands. Thereater fusiform activation in explicit expression rec-gnition than in gender decision suggests that a deepererceptual analysis is necessary to explicitly extractmotional information from a face. Similarly, the acti-

FIG. 3. Bilateral orbitofrontal regions that showed greater activposed on axial sections [(a) x 5 224; (b) x 5 216] of the standard M

FIG. 4. Right frontal and fusiform activations related to the expl

3-D rendering image of the standard Montreal Neurological Institute b

ation of the precentral sulcus, close to an area re-ently identified as human frontal eye fields, could bexplained by greater visual “scanning” in the explicithan in the implicit task (Luna et al., 1998; Petit andaxby, 1999). The differential role of ventral and dor-

al prefrontal regions is still a matter of debate (for aeview see Owen et al., 1999). The currently prevailingiew postulates that the right ventral prefrontal andnterior insular cortex are implicated in retrievingemory traces from long-term memory, while dorsal

egions are implicated in response monitoring (Hensont al., 1999; Fletcher et al., 1998). Accordingly, greaterctivation of the right ventral and anterior insularortices in the explicit task might indicate a memoryetrieval effort to match facial features with previouslyxperienced representations. The task dependency ofhe effect in the lateral prefrontal cortex can explainhe inconsistency of this activation in previous studies

on for explicit recognition of happy faces. Activations are superim-treal Neurological Institute brain (Cocosco et al., 1997).

task regardless of emotion type. Activations are superimposed on a

ati

icit

rain (Cocosco et al., 1997).

didoctrtewa

C

C

D

E

E

F

F

F

G

G

G

H

H

H

K

L

L

L

M

472 GORNO-TEMPINI ET AL.

of facial expression processing. In fact, a right ventralprefrontal activation was only found when an explicitemotional judgement was required, as when perform-ing an emotional matching task (George et al., 1993) ora facial attractiveness judgement (Nakamura et al.,1999). There was no prefrontal activation when emo-tional processing was incidental (Morris et al., 1996).

Taken together, the results of the present study in-icate that the network of regions involved in process-ng facial expression can be greatly modulated by taskemands. Explicit recognition of facial expressions notnly increases the demands on aspecific cognitive pro-esses such as perceptual analysis and memory re-rieval, but also modulates the response of regions thatespond differently to different emotions. Responses inhe left amygdala and in the right neostriatum, werenhanced by explicit recognition of disgusted faces,hile the bilateral orbitofrontal cortex was typicallyssociated with recognising happy faces.

ACKNOWLEDGMENTS

This study was funded by a special grant from the Azienda Poli-clinico di Modena. M.L.G.T. was funded by the Wellcome Trust.Travelling expenses were covered by grant from the British Council-Ministero dell’Universita e della Ricerca Scientifica e Tecnologica.We thank Ray Dolan, John Morris, and Cathy Price for their com-ments.

REFERENCES

Adolphs, R., Damasio, H., Tranel, D., Cooper, G., and Damasio, A. R.2000. A role of somatosensory cortices in the visual recognition ofemotion as revealed by three-dimensional lesion mapping. J. Neu-rosci. 20: 2683–2690.

Adolphs, R., Tranel, D., Hamann, S., Young, A. W., Calder, A. J.,Phelps, E. A., Anderson, A., Lee, G. P., and Damasio, A. R. 1999.Recognition of facial emotion in nine individuals with bilateralamygdala damage. Neuropsychologia 37: 1111–1117.

Blair, R. J., and Cipolotti, L. 2000. Impaired social response reversal.A case of “acquired sociopathy.” Brain 123: 1122–1141.

Blair, R. J., Morris, J. S., Frith, C. D., Perrett, D. I., and Dolan, R. J.1999. Dissociable neural responses to facial expressions of sadnessand anger. Brain 122: 883–893.

Borod, J. C., Obler, L. K. Erhan, H. M., Grunwald, I. S., Cicero, B. A.,Welkowitz, J., Santschi, R. M., and Whalen, J. R. 1998. Righthemisphere emotional perception: Evidence across multiple chan-nels. Neuropsychology 12: 446–458.

Bowers, D., Bauer, R .M., Coslett, H. M., and Heilmann, K. M. 1985.Processing of faces by patients with unilateral hemisphere lesions.Brain Cogn. 4: 258–272.

Breiter, H. C., Etcoff, N. L., Whalen, P. J., Kennedy, W. A., Rauch,S. L., Buckner, R. L., Strauss, M. M., Hyman, S. E., and Rosen,B. R. 1996. Response and habituation of the human amygdaladuring visual processing of facial expression. Neuron 17: 875–887.

Calder, A. J., Keane, F., Manes F., Antoun, N., and Young, A. W.2000. Impaired recognition and experience of disgust followingbrain injury. Nat. Neurosci. 3: 1077–1078.

Calder, A. J., Young, A. W., Rowland, D., Perret, D. A., Hodges, J. R.,and Etcoff, N. L. 1996. Facial emotion recognition after bilateralamygdala damage: Differentially severe impairment of fear. Cogn.

Neuropsychol. 13: 699–745.

ocosco, C. A., Kollokian, V., Kwan, R., and Evans, A. C. 1997.Brainweb: Online interface to a 3D MRI simulated brain database.Neuroimage 5: S425.ourtney, S. M., Petit, L., Maisog, J. M., Ungerleider, L. G., andHaxby, J. V. 1998. An area specialized for spatial working memoryin human frontal cortex. Science 279: 1347–1351.uvernoy H. M. 1999. The Human Brain: Surface, Three Dimen-sional Sectional Anatomy with MRI, and Blood Supply, 2nd ed.Springer Verlag, Wien.kman, P., and Frisen W. V. 1976. Pictures of Facial Affect. Consult-ing Psychologists Press, Palo Alto.lliott, R., Dolan, R. J., and Frith, C. D. 2000. Dissociable functionsin the medial and lateral orbitofrontal cortex: Evidence from hu-man neuroimaging studies. Cereb. Cortex 10: 308–317.

letcher, P. C., Shallice, T., Frith, C. D., Frackowiak, R. S., andDolan, R. J. 1998. The functional roles of prefrontal cortex inepisodic memory. II. Retrieval. Brain 121: 1249–1256.

riston, K. J., Ashburner, J., Frith, C. D., Poline, J. B., Heather,J. D., and Frackowiak, R. S. J. 1995a. Spatial registration andnormalization of images. Hum. Brain Mapp. 2: 165–189.

riston, K. J., Holmes, A., Worsley, K. J., Poline, J.-B., Frith, C. D.,and Frackowiak, R. S. J. 1995b. Statistical parametric maps infunctional imaging: A general linear approach. Hum. Brain Mapp.2: 189–210.eorge, M. S., Ketter, T. A., Gill, D. S., Haxby, J. V., Ungerleider,L. G., Herscovitch, P., and Post, R. M. 1993. Brain regions involvedin recognizing facial emotion or identity: An oxygen-15 PET study.J. Neuropsychiatry Clin. Neurosci. 5: 384–394.orno-Tempini, M. L., Price, C. J., Josephs, O., Vandenberghe, R.,Cappa, S. F., Kapur, N., Frackowiak, R. S., and Tempini, M. L.1998 The neural systems sustaining face and proper-name pro-cessing [published errata appear in Brain 1998. 121(Pt 12):2402;and Brain 2000. 123(Pt 2):419]. Brain 121: 2103–2118.ray, J. M., Skolnick, B. E., and Gur, R. E. 1997. Impaired recogni-tion of disgust in Huntington’s disease gene carriers. Brain 120:2029–2038.axby, J. V., Petit, L., Ungerleider, L. G., and Courtney, S. M. 2000.Distinguishing the functional roles of multiple regions in distrib-uted neural systems for visual working memory. Neuroimage 11:145–156.enson, R. N., Shallice, T., and Dolan, R. J. 1999. Right prefrontalcortex and episodic memory retrieval: A functional MRI test of themonitoring hypothesis. Brain 122: 1367–1381.ornak, J., Rolls, E. T., and Wade, D. 1996. Face and voice expres-sion identification in patients with emotional and behaviouralchanges following ventral frontal lobe damage. Neuropsychologia34: 247–261.anwisher, N., McDermott, J., and Chun, M. M. 1997. The fusiformface area: A module in human extrastriate cortex specialized forface perception. J. Neurosci. 17: 4302–4311.

eonard, C. M., Rolls, E. T., Wilson, F. A. W., and Baylis, G. C. 1985.Neurons in the amygdala of he monkey with responses selectivefor faces. Behav. Brain Res. 15: 159–176.

epage, M., Ghaffar, O., Nyberg, L., and Tulving, E. 2000. Prefrontalcortex and episodic memory retrieval mode. Proc. Natl. Acad. Sci.USA 97: 506–511.

una, B., Thulborn, K. R., Strojwas, M. H., McCurtain, B. J., Ber-man, R. A., Genovese, C. R., and Sweeney, J. A. 1998. Dorsalcortical regions subserving visually guided saccades in humans:An fMRI study. Cereb. Cortex 8: 40–47.orris, J. S., Friston, K. J., Buchel, C., Frith, C. D., Young, A. W.,Calder, A. J., and Dolan, R. J. 1998a. A neuromodulatory role forthe human amygdala in processing emotional facial expressions.

Brain 121: 47–57.

O

O

P

P

P

R

R

R

S

S

S

S

T

T

473FACIAL EXPRESSION PROCESSING

Morris, J. S., Frith, C. D., Perrett, D. I., Rowland, D., Young, A. W.,Calder, A. J., and Dolan, R. J. 1996. A differential neural responsein the human amygdala to fearful and happy facial expressions.Nature 383: 812–815.

Morris, J. S., Ohman, A., and Dolan, R. J. 1998b. Conscious andunconscious emotional learning in the human amygdala. Nature393: 467–470.

Nakamura, K., Kawashima, R., Ito, K., Sugiura, M., Kato, T., Naka-mura, A., Hatano, K., Nagumo, S., Kubota, K., Fukuda, H., andKojima, S. 1999. Activation of the right inferior frontal cortex duringassessment of facial emotion. J. Neurophysiol. 82: 1610–1614.ldfield, R. C. 1971. The assessment and analysis of handedness: theEdinburgh inventory. Neuropsychologia 9: 97–113.wen, A. M., Herrod, N. J., Menon, D. K., Clark, J. C., Downey, S. P.,Carpenter, T. A., Minhas, P. S., Turkheimer, F. E., Williams, E. J.,Robbins, T. W., Sahakian, B. J., Petrides, M., and Pickard, J. D.1999. Redefining the functional organization of working memoryprocesses within human lateral prefrontal cortex. Eur. J. Neurosci.11: 567–574.

aradiso, S., Johnson, D. L., Andreasen, N. C., O’Leary, D. S.,Watkins, G. L., Ponto, L. L., and Hichwa, R. D. 1999. Cerebralblood flow changes associated with attribution of emotionalvalence to pleasant, unpleasant, and neutral visual stimuli in aPET study of normal subjects. Am. J. Psychiatry 156: 1618 –1629.

etit, L., and Haxby, J. V. 1999. Functional anatomy of pursuit eyemovements in humans as revealed by fMRI. J. Neurophysiol. 82:463–471.

hillips, M. L., Young, A. W., Senior, C., Brammer, M., Andrew,

C., Calder, A. J., Bullmore, E. T., Perrett, D. I., Rowland, D.,

Williams, S. C., Gray, J. A., and David, A. S. 1997. A specificneural substrate for perceiving facial expressions of disgust.Nature 389: 495– 498.

olls, E. T. 1981. Responses of the amygdaloid neurons in the pri-mate. In The Amygdaloid Complex (Y. Ben-Ari, Ed.), pp. 383–393.Elsevier, Amsterdam.olls, E. T. 1999. The Brain and Emotion. Oxford Univ. Press,Oxford.olls, E. T. 2000. The orbitofrontal cortex and reward. Cereb. Cortex10: 284–294.

cott, S. K., Young, A.W., Calder, A. J., Hellawell, D. J., Aggleton,J. P., Johnson, M. 1997. Impaired auditory recognition of fearand anger following bilateral amygdala lesions. Nature 385:254 –257.

ergent, J., Ohta, S., and MacDonald, B. 1992. Functional neuro-anatomy of face and object processing. A positron emission tomog-raphy study. Brain 115 Pt 1: 15–36.

prengelmeyer, R., Young, A. W., Calder, A. J., Karnat, A., Lange,H., Homberg, V., Perrett, D. I., and Rowland, D. 1996. Loss ofdisgust. Perception of faces and emotions in Huntington’s disease.Brain 119: 1647–1665.

prengelmeyer R., Young A. W. Schroeder U., Grossenbacher P. G.,Federlein, J., Buttner T., and Przuntek H. 1999. Knowing no fear.Proc. R. Soc. Lond. B 266: 2451–2456.

alairach J., and Tournoux P. 1988. Co-planar Stereotaxic Atlas ofthe Human Brain. Thieme, New York.

aylor, S. F., Liberzon, I., and Koeppe, R. A. 2000. The effect ofgraded aversive stimuli on limbic and visual activation. Neuropsy-

chologia 38: 1415–1425.