Processing the socially relevant parts of faces

13
Brain Research Bulletin 74 (2007) 344–356 Research report Processing the socially relevant parts of faces Francesca Benuzzi a,, Matteo Pugnaghi a , Stefano Meletti a , Fausta Lui b , Marco Serafini c , Patrizia Baraldi b , Paolo Nichelli a a Dipartimento Integrato di Neuroscienze, Universit` a di Modena e Reggio Emilia, Italy b Dipartimento di Scienze Biomediche, Universit` a di Modena e Reggio Emilia, Italy c Unit` a Operativa di Fisica Sanitaria, A.S.L. Modena, Italy Received 9 May 2007; received in revised form 3 July 2007; accepted 3 July 2007 Available online 27 July 2007 Abstract Faces are processed by a distributed neural system in the visual as well as in the non-visual cortex [the “core” and the “extended” systems, J.V. Haxby, E.A. Hoffman, M.I. Gobbini, The distributed human neural system for face perception, Trends Cogn. Sci. 4 (2000) 223–233]. Yet, the functions of the different brain regions included in the face processing system are far from clear. On the basis of the case study of a patient unable to recognize fearful faces, Adolphs et al. [R. Adolphs, F. Gosselin, T.W. Buchanan, D. Tranel, P. Schyns, A.R. Damasio, A mechanism for impaired fear recognition after amygdala damage, Nature 433 (2005) 68–72] suggested that the amygdala might play a role in orienting attention towards the eyes, i.e. towards the region of face conveying most information about fear. In a functional magnetic resonance (fMRI) study comparing patterns of activation during observation of whole faces and parts of faces displaying neutral expressions, we evaluated the neural systems for face processing when only partial information is provided, as well as those involved in processing two socially relevant facial areas (the eyes and the mouth). Twenty-four subjects were asked to perform a gender decision task on pictures showing whole faces, upper faces (eyes and eyebrows), and lower faces (mouth). Our results showed that the amygdala was activated more in response to the whole faces than to parts of faces, indicating that the amygdala is involved in orienting attention toward eye and mouth. Processing of parts of faces in isolation was found to activate other regions within both the “core” and the “extended” systems, as well as structures outside this network, thus suggesting that these structures are involved in building up the representation of the whole face from its parts. © 2007 Elsevier Inc. All rights reserved. Keywords: Face processing; Eyes; Mouth; Amygdala; fMRI 1. Introduction Faces are processed in a number of areas of the visual cortex as well as in several regions of the non-visual cortex. Different parts of this distributed neural system are involved in different aspects of the processing. According to Haxby et al. [31], the perception of the structural aspects of faces is cognitively inde- pendent of and anatomically dissociated from the perception of facial movements. Recognition of individuals is based on the structural aspects of faces, which are processed by the “core” system, while facial movements, the perception of which plays Corresponding author at: Dipartimento Integrato di Neuroscienze, Uni- versit` a di Modena e Reggio Emilia, Nuovo Ospedale Civile Sant’Agostino e Estense, Via Giardini 1355, 41100 Baggiovara (Modena), Italy. Tel.: +39 059 3961679; fax: +39 059 3962409. E-mail address: [email protected] (F. Benuzzi). a crucial role in social communication, are processed by the “core” and the “extended” systems. The “core” system includes the inferior occipital gyrus (IOG), the lateral portion of the fusiform gyrus (FG), and the supe- rior temporal sulcus (STS). The two ventral regions (IOG and FG) mediate the recognition of individuals, while the more dorsal region (STS) is involved in the perception of social signals such as direction of gaze, speech-related lip move- ments, and facial expressions. The “extended” system includes a number of regions with distinct functional specializations. The intraparietal sulcus and presumably the frontal eye field process gaze direction and head position, in order to guide spatial attention. The superior temporal gyrus is involved in the processing of speech-related lip movements for the extrac- tion of phonemic information. The anterior temporal lobe is involved in retrieving the name and other information associ- ated with the face. The amygdala and the insula are thought to 0361-9230/$ – see front matter © 2007 Elsevier Inc. All rights reserved. doi:10.1016/j.brainresbull.2007.07.010

Transcript of Processing the socially relevant parts of faces

A

Jtuit

np

fawb©

K

1

apappfss

vET

0d

Brain Research Bulletin 74 (2007) 344–356

Research report

Processing the socially relevant parts of faces

Francesca Benuzzi a,∗, Matteo Pugnaghi a, Stefano Meletti a, Fausta Lui b,Marco Serafini c, Patrizia Baraldi b, Paolo Nichelli a

a Dipartimento Integrato di Neuroscienze, Universita di Modena e Reggio Emilia, Italyb Dipartimento di Scienze Biomediche, Universita di Modena e Reggio Emilia, Italy

c Unita Operativa di Fisica Sanitaria, A.S.L. Modena, Italy

Received 9 May 2007; received in revised form 3 July 2007; accepted 3 July 2007Available online 27 July 2007

bstract

Faces are processed by a distributed neural system in the visual as well as in the non-visual cortex [the “core” and the “extended” systems,.V. Haxby, E.A. Hoffman, M.I. Gobbini, The distributed human neural system for face perception, Trends Cogn. Sci. 4 (2000) 223–233]. Yet,he functions of the different brain regions included in the face processing system are far from clear. On the basis of the case study of a patientnable to recognize fearful faces, Adolphs et al. [R. Adolphs, F. Gosselin, T.W. Buchanan, D. Tranel, P. Schyns, A.R. Damasio, A mechanism formpaired fear recognition after amygdala damage, Nature 433 (2005) 68–72] suggested that the amygdala might play a role in orienting attentionowards the eyes, i.e. towards the region of face conveying most information about fear.

In a functional magnetic resonance (fMRI) study comparing patterns of activation during observation of whole faces and parts of faces displayingeutral expressions, we evaluated the neural systems for face processing when only partial information is provided, as well as those involved inrocessing two socially relevant facial areas (the eyes and the mouth).

Twenty-four subjects were asked to perform a gender decision task on pictures showing whole faces, upper faces (eyes and eyebrows), and lower

aces (mouth). Our results showed that the amygdala was activated more in response to the whole faces than to parts of faces, indicating that themygdala is involved in orienting attention toward eye and mouth. Processing of parts of faces in isolation was found to activate other regionsithin both the “core” and the “extended” systems, as well as structures outside this network, thus suggesting that these structures are involved inuilding up the representation of the whole face from its parts.

2007 Elsevier Inc. All rights reserved.

a“

trFds

eywords: Face processing; Eyes; Mouth; Amygdala; fMRI

. Introduction

Faces are processed in a number of areas of the visual cortexs well as in several regions of the non-visual cortex. Differentarts of this distributed neural system are involved in differentspects of the processing. According to Haxby et al. [31], theerception of the structural aspects of faces is cognitively inde-endent of and anatomically dissociated from the perception of

acial movements. Recognition of individuals is based on thetructural aspects of faces, which are processed by the “core”ystem, while facial movements, the perception of which plays

∗ Corresponding author at: Dipartimento Integrato di Neuroscienze, Uni-ersita di Modena e Reggio Emilia, Nuovo Ospedale Civile Sant’Agostino estense, Via Giardini 1355, 41100 Baggiovara (Modena), Italy.el.: +39 059 3961679; fax: +39 059 3962409.

E-mail address: [email protected] (F. Benuzzi).

maTpsttia

361-9230/$ – see front matter © 2007 Elsevier Inc. All rights reserved.oi:10.1016/j.brainresbull.2007.07.010

crucial role in social communication, are processed by thecore” and the “extended” systems.

The “core” system includes the inferior occipital gyrus (IOG),he lateral portion of the fusiform gyrus (FG), and the supe-ior temporal sulcus (STS). The two ventral regions (IOG andG) mediate the recognition of individuals, while the moreorsal region (STS) is involved in the perception of socialignals such as direction of gaze, speech-related lip move-ents, and facial expressions. The “extended” system includesnumber of regions with distinct functional specializations.

he intraparietal sulcus and presumably the frontal eye fieldrocess gaze direction and head position, in order to guidepatial attention. The superior temporal gyrus is involved in

he processing of speech-related lip movements for the extrac-ion of phonemic information. The anterior temporal lobe isnvolved in retrieving the name and other information associ-ted with the face. The amygdala and the insula are thought to

arch

mssarhsna[ve

astsbtafs

gSrtetFtei[

dal[

1

2

sasda

tHieo

wcrnt

2

2

fnawe

2

nffwt8Iffc

Sc(bt

osBwr

2

pva(sts1

F. Benuzzi et al. / Brain Rese

ediate perception of the emotional content of facial expres-ions. In particular, the amygdala has been described as a keytructure in the processing of facial expressions of negativeffects, but recent studies have shown that it plays a widerole in detection of emotional salience. Neuroimaging studiesave found that the amygdala responds to facial expressions ofadness [74,82], happiness [13,82], and surprise [48], and toeutral expressions [37,81]. Other studies have shown that themygdala is equally responsive across all expressions of affect27,39,80,82], suggesting that it could be regarded as a “rele-ance detector” [69] involved in appraisal of the face for relevantvents.

Adolphs et al. [6] described a patient (SM) who, after bilateralmygdala damage, failed to recognize fear from facial expres-ions. However, further investigation [4] revealed that SM failedo fixate the eye region normally in all facial expressions, herelective impairment in recognizing fear thus being explainedy the fact that the eyes are the most important feature in iden-ification of this emotion. This finding also suggests that themygdala is involved in orienting gaze towards the parts of theace that are more important in the processing facial expres-ions.

The regions around the eyes and the mouth convey thereatest amount of information useful for social interaction.everal neuroimaging studies have specifically investigated theole of these two regions. Three studies [72,75,76] investigatedhe mechanism of face perception using stimuli showing theyes alone. In general, they found similar but weaker activa-ion of the lateral portion of the FG (the fusiform face area,FA) in response to presentation of the eye region compared

o the whole face. Processing the emotional content of theye region required recruitment of the “extended” system; fornstance, fearful eyes evoked increased activity in the amygdala53,78].

Other studies investigated the mouth region considering itsifferent types of movement. Data in the literature demonstratedctivation of two different portions of the STS in relation toinguistic (lip-reading [18,68]) and non-linguistic movements12,23,66,67].

The aim of the present study was two-fold:

. To look for brain activation in response to whole face stimulias opposed to the eyes and mouth alone; it was anticipatedthat these comparisons may shed light on the brain networkinvolved in orienting attention towards the eyes and mouth,which are the more socially relevant parts of faces.

. To evaluate the different contributions of the distributed sys-tem for face perception when only partial information (eithereyes or mouth) is provided. This was done by comparing bothupper face and lower face stimuli with whole face stimuli.

Unlike previous studies, we collected fMRI data from all thetructures of the “core” and “extended” systems [31] and, to

void any emotional modulation of face processing, we usedtimuli depicting neutral expressions. Following Adolphs et al.escription [4] of patient SM’s inability to look at the eyes, wenticipated that processing of whole faces, as opposed to just

s

m

a

Bulletin 74 (2007) 344–356 345

he eyes, would produce bilateral activation of the amygdala.ypothesizing that the amygdala could play a broader process-

ng role, also orienting attention to parts of the face, we alsoxpected to find the same effect when comparing the processingf whole faces to processing of just the mouth.

We also anticipated that processing isolated parts of facesould require the recruitment of additional neural resources

ompared to those needed for processing whole faces. Ouresults, showing increased activity both within and outside theeural network specialized for face processing, demonstratedhat extra resources are indeed recruited.

. Methods

.1. Subjects

Twenty-four right-handed subjects (12 males and 12 females) ranging in agerom 21 to 27 years, mean age = 26, education 13–18 years) with no history ofeurological or psychiatric illness, participated in the study. Handedness wasssessed by means of the Edinburgh Inventory [58]. All subjects gave theirritten informed consent to take part in the study, which was approved by the

thics committee of the University of Modena and Reggio Emilia.

.2. Stimuli and experimental design

We used pictures of nine individuals (four males, five females) displayingeutral facial expressions. The pictures were black and white photographs takenrom the Ekman and Friesen series [24]. Upper (eyes and eyebrows) and lowerace (mouth) stimuli were obtained by masking the remaining part of the faceith a mosaic filter. Control stimuli were prepared by applying an Adobe Pho-

oshop mosaic filter to the pictures, thus obtaining grey-scale images formed by× 11 squares, no longer recognizable as faces; we call these images “masks”.

n the three experimental conditions, i.e., the presentation of whole faces, upperaces, and lower faces, the subjects were required to decide the gender of theace. In the control condition, they were asked to pick out a white square in theentre of the masks.

The experiment comprised four sessions, each composed of 12 blocks.pecifically, two blocks were presented for each of the three experimentalonditions in randomized order, interspersed with a total of six control blocksmasks). In each block, nine stimuli were presented for 2.8 s each. Before theeginning of each block, subjects were given written instructions explaining theask.

Responses were given at the end of each stimulus presentation by pressingne of two buttons. Accuracy and response time data were collected during thecanning sessions by means of a custom-made software developed in Visualasic 6 (http://web.tiscali.it/MarcoSerafini/stimoli video/). The same softwareas used to present stimuli via the IFIS (MRI Devices Corporation, WI, USA)

emote display.

.3. Image acquisition and data analysis

Images were acquired using a Philips Intera at 3 T. Each BOLD-echo-lanar volume consisted of 30 transverse slices (128 × 128 in plane matrix;oxel size 1.8 × 1.8 × 4 mm; TE = 30 ms). Dummy scans lasting 12 s werecquired at the beginning of each session. One hundred and twenty volumes10 vol./block) were collected in each scanning session (TR = 3000 ms) and eachubject underwent four sessions, thereby generating a total of 480 vol. In addi-ion, a high-resolution T1-weighted image of the brain was acquired for eachubject to allow anatomical localization of activations. The volume consisted of70 sagittal slices (TR = 9.9 ms; TE = 4.6 ms; in plane matrix = 256 × 256; voxel

ize = 1 × 1 × 1 mm).

Image analysis was performed using the SPM2 software (Wellcome Depart-ent of Imaging Neuroscience, London, UK).

All functional volumes for each subject were realigned to the first volumecquired. Images were then spatially normalized to the Montreal Neurological

3 arch

Ioidgeate

fiucct

3

3

caf7

afsNbubtc

f

lr(

3

3a

mohaiiAtoi

vpcm

pa

TA

C

46 F. Benuzzi et al. / Brain Rese

nstitute (MNI) standard brain and resampled to obtain images with a voxel sizef 2 mm × 2 mm × 4 mm. Volumes were then smoothed with an 8-mm FWHMsotropic Gaussian kernel. The three conditions were modelled with the haemo-ynamic response function. Condition effects were estimated according to theeneral linear model and region-specific effects were investigated using lin-ar contrasts and comparing the three experimental conditions (whole, uppernd lower faces) with the control condition (masks). In addition, specific con-rasts were applied to estimate the differences between each pair of the threexperimental conditions.

All analyses were performed at a random effect level. Each subject wasrst analyzed singly, then the statistical maps obtained for each contrast weresed to perform a second level group analysis (random effect analysis). Theoordinates in Talairach space [71] were obtained by applying the Matthew Brettorrection (mni2tal: http://www.mrc-cbu.cam.ac.uk/Imaging/mnispace.html) tohe SPM–MNI coordinates.

. Results

.1. Behavioural results

Response time and percentages of correct responses werealculated separately for whole faces, upper faces, lower faces,nd masks. The mean response time was 814.07 ms for wholeaces, 890.7 ms for upper faces, 1001.97 ms for lower faces, and37.65 for masks.

Differences between the response times were analyzed usingn ANOVA with one factor (stimulus) and four levels (wholeaces, upper faces, lower faces, and masks). The results showed aignificant difference between conditions (F = 107.4; p<0.001);ewman–Keuls post-hoc tests showed significant differencesetween whole faces and masks (p < 0.001), whole faces and

pper faces (p < 0.001), whole faces and lower faces (p < 0.001),etween the two partial face conditions (p < 0.001), and betweenhe two partial face conditions and masks (p < 0.001 for eachomparison).

tecf

able 1reas of increased signal intensity for whole faces compared to masks (control)

Region Side BA

Cerebellum, fusiform gyrus, inferior occipitalgyrus, inferior temporal gyrus, middle occipitalgyrus

Right 18, 19,

Hippocampus, thalamus, parahippocampal gyrus Right 27, 30Cerebellum, fusiform gyrus, inferior occipital

gyrus, middle occipital gyrusLeft 18, 19,

Inferior occipital gyrus, lingual gyrus, middleoccipital gyrus, cuneus

Right 17, 18

Inferior frontal gyrus, middle frontal gyrus Left 44, 45,Parahippocampal gyrus, amygdala Right 28, 34Inferior occipital gyrus, lingual gyrus, cuneus Left 17, 18Cerebellum, lingual gyrus Right 18Inferior parietal lobule, superior parietal lobule Right 7Lingual gyrus Right–left 18Cerebellum LeftInferior frontal gyrus, Middle frontal gyrus Right 46Amygdala LeftMiddle occipital gyrus Left 18

oordinates of the peak voxel are shown for each cluster. All activations are significa

Bulletin 74 (2007) 344–356

Accuracy was 98.78% for whole faces, 96.35% for upperaces, 90.72% for lower faces, and 99.12% for masks.

Differences between correct response percentages were ana-yzed using a non-parametric test (Friedman ANOVA). Theesults showed a significant difference between conditionsχ2 = 36.49; p < 0.001; d.f. = 3).

.2. Neuroimaging results

.2.1. Regions of increased signal intensity for whole facesnd parts of faces versus controls

Whole face processing, compared to the processing ofasks, was associated with significant activation in several

ccipito-temporal regions as well as in mesial structures of bothemispheres (Table 1 and Fig. 1). In particular, an increase ofctivity was found in the FG and in the inferior and middle occip-tal gyrus (IOG and MOG). The mesial temporal lobe activationncluded the bilateral parahippocampal gyrus and the amygdala.ctivation was also observed in the right parietal lobe including

he intraparietal sulcus, in the lateral frontal cortex, particularlyn the left side (BA 44, 45, 9 in the left hemisphere and BA 46n the right hemisphere), and in the cerebellum bilaterally.

The presentation of upper faces, compared to masks, acti-ated the same occipito-temporal regions, as well as the rightarietal lobe, including the intraparietal sulcus and the bilateralerebellum (Table 2 and Fig. 2). No activation was found in theesial structures or in the lateral frontal cortex.Regions of increased signal intensity for lower faces com-

ared to masks included the same occipito-temporal networkctivated in whole face processing (Table 3 and Fig. 3). In addi-

ion, the parietal activation was found to be bilateral and morextensive. The frontal cortex activation extended to the insularortex and anterior cingulate. A single area of activation wasound in the left parahippocampal gyrus.

No. voxels Z-score Coordinates

x y z

37, 20 669 6.90 44 −78 −10

47 6.32 18 −31 −537 295 6.30 −44 −67 −17

168 6.25 26 −101 2

9 49 6.05 −44 19 2126 5.71 18 −9 −1688 5.61 −16 −97 −264 5.54 6 −82 −1613 5.42 30 −52 4318 5.41 0 −72 427 5.31 −14 −83 −1915 5.23 44 30 178 5.06 −16 −4 −103 4.98 −24 −84 −9

nt at p < 0.005 (corrected); k ≥ 2 voxels.

F. Benuzzi et al. / Brain Research Bulletin 74 (2007) 344–356 347

Fig. 1. Areas of significant activation for whole faces compared to masks. Coronal sections show activation in the inferior occipital gyrus (IOG), fusiform gyrus andi ). Aco

3v

woashno

oT

fti

ntraparietal sulcus (FG–IPS), amygdala (AMY) and inferior frontal gyrus (IFGf all the subjects’ T1 volumes. p < 0.05 (corrected); k ≥ 2.

.2.2. Regions of increased signal intensity for whole facesersus parts of faces

First we considered the regions of increased activation forhole faces compared to parts of faces (considering the resultsf the upper and lower faces together). The main finding wasctivation of the mesial temporal structures of both hemi-

pheres (amygdala and parahippocampal gyrus on the right andippocampus and parahippocampal gyrus on the left side). Sig-ificant activation was also found in the striate cortex (middleccipital gyrus, cuneus, lingual gyrus) and in the temporal lobes

esF

tivated clusters are superimposed on a structural image obtained as the average

n both sides (superior, middle and inferior temporal gyrus;able 4 and Fig. 4C).

Then, we considered the regions of increased signal for wholeaces versus upper and lower faces separately; it was anticipatedhat these comparisons may shed light on the brain networknvolved in orienting attention towards the eyes and mouth.

The presentation of whole faces, compared to upper faces,voked activation in the mesial-temporal structures of both hemi-pheres (amygdala and parahippocampal gyrus; Table 5 andig. 4A). Additional activation was found in the right fusiform

348 F. Benuzzi et al. / Brain Research Bulletin 74 (2007) 344–356

Table 2Areas of increased signal intensity for upper faces compared to masks (control)

Region Side BA No. voxels Z-score Coordinates

x y z

Fusiform gyrus, inferior occipital gyrus, inferior temporalgyrus, lingual gyrus, middle occipital gyrus, cerebellum

Right 17, 18, 19, 20, 37 1198 7.52 26 −101 2

Fusiform gyrus, inferior occipital gyrus, lingual gyrus,middle occipital gyrus, cerebellum

Left 17, 18, 19, 37 629 6.79 −22 −101 −2

Cerebellum Right 12 5.51 2 −77 −23Superior occipital gyrus Right 19 11 5.29 30 −78 26

7

C nifica

gda3

Fik

Brainstem RightSuperior parietal lobule Right

oordinates of the peak voxel are shown for each cluster. All activations are sig

yrus (BA 37), and in the striate cortex on the right side (mid-le occipital gyrus, cuneus, lingual gyrus). A single area ofctivation was found in the right superior temporal gyrus (BA8).

lat

ig. 2. Areas of significant activation for upper faces compared to masks. Coronal sentraparietal sulcus (FG–IPS). Activated clusters are superimposed on a structural ima≥ 2.

3 5.14 8 −31 −55 5.05 30 −54 43

nt at p < 0.005 (corrected); k ≥ 2 voxels.

Regions of increased signal intensity for whole faces versusower faces included the mesial-temporal structures (amygdaland parahippocampal gyrus; Table 6 and Fig. 4B) and the peris-riate cortex of both hemispheres (lingual gyrus, middle occipital

ctions show activation in the inferior occipital gyrus (IOG), fusiform gyrus andge obtained as the average of all the subjects’ T1 volumes. p < 0.05 (corrected);

F. Benuzzi et al. / Brain Research Bulletin 74 (2007) 344–356 349

Fig. 3. Areas of significant activation for lower faces compared to masks. Coronal sections show activation in the inferior occipital gyrus (IOG), fusiform gyrus andi l gyrui ed); k

gcgb

3f

mI

or

ps(

ntraparietal sulcus (FG–IPS), middle frontal gyrus (MFG), and inferior frontamage obtained as the average of all the subjects’ T1 volumes. p < 0.05 (correct

yrus, and cuneus). Activation was also found in the temporalortex on both sides (inferior, middle, and superior temporalyrus), in the left posterior cingulate, in the medial frontal gyrusilaterally (BA 10), and in the right insula.

.2.3. Regions of increased signal intensity for parts of

aces versus whole faces

The presentation of upper faces, compared to whole faces,ost notably evoked activation in a region located between the

OG and MOG of the right hemisphere (Table 7). A small area

iss(

s and insula (IFG–insula). Activated clusters are superimposed on a structural≥ 2.

f increased signal intensity was also found in the MOG on theight side.

Regions of increased signal intensity for lower faces, com-ared to whole faces, included extrastriate visual areas on bothides, as well as a region of the FG located medially to the FFATable 8). Activation was also found in the parietal lobe includ-

ng the intraparietal sulcus, and was more extensive on the leftide. In the right frontal lobe, increased signal was found in theuperior (BA 8, 9), middle (BA 10, 46), and inferior frontalBA 47) gyri, as well as in the cingulate and in the insular

350 F. Benuzzi et al. / Brain Research Bulletin 74 (2007) 344–356

Table 3Areas of increased signal intensity for lower faces compared to masks (control)

Region Side BA No. voxels Z-score Coordinates

x y z

Fusiform gyrus, inferior occipital gyrus, inferior temporalgyrus, middle occipital gyrus, lingual gyrus, middletemporal gyrus, superior occipital gyrus, cuneus, inferiorparietal lobule, precuneus, superior parietal lobule,cerebellum

Right 7, 17, 18, 19, 20, 37, 40 2309 7.25 42 −57 −11

Fusiform gyrus, inferior occipital gyrus, inferior temporalgyrus, lingual gyrus, middle occipital gyrus, middletemporal gyrus, superior occipital gyrus, cuneus,cerebellum

Left 17, 18, 19, 37, 20 1266 7.17 −36 −67 −20

Insula, inferior frontal gyrus Left 47, 13 56 6.09 −32 19 −4Inferior parietal lobule, precuneus, superior parietal lobule Left 7, 19, 40 167 5.97 −26 −70 44Inferior frontal gyrus, middle frontal gyrus Right 10, 46 36 5.91 44 53 8Middle frontal gyrus Left 46 15 5.80 −51 28 24Inferior frontal gyrus, middle frontal gyrus Right 46 115 5.74 48 30 21Cerebellum Left 30 5.70 −2 −60 −31Inferior frontal gyrus, middle frontal gyrus Right 9, 44, 46 50 5.63 51 15 32Insula, inferior frontal gyrus Right 47 53 5.59 36 19 −8Parahippocampal gyrus Right 27 4 5.58 18 −31 −2Cerebellum, lingual gyrus Left–right 18(left) 131 5.57 −10 −77 −23Medial frontal gyrus Right 8, 9 23 5.34 8 29 39Middle frontal gyrus Left 9 21 5.29 −42 13 25Medial frontal gyrus, cingulate gyrus Left 8, 32 11 5.09 −2 24 43Lingual gyrus Right 18 2 4.93 24 −70 −7

Coordinates of the peak voxel are shown for each cluster. All activations are significant at p < 0.005 (corrected); k ≥ 2 voxels.

Table 4Areas of increased signal intensity for whole faces compared to parts of faces

Region Side BA No. voxels Z-score Coordinates

x y z

Lingual gyrus, middle occipital gyrus Bilateral 17, 18, 19 156 6.58 10 −100 12Amygdala, parahippocampal gyrus Right 28, 34, 35 28 6.02 18 −7 −20Superior temporal gyrus, middle temporal gyrus Right 21, 38 37 5.92 44 16 −34Parahippocampal gyrus, hippocampus Left 35 19 5.37 −24 −7 −23Middle and superior temporal gyrus Left 39 4 5.18 −55 −67 25Inferior temporal gyrus Left 20, 21 6 5.18 −59 −11 −20Superior temporal gyrus Right 38 5 5.16 34 7 −24Lingual gyrus Right 18 4 5.10 4 −83 1Lingual gyrus Left 18 8 5.05 −10 −84 −3Middle temporal gyrus Right 39 2 4.91 48 −65 25

Coordinates of the peak voxel are shown for each cluster. All activations are significant at p < 0.005 (corrected); k ≥ 2 voxels.

Table 5Areas of increased signal intensity for whole faces compared to upper faces

Region Side BA No. voxels Z-score Coordinates

x y z

Amygdala, parahippocampal gyrus Right 34 11 5.72 16 −7 −16Middle occipital gyrus, cuneus Right 18 4 5.23 10 −100 12Fusiform gyrus Right 37 4 5.08 44 −48 −18Parahippocampal gyrus Left 28 5 5.06 −24 −12 −13Amygdala, parahippocampal gyrus Left 34 2 5.05 −18 −5 −13Superior temporal gyrus Right 38 2 5.02 42 12 −38Fusiform gyrus Right 37 2 5.01 48 −52 −21

Coordinates of the peak voxel are shown for each cluster. All activations are significant at p < 0.005 (corrected); k ≥ 2 voxels.

F. Benuzzi et al. / Brain Research Bulletin 74 (2007) 344–356 351

Fig. 4. Areas of significant activation for whole faces compared to parts of faces. Coronal sections show activation in the amygdala (AMY) for whole face comparedto upper face (A) and lower face (B) stimuli. 3D renderings (C) show activations for whole faces versus upper and lower faces together. Activated clusters aresuperimposed on a structural image obtained as the average of all the subjects’ T1 volumes. p < 0.05 (corrected); k ≥ 2.

Table 6Areas of increased signal intensity for whole faces compared to lower faces

Region Side BA No. voxels Z-score Coordinates

x y z

Lingual gyrus, middle occipital gyrus, cuneus Bilateral 17, 18, 19 458 6.50 10 −98 12Inferior temporal gyrus, middle temporal gyrus Left 20, 21 25 5.69 −59 −11 −16Parahippocampal gyrus, amygdala Right 34, 35, 28 20 5.62 18 −7 −20Posterior cingulate Left 23, 30 14 5.59 −6 −52 14Parahippocampal gyrus, amygdala Left 35 18 5.53 −22 −7 −23Middle temporal gyrus, superior temporal gyrus Right 38 16 5.40 42 18 −31Middle temporal gyrus, superior temporal gyrus Left 39 12 5.21 −55 −65 25Parahippocampal gyrus Left 35, 36 7 5.18 −28 −15 −23Middle temporal gyrus Right 21 9 5.14 46 6 −34Insula Right 13 2 5.07 40 −19 12Medial frontal gyrus Left 10 2 4.92 −8 58 1

Coordinates of the peak voxel are shown for each cluster. All activations are significant at p < 0.005 (corrected); k ≥ 2 voxels.

352 F. Benuzzi et al. / Brain Research Bulletin 74 (2007) 344–356

Table 7Areas of increased signal intensity for upper faces compared to whole faces

Region Side BA No. voxels Z-score Coordinates

x y z

Inferior and middle occipital gyrus Right 18 12 5.28 26 −99 −2Middle occipital gyrus Right 19 2 4.98 36 −85 8

Coordinates of the peak voxel are shown for each cluster. All activations are significant at p < 0.005 (corrected); k ≥ 2 voxels.

Table 8Areas of increased signal intensity for lower faces with compared to whole faces

Region Side BA No. voxels Z-score Coordinates

x y z

Middle occipital gyrus, middle temporal gyrus, superior occipital gyrus, cuneus,precuneus

Right 18, 19, 20 275 7.23 38 −85 8

Fusiform gyrus, cerebellum Right 37 106 6.62 30 −55 −11Inferior occipital gyrus, middle occipital gyrus, superior occipital gyrus, cuneus Left 18, 19 135 6.23 −32 −85 19Fusiform gyrus, inferior temporal gyrus, middle occipital gyrus, cerebellum Left 19, 37 253 6.23 −48 −66 −7Inferior parietal lobule, precuneus, superior parietal lobule Right 7, 40 229 5.93 22 −60 44Middle frontal gyrus Right 10 21 5.77 46 51 9Anterior cingulate, superior frontal gyrus Right 8, 9, 32 46 5.48 6 36 28Fusiform gyrus, inferior temporal gyrus Right 20, 37 34 5.48 51 −59 −14Insula, inferior frontal gyrus Right 47 37 5.34 32 19 −1Precuneus, superior parietal lobule Left 7 5 5.14 −24 −66 40

C nifica

cc

4

4

ptFesaT(go[ilF

tfcfotp

ni

atndhoa[SiTm

Iot

4i

pr

Middle frontal gyrus

oordinates of the peak voxel are shown for each cluster. All activations are sig

ortex. A region of increased signal was also found in the lefterebellum.

. Discussion

.1. The processing of whole faces and parts of faces

Our results showed that the processing of faces, even in theresence of incomplete visual information, activates the struc-ures of the “core” system (namely, the IOG and the lateralG). Although viewing both whole faces and parts of facesvoked bilateral activation within these face-responsive regions,tronger responses were found in the right hemisphere (largernd more significant clusters of activation on the right side).he most significant activation included the right lateral FG

FFA), the region that in previous studies consistently showedreater activation during tasks involving perception of faces aspposed to nonsense control stimuli [22,32,70] or other objects42,43,51,64]. In the present study this activation was also foundn the presence of incomplete visual information (upper andower faces compared to masks) and reflects the role of the lateralG in processing faces.

We also found activation in regions of the “extended” sys-em, namely in the intraparietal sulcus. Activation of the inferiorrontal cortex and limbic structures was found only when theomplete image of the face was shown. Activation of the inferior

rontal gyrus has been found in numerous neuroimaging studiesf emotional processing (particularly in association with activa-ion of the amygdala: [25,29,38,45,47,57]). Recently, a similarattern of activation was also found in the passive viewing of

[wwi

Right 9 2 5.03 55 13 32

nt at p < 0.005 (corrected); k ≥ 2 voxels.

eutral faces [39], suggesting that this region could be involvedn the perceptual processing of faces.

As regards the structures of the “core” system, we did not findctivation of the STS. The STS is involved in the evaluation ofhe changeable aspects of a face. In non-human primates, faceeurons have been found to be tuned to face views and gazeirection [59,60]. Functional neuroimaging studies in humansave shown that the STS is activated during the perceptionf biological motion, including facial movements, such as eyend mouth movements, and whole body and hand movements19,23,65,67,79]. Although the most marked response of theTS is related to biological movements, its posterior portion

s also activated by still pictures of faces [21,30,33,35,39,43].his activation may reflect the processing of potential or impliedovement or the evaluation of the changeable aspects of a face.In our study, the task required the subjects to identify gender.

n this condition it is likely that processing of implied motionr evaluation of the changeable aspects of a face is not required,hus explaining the absence of STS activation.

.2. Processing faces in the presence of incompletenformation

Neuroimaging studies investigating face processing in theresence of incomplete information have focused on theesponse of the FFA to stimuli showing the eyes alone

72,75,76]. The results of these studies showed similar buteaker activation when the eye region alone, as opposed to thehole face, was presented. Our data suggest that the process-

ng of parts of faces involves both increased activation of the

arch

noagf

aTpvt

errap

p(tbri

ihimwiitrimatsp

ccab

4

pspaf[di

otf

d[sps

isrg

isptthnwfft

vsft[rmcoimtotiiTad

psgovW

F. Benuzzi et al. / Brain Rese

eural network specialized for face processing, and activationutside that network. The recruitment of the latter was associ-ted with increased response time and decreased accuracy in theender recognition task in response to stimuli showing parts ofaces.

The region located between the IOG and the MOG showedstronger response to upper face than to whole face stimuli.his region is part of the distributed neural network for facerocessing (the “core” system) and was found to be more acti-ated in the processing of whole faces and parts of faces than inhe processing of controls (masks).

Several different areas, in particular the medial FG, werespecially activated when processing lower face stimuli. Thisegion is part of the ventral visual pathway, which is moreesponsive to non-face objects than to faces. However, increasedctivity in the medial portion of the FG has been associated withrocessing of inverted face stimuli [33].

We can speculate that the activation of the face-selective areasroduced in response to incomplete information about the facei.e., the lower face stimuli) was insufficient to form a represen-ation of the face. Thus, additional processing resources had toe recruited in the ventral visual pathway, as well as in otheregions of the “extended” system (the intraparietal sulcus, thensula, and the frontal cortex).

The activation of the inferior frontal gyrus during the process-ng of lower face stimuli (mouth) was located in the region thatas been found to be part of the so-called “mirror system”, whichs involved in understanding the meaning of actions. Indeed,

irror neurons responding to hand, foot or mouth stimuli (eyesere not specifically tested) were found in cell recording stud-

es in monkeys [26], and similar findings have been reportedn neuroimaging studies in humans [15,16]. Thus, it is possiblehat in our study the activation of the inferior frontal cortex inesponse to stimuli representing the lower part of the face reflectsnvolvement of part of the system specialized in processing com-

unicative gestures from the mouth region. Similar but weakernd less extensive patterns of activation were found in responseo whole faces (where visual cues from the mouth region weretill available), but not when only the upper part of the face wasresented.

Finally, the activation of the more dorsal portion of the frontalortex, involved in response monitoring [28,34], could indi-ate the greater effort required for stimuli showing lower faces,s suggested by the longer response times and less accurateehavioural responses.

.3. Processing the relevant parts of faces

The processing of whole faces, as opposed to parts of faces,roduced, notably, bilateral activation of the amygdala. Thistructure is part of the “extended system” which mediates theerception of emotional facial expressions, especially fear andnger [13,55,62,63]. Neuroimaging data also indicate that fear-

ul eyes alone are sufficient to evoke increased amygdala activity53,78]. Studies of fear conditioning in animals and humans haveemonstrated that the amygdala plays a central role in process-ng fear [49,50,56]. Data from patients with bilateral damage

aamf

Bulletin 74 (2007) 344–356 353

r unilateral right lesions have demonstrated an impairment inhese subjects’ ability to recognize negative emotions such asear and anger [6,11,17,52].

Recent neuroimaging studies have shown that the amyg-ala is equally responsive across all expressions of affect82,27,80] and also responds to neutral expressions [37,81]uggesting that the amygdala could have a wider role inrocessing several emotional categories or general stimulusalience/relevance.

Indeed, it has been shown that the amygdala processesnformation critical for social cognition [2,14], as in judgingomeone’s state of mind on the basis of the perception of the eyeegion [10]. In addition, discrimination between the direction ofaze elicits a response in the amygdala [46].

Recently, Adolphs et al. [3] monitored eye movements dur-ng observation of fearful facial expressions, both in healthyubjects and in a patient with bilateral amygdala damage. Theatient showed a highly abnormal fixation pattern, and, in par-icular, failed to fixate the eye region. Adolphs et al. suggestedhat this patient’s impaired ability to recognize fear was due toer inability to use information from the eye region, which isormally essential for recognizing fear, and that this inabilityas due to her lack of spontaneous fixation on the eyes during

ree viewing of faces. Notably, her ability to recognize fearfulaces normalized completely when she was explicitly instructedo look at the eyes [3].

These data suggest that the amygdala might also influence theisual information that our eyes seek in the first place. Animaltudies demonstrated that there are extensive back-projectionsrom the amygdala to the visual cortex [8,40,41] and recentlyhese reciprocal connections have been identified in humans, too20]. Together with the neuroimaging and neuropsychologicalesults of others [9,54,73], our data suggest that the amygdalaodulates visual processes relatively early. This mechanism

ould be part of the role played by the amygdala in the res-lution of ambiguous facial expressions [36,44,61,77]. Thus,mpaired recognition of fearful faces after amygdala damage

ight be due not so much to a basic visuo-perceptual inabilityo process information from the eyes, as to an inability to seekut, fixate, pay attention to, and make use of information crucialo the identification of emotions. In agreement with recent find-ngs, these recent results showed that the amygdala is involvedn processing information from the eye region of faces [1,46,53].his functional specialization might account for the role of themygdala in processing emotions related to fear [17], threat andanger [5,7].

In our study, activation of the amygdala was found duringrocessing of whole face stimuli, but not of incomplete facetimuli (eyes or mouth alone). In agreement with Adolphs’ sug-estion [3], we can speculate that the amygdala may directur visual system towards the region of the face that con-eys information helpful in guiding our social behaviour.hen we are presented with either the eyes or mouth region

lone, the amygdala has no need to direct the visual systemutomatically towards them. On the contrary, this orientationechanism appears to be necessary when processing whole

aces.

3 arch

5

(ocpio

adploetogtvfdmi

A

R

tas

R

[

[

[

[

[

[

[

[

[

[

[

[

[

[

[

[

[

[

[

[

54 F. Benuzzi et al. / Brain Rese

. Conclusions

Processing of faces in the presence of incomplete informationeither upper or lower face) was found to activate the structuresf the “core” and “extended” systems specialized for face pro-essing, as well as areas outside these systems. In particular, therocessing of lower faces activated the mirror system special-zed for communicative mouth movements, even in the absencef actual movements in the stimuli.

The processing of whole faces, as opposed to parts of faces,ctivated mesial temporal lobe structures including the amyg-ala. Recently, it has been demonstrated that deficits in fearrocessing after bilateral amygdala damage can depend on aack of spontaneous fixation on the eyes during free viewingf faces [3]. This suggests that the amygdala might modulatearly visual processes by directing our visual system towardshe eyes, the facial region that conveys information able to guideur social behaviour. On the other hand, by directing our ownaze onto the eyes of others we orient attentional resourceso this source of salient social information. Our results pro-ide further evidence that this mechanism is a key functionaleature of the amygdala and suggest that the role of the amyg-ala includes the orienting of attention to both the eye and theouth, i.e., to the parts of the face that convey socially relevant

nformation.

cknowledgement

This work was supported by Ministero dell’Universita e dellaicerca Scientifica e Tecnologica (MIUR).

Conflict of interest statement: All listed authors concur withhe submission of the manuscript, the final version has beenpproved by all authors. The authors have no financial or per-onal conflicts of interest.

eferences

[1] R.B. Adams Jr., H.L. Gordon, A.A. Baird, N. Ambady, R.E. Kleck, Effectsof gaze on amygdala sensitivity to anger and fear faces, Science 300 (2003)1536.

[2] R. Adolphs, Social cognition and the human brain, Trends Cogn. Sci. 3(1999) 469–479.

[3] R. Adolphs, F. Gosselin, T.W. Buchanan, D. Tranel, P. Schyns, A.R. Dama-sio, A mechanism for impaired fear recognition after amygdala damage,Nature 433 (2005) 68–72.

[4] R. Adolphs, D. Tranel, T.W. Buchanan, Amygdala damage impairs emo-tional memory for gist but not details of complex stimuli, Nat. Neurosci. 8(2005) 512–518.

[5] R. Adolphs, D. Tranel, A. Damasio, in: J.P. Aggleton (Ed.), The Amyg-dala. A Functional Analysis, Oxford University Press, New York, 2000,pp. 587–630.

[6] R. Adolphs, D. Tranel, H. Damasio, A. Damasio, Impaired recognitionof emotion in facial expressions following bilateral damage to the humanamygdala, Nature 372 (1994) 669–672.

[7] R. Adolphs, D. Tranel, S. Hamann, A.W. Young, A.J. Calder, E.A. Phelps,

A. Anderson, G.P. Lee, A.R. Damasio, Recognition of facial emotion innine individuals with bilateral amygdala damage, Neuropsychologia 37(1999) 1111–1117.

[8] D.G. Amaral, J.L. Price, Amygdalo-cortical projections in the monkey(Macaca fascicularis), J. Comp. Neurol. 230 (1984) 465–496.

[

[

Bulletin 74 (2007) 344–356

[9] A.K. Anderson, E.A. Phelps, Lesions of the human amygdala impairenhanced perception of emotionally salient events, Nature 411 (2001)305–309.

10] S. Baron-Cohen, H.A. Ring, S. Wheelwright, E.T. Bullmore, M.J. Bram-mer, A. Simmons, S.C. Williams, Social intelligence in the normal andautistic brain: an fMRI study, Eur. J. Neurosci. 11 (1999) 1891–1898.

11] F. Benuzzi, S. Meletti, G. Zamboni, G. Calandra-Buonaura, M. Serafini,F. Lui, P. Baraldi, G. Rubboli, C.A. Tassinari, P. Nichelli, Impaired fearprocessing in right mesial temporal sclerosis: a fMRI study, Brain Res.Bull. 63 (2004) 269–281.

12] E. Bonda, M. Petrides, D. Ostry, A. Evans, Specific involvement of humanparietal systems and the amygdala in the perception of biological motion,J. Neurosci. 16 (1996) 3737–3744.

13] H.C. Breiter, N.L. Etcoff, P.J. Whalen, W.A. Kennedy, S.L. Rauch, R.L.Buckner, M.M. Strauss, S.E. Hyman, B.R. Rosen, Response and habitua-tion of the human amygdala during visual processing of facial expression,Neuron 17 (1996) 875–887.

14] L. Brothers, The social brain: A project for integrating primate behaviorand neurophysiology in a new domain, Concept Neurosci. 1 (1990) 27–51.

15] G. Buccino, F. Binkofski, G.R. Fink, L. Fadiga, L. Fogassi, V. Gallese, R.J.Seitz, K. Zilles, G. Rizzolatti, H.J. Freund, Action observation activatespremotor and parietal areas in a somatotopic manner: an fMRI study, Eur.J. Neurosci. 13 (2001) 400–404.

16] G. Buccino, F. Lui, N. Canessa, I. Patteri, G. Lagravinese, F. Benuzzi,C.A. Porro, G. Rizzolatti, Neural circuits involved in the recognition ofactions performed by nonconspecifics: an FMRI study, J. Cogn. Neurosci.16 (2004) 114–126.

17] A.J. Calder, A.W. Young, D. Rowland, D. Perret, J. Hodges, N. Etcof, Facialemotion recognition after bilateral amygdala damage: differentially severeimpairment of fear, Cogn. Neuropsychol. 13 (1996) 699–745.

18] G.A. Calvert, E.T. Bullmore, M.J. Brammer, R. Campbell, S.C. Williams,P.K. McGuire, P.W. Woodruff, S.D. Iversen, A.S. David, Activation ofauditory cortex during silent lipreading, Science 276 (1997) 593–596.

19] G.A. Calvert, R. Campbell, Reading speech from still and moving faces:the neural substrates of visible speech, J. Cogn. Neurosci. 15 (2003) 57–70.

20] M. Catani, D.K. Jones, R. Donato, D.H. Ffytche, Occipito-temporal con-nections in the human brain, Brain 126 (2003) 2093–2107.

21] L.L. Chao, J.V. Haxby, A. Martin, Attribute-based neural substrates in tem-poral cortex for perceiving and knowing about objects, Nat. Neurosci. 2(1999) 913–919.

22] V.P. Clark, K. Keil, J.M. Maisog, S. Courtney, L.G. Ungerleider, J.V. Haxby,Functional magnetic resonance imaging of human visual cortex during facematching: a comparison with positron emission tomography, Neuroimage4 (1996) 1–15.

23] J. Decety, J. Grezes, Neural mechanisms subserving the perception ofhuman actions, Trends Cogn. Sci. 3 (1999) 172–178.

24] P. Ekman, W.V. Friesen, Pictures of Facial Affect, Consulting PsychologistPress, Palo Alto, 1976.

25] R. Elliott, K.J. Friston, R.J. Dolan, Dissociable neural responses in humanreward systems, J. Neurosci. 20 (2000) 6159–6165.

26] P.F. Ferrari, V. Gallese, G. Rizzolatti, L. Fogassi, Mirror neurons respondingto the observation of ingestive and communicative mouth actions in themonkey ventral premotor cortex, Eur. J. Neurosci. 17 (2003) 1703–1714.

27] D.A. Fitzgerald, M. Angstadt, L.M. Jelsone, P.J. Nathan, K.L. Phan,Beyond threat: amygdala reactivity across multiple expressions of facialaffect, Neuroimage 30 (2006) 1441–1448.

28] P.C. Fletcher, T. Shallice, C.D. Frith, R.S. Frackowiak, R.J. Dolan, Thefunctional roles of prefrontal cortex in episodic memory. II. Retrieval, Brain121 (1998) 1249–1256.

29] M.L. Gorno-Tempini, S. Pradelli, M. Serafini, G. Pagnoni, P. Baraldi,C. Porro, R. Nicoletti, C. Umita, P. Nichelli, Explicit and incidentalfacial expression processing: an fMRI study, Neuroimage 14 (2001) 465–473.

30] E. Halgren, A.M. Dale, M.I. Sereno, R.B. Tootell, K. Marinkovic, B.R.Rosen, Location of human face-selective cortex with respect to retinotopicareas, Hum. Brain Mapp. 7 (1999) 29–37.

31] J.V. Haxby, E.A. Hoffman, M.I. Gobbini, The distributed human neuralsystem for face perception, Trends Cogn. Sci. 4 (2000) 223–233.

arch

[

[

[

[

[

[

[

[

[

[

[

[

[

[

[

[

[

[

[

[

[

[

[

[

[

[

[

[

[

[

[

[

[

[

[

[

[

[

[

[

[

[

[

[

F. Benuzzi et al. / Brain Rese

32] J.V. Haxby, B. Horwitz, L.G. Ungerleider, J.M. Maisog, P. Pietrini, C.L.Grady, The functional organization of human extrastriate cortex: a PET-rCBF study of selective attention to faces and locations, J. Neurosci. 14(1994) 6336–6353.

33] J.V. Haxby, L.G. Ungerleider, V.P. Clark, J.L. Schouten, E.A. Hoffman, A.Martin, The effect of face inversion on activity in human neural systemsfor face and object perception, Neuron 22 (1999) 189–199.

34] R.N. Henson, T. Shallice, R.J. Dolan, Right prefrontal cortex and episodicmemory retrieval: a functional MRI test of the monitoring hypothesis, Brain122 (1999) 1367–1381.

35] E.A. Hoffman, J.V. Haxby, Distinct representations of eye gaze and identityin the distributed human neural system for face perception, Nat. Neurosci.3 (2000) 80–84.

36] P.C. Holland, M. Gallagher, Amygdala circuitry in attentional and repre-sentational processes, Trends Cogn. Sci. 3 (1999) 65–73.

37] T. Iidaka, T. Okada, T. Murata, M. Omori, H. Kosaka, N. Sadato, Y.Yonekura, Age-related differences in the medial temporal lobe responsesto emotional faces as revealed by fMRI, Hippocampus 12 (2002) 352–362.

38] T. Iidaka, M. Omori, T. Murata, H. Kosaka, Y. Yonekura, T. Okada, N.Sadato, Neural interaction of the amygdala with the prefrontal and temporalcortices in the processing of facial expressions as revealed by fMRI, J. Cogn.Neurosci. 13 (2001) 1035–1047.

39] A. Ishai, C.F. Schmidt, P. Boesiger, Face perception is mediated by adistributed cortical network, Brain Res. Bull. 67 (2005) 87–93.

40] E. Iwai, M. Yukie, Amygdalofugal and amygdalopetal connections withmodality-specific visual cortical areas in macaques (Macaca fuscata, M.mulatta, and M. fascicularis), J. Comp. Neurol. 261 (1987) 362–387.

41] E. Iwai, M. Yukie, H. Suyama, S. Shirakawa, Amygdalar connections withmiddle and inferior temporal gyri of the monkey, Neurosci. Lett. 83 (1987)25–29.

42] N. Kanwisher, M.M. Chun, J. McDermott, P.J. Ledden, Functional imagingof human visual recognition, Brain Res. Cogn. Brain Res. 5 (1996) 55–67.

43] N. Kanwisher, J. McDermott, M.M. Chun, The fusiform face area: a modulein human extrastriate cortex specialized for face perception, J. Neurosci.17 (1997) 4302–4311.

44] B.S. Kapp, P.J. Whalen, W.F. Supple, J.P. Pascoe, in: J. Aggleton (Ed.),The Amygdala: Neurobiological Aspects of Emotion, Memory, and MentalDysfunction, Wiley-Liss, New York, 1992.

45] H. Kawasaki, O. Kaufman, H. Damasio, A.R. Damasio, M. Granner, H.Bakken, T. Hori, M.A. Howard 3rd, R. Adolphs, Single-neuron responsesto emotional visual stimuli recorded in human ventral prefrontal cortex,Nat. Neurosci. 4 (2001) 15–16.

46] R. Kawashima, M. Sugiura, T. Kato, A. Nakamura, K. Hatano, K. Ito, H.Fukuda, S. Kojima, K. Nakamura, The human amygdala plays an importantrole in gaze monitoring. A PET study, Brain 122 (1999) 779–783.

47] M.L. Keightley, G. Winocur, S.J. Graham, H.S. Mayberg, S.J. Hevenor,C.L. Grady, An fMRI study investigating cognitive modulation of brainregions associated with emotional processing of visual stimuli, Neuropsy-chologia 41 (2003) 585–596.

48] H. Kim, L.H. Somerville, T. Johnstone, A.L. Alexander, P.J. Whalen,Inverse amygdala and medial prefrontal cortex responses to surprised faces,Neuroreport 14 (2003) 2317–2322.

49] K.S. LaBar, J.C. Gatenby, J.C. Gore, J.E. LeDoux, E.A. Phelps, Humanamygdala activation during conditioned fear acquisition and extinction: amixed-trial fMRI study, Neuron 20 (1998) 937–945.

50] J.E. LeDoux, in: P.A. John (Ed.), The Amygdala: Neurobiological Aspectsof Emotion, Memory, and Mental Dysfunction., Wiley-Liss, New York,NY, US, 1992, pp. 339–351.

51] G. McCarthy, A. Puce, J.C. Gore, T. Allison, Face-specific processing inthe human fusiform gyrus, J. Cogn. Neurosci. 9 (1997) 605–610.

52] S. Meletti, F. Benuzzi, G. Rubboli, G. Cantalupo, M. Stanzani Maserati, P.Nichelli, C.A. Tassinari, Impaired facial emotion recognition in early-onsetright mesial temporal lobe epilepsy, Neurology 60 (2003) 426–431.

53] J.S. Morris, M. deBonis, R.J. Dolan, Human amygdala responses to fearfuleyes, Neuroimage 17 (2002) 214–222.

54] J.S. Morris, K.J. Friston, C. Buchel, C.D. Frith, A.W. Young, A.J. Calder,R.J. Dolan, A neuromodulatory role for the human amygdala in processingemotional facial expressions, Brain 121 (1998) 47–57.

[

[

Bulletin 74 (2007) 344–356 355

55] J.S. Morris, C.D. Frith, D.I. Perrett, D. Rowland, A.W. Young, A.J. Calder,R.J. Dolan, A differential neural response in the human amygdala to fearfuland happy facial expressions, Nature 383 (1996) 812–815.

56] S.N. Moses, J.M. Houck, T. Martin, F.M. Hanlon, J.D. Ryan, R.J. Thoma,M.P. Weisend, E.M. Jackson, E. Pekkonen, C.D. Tesche, Dynamic neuralactivity recorded from human amygdala during fear conditioning usingmagnetoencephalography, Brain Res. Bull. 71 (2007) 452–460.

57] K. Nakamura, R. Kawashima, K. Ito, M. Sugiura, T. Kato, A. Nakamura,K. Hatano, S. Nagumo, K. Kubota, H. Fukuda, S. Kojima, Activation ofthe right inferior frontal cortex during assessment of facial emotion, J.Neurophysiol. 82 (1999) 1610–1614.

58] R.C. Oldfield, The assessment and analysis of handedness: the EdinburghInventory, Neuropsychologia 9 (1971) 97–113.

59] D.I. Perrett, J.K. Hietanen, M.W. Oram, P.J. Benson, Organization andfunctions of cells responsive to faces in the temporal cortex, Proc. R. Soc.Lond. B Biol. Sci. 335 (1992) 23–30.

60] D.I. Perrett, P.A. Smith, D.D. Potter, A.J. Mistlin, A.S. Head, A.D. Milner,M.A. Jeeves, Visual cells in the temporal cortex sensitive to face view andgaze direction, Proc. R. Soc. Lond. B Biol. Sci. 223 (1985) 293–317.

61] L. Pessoa, M. McKenna, E. Gutierrez, L.G. Ungerleider, Neural processingof emotional faces requires attention, Proc. Natl. Acad. Sci. U.S.A. 99(2002) 11458–11463.

62] M.L. Phillips, A.W. Young, S.K. Scott, A.J. Calder, C. Andrew, V. Giampi-etro, S.C. Williams, E.T. Bullmore, M. Brammer, J.A. Gray, Neuralresponses to facial and vocal expressions of fear and disgust, Proc. Biol.Sci. 265 (1998) 1809–1817.

63] M.L. Phillips, A.W. Young, C. Senior, M. Brammer, C. Andrew, A.J. Calder,E.T. Bullmore, D.I. Perrett, D. Rowland, S.C. Williams, J.A. Gray, A.S.David, A specific neural substrate for perceiving facial expressions ofdisgust, Nature 389 (1997) 495–498.

64] A. Puce, T. Allison, M. Asgari, J.C. Gore, G. McCarthy, Differential sensi-tivity of human visual cortex to faces, letterstrings, and textures: a functionalmagnetic resonance imaging study, J. Neurosci. 16 (1996) 5205–5215.

65] A. Puce, T. Allison, S. Bentin, J.C. Gore, G. McCarthy, Temporal cortexactivation in humans viewing eye and mouth movements, J. Neurosci. 18(1998) 2188–2199.

66] A. Puce, T. Allison, G. McCarthy, Electrophysiological studies of humanface perception. III: effects of top–down processing on face-specific poten-tials, Cereb. Cortex 9 (1999) 445–458.

67] A. Puce, A. Syngeniotis, J.C. Thompson, D.F. Abbott, K.J. Wheaton, U.Castiello, The human temporal lobe integrates facial form and motion:evidence from fMRI and ERP studies, Neuroimage 19 (2003) 861–869.

68] M. Sams, J.K. Hietanen, R. Hari, R.J. Ilmoniemi, O.V. Lounasmaa,Face-specific responses from the human inferior occipito-temporal cortex,Neuroscience 77 (1997) 49–55.

69] D. Sander, J. Grafman, T. Zalla, The human amygdala: an evolved systemfor relevance detection, Rev. Neurosci. 14 (2003) 303–316.

70] J. Sergent, S. Ohta, B. MacDonald, Functional neuroanatomy of face andobject processing. A positron emission tomography study, Brain 115 (1992)15–36.

71] J. Talairach, P. Tournoux, Co-Planar Stereotaxic Atlas of the Human Brain,Thieme Medical Publisher, Inc, New York, 1988.

72] F. Tong, K. Nakayama, M. Moscovitch, O. Weinrib, N. Kanwisher,Response properties of the human fusiform face area, Cogn. Neuropsychol.17 (2000) 257–279.

73] P. Vuilleumier, M.P. Richardson, J.L. Armony, J. Driver, R.J. Dolan, Distantinfluences of amygdala lesion on visual cortical activation during emotionalface processing, Nat. Neurosci. 7 (2004) 1271–1278.

74] L. Wang, G. McCarthy, A.W. Song, K.S. Labar, Amygdala activation to sadpictures during high-field (4 Tesla) functional magnetic resonance imaging,Emotion 5 (2005) 12–22.

75] S. Watanabe, R. Kakigi, S. Koyama, E. Kirino, It takes longer to recognizethe eyes than the whole face in humans, Neuroreport 10 (1999) 2193–2198.

76] S. Watanabe, R. Kakigi, S. Koyama, E. Kirino, Human face perceptiontraced by magneto- and electro-encephalography, Brain Res. Cogn. BrainRes. 8 (1999) 125–142.

77] P.J. Whalen, Fear, vigilance, and ambiguity: initial neuroimaging studiesof the human amygdala, Curr. Dir. Psychol. Sci. 7 (1998) 177–188.

3 arch

[

[

[

56 F. Benuzzi et al. / Brain Rese

78] P.J. Whalen, J. Kagan, R.G. Cook, F.C. Davis, H. Kim, S. Polis, D.G.McLaren, L.H. Somerville, A.A. McLean, J.S. Maxwell, T. Johnstone,Human amygdala responsivity to masked fearful eye whites, Science 306

(2004) 2061.

79] K.J. Wheaton, J.C. Thompson, A. Syngeniotis, D.F. Abbott, A. Puce,Viewing the motion of human body parts activates different regions ofpremotor, temporal, and parietal cortex, Neuroimage 22 (2004) 277–288.

[

[

Bulletin 74 (2007) 344–356

80] J.S. Winston, J. O’Doherty, R.J. Dolan, Common and distinct neuralresponses during direct and incidental processing of multiple facial emo-tions, Neuroimage 20 (2003) 84–97.

81] P. Wright, Y. Liu, Neutral faces activate the amygdala during identitymatching, Neuroimage 29 (2006) 628–636.

82] T.T. Yang, V. Menon, S. Eliez, C. Blasey, C.D. White, A.J. Reid, I.H. Gotlib,A.L. Reiss, Amygdalar activation associated with positive and negativefacial expressions, Neuroreport 13 (2002) 1737–1741.