Unattended emotional faces elicit early lateralized amygdala–frontal and fusiform activations

8
This article appeared in a journal published by Elsevier. The attached copy is furnished to the author for internal non-commercial research and education use, including for instruction at the authors institution and sharing with colleagues. Other uses, including reproduction and distribution, or selling or licensing copies, or posting to personal, institutional or third party websites are prohibited. In most cases authors are permitted to post their version of the article (e.g. in Word or Tex form) to their personal website or institutional repository. Authors requiring further information regarding Elsevier’s archiving and manuscript policies are encouraged to visit: http://www.elsevier.com/copyright

Transcript of Unattended emotional faces elicit early lateralized amygdala–frontal and fusiform activations

This article appeared in a journal published by Elsevier. The attachedcopy is furnished to the author for internal non-commercial researchand education use, including for instruction at the authors institution

and sharing with colleagues.

Other uses, including reproduction and distribution, or selling orlicensing copies, or posting to personal, institutional or third party

websites are prohibited.

In most cases authors are permitted to post their version of thearticle (e.g. in Word or Tex form) to their personal website orinstitutional repository. Authors requiring further information

regarding Elsevier’s archiving and manuscript policies areencouraged to visit:

http://www.elsevier.com/copyright

Author's personal copy

Unattended emotional faces elicit early lateralized amygdala–frontal andfusiform activations

Yuwen Hung a,c, Mary Lou Smith b,c, Dimitri J. Bayle a,d, Travis Mills a, Douglas Cheyne b, Margot J. Taylor a,b,⁎a Diagnostic Imaging, The Hospital for Sick Children, 555 University Avenue, Toronto, Ontario, Canada M5G1X8b Research Institute, The Hospital for Sick Children, 555 University Avenue, Toronto, Ontario, Canada M5G1X8c Department of Psychology, University of Toronto Mississauga, Mississauga, Canadad Brain Dynamics and Cognition, INSERM, U821, and Université Lyon 1, Lyon, France

a b s t r a c ta r t i c l e i n f o

Article history:Received 20 August 2009Revised 15 December 2009Accepted 22 December 2009Available online 4 January 2010

Human adaptive behaviour to potential threats involves specialized brain responses allowing rapid andreflexive processing of the sensory input and a more directed processing for later evaluation of the nature ofthe threat. The amygdalae are known to play a key role in emotion processing. It is suggested that theamygdalae process threat-related information through a fast subcortical route and slower cortical feedback.Evidence from human data supporting this hypothesis is lacking. The present study investigated event-related neural responses during processing of facial emotions in the unattended hemifield usingmagnetoencephalography (MEG) and found activations of the amygdala and anterior cingulate cortex tofear as early as 100 ms. The right amygdala exhibited temporally dissociated activations to input fromdifferent visual fields, suggesting early subcortical versus later cortical processing of fear. We also observedasymmetrical fusiform activity related to lateralized feed-forward processing of the faces in the visual–ventral stream. Results demonstrate fast, automatic, and parallel processing of unattended emotional faces,providing important insights into the specific and dissociated neural pathways in emotion and faceperception.

© 2010 Elsevier Inc. All rights reserved.

The ability to detect potential threats involves specialized neuralsystems that can facilitate fast responses to allow adaptive behaviour.The amygdalae are known to play a key role in emotion processing(LeDoux, 1996) and are responsive to stimuli such as fearful faces(Cornwell et al., 2008; Morris et al., 1996; LeDoux, 1996) or fearfuleyes (Whalen et al., 2004; Hardee et al., 2008). It is proposed, based onanimal research (LeDoux, 1996), that the amygdalae process threat-related information through two routes: a fast subcortical route(thalamus–amygdala) and a slower cortical route (thalamus–sensorycortex–amygdala). The subcortical pathway has an evolutionary valuethat allows quick and automatic responses to potential danger. Thecortical pathway provides feedback processing from the sensoryafferent input for cortical evaluation (LeDoux, 1996; Johnson, 2005).However, direct evidence with human data supporting these twolevels of processing is lacking, as the timing of brain activity is crucialto understanding the nature of neural events underlying rapidprocessing of emotional stimuli.

Neurophysiological, event-related potential (ERP) research hasreported an early emotion effect that modulates the amplitude ofneural responses at the latency of 100 ms in posterior regions and

120 ms in fronto-central areas for fearful expressions (Eimer andHolmes, 2002; Batty and Taylor, 2003; Holmes et al., 2003). It ishypothesized that this early emotion effect reflects fast andspontaneous extraction of affective information from the faces andis generated from the amygdala and interconnected cortical regionssuch as the anterior cingulate cortex (ACC) (Eimer and Holmes, 2007;Vuilleumier and Pourtois, 2007). Recent MEG studies have reportedincreases in gamma-band synchronization in the amygdala related toprocessing emotional stimuli at early latencies between 20 and170 ms, which suggested processing in the subcortical pathway (Luoet al., 2007, 2009). However, there is a paucity of data from humansthat differentiate the cortical versus subcortical processing ofemotions through the amygdala.

Studies have found that the amygdala responses to fearful faces aresensitive when the faces were of low visual spatial frequency(Vuilleumier et al., 2003; Johnson, 2005), presented subliminally(Luo et al., 2009), or in the unattended peripheral visual fields(Ewbank et al., 2009). The possible common mechanism of thesestudies is that these coarse, less detailed visual signals can activate fastand automatic neural responses through a subcortical pathway in theamygdala. Asymmetries in amygdala responses have also beenreported in a number of fMRI and PET studies, with explicit processingof fearful faces, eyes or negative pictures activating the left amygdala(Funayama et al., 2001; Hardee et al., 2008; Morris et al., 1996),

NeuroImage 50 (2010) 727–733

⁎ Corresponding author. Fax: +1 416 813 7362.E-mail address: [email protected] (M.J. Taylor).

1053-8119/$ – see front matter © 2010 Elsevier Inc. All rights reserved.doi:10.1016/j.neuroimage.2009.12.093

Contents lists available at ScienceDirect

NeuroImage

j ourna l homepage: www.e lsev ie r.com/ locate /yn img

Author's personal copy

whereas implicit processing of these stimuli activated the rightamygdala (Morris et al., 1998, 1999; Funayama et al., 2001; Funayamaet al., 2001, but see Whalen et al., 2004). Additional timing informa-tion of such activity would contribute to our understanding of thissubcortical processing.

Converging observations have further suggested that emotionalstimuli perceived in the left, rather than the right, visual field are morerelated to emotion processing implicating areas including the amygdala,particularly in the right hemisphere (Glascher and Adolphs, 2003;Noesselt et al., 2005). Thesefindingsmaybe related toa right-hemispherespecialization in emotion processing and its sensitivity to the afferentvisual input from the contralateral visual field. Here we characterized thetime course of emotion processing in the amygdala using emotional faceswith hemifield presentation. With the spatiotemporal sensitivity ofmagnetoencephalography (MEG), we expected to see a visual-field-sensitive processing in the amygdala related to cortical feedback of thefearful stimuli, differentiated in time from early subcortical processing.

The ACC in the frontal lobes of the brain is involved in recruitingfrontal, or executive, resources related to attentional control for task-related responses, especially in the presence of distracting stimuli(Bush et al., 2000) such as task-irrelevant emotional information(Pessoa, 2005). fMRI studies have noted complex relations between theamygdala and the ACC during processing of emotions, as the two areaseither co-activated (Petrovic et al., 2004; Mohanty et al., 2007; Bankset al., 2007; Haas et al., 2007) or de-activated with each other (Ochsneret al., 2002; Petrovic et al., 2004; Das et al., 2005; Felmingham et al.,2007). However, the limited temporal resolution of fMRI makes itdifficult to determine the real-time relation between these regions.

Herewe investigated the effect of emotional faces presented in the leftor right hemifield on the time-locked neural activity focusing on areasimplicated in the emotional face-processing network. We calculated thetime courses of the neural sources engaged in processing the facialemotions.MEGhasnotbeenapplied to researchof cognitiveneuroscienceas extensively asother imaging techniques, yet it provides superior spatiallocalization than ERPs, approaching fMRI, and excellent temporalresolution, allowing us to tease apart early and rapid neural events.

Materials and methods

Subjects

Fourteen healthy, right-handed subjects (mean age, 27 years;seven females) participated in the study. None had a history of

neurological or psychiatric disorders and all had normal vision. Allsubjects provided informed written consent; the study was approvedby the Hospital for Sick Children Research Ethics Board.

Task

We studied the effect of task-irrelevant emotional expressionsand their location in the visual field (left or right hemifield). Stimuliwere projected on a black-background screen at a viewing distance of50 cm. Each trial contained a scrambled pattern and a face that werepresented simultaneously and for the same duration, each on eitherthe left or the right side of a central fixation cross (Fig. 1). Subjectswere instructed to fixate the central cross while responding asquickly as possible to the scrambled pattern (the target) by pressingthe left or right button corresponding to the target location with theirindex fingers. The stimuli in each trial were presented for 150 ms toavoid secondary elaboration or inhibition effects, with an ISI varyingfrom 1100 to 1300 ms. The task contained 300 trials, including 150trials for each of the left-visual-field (LVF) and the right-visual-field(RVF) face conditions with 50 trials of each of the three emotions ineach hemifield. Twenty-five gray-scale photographs of different faceswere randomly presented for each expression (75 faces were used intotal). The visual angle of the stimuli and the inter-stimulus distancewas 4°.

Neuroimaging

Neuromagnetic activity was recorded using a 151-channel CTFMEG system (VSM MedTech Ltd., Canada) in a magnetically shieldedroom at the Hospital for Sick Children in Toronto. During theexperiment, participants lay comfortably with their head in theMEG helmet. Head position relative to the MEG sensors wasdetermined by the use of three reference sensors attached at thenasion, left and right pre-auricular points. Head movement over therecording was less than 5 mm for all subjects. The reference sensorswere replaced by contrast markers to allow co-registration with eachsubject's anatomical MR image. A T1-weighted MRI of the brain wasobtained (3D SPGR, 116 slices; TR/TE/FA=9 ms/4.2 ms/15°; voxelsize=0.9375×0.9375×1.5 mm) using a 1.5-T GE Excite MR scanner(Signa Advantage System) and an 8-channel head coil (GE MedicalSystems, Milwaukee, WI). A multiple sphere model was used to fit tothe inner skull surface derived from each subject's MRI usingBrainSuite software (Shattuck and Leahy 2002).

Fig. 1. The target-detection task. Each trial contains a scrambled pattern (target) randomly located in either left or right hemifield, accompanied by an emotional face in the otherhemifield. Subjects fixated the central cross and responded to the target by pressing buttons corresponding to the side of the target. Stimuli were presented for 150 ms with an inter-stimulus interval varying between 1100 and1300 ms.

728 Y. Hung et al. / NeuroImage 50 (2010) 727–733

Author's personal copy

Data

MEG was recorded at 625 samples/s with a bandpass of 0 to100 Hz and filtered off-line with a bandpass of 1 to 50 Hz. MEG datawere time-locked to trial onset and averaged by the trial type. Weanalyzed the first twoMEG components (M1andM2) that occurred at100 and 150 ms (Fig. 2) before behavioural responses were made. Welocalized sources at each latency for each subject and condition usingan event-related minimum variance beamforming (ERB) method(Cheyne et al., 2006, 2007). Volumetric images of brain activity werespatially normalized into Talairach space using SPM2 (http:// www.fil.ion.ucl.ac.uk/spm/software/spm2) using both linear and nonlinearwarping parameters obtained from each individual's MRI, producing a5-mm voxel-grid of source power in standardized stereotaxic space(Singh et al., 2003; Chau et al., 2004). Normalized group-averagedimages were superimposed on the 3D template brain using theMRIcro program (http://www.sph.sc.edu/comd/rorden/mricro.html). Results showed significant activations in the amygdala, ACC,and fusiform areas (regions of interest). We then calculated the timecourses for the peak voxels within these regions using the rectifiedaveraged time series output of the beamformer spatial filter(Robinson and Vrba, 1999; Cheyne et al., 2006).

Statistical analysis

We focused on the first two MEG components occurring at 100 and150 ms. To assess group-averaged activations for each trial type andcomponent, the threshold was determined using a non-parametricomnibus permutation test (Pb 0.01) on individual whole-head ERBimages. This test corrects formultiple comparisons by using a thresholdtaken from a distribution of the global maximum brain values (Nicholsand Holmes, 2002; Singh et al., 2003). To identify emotion-relatedactivations to the fearful and happy faces relative to the neutral faces,we calculated the differences of the source power between theemotional and neutral faces for each latency component for eachsubject. To assess the group-averaged contrast activations, the non-parametric omnibus permutation test was applied on the contrastedimages of mean activity power during baseline period (between –100and 0 ms prior to stimulus onset) to create a null distribution fromwhich threshold pseudo-Z values were selected for each contrastcorresponding to Pb0.001 (Chau et al., 2004; Cheyne et al., 2006). Forthe significant voxels of peak amplitude in regions of interest identifiedfrom the averaged contrast results, we examined the time courses andcomputed a sample-wise parametric permutation test (uncorrected)on each time point across subjects to identify differences in activityover time between two conditions. RTs were analyzed in a 3×2 ANOVA(three emotions by two hemifields) with SPSS software.

Results

Reaction times

The reaction times (RTs) showed a significant interaction betweenemotion and the location of the face in the visual field (Pb0.001, two-way ANOVA): RTs to the target (the scrambled pattern) were thelongest when the target was paired with a LVF fearful face (meanvalue=324ms). These RTs were significantly longer than those whenthe target was presented with either a neutral (309 ms, Pb0.005) or ahappy (317 ms, Pb0.05) LVF face.

ACC and amygdala activations

Contrasts between the fearful and neutral faces calculated from thefirst component at 100 ms (M1) across the visual fields showedsignificantly increased dorsal ACC and decreased ventral ACC activitybilaterally, but predominantly in the left hemisphere (Pb .001; non-parametric permutation test; Fig. 3A). In the same fearful-neutralcontrasts at M1, we also found a significant source in the rightamygdala (Pb .001; non-parametric permutation test; Fig. 3B),activating more in the fearful than in the neutral emotion conditionacross visual fields as shown in the time courses (Pb0.005, sample-wise permutation test; Fig. 3C). Separating LVF and RVF facepresentations, the source activity of the right amygdala in the LVFfearful condition showed two peaks, significantly stronger than thosein the LVF neutral condition, at 100 ms (Pb0.005; sample-wisepermutation) and 165 ms (Pb0.01; Fig. 3D). In the RVF condition,the amygdala showed only a fearful-neutral difference at 100 ms(Pb0.005; Fig. 3E). We did not observe activation in the left amygdalain the fearful-neutral contrasts, and neutral-happy contrasts were notsignificant in either the ACC or the amygdala.

Fusiform activations

We found significant bilateral fusiform activations at the secondMEG component at 150 ms across all conditions (Pb .01; non-parametric permutation test), with a right hemispheric dominance(Fig. 4A). The time course of the right fusiform activity showed alarger amplitude to fearful than to neutral (Pb0.007; sample-wisepermutation) and happy faces (Pb0.009) at 170 ms across the visualfields (Fig. 4B). Comparison of the time courses between the LVF andRVF face presentation across emotions showed dissociated fusiformactivity. The right fusiform exhibited significantly higher activation tothe LVF compared to the RVF face presentations during 150–170 ms(Pb0.005; Fig. 4C). In contrast, the left fusiform was activated morestrongly with the RVF than LVF face presentation in the latency range

Fig. 2. The MEG waveforms averaged across subjects. Significant results were found from early latency components at 100 and 150 ms.

729Y. Hung et al. / NeuroImage 50 (2010) 727–733

Author's personal copy

of 120–150 ms (Pb0.005; Fig. 4D); at the earlier peak in the leftfusiform (85ms), there was no significant difference between the twovisual field conditions.

Discussion

The present study demonstrates that the ACC–amygdala regionsare involved in the rapid processing of unattended fearful facialexpressions.We found significant activations within the ACC and rightamygdala at 100 ms that were stronger in response to the task-irrelevant fearful, relative to the neutral, faces. The dorsal ACC showedincreased activity in response to the fearful relative to the neutralfaces, while the ventral ACC showed decreased activity in thiscomparison. Research has reported a heterogeneous division offunction within the ACC (Bush et al., 2000) involving both cognitiveand emotion processing. It has been posited that the dorsal ACC regionis associated with top–down cognitive control of distracting stimuli,including emotional information, during attention-demanding tasks,whereas the ventral ACC region is associated with bottom–up,stimulus-driven processing of emotional events or during patholog-ical or induced internal emotional states (Drevets and Raichele, 1998;

Bush et al., 2000; Pessoa, 2005; Herd et al., 2006). Our data support areciprocal suppression between these two functionally and anatom-ically connected areas (Drevets and Raichle, 1998; Margulies et al.,2007) with opposite and time-locked patterns of dorsal versus ventralACC activity within a single task related to processing the task-irrelevant fearful faces. These opposing activations may be related toearly attentional control of the ACC to suppress the distracting fearfulemotion of the faces, as they capture attention automatically andinterfere with the task at hand. The co-activation of the ACC and theamygdala suggests a fear-related network in the two areas (Bankset al., 2007) and supports the anatomical connections between thesetwo regions reported in animal studies (Conde et al., 1995; Sesacket al., 1989). These findings may contribute to understanding emo-tional disorders and emotion regulation, where individuals havedifficulties in suppressing irrelevant and negative emotional eventsthat appear to occupy attention non-volitionally.

The amygdala responses to the task-irrelevant fearful faces addnew data to literature demonstrating the specialized and implicitfunction of the amygdalae to fearful objects (Breiter et al., 1996;Morris et al., 1996). Previous research has suggested that the rightamygdala is more engaged in a fast, implicit, and reflexive processing

Fig. 3. (A) Reciprocal responses to fear within the ACC. Significant activity increases (red) in dorsal ACC and decreases (blue) in ventral ACC were found to the fearful in contrast tothe neutral faces at 100ms across the visual fields. Peak Talairach coordinates of the dorsal ACC: –10, 30, 22; the ventral ACC: –10, 39, 3. (B) Right amygdala activation. Brain image offearful versus neutral (F–N) difference at M1 showed the right amygdala source, averaged across the visual fields and all subjects. Peak Talairach coordinates: 30, –5, –12. (C)Averaged time series of the right amygdala peak source showed a higher peak amplitude in response to all fearful compared to all neutral conditions at 100 ms across visual fields.(D)With the LVF face presentation, the time course of the right amygdala showed a bifid activity with larger responses to the LVF fearful faces than to the LVF neutral faces at 100 and165 ms. (E) With the RVF face presentation, the right amygdala only showed significant differences in activity, higher to fearful than to neutral faces, at 100 ms.

730 Y. Hung et al. / NeuroImage 50 (2010) 727–733

Author's personal copy

of stimuli that signal potential threat, while the left amygdala is moreengaged in a sustained, explicit, linguistic-related evaluation ofnegative emotional events (Morris et al., 1998; Funayama et al.,2001; Wright et al., 2001; Gläscher and Adolphs, 2003; Hardee et al.,2008). Evidence consistent with this specialization includes recentMEG findings of left-lateralized amygdala activation in tasks thatexplicitly compared and identified emotional faces (face matchingtasks) at time windows after 100 ms (Cornwell et al., 2008), alongwith the current data of right-lateralized amygdala activation at100 ms indexing implicit processing of task-irrelevant emotionalinformation of the faces. The timing of brain activation is crucial toallow specification of distinct neural events, which provide criticalinformation to identify neurocognitive functions.

To date, studies have not been able to characterize the timing ofearly-subcortical versus later-cortical processing in fear perception.Here we provide evidence supporting the hypothesis regarding thistwo-level processing model. MEG allowed us to differentiate thetemporal activity of the amygdala related to the visual-field sensitivityto the fearful emotion: the right amygdala showed a significantdifference in fearful-neutral activity to both LVF and RVF facepresentations at 100 ms, while this difference was present only at165 ms to the LVF face presentation. Although larger right amygdalaresponse to fearful faces in the LVF has been previously observed(Noesselt et al., 2005), the mechanism of this visual-field superiorityof the amygdala was unknown. Here the data showed that the right

amygdala activation sensitive to the LVF fearful emotions occurred ata later stage of processing, differentiating early visual-field-indepen-dent activation at 100 ms from the LVF-dependent activation at165 ms. This dissociation in visual-field sensitivity suggests thatdifferent neural inputs may be involved in the early versus lateprocessing stages. Recent models have proposed that the amygdalareceives visual input through both a fast, direct subcortical pathway(via retinal–collicular–pulvinar) (Linke et al., 1999; Morris et al.,1999; Williams et al., 2006) and a slow, indirect visual corticalpathway (via lateral geniculate nucleus–amygdala) (Lamme andRoelfsema, 2000;Williams et al., 2006). The early amygdala activationindicates a fast and automatic response, independent of the location ofthe fearful stimuli in the visual field. We speculate that this early andvisual-field independent response to fear is mediated by the fastsubcortical pathway; the later, visual-field dependent activation ismediated by the slower cortical pathway predominantly from theright hemisphere that preferentially receives LVF visual input. Thesecondary amygdala response may also account for the behaviouralfindings in which the LVF fearful faces delayed the responses to thetarget in the current study. Studies regarding activity in the ACC andthe amygdala in processing negative events (Banks et al., 2007;Ochsner et al., 2002) should consider the anatomically heterogeneousand temporally dynamic nature of activity in these areas in the future.

Fusiform activations at 150–200ms (M170) during face processinghave been well established (Barbeau et al., 2008; McCarthy et al.,

Fig. 4. Bilateral fusiform activations and double-dissociated time series. (A) Significant bilateral fusiform responses were observed in the averagedMEG response across conditions at150 ms. (B) The right fusiform (30, –64, –9) showed greater activation in response to all fearful compared to all neutral faces at 170 ms across visual fields. (C) The right fusiformresponded more to the LVF than to the RVF faces from 150–170 ms across emotions. (D) The left fusiform (–30, –68, –9), in contrast, responded more to the RVF rather than the LVFfaces during the second peak, 120–150 ms, while at the first peak (85 ms), no significant difference was observed between the two conditions. Peak Talairach coordinates were usedto calculate time courses.

731Y. Hung et al. / NeuroImage 50 (2010) 727–733

Author's personal copy

1997; Puce et al., 1995, 1996), even when the faces were not directlyattended (Cauquil et al., 2000; Furey et al., 2006), as we also observed.The current study showed fusiform sensitivity to facial emotions, asthe right fusiform activated more to the fearful compared to otheremotions, consistent with previous ERP results where fearful facesproduced the largest and longest latency N170s (Batty and Taylor,2003). The time course in the right fusiform showed higher activity tofearful faces at 170 ms, later than the peak response to all faces at150 ms, suggesting that the emotional component may be differen-tially processed later in face perception. It has been argued that thisdelay in the M/N170 to fearful faces is due to incorporation offeedback from the rapid earlier processing for highly salient stimuli(Batty and Taylor, 2003).

In contrast to most previous studies, the faces in our task werepresented in the hemifields maximizing laterality effects in the visualventral stream (Boles, 1983; Enns and Kingstone, 1997; Liu andIoannides, 2006) as our subjects fixated the central cross and usedperipheral vision to detect the targets. We observed that both the leftand right fusiform regions showed higher amplitude activations withcontralateral than ipsilateral faces between 120 and 170 ms (M170).These peripherally presented faces resulted in larger contralateralfusiform activations, dissociating the left and right fusiformresponses. This contralateral visual-field superiority suggests thatthe fusiform receives visual input predominantly from visual areas inthe same hemisphere, supporting the proposal of an occipital–fusiform feed-forward mode in face perception (Liu and Ioannides,2006; Rossion et al., 2003).

Multiple stages of fusiform activation in response to faces havebeen reported (Barbeau et al., 2008; Liu et al., 2002), and it hasrecently been suggested that the left fusiform may account for anearly stage of fusiform activation in response to faces (Cornwell et al.,2008; Rossion et al., 2003). The left fusiform in our data also showedtwo peaks of activation at 85 and 135 ms, both earlier than that of theright fusiform (150 ms), suggesting that the left fusiform may beresponsible for an early phase while the right fusiform may beresponsible for a later phase of face processing. In addition, the peak inthe left fusiform at 85 ms, unlike the peak at 135 ms, did not show thecontralateral visual-field superiority in processing faces, implying thatdifferent processing may be involved in the early stage of faceperception in this region. This rapid face processing is likely related tosubcortical processing of faces, which have lower spatial frequencywhen perceived peripherally (Johnson, 2005; Vuilleumier et al.,2003).

The slower behavioural responses in the presence of the fearful,compared with the neutral, faces indicate interference from the task-irrelevant fearful emotion. The LVF effect of the fearful expression indelaying task responses is in accordancewith previous observations ofan advantage of fearful stimuli in LVF to impair task performance(Noesselt et al., 2005). The behavioural effect suggests that fear-related informationmay be processedmore automatically by the righthemisphere, which receives input from the left visual field moredirectly. This model of hemispheric specialization from behaviouraldata was supported by ourMEG results that showed the fear-sensitiveactivations in the right amygdala and right fusiform areas.

Conclusions

The present study provides novel timing information on earlybrain activations in the amygdala, ACC, and fusiform regions, addingto our knowledge of implicit processing of human facial emotions.The sensitivity to task-irrelevant fearful emotions suggests that theunattended information operates at a level where potential threat isautomatically processed. The early timing of amygdala–ACC activa-tions (at 100 ms) suggests a specialized frontal–limbic network thatcould facilitate fast reaction to potential threat. The dissociation ofthe amygdala fear processing into early, visual-field-independent

versus later, left visual-field-dependent stages can help scientistsand clinicians understand complex cognitive–neurological deficitslinked to the amygdala. The double dissociation in fusiform lateralityrelated to the sensitivity to the contralateral faces can help futureneuroimaging studies probing lateralized processing in this regionand may serve to differentiate neural deficits due to lateralized brainlesions from possible functional compensation. Future studies mayinclude more emotion types, such as angry expressions, to the taskto determine the amygdala's specificity to negative emotions otherthan fear. Future directions may also include assessing networkconnectivity between the amygdalae, ACC, and fusiform regions.Finally, the present study demonstrates that MEG with sophisticatedtemporal source analyses can provide detailed measures of both thelocation and time course of neurocognitive events in deep brainstructures.

Acknowledgments

This work is supported by the Canadian Institutes of HealthResearch grants to M.J.T. (MOP-81161) and Y.H. (CDG-87793) and theOntario Student Opportunity Trust Fund—Hospital for Sick ChildrenFoundation Student Scholarship Program to Y.H.

References

Banks, S.J., Eddy, K.T., Angstadt, M., Nathan, P.J., Phan, K.L., 2007. Amygdala–frontalconnectivity during emotion regulation. Soc. Cogn. Affect. Neurosci. 2 (4), 303–312.

Barbeau, E.J., Taylor, M.J., Regis, J., Marquis, P., Chauvel, P., Liegeois-Chauvel, C., 2008.Spatio temporal dynamics of face recognition. Cereb. Cortex 18 (5), 997–1009.

Batty, M., Taylor, M.J., 2003. Early processing of the six basic facial emotionalexpressions. Cogn. Brain Res. 17 (3), 613–620.

Boles, D.B., 1983. Hemispheric interaction in visual field asymmetry. Cortex 19 (1),99–113.

Breiter, H.C., Etcoff, N.L., Whalen, P.J., Kennedy, W.A., Rauch, S.L., Buckner, R.L., Strauss,M.M., Hyman, S.E., Rosen, B.R., 1996. Response and habituation of the humanamygdala during visual processing of facial expression. Neuron 17 (5), 875–887.

Bush, G., Luu, P., Posner, M.I., 2000. Cognitive and emotional influences in anteriorcingulate cortex. Trends Cogn. Sci. 4 (6), 215–222.

Cauquil, A.S., Edmonds, G.E., Taylor, M.J., 2000. Is the face-sensitive N170 the only ERPnot affected by selective attention? NeuroReport 11 (10), 2167–2171.

Chau, W., McIntosh, A.R., Robinson, S.E., Schulz, M., Pantev, C., 2004. Improvingpermutation test power for group analysis of spatially filtered MEG data.NeuroImage 23 (3), 983–996.

Cheyne, D., Bakhtazad, L., Gaetz, W., 2006. Spatiotemporal mapping of cortical activityaccompanying voluntary movements using an event-related beamformingapproach. Hum. Brain Mapp. 27 (3), 213–229.

Cheyne, D., Bostan, A.C., Gaetz, W., Pang, E.W., 2007. Event-related beamforming: arobust method for presurgical functional mapping using MEG. Clin. Neurophysiol.118 (8), 1691–1704.

Cornwell, B.R., Carver, F.W., Coppola, R., Johnson, L., Alvarez, R., Grillon, C., 2008. Evokedamygdala responses to negative faces revealed by adaptive MEG beamformers.Brain Res. 1244, 103–112.

Conde, F., Maire-Lepoivre, E., Audinat, E., Crepel, F., 1995. Afferent connections of themedial frontal cortex of the rat. II: cortical and subcortical afferents. J. Comp.Neurol. 352, 567–593.

Drevets, W.C., Raichle, M.E., 1998. Reciprocal suppression of regional cerebral bloodflow during emotional versus higher cognitive processes: implications for interac-tions between emotion and cognition. Cognition Emotion 12 (3), 353–385.

Eimer, M., Holmes, A., 2002. An ERP study on the time course of emotional faceprocessing. NeuroReport 13 (4), 427–431.

Eimer, M., Holmes, A., 2007. Event-related brain potential correlates of emotional faceprocessing. Neuropsychologia 45 (1), 15–31.

Enns, J.T., Kingstone, A., 1997. Hemispheric coordination of spatial attention. In:Christman, S. (Ed.), Cerebral Symmetries in Sensory and Perceptual Processing.Elsevier, New York, pp. 197–231.

Ewbank, M.P., Lawrence, A.D., Passamonti, L., Keane, J., Peers, P.V., Calder, A.J., 2009.Anxiety predicts a differential neural response to attended and unattended facialsignals of anger and fear. NeuroImage 44 (3), 1144–1151.

Felmingham, K., Kemp, A., Williams, L., Das, P., Hughes, G., Peduto, A., et al., 2007.Changes in anterior cingulate and amygdala after cognitive behavior therapy ofposttraumatic stress disorder. Psychological Science 18, 127–129.

Funayama, E.S., Grillon, C., Davis, M., Phelps, E.A., 2001. A double dissociation in theaffective modulation of startle in humans: effects of unilateral temporal lobectomy.J. Cogn. Neurosci. 13 (6), 721–729.

Furey, M.L., Tanskanen, T., Beauchamp, M.S., Avikainen, S., Uutela, K., Hari, R., Haxby,J.V., 2006. Dissociation of face-selective cortical responses by attention. Proc.Natl. Acad. Sci. U. S. A. 103 (4), 1065–1070.

Gläscher, J., Adolphs, R., 2003. Processing of the arousal of subliminal and supraliminalemotional stimuli by the human amygdala. J. Neurosci. 23 (32), 10274–10282.

732 Y. Hung et al. / NeuroImage 50 (2010) 727–733

Author's personal copy

Haas, B.W., Omura, K., Constable, R.T., Canli, T., 2007. Emotional conflict andneuroticism: personality-dependent activation in the amygdala and subgenualanterior cingulate. Behavioral Neuroscience 121, 249–256.

Hardee, J.E., Thompson, J.C., Puce, A., 2008. The left amygdala knows fear: laterality inthe amygdala response to fearful eyes. Soc. Cogn. Affect. Neurosci. 3 (1), 47–54.

Herd, S.A., Banich, M.T., O'Reilly, R.C., 2006. Neural mechanisms of cognitive control: anintegrativemodel of Stroop task performance and fMRI data. J. Cogn. Neurosci. 18, 22.

Holmes, A., Vuilleumier, P., Eimer, M., 2003. The processing of emotional facial ex-pression is gated by spatial attention: evidence from event-related brain potentials.Cogn. Brain Res. 16 (2), 174–184.

Johnson, M.H., 2005. Subcortical face processing. Nat. Rev. Neurosci. 6, 766–774.Lamme, V.A., Roelfsema, P.R., 2000. The distinct modes of vision offered by feedforward

and recurrent processing. Trends Neurosci. 23 (11), 571–579.LeDoux, J.E., 1996. The Emotional Brain: The Mysterious Underpinnings of Emotional

Life. Simon & Schuster, New York.Linke, R., De Lima, A.D., Schwegler, H., Pape, H.C., 1999. Direct synaptic connections of

axons from superior colliculus with identified thalamo-amygdaloid projectionneurons in the rat: possible substrates of a subcortical visual pathway to theamygdala. J. Comp. Neurol. 403 (2), 158–170.

Liu, J., Harris, A., Kanwisher, N., 2002. Stages of processing in face perception: an MEGstudy. Nat. Neurosci. 5 (9), 910–916.

Liu, L., Ioannides, A.A., 2006. Spatiotemporal dynamics and connectivity patterndifferences between centrally and peripherally presented faces. NeuroImage 31(4), 1726–1740.

Luo, Q., Holroyd, T., Jones, M., Hendler, T., Blair, J., 2007. Neural dynamics for facialthreat processing as revealed by gamma band synchronization using MEG.NeuroImage 34, 839–847.

Luo, Q., Mitchell, D., Cheng, X., Mondillo, K., Mccaffrey, D., Holroyd, T., Carver, F.,Coppola, R., Blair, J., 2009. Visual awareness, emotion, and gamma bandsynchronization. Cereb. Cortex 19, 1896–1904.

Margulies, D.S., Kelly, A.M., Uddin, L.Q., Biswal, B.B., Castellanos, F.X.,Milham,M.P., 2007.Mapping the functional connectivity of anterior cingulate cortex. NeuroImage 37(2), 579–588.

McCarthy, G., Puce, A., Gore, J.C., Allison, T., 1997. Face-specific processing in the humanfusiform gyrus. J. Cogn. Neurosci. 9 (5), 605–610.

Mohanty, A., Engels, A.S., Herrington, J.D., Heller, W., Ho, M.R., Banich, M.T., et al., 2007.Differential engagement of anterior cingulate cortex subdivisions for cognitive andemotional function. Psychophysiology 44, 343–351.

Morris, J.S., Frith, C.D., Perrett, D.I., Rowland, D., 1996. A differential neural response inthe human amygdala to fearful and happy facial expressions. Nature 383 (6603),812–815.

Morris, J.S., Öhman, A., Dolan, R.J., 1998. Conscious and unconscious emotional learningin the human amygdala. Nature 393 (6684), 467–470.

Morris, J.S., Ohman, A., Dolan, R.J., 1999. A subcortical pathway to the right amygdalamediating “unseen” fear. Proc. Natl. Acad. Sci. 96 (4), 1680–1685.

Nichols, T.E., Holmes, A.P., 2002. Nonparametric permutation tests for functionalneuroimaging: a primer with examples. Hum. Brain Mapp. A15 (1), 1–25.

Noesselt, T., Driver, J., Heinze, H.J., Dolan, R., 2005. Asymmetrical activation in thehuman brain during processing of fearful faces. Curr. Biol. 15 (5), 424–429.

Ochsner, K.N., Bunge, S.A., Gross, J.J., Gabrieli, J.D.E., 2002. Rethinking feelings: an fMRIstudy of the cognitive regulation of emotion. J. Cogn. Neurosci. 14 (8), 1215–1229.

Pessoa, L., 2005. To what extent are emotional visual stimuli processed withoutattention and awareness? Curr. Opin. Neurobiol. 15 (2), 188–196.

Petrovic, P., Carlsson, K., Petersson, K.M., Hansson, P., Ingvar, M., 2004. Context-dependent deactivation of the amygdala during pain. J. Cogn. Neurosci. 16,1289–1301.

Puce, A., Allison, T., Gore, J.C., McCarthy, G., 1995. Face-sensitive regions in humanextrastriate cortex studied by functional MRI. J. Neurophysiol. 74, 1192–1199.

Puce, A., Allison, T., Asgari, M., Gore, J.C., McCarthy, G., 1996. Differential sensitivity ofhuman visual cortex to faces, letterstrings, and textures: a functional magneticresonance imaging study. J. Neurosci. 16 (16), 5205–5215.

Robinson, S.E., Vrba, J., 1999. Functional neuroimaging by synthetic aperturemagnetometry. In: Yoshimoto, T., Kotani, M., Kuriki, S., et al. (Eds.), RecentAdvances in Biomagnetism. Tohoku University Press, Sendai, p. 302.

Rossion, B., Caldara, R., Seghier, M., Schuller, A., Lazeyras, F., Mayer, E., 2003. A networkof occipito-temporal face-sensitive areas besides the right middle fusiform gyrus isnecessary for normal face processing. Brain 126 (11), 2381–2395.

Sesack, S.R., Deutch, A.Y., Roth, R.H., Bunney, B.S., 1989. Topographical organization ofthe efferent projections of the medial prefrontal cortex in the rat: an anterogradetract-tracing study with Phaseolus vulgaris leucoagglutinin. J. Comp. Neurol. 290,213–242.

Shattuck, D.W., Leahy, R.M., 2002. BrainSuite: an automated cortical surfaceidentification tool. Med. Image Anal. 6 (2), 129–142.

Singh, K.D., Barnes, G.R., Hillebrand, A., 2003. Group imaging of task-related changes incortical synchronisation using nonparametric permutation testing. NeuroImage 19(4), 1589–1601.

Vuilleumier, P., Armony, J.L., Driver, J., Dolan, R.J., 2003. Distinct spatial frequencysensitivities for processing faces and emotional expressions. Nat. Neurosci. 6,624–631.

Vuilleumier, P., Pourtois, G., 2007. Distributed and interactive brain mechanismsduring emotion face perception: evidence from functional neuroimaging.Neuropsychologia 45 (1), 174–194.

Whalen, P.J., Kagan, J., Cook, R.G., Davis, F.C., Kim, H., Polis, S., et al., 2004. Humanamygdala responsivity to masked fearful eye whites. Science 306, 2061.

Williams, L.M., Das, P., Liddell, B.J., Kemp, A.H., Rennie, C.J., Gordon, E., 2006. Mode offunctional connectivity in amygdala pathways dissociates level of awareness forsignals of fear. J. Neurosci. 26 (36), 9264–9271.

Wright, C.I., Fischer, H., Whalen, P.J., McInerney, S.C., Shin, L.M., Rauch, S.L., 2001.Differential prefrontal cortex and amygdala habituation to repeatedly presentedemotional stimuli. NeuroReport 12 (2), 379–383.

733Y. Hung et al. / NeuroImage 50 (2010) 727–733