Event-related potential and eye tracking evidence of the developmental dynamics of face processing

14
Event-related potential and eye tracking evidence of the developmental dynamics of face processing Emilie Meaux,* Nadia Hernandez, Isabelle Carteau-Martin, Jo elle Martineau, Catherine Barth el emy, Fr ed erique Bonnet-Brilhault and Magali Batty Universit e Franc ßois-Rabelais de Tours, Inserm, Imagerie et Cerveau UMR U 930, CHRU de Tours, Centre Universitaire de P edoPsychiatrie, Tours, France Keywords: development, eye-tracking, faces, visual event-related potentials Abstract Although the wide neural network and specific processes related to faces have been revealed, the process by which face-pro- cessing ability develops remains unclear. An interest in faces appears early in infancy, and developmental findings to date have suggested a long maturation process of the mechanisms involved in face processing. These developmental changes may be sup- ported by the acquisition of more efficient strategies to process faces (theory of expertise) and by the maturation of the face neu- ral network identified in adults. This study aimed to clarify the link between event-related potential (ERP) development in response to faces and the behavioral changes in the way faces are scanned throughout childhood. Twenty-six young children (410 years of age) were included in two experimental paradigms, the first exploring ERPs during face processing, the second investigating the visual exploration of faces using an eye-tracking system. The results confirmed significant age-related changes in visual ERPs (P1, N170 and P2). Moreover, an increased interest in the eye region and an attentional shift from the mouth to the eyes were also revealed. The proportion of early fixations on the eye region was correlated with N170 and P2 characteristics, highlighting a link between the development of ERPs and gaze behavior. We suggest that these overall developmental dynamics may be sustained by a gradual, experience-dependent specialization in face processing (i.e. acquisition of face expertise), which produces a more automatic and efficient network associated with effortless identification of faces, and allows the emergence of human-specific social and communication skills. Introduction Human faces are stimuli we are exposed to every day and they pro- vide a wide range of information crucial to understanding others and to adapting to social life. Several studies have revealed that adults process faces at an astonishing level of prociency (Binde- mann et al., 2005; Hershler & Hochstein, 2005; Theeuwes & Van der Stigchel, 2006; Cerf et al., 2008; Fletcher-Watson et al., 2008; Crouzet et al., 2010). One of the current hypotheses to explain this efciency suggests that we are expertsin face processing (exper- tise hypothesis) (Gauthier & Tarr, 1997; Gauthier et al., 2000; Gau- thier & Nelson, 2001; Rossion et al., 2002; Pascalis et al., 2011; Hoehl & Peykarjou, 2012). Throughout childhood, having more experience of discriminating faces than they have of individuating members of other object categories, children progressively acquire extensive face experience and then expertise, which may lead to the development of the face-specic behavioral and neural processes reported in adulthood. In other words, this hypothesis suggests that the remarkable ability to detect and recognize faces in adults is not innate and may be the result of a long period of expertise acquisi- tion (over many years) (Lee et al., 2013). However, this view has been disputed and it is still a matter of debate whether there is a sensitive period for acquiring face expertise in infancy (McKone et al., 2007). Event-related potentials (ERPs) and gaze behaviors appear to be powerful indices to provide insight into how face processing matures throughout normal development,. Event-related potentials recorded from the scalp are particularly useful to elucidate the evolution of the cerebral timing underlying face processing throughout normal development. Extensive studies using this technique have consistently indexed a negative face-spe- cic activity over temporo-occipital sites; the N170 component pro- vides an established measure of early stages of face processing in both adults and children (Bentin, Taylor et al. 2007; Rossion & Jac- ques, 2008; Joyce & Rossion, 2005; Kovacs et al., 2006) and also appears to be sensitive to the different facial features, especially the eyes (Bentin et al., 1996, 2006). Although less face-specic, sev- eral other visual ERP components over posterioroccipital sites, such as P1 and P2, also appear to be associated with face process- ing. P1, recorded at around 90130 ms in occipital sites, is known to reect early and basic visual processing and shows some sensitiv- ity to faces (Taylor, 2002; Dering et al., 2011) and face manipula- tion (Halit et al., 2000; Batty & Taylor, 2003), particularly Correspondence: Dr E. Meaux, present address below. E-mail: [email protected] *Present address: Laboratory for Behavioral Neurology and Imaging of Cognition (LabNIC), Department of Fundamental Neurosciences, University Medical Center, 1 Rue Michel Servet, 1211 Geneva 4, Switzerland Received 4 February 2012, revised 12 November 2013, accepted 30 December 2013 © 2014 Federation of European Neuroscience Societies and John Wiley & Sons Ltd European Journal of Neuroscience, pp. 114, 2014 doi:10.1111/ejn.12496 European Journal of Neuroscience

Transcript of Event-related potential and eye tracking evidence of the developmental dynamics of face processing

Event-related potential and eye tracking evidence of thedevelopmental dynamics of face processing

Emilie Meaux,* Nadia Hernandez, Isabelle Carteau-Martin, Jo€elle Martineau, Catherine Barth�el�emy,Fr�ed�erique Bonnet-Brilhault and Magali BattyUniversit�e Franc�ois-Rabelais de Tours, Inserm, Imagerie et Cerveau UMR U 930, CHRU de Tours, Centre Universitaire deP�edoPsychiatrie, Tours, France

Keywords: development, eye-tracking, faces, visual event-related potentials

Abstract

Although the wide neural network and specific processes related to faces have been revealed, the process by which face-pro-cessing ability develops remains unclear. An interest in faces appears early in infancy, and developmental findings to date havesuggested a long maturation process of the mechanisms involved in face processing. These developmental changes may be sup-ported by the acquisition of more efficient strategies to process faces (theory of expertise) and by the maturation of the face neu-ral network identified in adults. This study aimed to clarify the link between event-related potential (ERP) development inresponse to faces and the behavioral changes in the way faces are scanned throughout childhood. Twenty-six young children(4–10 years of age) were included in two experimental paradigms, the first exploring ERPs during face processing, the secondinvestigating the visual exploration of faces using an eye-tracking system. The results confirmed significant age-related changesin visual ERPs (P1, N170 and P2). Moreover, an increased interest in the eye region and an attentional shift from the mouth tothe eyes were also revealed. The proportion of early fixations on the eye region was correlated with N170 and P2 characteristics,highlighting a link between the development of ERPs and gaze behavior. We suggest that these overall developmental dynamicsmay be sustained by a gradual, experience-dependent specialization in face processing (i.e. acquisition of face expertise), whichproduces a more automatic and efficient network associated with effortless identification of faces, and allows the emergence ofhuman-specific social and communication skills.

Introduction

Human faces are stimuli we are exposed to every day and they pro-vide a wide range of information crucial to understanding othersand to adapting to social life. Several studies have revealed thatadults process faces at an astonishing level of proficiency (Binde-mann et al., 2005; Hershler & Hochstein, 2005; Theeuwes & Vander Stigchel, 2006; Cerf et al., 2008; Fletcher-Watson et al., 2008;Crouzet et al., 2010). One of the current hypotheses to explain thisefficiency suggests that we are ‘experts’ in face processing (‘exper-tise hypothesis’) (Gauthier & Tarr, 1997; Gauthier et al., 2000; Gau-thier & Nelson, 2001; Rossion et al., 2002; Pascalis et al., 2011;Hoehl & Peykarjou, 2012). Throughout childhood, having moreexperience of discriminating faces than they have of individuatingmembers of other object categories, children progressively acquireextensive face experience and then expertise, which may lead to thedevelopment of the face-specific behavioral and neural processesreported in adulthood. In other words, this hypothesis suggests that

the remarkable ability to detect and recognize faces in adults is notinnate and may be the result of a long period of expertise acquisi-tion (over many years) (Lee et al., 2013). However, this view hasbeen disputed and it is still a matter of debate whether there is asensitive period for acquiring face expertise in infancy (McKoneet al., 2007). Event-related potentials (ERPs) and gaze behaviorsappear to be powerful indices to provide insight into how faceprocessing matures throughout normal development,.Event-related potentials recorded from the scalp are particularly

useful to elucidate the evolution of the cerebral timing underlyingface processing throughout normal development. Extensive studiesusing this technique have consistently indexed a negative face-spe-cific activity over temporo-occipital sites; the N170 component pro-vides an established measure of early stages of face processing inboth adults and children (Bentin, Taylor et al. 2007; Rossion & Jac-ques, 2008; Joyce & Rossion, 2005; Kovacs et al., 2006) and alsoappears to be sensitive to the different facial features, especially theeyes (Bentin et al., 1996, 2006). Although less ‘face-specific’, sev-eral other visual ERP components over posterior–occipital sites,such as P1 and P2, also appear to be associated with face process-ing. P1, recorded at around 90–130 ms in occipital sites, is knownto reflect early and basic visual processing and shows some sensitiv-ity to faces (Taylor, 2002; Dering et al., 2011) and face manipula-tion (Halit et al., 2000; Batty & Taylor, 2003), particularly

Correspondence: Dr E. Meaux, present address below.E-mail: [email protected]

*Present address: Laboratory for Behavioral Neurology and Imaging of Cognition(LabNIC), Department of Fundamental Neurosciences, University Medical Center,1 Rue Michel Servet, 1211 Geneva 4, Switzerland

Received 4 February 2012, revised 12 November 2013, accepted 30 December 2013

© 2014 Federation of European Neuroscience Societies and John Wiley & Sons Ltd

European Journal of Neuroscience, pp. 1–14, 2014 doi:10.1111/ejn.12496

European Journal of Neuroscience

inversion (Linkenkaer-Hansen et al., 1998; Itier & Taylor, 2002,2004a,b; Itier et al., 2004). A few researchers have suggested thatthis sensitivity may be based on low-level visual feature analysis,needed in the first step of face perception (Latinus & Taylor, 2006;Rossion & Jacques, 2008). Later (at around 200 ms), P2 appears tobe sensitive to changes in facial configuration (e.g. Thatcherization;Boutsen et al., 2006; Elongation; Halit et al., 2000), suggesting thatthe component is involved in a later stage of face processing, i.e. inprocessing configural relationships between features (Boutsen et al.,2006; Latinus & Taylor, 2006; Pascalis et al., 2011). Early face-sen-sitive ERP investigations thus allow the study of brain representa-tion of the different mechanisms related to face processing.In contrast to ERP methodology, the eye-tracking technique is a

powerful tool to analyse the strategies used to look at faces. Eyemovements modulate our retinal input so that we see some things bet-ter (i.e. placing the salient element of our environment in foveal posi-tion) and others worse or not at all, indicating that gaze behavior hasconsequences for visual perception of faces. Moreover, it has beenassumed that saccades and repeated fixations on the same locationsselect informative image regions (Schutz et al., 2011). Investigatinggaze behaviors when exploring a face seems therefore to be a relevanttechnique to examine whether certain facial features are privileged. Inthis context, several studies have shown that human observers arecapable of very rapidly detecting faces in a scene, saccades to facesbeing performed before 150 ms (Crouzet et al., 2010). In addition,many studies monitoring eye movements and analysing fixationpatterns on humans and monkeys have found that the eyes are themain target in the face (Caldara et al., 2005; Henderson et al., 2005;Hernandez et al., 2009). This preference for eyes is evident very earlyafter stimulus onset (Vinette et al., 2004; Gamer & Buchel, 2009;Scheller et al., 2012), probably because the eyes are the main diagnos-tic feature used to assess facial information and constitute an impor-tant tool for social communication (Itier & Batty, 2009).Event-related potential and eye-tracking findings both indicate that

faces have ‘something special’. Furthermore, these two methodologieshave the advantage of providing complementary information: whereasERPs reflect the cerebral processes involved in face perception, eye-tracking makes it possible to determine what visual information indi-viduals acquire when processing faces. An arguable link betweenthese two indices may be suggested by fine functional magnetic reso-nance imaging (fMRI) studies (Dalton et al., 2005, 2007, 2008;Zurcher et al., 2013) showing that the time dedicated to the eyes isclosely associated with a heightened brain response in the amygdalaand the fusiform gyrus during processing of human faces in individu-als with autism. Investigating the relationship between gaze behaviorsand electrophysiological markers of face processing throughout devel-opment may therefore yield important complementary information onhow such processing develops (Hoehl & Peykarjou, 2012).In contrast to the literature available on adults, the body of evi-

dence regarding face processing with brain maturation in childhoodis still limited. Only a few studies have investigated the electrophys-iological responses elicited by faces in infants and children. At6 months of age, infants show a positive response to face 400 msafter its presentation (Quinn et al., 2010). de Haan et al. (2002) alsofound an early ‘infant N170’ in 6-month-olds elicited by faces at290 ms that precedes this positive response. To investigate thesefindings further, Halit et al. (2003) investigated these N290 andP400 components in typically developing 9-month-olds, using famil-iar and unfamiliar faces. However, previous developmental ERPstudies suggested that these cerebral markers involved in faceprocessing in early infancy undergo gradual development to reachthe adult level (Batty & Taylor, 2006; Joseph et al., 2011; Haist

et al., 2013). Indeed, Taylor and colleagues have reported consider-able age-related changes in P1/P2 and N170 beginning in earlyinfancy until adolescence (Taylor & Pang, 1999; Taylor et al.,2004) that have been thought to reflect cerebral maturation (Batty &Taylor, 2002; Taylor et al., 2004). These age-related ERP changesmight be largely related to the development of more effective strate-gies to process faces and acquisition of expertise. As suggested bythe link between face scanning and brain response to faces (Daltonet al., 2005, 2007), these strategies may involve developmentalchanges in the way children explore faces. A gradual, experience-dependent specialization of face processing throughout childhoodmay reflect the common development of both cortical systems andgaze behaviors involved in face perception, leading to face expertisein adulthood.Despite the significant contribution of ERP studies to our under-

standing of the normal development of face processing, little hasbeen reported about changes in face scanning over childhood. Mostof the available eye-tracking studies have investigated ocular behav-iors toward faces in adults and infants without focusing on the devel-opmental dynamics of face exploration in school-age children. Thesestudies have demonstrated special sensitivity to direct eye contact asearly as birth (Farroni et al., 2002; Senju & Csibra, 2008; Wheeleret al., 2011; Oakes & Ellis, 2013), suggesting an early interest in thesocially relevant eye region. Moreover, children tend to explore moreinternal features (eyes and mouth) than external facial features (hair,face outline, ears, etc.) (Yarbus, 1961; Walker-Smith et al., 1977;Althoff & Cohen, 1999; Hunnius & Geuze, 2004; Orban de Xivryet al., 2008; Elsabbagh et al., 2013; Tenenbaum et al., 2013). Tworecent studies investigated the role of cultural background on thedevelopment of face scanning in young children between the ages of1 and 7 years (Senju et al., 2013) and 7 and 12 years (Kelly et al.,2011). The authors reported a culture-specific effect on eye movmentpatterns when encoding a human face from the first year of life thatextended after infancy. The distribution of the fixation on facesappears to become more ‘adult-like’ from 7 to 12 years of age. Thesetwo studies are the first to identify the development of face scanningin childhood (in a cross-cultural approach).In summary, more developmental studies are needed to depict a

complete and comprehensive picture of the maturation of this impor-tant human ability (Lee et al., 2013). Moreover, despite the fact thatneuronal and behavioral evidence might have important implicationsfor understanding the development of face processing, no previousstudy has attempted to combine findings from both techniques toassess this research issue to our knowledge.The first aim of this study was to investigate the developmental

dynamics of the neurophysiological mechanisms involved in faceprocessing throughout childhood, using electrophysiological andeye-tracking assessments. The second aim was to clarify how age-related changes in visual ERPs and face exploration behavior mightbe linked.

Methods

Participants

Thirty children were included in an original protocol combining twoexperiments involving face processing: the first investigated thedevelopmental changes in visual ERPs during a passive viewing ofemotional faces; the second used an eye tracking system to studythe age-related changes in ocular exploration behaviors when look-ing passively at emotional faces. Two children were excluded forexcessive ocular and/or muscular artifacts and unsatisfactory

© 2014 Federation of European Neuroscience Societies and John Wiley & Sons LtdEuropean Journal of Neuroscience, 1–14

2 E. Meaux et al.

attempts during the electroencephalographic (EEG) experiment, andtwo others were not included because the gaze tracking signal wasnot sufficient during the eye-tracking experiment. The data presentedhere were therefore collected from the same 26 children for the twoexperiments. The 26 children were divided into three age groups(4–6 years, 6–8 years, 8–10 years), as summarized in Table 1.All participants had normal or corrected to normal eyesight and

none presented any kind of disorder. All children successfully com-pleted four subtests of the EDEI-R (Perron-Borelli, 1996) (vocabu-lary, conceptualization, categorical analysis and practical adaptation)or of the WISC-IV (Wechsler, 2004) (vocabulary, word reasoning,block design, picture concepts), which assess intellectual abilities toensure that all were in the normal cognitive range for age (Develop-mental Quotient – Verbal, mean = 102.9 � 9.8; from 84.3 to 118.7;median = 105.3. Developmental Quotient – non-Verbal, mean =106.9 � 8.9; from 89.8 to 119.5; median = 108.6). The parentsgave written informed consent. The experimental procedure wasapproved by the local Ethics Committee (CPP) and conformed withThe Code of Ethics of the World Medical Association (Declarationof Helsinki) (1964).

Experiment 1: development of visual ERPs throughoutchildhood

Experiment 1 investigated the developmental changes in visualERPs evoked by faces.

Stimuli and procedure: electrophysiological paradigm

In total, 210 photographs of adult faces (neutral and emotional, i.e.Happiness, Sadness, Disgust, Fear, Surprise and Anger) were pre-sented twice in three blocks and in random order (Batty & Taylor,2003). Each block contained both male and female photographs.The mean luminance and local contrast were equalized across stim-uli. All photographs were 11 9 8 cm and were presented usingStim2 software on a light gray background on a computer screen infront of the subject (visual angle 4°) for 500 ms, with a randominterstimulus interval (ISI; 1200–1600 ms). Children were comfort-ably seated on a chair, and were told that they would be shown aset of photographs without any further information (implicit task)and that they had to remain silent and still during the experiment.

Electrophysiological measurements

Electroencephalographic data were recorded from active 18Ag/AgCl electrodes fixed on the scalp according to the international10–10 system (Oostenveld & Praamstra, 2001): Fz, Cz, F3, C3, O1,T3, T5 and their homologous locations on the right hemiscalp.Additional electrodes were placed at M1 and M2 (left and rightmastoid sites) and vertical electrooculograms (EOGs) were recorded

with electrodes at the superior and inferior orbital ridge. A noseelectrode was chosen as reference during the recording (Joyce &Rossion, 2005), an average reference being calculated off line (Pic-ton et al., 2000). Impedance was maintained below 5 kΩ. The EEGand EOG were amplified and filtered with a band-pass (0.1–70 Hz)and digitized at a sampling rate of 500 Hz. The whole experimentwas controlled by a Compumedics NeuroScan EEG system(Synamps amplifier, Scan 4.3 and Stim2 software). After visualobservation of the overall signal, EEG periods contaminated withocular and motor activity were manually rejected before averaging.Trials were then averaged according to the emotions expressed bythe faces (a threshold of 50 trials per subject per emotion wasrequired) over a 1100-ms analysis period, including a 100-ms presti-mulus baseline, and were digitally filtered (0–30 Hz).The ELAN software package was used for analysis and visualiza-

tion of EEG-ERPs (Aguera et al., 2011). Maximum amplitudes andpeak latencies of the P1, P2 and N170 components, previouslyshown to be differentially sensitive to faces, were measured for eachsubject within a 40- or 60-ms time window around the peak of thegrand average waveform. The components were measured for eachsubject at the electrode sites of interest: P1 and P2 were measuredat O1 and O2 (between 90–130 and 270–310 ms, respectively);N170 was measured at T5 and T6 (between 170 and 230 ms).

Electrophysiological analyses

Previous findings have revealed effects of expression of emotion onthe P1 and the N170 components in adults (Batty & Taylor, 2003)but the few studies undertaken in childhood have reported inconsis-tent results (Batty & Taylor, 2006; Batty et al., 2011). Moreover, inthe current study we did not focus on emotional effects. After con-firming the absence of emotional effect on P1 (amp. P = 0.82, lat.P = 0.16), N170 (amp. P = 0.062, lat. P = 0.23) and P2 (amp.P = 0.81, lat. P = 0.76), the data recorded were therefore averagedindependently of the emotions expressed by the faces.Visual ERP (P1, N170, P2) characteristics (amplitude and latency)

were submitted to repeated-measures analysis of variance (ANOVA)according to hemisphere (2; i.e. left vs. right) as the within-subjectfactor, and age group (3; i.e. 4–6, 6–8 and 8–10 years) and gender(2; i.e. boys vs. girls) as between-subject factors. All results werecorrected for multiple comparisons using the Greenhouse–Geissertest. Significant interactions were followed by Newman–Keulspost-hoc comparisons to determine where the differences lay. Analpha level of 0.05 was used for all statistical tests.

Experiment 2: development of face exploration behaviorthroughout childhood

The second experiment examined whether face exploration behaviorin children changed with age, which should provide informationabout the maturation of the strategy used to look at faces duringdevelopment.

Stimuli and procedure: eye-tracking paradigm

This study was performed using 15 photographs of neutral and emo-tional male faces [Happiness (five), Sadness (five) or Neutral (five)].All stimuli were transformed with Photoshop 7.0 software to harmo-nize colors, luminance, background, position and size of the faces.The stimuli were presented on a computer screen (17-inch) at

about 90 cm from the eyes of the subject. The stimuli were37.7 9 30 cm in size with a resolution of 1024 9 768 pixels

Table 1. Distribution and description of the 26 children in all age groups

Age group

4–6 years 6–8 years 8–10 years

No. of children tested 10 10 10No. of children analysed 8 8 10Mean age (years) 5.08 6.93 8.98Standard deviation 0.37 0.39 0.72Min–max (years) 4.75–5.87 6.42–7.5 8–9.92Sex ratio (M:F) 5:3 5:3 5:5

© 2014 Federation of European Neuroscience Societies and John Wiley & Sons LtdEuropean Journal of Neuroscience, 1–14

Normal maturation of face processing 3

(visual angle = 27°). Each stimulus was presented once in randomorder for 4 s with a fixed ISI of 500 ms. During presentation of thestimuli, children were comfortably seated on a chair and their ocularexploration of the faces was recorded. They were told that theywould be shown a set of photographs without any further informa-tion. The children had no instructions except to pay attention to thephotographs presented and to remain silent and to limit head move-ments during the experiment.

Eye-tracking measurements

Recordings were made with a free head-mounted eye-tracking sys-tem comprising a computer equipped with two cameras able to filmthe eyes of a subject when looking at images on the computerscreen. Gaze location was monitored by the software maintained bythe camera system, which is able to capture illumination of the cor-nea through infrared diodes with an acquisition frequency of 60 Hz.This corneal reflection method (Salapatek & Kessen, 1966) allowsrapid calibration and relatively good tolerance of head movements(Gredeback et al., 2010), eliminating constraints on the subject whoremains relatively free to move. Before data registration, childrenmust undergo a calibration procedure. FaceLAB� software recordsthe eye position every 0.017 s during the 4 s of the presentation.Gaze Tracker� software then provides measurement and real-timeanalysis of the time spent on various parts of the image. The eye-tracking system sometimes failed to track ocular behavior over the4 s while the stimulus was being presented on the screen (the meanloss of signal was estimated at around 8 � 7.3% of the 4 s). Wetherefore only accepted recording stimuli on which at least 50% ofthe presentation time was tracked effectively (mean number of stim-uli = 13 � 2 per subject).Most previous studies performed analyses over the full, long

exposure times (typically longer than 2 s) which only allow charac-terization of explicit face perception mechanisms that presumablyare highly under control. To differentiate fast, potentially automatic,eye movements from more elaborate scanning of faces, the datawere analysed here according to two types of assessment.

First, to obtain an overview of how children scan faces acrossdevelopment, we calculated the percentage of time spent on the dif-ferent regions of interest, called LookZones (LZ), within the 4 s ofpresentation. This measure provides access to the strategy and thezones of specific interest when elaborate scanning of the face is per-formed. The percentage of time spent exploring outside the screen(‘Off-screen’ time) and percentage of time spent on the screen(‘Screen’ time) according to the time actually tracked were mea-sured. The ‘Screen’ time was then broken down into time spentlooking at the face (= LZ1, ‘Face’ time) and time spent on the stim-ulus background (‘Background’ time) (Fig. 1A). Finally, the per-centage of time spent looking at the eyes (LZa) and the mouth(LZb) in relation to the whole time spent on the face (LZ1) weremeasured. Thus, three regions of interest (LZs) were established (i.e.head, eyes and mouth) (Fig. 2A) and the times spent exploring theseregions were analysed according to subject age and sex.Second, to assess automatic capture of the children’s gaze, and

hence probably of their attention when a face pops up, we alsofocused on the initial fixations that occurred immediately after stim-ulus onset and whether the children showed a preference for the eyeregion (as mentioned in the Introduction, the eyes are expected tobe the main target in face exploration). We therefore calculated thenumber of gaze fixations on the eyes (LZa) in relation to the totalnumber of valid gaze fixations on the face (LZ1) within 300 msafter stimulus onset. In other words, we obtained and analysed theproportion of early gaze fixations on the eyes. In contrast tomeasurement of time spent on the LZ (which took into account allsaccades every 17 ms), fixation times was calculated only when theposition of the eye was maintained on a single location (three gazepoints stored closer than 40 pixels) for at least 50 ms.

Eye-tracking analyses

Previous studies have failed to identify an overall effect of emotionon face exploration using the same eye-tracking paradigm in adults(Hernandez et al., 2009). Therefore, after confirming the absence ofan emotional effect on the three LZs (i.e. Head: P = 0.84; Eyes:

4-6 years

6-8 years

8-10 years

28.9

µV

T5

N170

T6

28.9

µV

–100 0 100 300

+

+

–100 0 100 300

28.9

µV O1

+

–100 0 100 300200

28.9

µV O2

P1

P2

+

–100 0 100 300200

ms ms

ms ms

Fig. 1. Effects of development on early visual ERPs. Grand average ERPs in response to faces for the three age groups from two electrodes of interest in theright hemisphere (O2 and T6) and the left hemisphere (O1 and T5). The arrow indicates the components measured: P1 and P2 were recorded at occipital sites(O1 and O2) and N170 in temporal regions (T5 and T6). ANOVAs revealed significant age effects on P1 amplitude and N170 and P2 amplitude and latency, sug-gesting maturation of cerebral face correlates during childhood. The head image indicating electrode positions and labels in the 10–20 system (black circles)was extracted from Oostenveld & Praamstra (2001).

© 2014 Federation of European Neuroscience Societies and John Wiley & Sons LtdEuropean Journal of Neuroscience, 1–14

4 E. Meaux et al.

P = 0.41; Mouth: P = 0.45), the data recorded were averaged inde-pendently of the emotions expressed by the faces.

Analysis of time spent on different LZs within the 4 s of facepresentation

Analyses were performed on the percentage of cumulative timespent looking at the three previously defined LZs (LZ1, LZa, LZb).Three distinct repeated-measures ANOVAs were performed usinggender (2; i.e. boys vs. girls) and age group (3; i.e. 4–6, 6–8 and8–10 years) as between-subject factors. Different LZs were used as

the within-subject factor for the three ANOVAs: the first compared thetime spent on the screen with the time spent off-screen, the secondcompared the time spent on the head with the time spent exploringthe background and the third compared the time spent on the eyeswith the time spent on the mouth in relation to the whole time eachparticipant spent on the head.

Analysis of proportion of gaze fixation on the eyes within 300 msafter face onset

Analyses were performed on the ratio of gaze fixations (i.e. whenthe position of the eye is maintained on a single location for at least50 ms) on the eyes (LZa) to the total number of valid gaze fixationswithin the first 300 ms after stimulus onset. An ANOVA wasperformed using age group (3; i.e. 4–6, 6–8 and 8–10 years) andgender (2; i.e. boys vs. girls) as between-subject factor.As in experiment 1, all eye-tracking results were corrected for

multiple comparisons using a Greenhouse–Geisser test. Significantinteractions were followed by Newman–Keuls post-hoc comparisonsto determine where the differences lay. An alpha level of 0.05 wasused for all statistical tests.

Results

Experiment 1: development of visual ERPs throughoutchildhood

P1 (mean amplitude 20.2 � 7.05 lV; mean latency 108.7 �6.74 ms), N170 (mean amplitude �6.16 � 3.07 lV; mean latency210.7 � 15.36 ms) and P2 (mean amplitude 11.8 � 3.24 lV; meanlatency 290.3 � 14.1 ms) were recorded and measured at occipitaland occipito-temporal sites in the three age groups of children, asshown in Fig. 1. Statistical results are summarized in Table 2.

P1

P1 amplitude showed a significant main age effect (F2,20 = 4.75,P = 0.02), decreasing dramatically with increasing age (Table 2,Fig. 1). P1 amplitude also varied overall according to the electrodelocation. The component was smaller in the left hemisphere (O1)than in the right (O2) (F1,20 = 11.3, P = 0.003). While ANOVA failedto reveal any main effect of gender on P1 amplitude, the analysisshowed a significant sex 9 age interaction (F2,20 = 6.67,P = 0.006), and post-hoc comparison indicated that older girls(8–10 years) exhibited a smaller P1 than younger girls (4–6 years)(P = 0.028), suggesting that that the modification of P1 amplitudethroughout childhood was especially supported by girls.Analyses of P1 latency showed no main effect of age, gender or

hemisphere (Fig. 1, Table 2).

N170

A main age effect was measured on N170 amplitude (F2,20 = 4.72,P = 0.02) (Fig. 1, Table 2). N170 amplitude decreased significantlybetween 4 and 10 years of age (Table 2, Fig. 1); post-hoc testsrevealed that the 4- to 6-year-old group differed from the middle(P = 0.030) and the older (P = 0.033) groups.N170 latency also decreased with age (F2,20 = 7.28,

P = 0.004) (Fig. 1, Table 2). The N170 in the 4- to 6-year-oldgroup was delayed compared with the N170 recorded in the mid-dle group (P = 0.011) and the older group (P = 0.015) (Fig. 1).Thus, no significant differences were found between the two older

0

20

40

60

80

100

120

% T

ime

spen

t

HEAD EYES MOUTH

**

« FACE », LZ1

Screen

Background

Off-Screen

« EYES », LZa

« MOUTH », LZb

4-6 years

6-8 years

8-10 years

* P < 0.05

01020304050607080

4-6y 6-8y 8-10y

% 1

rst F

ixat

ion

on E

yes

(300

ms)

*

A

B

C

Fig. 2. Effects of development on face exploration. (A) Representation ofthe different LookZones (LZs) of interest: face, eyes and mouth. (B) Meanand standard deviation of the time spent on face, eye and mouth regions arepresented throughout development. Eye (LZa) and mouth (LZb) values wereobtained from time spent on the face region (LZ1): for example, 4–6-year-old children looked at the eye region 30% of the time attributed to the head(96% of the total time tracked). With increasing age, children spent signifi-cantly more time on the eyes and less time on the mouth region. Post-hoccomparisons revealed between-group differences. (C) Mean and standarddeviation of the proportion of initial eyes fixation within 300 ms after theface onset are represented throughout development. With increasing age,early, automatic gaze preference for eyes is intensified.

© 2014 Federation of European Neuroscience Societies and John Wiley & Sons LtdEuropean Journal of Neuroscience, 1–14

Normal maturation of face processing 5

groups (6- to 8- vs. 8- to 10-year-olds) for either N170 amplitudeor latency.No effects of hemisphere or gender were found for this compo-

nent.

P2

Both P2 amplitude (F2,20 = 3.46, P = 0.05) and latency(F2,20 = 9.12, P = 0.0015) were significantly affected by age(Table 2, Fig. 1). These overall effects of age were supported by adecrease in amplitude and latency with increasing age. Newman–Ke-uls tests showed that P2 in the younger group was larger than in theolder group (P = 0.027). Post-hoc comparisons for P2 latencyrevealed that the P2 of the 4- to 6-year-old group was delayedcompared with the two older groups (vs. 6–8 years, P = 0.013; vs.8–10 years, P = 0.033).This component did not demonstrate any effect of hemisphere or

gender.

Experiment 2: development of face exploration behaviorthroughout childhood

Development of time spent on different LZs within 4 s of facepresentation

The first ANOVA revealed that children of all ages spent more timelooking at the screen (F1,20 = 30227, P = 0.00) compared with thetime spent off the screen (3.2 � 1.3% of the total time tracked),indicating that the instructions were respected.The second ANOVA showed that all the children spent more time

looking at the face region (F1,20 = 3074.8, P = 0.00) than at thebackground of the stimulus (4 � 3.8% of the time spent on thescreen).Finally, the last analysis showed that, within the face LZs and

regardless of age, children preferred to look at the eye region thanthe mouth region (F1,20 = 35.28, P = 0.00) (Fig. 2B, Table 3).Moreover, the time spent on these two LZs of interest was differen-tially modulated by age (F2,20 = 3.62, P = 0.04): the time spentexploring the eye region increased with age whereas the time spenton the mouth region decreased with age (Fig. 2B). Post-hoc analy-ses revealed that the children of the older group spent significantlymore time on the eye region than children of the younger group(P = 0.043) and less time exploring the mouth region than theyounger group (P = 0.008) or the middle group (P = 0 .007)(Fig. 2B). For example, whereas the oldest children looked at theeyes for almost 45% of the time spent on the face, the youngestchildren focused on this social region for only 30% of the time. Asignificant LZ (Eyes; Mouth) 9 Age (4–6, 6–8, 8–10 years) interac-tion (F1,20 = 4.95, P = 0.017) was also found: during childhood,the time spent on the eyes increased while the time spent on the

mouth decreased. This final ANOVA failed to reveal a main sex effect.However, a significant gender 9 age interaction (F1,20 = 4.01,P = 0.03) was found with the time spent on the mouth, indicatingthat boys spent more time exploring the mouth than girls only in theoldest age group (8–10 years).

Development of the proportion of gaze fixation on eyes within300 ms after face onset

We observed a statistically significant increase in early preferencefor eyes throughout development (F2,22 = 4.5, P = 0.023). The pro-portion of fixation on eye LZs went from 28% in the 4- to 6-year-old children and reached 54% in the 8- to 10-year-olds (Fig. 2C),suggesting that children became quickly and automatically attractedby the eyes with increasing age (more than half of the time).Post-hoc analyses revealed that this main age effect was governedby more eye fixations within 300 ms displayed by the older than theyounger group (P = 0.032). No effect of gender was revealed(P = 0.86).

Complementary analyses: combined eye-tracking andelectrophysiological paradigm data

Developmental changes in relation to faces may therefore be charac-terized by decreases in visual ERP amplitude and latency as well asby increases in time spent on the eyes and decreases in time spenton the mouth region. Furthermore, we also highlighted increases inearly gaze fixation on the eyes. To determine whether the develop-mental changes observed on ERPs might be related to change in the

Table 2. Summary of the main effects on visual ERPs

ANOVA

P1 N170 P2

Amplitude Latency Amplitude Latency Amplitude Latency

Sex n.s. n.s. n.s. n.s. n.s. n.s.Age *P = 0.02 n.s. *P = 0.02 *P = 0.004 P = 0.05 *P = 0.001Hemisphere *P = 0.003 n.s. n.s. n.s. n.s. n.s.

P1, N170 and P2 characteristics (amplitude and latency) were submitted to repeated ANOVA measures according to sex (2), hemisphere (2) and age (3: i.e. 4–6,6–8 and 8–10 years of age). *P < 0.05.

Table 3. Summary of statistical effects on face exploration

ANOVA Time spent on LZa/b Early fixation on LZa

Gender Age*Gender (P = 0.03) n.s.

Age *P = 0.04 *P = 0.02Age*Gender (P = 0.03)Age*LZ (P = 0.01)

Lz *P = 0.00Age*LZ (P = 0.01)

Percentages of time spent on the different regions of interest were used inthe analyses (column 1). A main effect of LZ (P = 0.04) and Age (P = 0.00)was found. When they explored the face region (LZ1), children spent moretime exploring the eye (LZa) than the mouth region (LZb). Moreover, thetime spent on these two LZs of interest is differentially modulated by age(P = 0.01). No main effect of sex was reported but an Age*Gender interac-tion was found (P = 0.03). Initial fixations to eyes within the 300 ms afterthe face onset were also analysed (column 2). A main effect of age wasfound (P = 0.02). The age effects on the time spent on the internal facialfeatures (LZa/b) and on the initial fixations to eyes (LZa) are detailed inFig. 2(A, B).

© 2014 Federation of European Neuroscience Societies and John Wiley & Sons LtdEuropean Journal of Neuroscience, 1–14

6 E. Meaux et al.

ocular exploration of a face, we combined findings from the twoexperiments. Correlations were performed to investigate how earlyvisual ERPs involved in face perception were related to the subjects’gaze behavior during face exploration.First, Pearson correlations were computed between the ERP char-

acteristics (Experiment 1) and the time spent on the eye and mouthregions during the 4 s of face presentation (Experiment 2, 1st mea-sure, elaborate scanning of faces). Pearson analysis revealed thatneither the amplitude nor the latency of P1 and P2 varied signifi-cantly according to the time spent exploring the eyes or the mouth.However, a significant positive correlation indicated that greatertime spent on the eye region was associated with smaller (less nega-tive) N170 amplitude (r = 0.48, P < 0.012). In other words, N170amplitude became less negative with increasing eye exploration.This correlation was observed in 6- to 8-year-old (r = 0.81,P = 0.013) and 8- to 10-year-old (r = 0.75, P = 0.012) children,but analyses failed to reach significance in the younger group. Toevaluate the relative role of the development on this effect, a partialout-correlation controlled for age was computed. The correlationthen no longer remained significant (P = 0.07), suggesting that thedevelopment of these indices might explain their correlation. No sig-nificant correlation was found between the time spent exploring theeye region and N170 latency. Neither the amplitude nor the latencyof this component appeared to be correlated with time spent on themouth region.To fit more precisely with the timing of the ERPs, a Pearson

correlation was then computed between the visual ERP (N170, P1,P2) characteristics and the proportion of gaze fixation dedicated tothe eyes within the first 300 ms after stimulus onset (Experiment 2,2nd measure, early preference for eyes). Strong significant negativePearson correlations were found between the proportion of initialgaze fixations on the eyes and (1) N170 latency (r = �0.58;P = 0.002), (2) P2 amplitude (r = �0.41; P = 0.033) and (3) P2latency (r = �0.44; P = 0.022). A high rate of initial gaze fixationon the eyes was associated with faster N170 and P2 and with asmaller P2 wave (Fig. 3). To check whether these correlations wererelated to development, partial out-correlations controlled for agewere also performed. The negative correlation previously found withN170 latency lost its significance (P = 0.063), again indicating thatthe developmental dynamics of N170 and gaze behavior mightexplain their correlation. However, when the age effect was par-tialled out, the correlation remained strongly significant between theP2 component and the rate of initial gaze fixation on the eyes(Amplitude P = 0.007; Latency P = 0.021), suggesting that devel-opment cannot drive these statistical correlations.

Discussion

Experiment 1: development of visual ERPs throughoutchildhood

Our study revealed significant age-related changes in visual ERPs inresponse to faces (P1, N170 and P2) throughout childhood. Only afew ERP studies have investigated face processing in typical devel-oping populations of school-age children to date (Taylor et al.,2001, 2004; Itier & Taylor, 2004a,b; Batty & Taylor, 2006; Kuefneret al., 2010; Vlamings et al., 2010; Hileman et al., 2011). Almostall of these studies have described modulation of the amplitude andlatency of P1 and N170 in response to faces with age.Our study also revealed a significant decrease in P1 amplitude, as

classically reported in face processing (Taylor et al., 2004) as wellas in general visual processing (Brecelj et al., 2002; Crognale,

2002; Doucet et al., 2005). Whereas several earlier studies reporteda decrease in P1 latency with increasing age, our findings did notreveal any developmental effect on P1 latency. However, in a meta-analysis, Taylor et al. suggested that the age effect on P1 latencycould be task-dependent, and the previous studies revealing age-related changes in P1 latency in response to faces involved olderchildren than in our study. For example, Kuefner et al. (2010)

180

190

200

210

220

230

240

250

260

270

280

0 10 20 30 40 50 60 70 80 90 100

N17

0 la

tenc

y (m

s)

Proportion of 1rst fixation on eyes (%)

A

4

6

8

10

12

14

16

18

20

22

0 10 20 30 40 50 60 70 80 90 100

P2 a

mpl

itud

e (μ

V)

Proportion of 1rst fixation on eyes (%)

B

250

260

270

280

290

300

310

320

330

0 10 20 30 40 50 60 70 80 90 100

P2 la

tenc

y (m

s)

Proportion of 1rst fixation on eyes (%)

C

Fig. 3. Significant Pearson correlation between N170/P2 characteristics andthe proportion of initial fixations on eyes within 300 ms after face onset. (A)N170 latency at temporal sites decreased with increasing the percentage offirst fixation dedicated to eyes; i.e. N170 became earlier with increasing earlygaze preference for eyes (r = �0.58, P = 0.002) (N170 Lat. =232.93�53.94 * % of Initial Fixations on Eyes). (B) P2 amplitude(r = �0.41, P = 0.033) (P2 Amp. = 15.037�7.868 * % of Initial Fixationson Eyes) and (C) latency (r = �0.44, P = 0.022) (P2 Lat. = 305.11�35.43* % of Initial Fixations on Eyes) decreased with increasing percentage of ini-tial fixations dedicated to eyes; i.e. P2 became smaller and earlier withincreasing early gaze preference for eyes. Partial out-correlation analysesrevealed that the correlation found with the P2 component is not dependenton the age of the participants, whereas the correlation observed between ini-tial fixation on eyes and the N170 appears to be partially driven by commondevelopmental dynamics.

© 2014 Federation of European Neuroscience Societies and John Wiley & Sons LtdEuropean Journal of Neuroscience, 1–14

Normal maturation of face processing 7

showed a moderate decrease in P1 latency with age, the most signif-icant decrease appearing between 9 and 12 years of age. Similarly,Batty & Taylor (2006) found that children aged from 4 to 9 yearsdiffered significantly from 10- to 15-year-old children in P1 latency.It seems therefore that P1 latency undergoes age-modulation after10 years of age, explaining why no age effect was found in ourstudy. We also showed that P1 was larger over the right hemisphereacross all three age groups. This hemispheric effect has been widelyreported both in children (Field et al., 1998; de Haan et al., 2004;Batty & Taylor, 2006) and in adults (Batty & Taylor, 2003), sug-gesting that greater right hemisphere activation to faces appearsearly during development.In the meta-analysis by Taylor et al. (2004), a considerable

decrease in N170 latency was also reported from 4 years of ageuntil adulthood. N170 peaked 100 ms earlier in adults than in the4-year-old group, with the steepest decrease occurring before 10–11 years. The amplitude of N170 was reported to have an invertedU-shaped developmental trajectory, the least negative amplitudebeing for children aged 10–11 years. Children both older and youn-ger than 10–11 years were reported to have greater (more negative)N170 amplitude (Taylor et al., 2004). Our results confirm thedecrease in N170 latency and the first step of this U-shaped devel-opmental trajectory, N170 amplitude decreasing significantly from 4to 8 years of age.Finally, our study also assessed the development of the P2 com-

ponent in response to faces and revealed a decrease in P2 latencyand amplitude from 4 to 10 years of age. To our knowledge, nostudy has previously investigated age-related changes in the P2 com-ponent in response to faces. However, in response to more basicvisual stimuli (radially modulated concentric patterns), P2 character-istics became adult-like by 13 years of age whereas the distributionwas still changing until late adolescence (Doucet et al., 2005). Thisrarely studied component therefore seems to indicate a slow matura-tion process, in agreement with our findings.Age-related changes in visual ERPs have commonly been inter-

preted as reflecting both structural and functional brain maturation.The variations in latency suggest increasing general speed and effi-ciency in early visual processing throughout childhood and mayreflect increasing myelinization in visual cortical areas (Nelson,1997). Variations in amplitude and decrease in neural activity havebeen related to changes in brain structure, age-related reduction inslow wave activity mirroring an age-related reduction in gray mattervolume (Whitford et al., 2007), in particular synaptic density(Huttenlocher, 1990). fMRI studies have reported developmentalchanges in ‘face-specific’ brain areas such as the fusiform face area(FFA) (Aylward et al., 2005; Golarai et al., 2007; Passarotti et al.,2007; Scherf et al., 2007), the occipital face area (OFA) and thelateral occipital complex (LOC) (Golarai et al., 2007; Scherf et al.,2007) and also throughout the extended network for face processing(Joseph et al., 2011; Taylor et al., 2011).However, these developmental ERP changes in response to face

and their origins have recently been questioned by Kuefner et al.(2010). The main focus of their study was whether age-relatedimprovement in face processing tasks is specific to our ability toperceive faces per se, or rather is a product of age-related improve-ments in general sensory and cognitive functions or general visualpattern recognition (Want et al., 2003; Crookes & McKone, 2009).Kuefner et al. (2010) investigated the development of ERPs inresponse to both face and non-face stimuli (face, car, scrambledface, scrambled car) between the ages of 4 and 17 years, and agroup of adults. They showed that the developmental patterns of P1and N170 did not differ for the four types of stimuli, indicating that

the evolution with age reflects a general developmental trend, notspecific to faces (Kuefner et al., 2010). They therefore suggestedthat this ERP maturation may be due to overall sensory and cogni-tive development and not to the specific functional maturation ofbrain areas involved in face perception.On the other hand, the reported age-related changes in children’s

raw ERPs to faces may also be associated with developmentalbehavioral changes in the way children perceive faces, linked toacquisition of face expertise. Due to intense exposure to faces inearly life, children gradually develop an expertise with these stimuli,tending to process faces using more holistic and configural informa-tion and less analytic processes (Carey & Diamond, 1977; Diamond& Carey, 1986; Carey, 1992; Rossion et al., 2002; Lee et al.,2013). Supporting this view, two recent studies indicated that thedevelopmental dynamics of ERPs evoked by faces were dependenton changes in the use of low spatial frequency (SF) information(Vlamings et al., 2010; Peters et al., 2013). Using low pass-filteredfaces (only low SF information is available; low SF that make itpossible to capture the coarse cues needed for face-specific holistic/configural processing), Vlamings et al. (2010) reported a significantnegative correlation between age (3–8 years of age) and N170latency, suggesting that the strategy involved in face perceptiontends to be more and more based on holistic information (and lesson detailed information) with increasing age.In the current context of knowledge, age-related changes in ERPs

to faces observed during childhood may thus be related to the matu-ration of general sensory and cognitive functions and/or to the matu-ration of more specific processes involved in the way to perceivefaces (as a function of acquisition of face expertise). Because whatwe process is what we look at, the evolution of the way childrenperceive faces and the maturation of brain correlates over age shouldnot be considered in isolation from the developmental dynamics ofvisual face scanning. Gaze exploration of faces is indeed crucial toobtain all the facial information required by complex perceptualfacial processes. This view is in accordance with the hypothesis ofElsabbagh et al. (2012) in a fine follow-up ERP study in infants ‘atrisk’ for autism and control infants (from 6–10 to 36 months). Inaddition to ERP analyses, the authors used a separate eye-trackingtask to assess whether the differences occurring in neural responseto eye gaze in infants at risk for autism might be attributed todecreased scanning of or visual attention to the eye region. How-ever, they failed to find any differences in the eye scanning patternsbetween at-risk and control infants, and suggested that atypical brainfunction may precede the onset of overt behavioral signs and subse-quent symptoms of autism. By investigating for the first time thedynamics of ocular exploration changes with age during face per-ception, in the second experiment we adopted a comparableapproach throughout normal development. Furthermore, the correla-tion analyses performed between the data collected from the 1st andthe 2nd experiments allowed us to clarify the involvement of visualface scanning in ERP development.

Experiment 2: development of face exploration behaviorthroughout childhood

Only a few studies have previously reported the evolution of thescanning patterns when children were exposed to a face, but thesestudies were mainly focused on early infancy. They revealed rapiddevelopment during the first year of life: children tend to dividetheir attention between the particularly meaningful internal areas ofthe face, i.e. the eyes and the mouth (Maurer & Salapatek, 1976;Haith et al., 1977; Bronson, 1994; Hunnius & Geuze, 2004;

© 2014 Federation of European Neuroscience Societies and John Wiley & Sons LtdEuropean Journal of Neuroscience, 1–14

8 E. Meaux et al.

Wheeler et al., 2011; Elsabbagh et al., 2013; Oakes & Ellis, 2013;Tenenbaum et al., 2013). These studies revealed that infants (from4.5 to 15 months of age) focused on internal features more thanexternal features and more on the eye region than on other internalfeatures. In addition, an increase in the focus on the mouth wasreported throughout infancy (Oakes & Ellis, 2013; Tenenbaumet al., 2013). Similarly, the increased scanning of the mouth regioncompared with the eyes at 7 months was found to be predictive ofsuperior expressive language abilities at 36 months (Elsabbaghet al., 2013). It is likely that within the early development periodwhen language skills are emerging, mouth cues play an importantrole compared with eye and hand cues (Lewkowicz & Hansen-Tift,2012). Recent studies have reported developmental eye-trackingfindings in older children (1–12 years of age) (Kelly et al., 2011;Senju et al., 2013). Although the main aim of these studies was spe-cifically to assess the developmental change in the culture-specificpattern of face exploration, they reported that children almost exclu-sively directed fixations towards internal facial features rather thanexternal features, irrespective of their culture (British/Chinese). Sen-ju et al. (2013) reported a significant increase in the relative visitduration on the eye region from 1 to 7 years of age in British malechildren. Kelly et al. (2011) also reported a non-significant butinformative trend toward more ‘adult-like’ fixation strategiesbetween 7 and 12 years of age.Previous findings have suggested that face scanning becomes

adult-like, i.e. closer to 60% of gaze fixation time spent on the eyesin a face-recognition task and about 70–90% of all fixations beingdirected at the eyes and mouth after a long maturation period(Walker-Smith et al., 1977; Henderson et al., 2001). Our findingsconfirmed the progressive and slow evolution of visual face scan-ning patterns during childhood. At 8–10 years of age, childrenappear to devote at least 50% of their attention to the eyes andmouth, suggesting that face scanning is not yet adult-like. We iden-tified both an increase in the time spent exploring the eyes and adecrease in the time spent looking at the mouth with increasing agecompared with the whole time participants spent on the head. Thesignificant interaction between LZs and age indicates that these twochanges were linked: attention to the eyes and mouth regionsthroughout childhood appears to be connected. The proportion oftime spent on the eyes increased from 31% in the younger group to43% in the older group, while the time spent on the mouthdecreased from 18.5 to 10%, suggesting a shift of attention from themouth to the eyes during normal development. For example,between 4 and 6 years of age, the increase in time spent on the eyeswas approximately the same as the decrease in time spent on themouth (5 vs. 5.5%). Thus, the results of our study revealed a bal-ance between the eyes and mouth: the increase in time spent on theeyes is to the detriment of time spent on the mouth. Recent findingspresent opportunities to improve the understanding of this attentionalshift with age. It is widely accepted that the acquisition of commu-nication skills in general (verbal and non-verbal) relies on infants’and children’s ability to orient to relevant cues, ignore irrelevantones and understand their referential nature (Elsabbagh et al., 2013).Throughout childhood, the relevance of the different facial featureschanges according to the developmental level of the child, by whatthey need most at any one moment of their development to under-stand and be understood. Thus, selectively attending to the mouthduring infancy may be crucial for reading the lips during languageacquisition whereas after this period finer socio-communicative indi-ces may be obtained by children from the eye region. This changefrom the mouth to the eyes may therefore reflect a shift of visualattention toward the more relevant facial feature to develop social

and communication skills, according to the age of the child. Ourfindings also pointed out that this strong interest in the eye region isobservable from the first 300 ms after face stimulus onset. Thisresult is consistent with recent evidence suggesting that briefly pre-sented faces also trigger very early, potentially reflexive, eye move-ments that are sensitive to the distribution of diagnostic facialfeatures (Gamer & Buchel, 2009). In their study, faces were pre-sented briefly (150 ms) so that observers were only able to accom-plish one saccade after stimulus offset. The authors showed thatreflexive gaze changes toward the eye region occurred much morefrequently than fixation changes leaving the eye region. Other stud-ies have also reported evidence for this rapid preference for the eyesand have proposed that it may reflect a reflexive mechanism thatautomatically detects relevant facial features in the visual field andfacilitates the orientation of attention towards them (Vinette et al.,2004; Scheller et al., 2012).This assumption is consistent with the fact that gaze is a crucial

part of human non-verbal communication and interaction. The eyesare highly important for social value conveyed by faces (Itier &Batty, 2009), and holding essential information for the recognitionof facial identity (McKelvie, 1976; Schyns et al., 2002; Sekuleret al., 2004; Vinette et al., 2004) and emotion (Calder et al., 2000;Smith et al., 2005; Fox & Damjanovic, 2006), which make it possi-ble to assess the mental states of others and adapt social behavior.This view is also supported by a recent study which assessed visualscanning of faces in autism. Falck-Ytter et al. (2010) revealed that ahigher level of socio-emotional behaviors was correlated with morefixation on the eye region in children (5 years) with autism spectrumdisorder (ASD). This finding could be interpreted developmentally,suggesting that the longer time spent on the eyes with age could beassociated with increased socio-emotional behavior, making it possi-ble to draw inference from another’s eyes. This proposition fits withbehavioral results showing that the ability to decode the feelingsand thoughts of others from the eyes develops before early adoles-cence (Gunther Moor et al., 2012).Also accounting for the role of socio-emotional development in

the evolution of face scanning during childhood, our experimentrevealed that boys only in the oldest age group (8–10 years) spentmore time exploring the mouth than girls. This result is consistentwith previous studies reporting better non-verbal abilities and socio-emotional skills in females than males (Boyatzis et al., 1993;McClure, 2000; Leppanen & Hietanen, 2001; Grispan et al., 2003).Moreover, women are better at performing tasks inferring the mentalstate of a person just from eye information than men (Baron-Cohenet al., 1997). In our study, the gender effect only appeared in8-year-old children, probably due to the slow maturation of suchsocial abilities (Gosselin, 1995; Leppanen & Hietanen, 2001).However, due to the unequal sex ratio in the two youngest groupsof our study, this gender effect must be treated with caution.In view of the results of this second experiment, we suggest that

the shift from mouth to eye scanning and the increase in early gazepreference for the eye region from 4 to 10 years of age may beinterpreted as indicators of the dynamics of face expertise acquisi-tion (Diamond & Carey, 1986). This claim is supported by recentinsights showing that the amount of time spent looking at faces pre-dicts face processing skills in typically developing (TD) and ASDchildren (Parish-Morris et al., 2013). The authors argued that visualattention to faces predicts face expertise. Interests and skills forsocio-emotional behaviors are enhanced from birth to adulthood,generating increasing attention to highly relevant stimuli, i.e. facesand eyes. After the first year of intense exposure to these socialcues, children develop a strategy to process them more quickly and

© 2014 Federation of European Neuroscience Societies and John Wiley & Sons LtdEuropean Journal of Neuroscience, 1–14

Normal maturation of face processing 9

efficiently, i.e. they become expert in face processing. Acquisitionof this expertise may be sustained by the development of a rapidand automatic gaze attraction to the eyes and by more frequent fixa-tion on this region when scanning a face. Moreover, previous stud-ies have also demonstrated that acquisition of face expertise mayinvolve changes in cognitive encoding, from local to holistic(Tanaka & Farah, 1993; Hole, 1994) and configural processing(Leder & Bruce, 2000). Face inversion, which is known to affectconfigural processing, was recently reported to disrupt initial andsubsequent fixations on faces (Barton et al., 2006; Hills et al., 2012,2013). This finding highlights a link between the cognitive encodingand gaze behaviors involved in face processing, both of which maybe related to an ‘expert-like’ approach to faces.

Combined eye-tracking and electrophysiological paradigmdata

Early evoked brain responses and ocular exploration of faces werestrongly affected by age and appeared to be partly correlated. Onlythe N170 and the P2 characteristics were linked to both early, auto-matic gaze fixation on the eyes and more elaborate scanning of thissocial region. Moreover, the rate of initial gaze fixation on the eyeswas strongly correlated with P2 characteristics independently of age,in contrast to the correlation observed with N170, which seemed tobe partially driven by common developmental dynamics.P1 has been widely reported to reflect basic and early visual pro-

cessing. Some studies have revealed face-sensitive effects on thisearly component (Batty & Taylor, 2003; Itier & Taylor, 2004a), butthese effects appear to reflect low-level systematic differencesbetween faces and other complex visual stimuli (Rossion & Jacques,2008). Taken together, the absence of correlation between this com-ponent and measurements of ocular exploration of faces and previ-ous reports in the literature suggest that age-related changes in P1may not reflect the evolution of face-specific perceptual processesbut may rather be the result of age-related improvements in generalsensory and/or cognitive functions (Want et al., 2003; Crookes &McKone, 2009), such as visual acuity (Skoczenski & Norcia, 2002),sustained attention (Betts et al., 2006), and the ability to narrow thefocus of visual attention (Past�o & Burack, 1997).In contrast, the N170 component is correlated with eyes explora-

tion.A well-documented body of literature has reported that, although

delayed, the N170 component is even larger for eyes than faces(Bentin et al., 1996; Itier & Batty, 2009; Itier, Latinus, & Taylor,2006; Taylor et al., 2001). This strong N170 response to the eyeregion is consistent with our findings revealing that the time spenton the eye region influences the N170 characteristics. In addition,according to Dalton and colleagues, the activation of the fusiformgyrus (FG), which is considered to be the main generator of theN170 component (Corrigan et al., 2009), was positively correlatedwith the time spent exploring the eyes in 15- to 17-year-old adoles-cents (Dalton et al., 2005, 2007). Similarly, another recent studyargued that when individuals with autism were explicitly cued toperform visual scanpaths that involved fixing on the eyes of neutraland fearful face, a ‘normalization’ of activity in the right FG couldbe produced (Perlman et al., 2011; Zurcher et al., 2013), suggestingagain a link between FG activation and eye exploration.However, our results also indicated that this relationship between

N170 and eye exploration in children is not driven by a specific/direct correlation between these two indices but rather by their com-mon developmental changes occurring between 4 and 10 years ofage. We thus assume that our results may be interpreted in terms of

acquisition of face expertise. We suggest that a gradual, experience-dependent specialization in face processing throughout childhoodmay infer simultaneous development of brain correlates (N170) andvisual scanning (eye preference) involved in face perception. Thesedevelopmental dynamics refine the early interest in face observed ininfants, generating more automatic and efficient mechanisms associ-ated with effortless identification of faces throughout childhoodwhich finally lead to face expertise. Once the maturation of faceprocesses and expertise acquisition are over (around 10 years ofage) the N170 characteristics and the scan paths of eyes may thenstrongly and directly impact each other (according to Dalton et al.,2005, 2007; and Perlman et al., 2011).Finally, the P2 component was strongly correlated with eye scan-

ning behavior but, in contrast to the N170, this relationship may besustained more by complex cognitive functions than by maturationof face processing.The P2 component is generated by the re-activation of early

visual areas reflecting re-entrant feedback from higher to lowervisual areas (Kotsoni et al., 2007) and is documented as the reflec-tion of deeper processing engaged in visual processing than P1 (Lat-inus & Taylor, 2005). In addition, P2 has been not only functionallyassociated with sustained visual processing but also with more com-plex cognitive processing such as early attentional capture andmobilization of resources on salient stimuli to be processed (Schuppet al., 2003, 2004; Bar-Haim et al., 2005; Mercado et al., 2006; El-dar et al., 2010; Rossignol et al., 2012). Similarly, the P2 compo-nent may reflect attentional fixation (Fox, 2004; Rossignol et al.,2012). P2 elicited by faces in children in this study might mirror aglobal capture of attention on face cues, especially socially relevantcues (i.e. the eyes). The study’s paradigm was the simple andsequential presentation of faces on a blank gray screen, and thusappeared compatible with this suggestion (i.e. no task or distractorstimuli could divide visual attention). Moreover, the negative corre-lation observed in the current study between the P2 characteristicsand the proportion of initial fixation on the eyes also supports thisassumption; the P2 component became smaller and earlier withincreasing automatic attentional orienting toward the eyes. Concor-dant evidence for this comes from previous studies showing that incases of eye avoidance during social interaction in patients sufferingfrom social anxiety disorders (Horley et al., 2003), the P2 compo-nent is enhanced (Rossignol et al., 2012, 2013). In this study, thefact that the correlation performed reached significance only for theinitial fixation on the eyes and not for the total time spent on theeyes suggests that in school-age children, P2 may reflect rapidattentional orienting rather than sustained attention toward the eyes.Taking these arguments together, the ability of attentional shifttoward salient stimuli (eyes and face), sustained by P2, could alsobe considered as an indicator of face expertise. The greater mobili-zation of automatic attentional resources on the significant eyeregion may certainly facilitate the rapid attentional engagement onthis target and thus participate in improvement of face processingefficiency.

Conclusion

Our findings together demonstrate a long period of maturation forthe mechanisms involved in face processing, as highlighted by theevolution of both ERPs and face scanning behaviors throughoutchildhood when looking at faces. The major result of this study isthat these age-related changes are linked. The interactions betweenthe development of what is seen and what is processed is obvious,but the way one impacts on the other is not yet clear. We suggest

© 2014 Federation of European Neuroscience Societies and John Wiley & Sons LtdEuropean Journal of Neuroscience, 1–14

10 E. Meaux et al.

that this overall maturation process may be underpinned by agradual, experience-dependent specialization of face processingthroughout childhood (i.e. acquisition of face expertise), which isfundamental to the development of social and communication skillswith our peers.

Limitations and perspectives

There are certain methodological issues that need to be taken intoconsideration. First, our protocol did not combine simultaneousrecordings of the EEG and eye-tracking data. The duration of stimu-lus presentation required in an experimental design for ERP studiesdiffers from that of eye-tracking studies. Previous findings havedemonstrated that 4 s is a sufficient time window to undertake anexploration of the whole face without looking back to regionsalready explored (Hernandez et al., 2009). Moreover, when eyecontact increases (up to 5 s), direct gaze can be considered as hav-ing an aversive effect (Brooks et al., 1986). While this 4 s of pre-sentation of the stimulus has been recommended to record the timespent on the different facial elements, this would have led to a verylong EEG protocol (according to the number of stimuli required),making it inappropriate for young children. Secondly, we believethat experience and expertise play a significant role in the develop-ment of neural and behavioral indices of face processing. However,the exact causal relationship between expertise and the maturationof these processes needs further investigations. For example, adevelopmental follow-up study using a training paradigm in whichexposure to different types of faces could be manipulated and con-trolled may help to elucidate this question. Finally, this study usedemotional as well as neutral faces. Although no emotional effectswere found here, the implicit emotional processing could haveaffected the results obtained in some way. Furthermore, neurophysi-ological and behavioral investigations are required on a larger popu-lation from 4 years of age to adulthood to take into account theeffects of emotion and to improve description of the developmental‘norm’ of face perception over the life span. Such information willprovide a valuable basis to assess face perception impairment insocial disorders.

Acknowledgement

This research was supported by grants from the ORANGE Foundation.Above all, we thank all the subjects for their time and effort spent participat-ing in this study. The manuscript was revised while EM was hosted by PatrikVuilleumier at the Department of Fundamental Neurosciences of GenevaUniversity [http://labnic.unige.ch/] and sponsored by a BRIDGE fellowshipof the Marie Curie foundation. There was no conflict of interest.

Abbreviations

ANOVA, analysis of variance; ASD, autism spectrum disorder; EEG, electro-encephalogram; EOG, electrooculogram; ERP, event-related potential; ISI,interstimulus interval; LZ, LookZone; SF spatial frequency.

References

Aguera, P.E., Jerbi, K., Caclin, A. & Bertrand, O. (2011) ELAN: a softwarepackage for analysis and visualization of MEG, EEG, and LFP signals.Comput. Intell. Neurosci., 2011, 158970.

Althoff, R.R. & Cohen, N.J. (1999) Eye-movement-based memory effect:a reprocessing effect in face perception. J. Exp. Psychol. Learn., 25,997–1010.

Aylward, E.H., Park, J.E., Field, K.M., Parsons, A.C., Richards, T.L.,Cramer, S.C. & Meltzoff, A.N. (2005) Brain activation during face percep-tion: evidence of a developmental change. J. Cognitive Neurosci., 17, 308–319.

Bar-Haim, Y., Lamy, D. & Glickman, S. (2005) Attentional bias in anxiety:a behavioral and ERP study. Brain Cognition, 59, 11–22.

Baron-Cohen, S., Jolliffe, T., Mortimore, C. & Robertson, M. (1997)Another advanced test of theory of mind: evidence from very high func-tioning adults with autism or asperger syndrome. J. Child Psychol. Psyc.,38, 813–822.

Barton, J.J., Radcliffe, N., Cherkasova, M.V., Edelman, J. & Intriligator,J.M. (2006) Information processing during face recognition: the effects offamiliarity, inversion, and morphing on scanning fixations. Perception, 35,1089–1105.

Batty, M. & Taylor, M.J. (2002) Visual categorization during childhood: anERP study. Psychophysiology, 39, 482–490.

Batty, M. & Taylor, M.J. (2003) Early processing of the six basic facialemotional expressions. Brain Res. Cogn. Brain Res., 17, 613–620.

Batty, M. & Taylor, M.J. (2006) The development of emotional face process-ing during childhood. Developmental Sci., 9, 207–220.

Batty, M., Meaux, E., Wittemeyer, K., Roge, B. & Taylor, M.J. (2011) Earlyprocessing of emotional faces in children with autism: an event-relatedpotential study. J. Exp. Child Psychol., 109, 430–444.

Bentin, S., Allison, T., Puce, A., Perez, E. & McCarthy, G. (1996) Electro-physiological studies of faces perception in humans. J. Cognitive Neuro-sci., 8, 551–565.

Bentin, S., Golland, Y., Flevaris, A., Robertson, L.C. & Moscovitch, M.(2006) Processing the trees and the forest during initial stages of face per-ception: electrophysiological evidence. J. Cognitive Neurosci., 18, 1406–1421.

Bentin, S., Taylor, M.J., Rousselet, G.A., Itier, R.J., Caldara, R., Schyns,P.G., Jacques, C. & Rossion, B. (2007) Controlling interstimulus percep-tual variance does not abolish N170 face sensitivity. Nat. Neurosci., 10,801–802; author reply 802–803.

Betts, J., McKay, J., Maruff, P. & Anderson, V. (2006) The development ofsustained attention in children: the effect of age and task load. ChildNeuropsychol., 12, 205–221.

Bindemann, M., Burton, A.M., Hooge, I.T., Jenkins, R. & de Haan, E.H.(2005) Faces retain attention. Psychon. B. Rev., 12, 1048–1053.

Boutsen, L., Humphreys, G.W., Praamstra, P. & Warbrick, T. (2006)Comparing neural correlates of configural processing in faces and objects:an ERP study of the Thatcher illusion. NeuroImage, 32, 352–367.

Boyatzis, C.J., Chazan, E. & Ting, C.Z. (1993) Preschool children’s decod-ing of facial emotions. J. Genet. Psychol., 154, 375–382.

Brecelj, J., Strucl, M., Zidar, I. & Tekavcic-Pompe, M. (2002) Pattern ERG andVEP maturation in schoolchildren. Clin. Neurophysiol., 113, 1764–1770.

Bronson, G.W. (1994) Infant’s transitions toward adult-like scanning. ChildDev., 65, 1243–1261.

Brooks, C.I., Church, M.A. & Fraser, L. (1986) Effects of duration of eyecontact on judgments of personality characteristics. J. Soc. Psychol., 126,71–78.

Caldara, R., Schyns, P., Mayer, E., Smith, M.L., Gosselin, F. & Rossion, B.(2005) Does prosopagnosia take the eyes out of face representations? Evi-dence for a defect in representing diagnostic facial information followingbrain damage. J. Cognitive Neurosci., 17, 1652–1666.

Calder, A.J., Keane, J., Manes, F., Antoun, N. & Young, A.W. (2000)Impaired recognition and experience of disgust following brain injury.Nat. Neurosci., 3, 1077–1078.

Carey, S. (1992) Becoming a face expert. Philos. T. Roy. Soc. B., 335, 95–103.Carey, S. & Diamond, R. (1977) From piecemeal to configurational represen-

tation of faces. Science, 195, 312–314.Cerf, M., Harel, J., Einhauser, W. & Koch, C. (2008) Predicting human gaze

using low-level saliency combined with face detection. In Platt, J.C.,Singer, Y. & Roweis, S. (Eds), Advances in Neural Information Process-ing Systems. MIT Press, Cambridge, MA, pp. 241–248.

Chung, M.S. & Thomson, D.M. (1995) Development of face recognition.Brit. J. Psychol., 86, 55–87.

Corrigan, N.M., Richards, T., Webb, S.J., Murias, M., Merkle, K., Kleinhans,N.M., Johnson, L.C., Poliakov, A., Aylward, E. & Dawson, G. (2009) Aninvestigation of the relationship between fMRI and ERP source localizedmeasurements of brain activity during face processing. Brain Topogr., 22,83–96.

Crognale, M.A. (2002) Development, maturation, and aging of chromaticvisual pathways: VEP results. J. Vision, 2, 438–450.

Crookes, K. & McKone, E. (2009) Early maturity of face recognition: nochildhood development of holistic processing, novel face encoding, orface-space. Cognition, 111, 219–247.

Crouzet, S.M., Kirchner, H. & Thorpe, S.J. (2010) Fast saccades towardfaces: face detection in just 100 ms. J. Vision, 10, 16.1–16.17.

© 2014 Federation of European Neuroscience Societies and John Wiley & Sons LtdEuropean Journal of Neuroscience, 1–14

Normal maturation of face processing 11

Dalton, K.M., Nacewicz, B.M., Johnstone, T., Schaefer, H.S., Gernsbacher,M.A., Goldsmith, H.H., Alexander, A.L., Davidson, R.J. (2005) Gaze fixa-tion and the neural circuitry of face processing in autism. Nat. Neurosci.,8, 519–526.

Dalton, K.M., Nacewicz, B.M., Alexander, A.L. & Davidson, R.J. (2007)Gaze-fixation, brain activation, and amygdala volume in unaffectedsiblings of individuals with autism. Biol. Psychiat., 61, 512–520.

Dalton, K.M., Holsen, L., Abbeduto, L. & Davidson, R.J. (2008) Brain func-tion and gaze fixation during facial-emotion processing in fragilex andautism. Autism Res., 1, 231–239.

Dering, B., Martin, C.D., Moro, S., Pegna, A.J. & Thierry, G. (2011) Face-sensitive processes one hundred milliseconds after picture onset. Front.Hum. Neurosci., 5, 93.

Diamond, R. & Carey, S. (1986) Why faces are and are not special: an effectof expertise. J. Exp. Psychol. Gen., 115, 107–117.

Doucet, M.E., Gosselin, F., Lassonde, M., Guillemot, J.P. & Lepore, F.(2005) Development of visual-evoked potentials to radially modulatedconcentric patterns. NeuroReport, 16, 1753–1756.

Eldar, S., Yankelevitch, R., Lamy, D. & Bar-Haim, Y. (2010) Enhanced neu-ral reactivity and selective attention to threat in anxiety. Biol. Psychol., 85,252–257.

Elsabbagh, M., Mercure, E., Hudry, K., Chandler, S., Pasco, G., Charman,T., Pickles, A., Baron-Cohen, S., Bolton, P., Johnson, M.H. & Team, B.(2012) Infant neural sensitivity to dynamic eye gaze is associated withlater emerging autism. Curr. Biol., 22, 338–342.

Elsabbagh, M., Bedford, R., Senju, A., Charman, T., Pickles, A., Johnson,M.H. & The, B.T. (2013) What you see is what you get: contextual modu-lation of face scanning in typical and atypical development. Soc. Cogn.Affect. Neur., doi:10.1093/scan/nst012. [Epub ahead of print].

Falck-Ytter, T., Fernell, E., Gillberg, C. & von Hofsten, C. (2010) Face scan-ning distinguishes social from communication impairments in autism.Developmental Sci., 13, 864–875.

Farroni, T., Csibra, G., Simion, F. & Johnson, M.H. (2002) Eye contactdetection in humans from birth. Proc. Natl. Acad. Sci. USA, 99, 9602–9605.

Field, T., Pickens, J., Fox, N.A., Gonzales, J. & Nawrocki, T. (1998) Facialexpressions and EEG responses to happy and sad faces/voices by 3 month-old infants of depressed mothers. Brit. J. Dev. Psychol., 16, 485–494.

Fletcher-Watson, S., Findlay, J.M., Leekam, S.R. & Benson, V. (2008) Rapiddetection of person information in a naturalistic scene. Perception, 37,571–583.

Fox, E. (2004) Maintenance or capture of attention in anxiety-related biases.In Yiend, J. (Ed.), Cognition, Emotion, and Psychopathology: TheoriticalEmpirical and Clinical Directions. Cambridge University Press,Cambridge, UK, pp. 86–105.

Fox, E. & Damjanovic, L. (2006) The eyes are sufficient to produce a threatsuperiority effect. Emotion, 6, 534–539.

Gamer, M. & Buchel, C. (2009) Amygdala activation predicts gaze towardfearful eyes. J. Neurosci., 29, 9123–9126.

Gauthier, I. & Nelson, C.A. (2001) The development of face expertise. Curr.Opin. Neurobiol., 11, 219–224.

Gauthier, I. & Tarr, M.J. (1997) Becoming a ‘Greeble’ expert: exploringmechanisms for face recognition. Vision Res., 37, 1673–1682.

Gauthier, I., Skudlarski, P., Gore, J.C. & Anderson, A.W. (2000) Expertisefor cars and birds recruits brain areas involved in face recognition. Nat.Neurosci., 3, 191–197.

Golarai, G., Ghahremani, D.G., Whitfield-Gabrieli, S., Reiss, A., Eberhardt,J.L., Gabrieli, J.D. & Grill-Spector, K. (2007) Differential development ofhigh-level visual cortex correlates with category-specific recognition mem-ory. Nat. Neurosci., 10, 512–522.

Gosselin, P. (1995) The development of the recognition of facial expressionsof emotion in children. Can. J. Behav. Sci., 27, 107–119.

Gredeback, G., Johnson, S. & von Hofsten, C. (2010) Eye tracking ininfancy research. Dev. Neuropsychol., 35, 1–19.

Grispan, D., Hemphill, A. & Nowicki, S.J. (2003) Improving the abilityof elementary school-age children to identify emotion in facial expression.J. Genet. Psychol., 164, 88–100.

Gunther Moor, B., Op de Macks, Z.A., G€uroglu, B., Rombouts, S.A.,Van der Molen, M.W. & Crone, E.A. (2012) Neurodevelopmentalchanges of reading the mind in the eyes. Soc. Cogn. Affect. Neur., 7,44–52.

de Haan, M., Pascalis, O. & Johnson, M.H. (2002) Specialization of neuralmechanisms underlying face recognition in human infants. J. CognitiveNeurosci., 14, 199–209.

de Haan, M., Belsky, J., Reid, V., Volein, A. & Johnson, M.H. (2004)Maternal personality and infants’ neural and visual responsivity to facialexpressions of emotion. J. Child Psychol. Psyc., 45, 1209–1218.

Haist, F., Adamo, M., Han, J., Lee, K. & Stiles, J. (2013) The functionalarchitecture for face-processing expertise: FMRI evidence of the develop-mental trajectory of the core and the extended face systems. Neuropsycho-logia, 51, 2893–2908.

Haith, M.M., Bergman, T. & Moore, M.J. (1977) Eye contact and face scan-ning in early infancy. Science, 198, 853–855.

Halit, H., de Haan, M. & Johnson, M.H. (2000) Modulation of event-relatedpotentials by prototypical and atypical faces. NeuroReport, 11, 1871–1875.

Halit, H., de Haan, M. & Johnson, M.H. (2003) Cortical specialisationfor face processing: face-sensitive event-related potential components in3-and 12-month-old infants. NeuroImage, 19, 1180–1193.

Henderson, J.M., Falk, R.J., Minut, S., Dyer, F.C. & Mahadevan, S. (2001)Gaze control for face learning and recognition in humans and machines. InShipley, T. & Kellman, P. (Eds), From Fragments to Objects: Segmenta-tion Processes in Vision. Elsevier, New York, pp. 463–481.

Henderson, J.M., Williams, C.C. & Falk, R.J. (2005) Eye movements arefunctional during face learning. Mem. Cognition, 33, 98–106.

Hernandez, N., Metzger, A., Magn�e, R., Bonnet-Brilhault, F., Roux, S., Bar-thelemy, C., Martineau, J. (2009) Exploration of core features of a humanface by healthy and autistic adults analysed by visual scanning. Neuro-psychologia, 47, 1004–1012.

Hershler, O. & Hochstein, S. (2005) At first sight: a high-level pop out effectfor faces. Vision Res., 45, 1707–1724.

Hileman, C.M., Henderson, H., Mundy, P., Newell, L. & Jaime, M. (2011)Developmental and individual differences on the P1 and N170 ERP com-ponents in children with and without autism. Dev. Neuropsychol., 36,214–236.

Hills, P.J., Sullivan, A.J. & Pake, J.M. (2012) Aberrant first fixations whenlooking at inverted faces in various poses: the result of the centre-of-grav-ity effect? Brit. J. Psychol., 103, 520–538.

Hills, P.J., Cooper, R.E. & Pake, J.M. (2013) First fixations in face process-ing: the more diagnostic they are the smaller the face-inversion effect. ActaPsychol. (Amst)., 142, 211–219.

Hoehl, S. & Peykarjou, S. (2012) The early development of face processing– what makes faces special? Neurosci Bull, 28, 765–788.

Hole, G.J. (1994) Configurational factors in the perception of unfamiliarfaces. Perception, 23, 65–74.

Horley, K., Williams, L.M., Gonsalvez, C. & Gordon, E. (2003) Socialphobics do not see eye to eye: a visual scanpath study of emotionalexpression processing. J. Anxiety Disord., 17, 33–44.

Hunnius, S. & Geuze, R.H. (2004) Developmental changes in visual scan-ning of dynamic faces and abstract stimuli in infants: a longitudinal study.Infancy, 6, 231–255.

Huttenlocher, P.R. (1990) Morphometric study of human cerebral cortexdevelopment. Neuropsychologia, 28, 517–527.

Itier, R.J. & Batty, M. (2009) Neural bases of eye and gaze processing: thecore of social cognition. Neurosci. Biobehav. R., 33, 843–863.

Itier, R.J. & Taylor, M.J. (2002) Inversion and contrast polarity reversalaffect both encoding and recognition processes of unfamiliar faces: a repe-tition study using ERPs. NeuroImage, 15, 353–372.

Itier, R.J. & Taylor, M.J. (2004a) Effects of repetition learning on upright,inverted and contrast-reversed face processing using ERPs. NeuroImage,21, 1518–1532.

Itier, R.J. & Taylor, M.J. (2004b) Face recognition memory and configuralprocessing: a developmental ERP study using upright, inverted, andcontrast-reversed faces. J. Cognitive Neurosci., 16, 487–502.

Itier, R.J., Taylor, M.J. & Lobaugh, N.J. (2004) Spatiotemporal analysis ofevent-related potentials to upright, inverted, and contrast-reversed faces:effects on encoding and recognition. Psychophysiology, 41, 643–653.

Itier, R.J., Latinus, M. & Taylor, M.J. (2006) Face, eye and object early pro-cessing: what is the face specificity? NeuroImage, 29, 667–676.

Joseph, J.E., Gathers, A.D. & Bhatt, R.S. (2011) Progressive and regressivedevelopmental changes in neural substrates for face processing: Testingspecific predictions of the Interactive Specialization account. Developmen-tal Sci., 14, 227–241.

Joyce, C. & Rossion, B. (2005) The face-sensitive N170 and VPP compo-nents manifest the same brain processes: the effect of reference electrodesite. Clin. Neurophysiol., 116, 2613–2631.

Kelly, D.J., Liu, S., Rodger, H., Miellet, S., Ge, L. & Caldara, R. (2011) Devel-oping cultural differences in face processing. Developmental Sci., 14, 1176–1184.

© 2014 Federation of European Neuroscience Societies and John Wiley & Sons LtdEuropean Journal of Neuroscience, 1–14

12 E. Meaux et al.

Kotsoni, E., Csibra, G., Mareschal, D. & Johnson, M.H. (2007) Electrophysi-ological correlates of common-onset visual masking. Neuropsychologia,45, 2285–2293.

Kovacs, G., Zimmer, M., Banko, E., Harza, I., Antal, A. & Vidnyanszky, Z.(2006) Electrophysiological correlates of visual adaptation to faces andbody parts in humans. Cereb. Cortex, 16, 742–753.

Kuefner, D., de Heering, A., Jacques, C., Palmero-Soler, E. & Rossion, B.(2010) Early visually evoked electrophysiological responses over thehuman brain (P1, N170) show stable patterns of face-sensitivity from4 years to adulthood. Front. Hum. Neurosci., 3, 67.

Latinus, M. & Taylor, M.J. (2005) Holistic processing of faces: learningeffects with Mooney faces. J. Cognitive Neurosci., 17, 1316–1327.

Latinus, M. & Taylor, M.J. (2006) Face processing stages: impact of diffi-culty and the separation of effects. Brain Res., 1123, 179–187.

Leder, H. & Bruce, V. (2000) When inverted faces are recognized: the roleof configural information in face recognition. Q. J. Exp. Psychol. A, 53,513–536.

Lee, K., Quinn, P.C., Pascalis, O. & Slater, A. (2013) Development of face-processing ability in childhood. In Zelaso, P.D.E. (Ed.), The OxfordHandbook of Developmental Psychology, Vol 1: Body and Mind. OxfordUniversity Press, New York, pp. 338–370.

Leppanen, J.M. & Hietanen, J.K. (2001) Emotion recognition and socialadjustment in school-aged girls and boys. Scand. J. Psychol., 42, 429–435.

Lewkowicz, D.J. & Hansen-Tift, A.M. (2012) Infants deploy selective atten-tion to the mouth of a talking face when learning speech. Proc. Natl.Acad. Sci. USA, 109, 1431–1436.

Linkenkaer-Hansen, K., Palva, J.M., Sams, M., Hietanen, J.K., Aronen, H.J.& Ilmoniemi, R.J. (1998) Face-selective processing in human extrastriatecortex around 120 ms after stimulus onset revealed by magneto- and elec-troencephalography. Neurosci. Lett., 253, 147–150.

Maurer, D. & Salapatek, P. (1976) Developmental changes in the scanningof faces by young infants. Child Dev., 47, 523–527.

McClure, E.B. (2000) A meta-analytic review of sex differences in facialexpression processing and their development in infants, children, andadolescents. Psychol. Bull., 126, 424–453.

McKelvie, S.J. (1976) The role of eyes and mouth in the memory of a face.Am. J. Psychol., 89, 311–323.

McKone, E., Kanwisher, N. & Duchaine, B.C. (2007) Can generic expertiseexplain special processing for faces? Trends Cogn. Sci., 11, 8–15.

Mercado, F., Carretie, L., Tapia, M. & Gomez-Jarabo, G. (2006) The influ-ence of emotional context on attention in anxious subjects: neurophysio-logical correlates. J. Anxiety Disord., 20, 72–84.

Nelson, C.A. (1997) Electrophysiological correlates of memory developmentin the first year of life. In Reese, H.W. & Franzen, M.D. (Eds), Biologicaland Neuropsychological Mechanisms. Erlbaum, Mahwah, NJ, pp. 95–131.

Oakes, L.M. & Ellis, A.E. (2013) An eye-tracking investigation of develop-mental changes in infants’ exploration of upright and inverted humanfaces. Infancy, 18, 134–148.

Oostenveld, R. & Praamstra, P. (2001) The five percent electrode system forhigh-resolution EEG and ERP measurements. Clin. Neurophysiol., 112,713–719.

Orban de Xivry, J.J., Ramon, M., Lefevre, P. & Rossion, B. (2008) Reducedfixation on the upper area of personally familiar faces following acquiredprosopagnosia. J. Neuropsychol., 2(Pt 1), 245–268.

Parish-Morris, J., Chevallier, C., Tonge, N., Letzen, J., Pandey, J. & Schultz,R.T. (2013) Visual attention to dynamic faces and objects is linked to faceprocessing skills: a combined study of children with autism and controls.Front. Psychol., 4, 185.

Pascalis, O., de Vivies, X.D., Anzures, G., Quinn, P.C., Slater, A.M.,Tanaka, J.W. & Lee, K. (2011) Development of face processing. WileyInterdiscip. Rev. Cogn. Sci., 2, 666–675.

Passarotti, A.M., Smith, J., DeLano, M. & Huang, J. (2007) Developmentaldifferences in the neural bases of the face inversion effect show progres-sive tuning of face-selective regions to the upright orientation. NeuroIm-age, 34, 1708–1722.

Past�o, L. & Burack, J.A. (1997) A developmental study of visual attention:issues of fi ltering effi ciency and focus. Cognitive Dev., 12, 523–535.

Perlman, S.B., Hudac, C.M., Pegors, T., Minshew, N.J. & Pelphrey, K.A.(2011) Experimental manipulation of face-evoked activity in the fusiformgyrus of individuals with autism. Soc. Neurosci., 6, 22–30.

Perron-Borelli, M. (1996) Echelle Diff�erentielle D’efficience Intellectuelle.Forme r�evis�ee. Editions et Applications Psychologiques. Issy-les-Mouli-neaux, E.A.P., Paris.

Peters, J.C., Vlamings, P. & Kemner, C. (2013) Neural processing of highand low spatial frequency information in faces changes across develop-

ment: qualitative changes in face processing during adolescence. Eur. J.Neurosci., 37, 1448–1457.

Picton, T.W., Bentin, S., Berg, P., Donchin, E., Hillyard, S.A., Johnson, R.Jr., Miller, G.A., Ritter, W., Ruchkin, D.S., Rugg, M.D. & Taylor, M.J.(2000) Guidelines for using human event-related potentials to study cogni-tion: recording standards and publication criteria. Psychophysiology, 37,127–152.

Quinn, P.C., Doran, M.M., Reiss, J.E. & Hoffman, J.E. (2010) Neural mark-ers of subordinate-level categorization in 6- to 7-month-old infants. Devel-opmental Sci., 13, 499–507.

Rossignol, M., Philippot, P., Bissot, C., Rigoulot, S. & Campanella, S.(2012) Electrophysiological correlates of enhanced perceptual processesand attentional capture by emotional faces in social anxiety. Brain Res.,1460, 50–62.

Rossignol, M., Campanella, S., Bissot, C. & Philippot, P. (2013) Fear ofnegative evaluation and attentional bias for facial expressions: an event-related study. Brain Cognition, 82, 344–352.

Rossion, B. & Jacques, C. (2008) Does physical interstimulus varianceaccount for early electrophysiological face sensitive responses in thehuman brain? Ten lessons on the N170. NeuroImage, 39, 1959–1979.

Rossion, B., Gauthier, I., Goffaux, V., Tarr, M.J. & Crommelinck, M. (2002)Expertise training with novel objects leads to left-lateralized facelike elec-trophysiological responses. Psychol. Sci., 13, 250–257.

Salapatek, P. & Kessen, W. (1966) Visual scanning of triangles by thehuman newborn. J. Exp. Child Psychol., 3, 155–167.

Scheller, E., Buchel, C. & Gamer, M. (2012) Diagnostic features ofemotional expressions are processed preferentially. PLoS One, 7, e41792.

Scherf, K.S., Behrmann, M., Humphreys, K. & Luna, B. (2007) Visual cate-gory-selectivity for faces, places and objects emerges along different devel-opmental trajectories. Developmental Sci., 10, F15–F30.

Schupp, H.T., Junghofer, M., Weike, A.I. & Hamm, A.O. (2003) Attentionand emotion: an ERP analysis of facilitated emotional stimulus processing.NeuroReport, 14, 1107–1110.

Schupp, H.T., Ohman, A., Junghofer, M., Weike, A.I., Stockburger, J. &Hamm, A.O. (2004) The facilitated processing of threatening faces: anERP analysis. Emotion, 4, 189–200.

Schutz, A.C., Braun, D.I. & Gegenfurtner, K.R. (2011) Eye movements andperception: a selective review. J. Vision, 11, 1–30.

Schyns, P.G., Bonnard, L. & Gosselin, F. (2002) Show me the features!Understanding recognition from the use of visual information. Psychol.Sci., 13, 402–409.

Sekuler, A.B., Gaspar, C.M., Gold, J.M. & Bennett, P.J. (2004) Inversionleads to quantitative, not qualitative, changes in face processing. Curr.Biol., 14, 391–396.

Senju, A. & Csibra, G. (2008) Gaze following in human infants depends oncommunicative signals. Curr. Biol., 18, 668–671.

Senju, A., Vernetti, A., Kikuchi, Y., Akechi, H. & Hasegawa, T. (2013)Cultural modulation of face and gaze scanning in young children. PLoSOne, 8, e74017.

Skoczenski, A.M. & Norcia, A.M. (2002) Late maturation of visual hypera-cuity. Psychol. Sci., 13, 537–541.

Smith, M.L., Cottrell, G.W., Gosselin, F. & Schyns, P.G. (2005) Transmit-ting and decoding facial expressions. Psychol. Sci., 16, 184–189.

Tanaka, J.W. & Farah, M.J. (1993) Parts and wholes in face recognition. Q.J. Exp. Psychol. A, 46, 225–245.

Taylor, M.J. (2002) Non-spatial attentional effects on P1. Clin. Neurophysi-ol., 113, 1903–1908.

Taylor, M.J. & Pang, E.W. (1999) Developmental changes in early cognitiveprocesses. Eeg. Cl. N. Su., 49, 145–153.

Taylor, M.J., Edmonds, G.E., McCarthy, G. & Allison, T. (2001) Eyes first!Eye processing develops before face processing in children. NeuroReport,12, 1671–1676.

Taylor, M.J., Batty, M. & Itier, R.J. (2004) The faces of development: a reviewof early face processing over childhood. J. Cognitive Neurosci., 16, 1426–1442.

Taylor, M.J., Mills, T. & Pang, E.W. (2011) The development of face recog-nition; hippocampal and frontal lobe contributions determined with MEG.Brain Topogr., 24, 261–270.

Tenenbaum, E.J., Shah, R.J., Sobel, D.M., Malle, B.F. & Morgan, J.L.(2013) Increased focus on the mouth among infants in the first year of life:a longitudinal eye-tracking study. Infancy, 18, 534–553.

Theeuwes, J. & Van der Stigchel, S. (2006) Faces capture attention: evidencefrom inhibition of return. Vis. Cogn., 13, 657–665.

Vinette, C., Gosselin, F. & Schyns, P. (2004) Spatiotemporal dynamics offace recognition in a flash: it’s in the eyes. Cognitive Sci., 28, 289–301.

© 2014 Federation of European Neuroscience Societies and John Wiley & Sons LtdEuropean Journal of Neuroscience, 1–14

Normal maturation of face processing 13

Vlamings, P.H., Jonkman, L.M. & Kemner, C. (2010) An eye for detail: anevent-related potential study of the rapid processing of fearful facialexpressions in children. Child Dev., 81, 1304–1319.

Walker-Smith, G.J., Gale, A.G. & Findlay, J.M. (1977) Eye movement strat-egies involved in face perception. Perception, 6, 313–326.

Want, S.C., Pascalis, O., Coleman, M. & Blades, M. (2003) Face facts: isthe development of face recognition in early and middle childhood reallyso special? In Pascalis, O. (Ed.), The Developement of Face Processing inInfancy and Early Childhood. Nova Science publishers, New York, pp.207–221.

Wechsler, D. (2004) The Wechsler Intelligence Scale for Children, 4th Edn.Pearson Assessment, London.

Wheeler, A., Anzures, G., Quinn, P.C., Pascalis, O., Omrin, D.S. & Lee, K.(2011) Caucasian infants scan own- and other-race faces differently. PLoSOne, 6, e18621.

Whitford, T.J., Rennie, C.J., Grieve, S.M., Clark, C.R., Gordon, E. &Williams, L.M. (2007) Brain maturation in adolescence: concurrent changesin neuroanatomy and neurophysiology. Hum. Brain Mapp., 28, 228–237.

Yarbus, A.L. (1961) Eye movements during the examination of complicatedobjects. Biofizika, 6, 52–56.

Zurcher, N.R., Donnelly, N., Rogier, O., Russo, B., Hippolyte, L., Hadwin,J., Lemonnier, E. & Hadjikhani, N. (2013) It’s all in the eyes: subcorticaland cortical activation during grotesqueness perception in autism. PLoSOne, 8, e54313.

© 2014 Federation of European Neuroscience Societies and John Wiley & Sons LtdEuropean Journal of Neuroscience, 1–14

14 E. Meaux et al.