The Four Stages of Eye-Tracking Integration to Enhance Flight ...

17
HAL Id: hal-01754789 https://hal.archives-ouvertes.fr/hal-01754789 Submitted on 30 Mar 2018 HAL is a multi-disciplinary open access archive for the deposit and dissemination of sci- entific research documents, whether they are pub- lished or not. The documents may come from teaching and research institutions in France or abroad, or from public or private research centers. L’archive ouverte pluridisciplinaire HAL, est destinée au dépôt et à la diffusion de documents scientifiques de niveau recherche, publiés ou non, émanant des établissements d’enseignement et de recherche français ou étrangers, des laboratoires publics ou privés. The Neuroergonomics of Aircraft Cockpits: The Four Stages of Eye-Tracking Integration to Enhance Flight Safety Vsevolod Peysakhovich, Olivier Lefrancois, Frédéric Dehais, Mickaël Causse To cite this version: Vsevolod Peysakhovich, Olivier Lefrancois, Frédéric Dehais, Mickaël Causse. The Neuroergonomics of Aircraft Cockpits: The Four Stages of Eye-Tracking Integration to Enhance Flight Safety. Safety, MDPI, 2018, 4 (1), pp.8. 10.3390/safety4010008. hal-01754789

Transcript of The Four Stages of Eye-Tracking Integration to Enhance Flight ...

HAL Id: hal-01754789https://hal.archives-ouvertes.fr/hal-01754789

Submitted on 30 Mar 2018

HAL is a multi-disciplinary open accessarchive for the deposit and dissemination of sci-entific research documents, whether they are pub-lished or not. The documents may come fromteaching and research institutions in France orabroad, or from public or private research centers.

L’archive ouverte pluridisciplinaire HAL, estdestinée au dépôt et à la diffusion de documentsscientifiques de niveau recherche, publiés ou non,émanant des établissements d’enseignement et derecherche français ou étrangers, des laboratoirespublics ou privés.

The Neuroergonomics of Aircraft Cockpits: The FourStages of Eye-Tracking Integration to Enhance Flight

SafetyVsevolod Peysakhovich, Olivier Lefrancois, Frédéric Dehais, Mickaël Causse

To cite this version:Vsevolod Peysakhovich, Olivier Lefrancois, Frédéric Dehais, Mickaël Causse. The Neuroergonomicsof Aircraft Cockpits: The Four Stages of Eye-Tracking Integration to Enhance Flight Safety. Safety,MDPI, 2018, 4 (1), pp.8. �10.3390/safety4010008�. �hal-01754789�

�������������������������� �������������������������������������������������������

�������������������������������������

���������������������������������������������

������ �� ��� ���� ����� ��������� ����� �������� ���� ��� � ��� ���� ��������

���������������� �������������������������������������������������

�������������������������������������������������

����������������� ��

������������ ���

a publisher's https://oatao.univ-toulouse.fr/19614

http://dx.doi.org/10.3390/safety4010008

Peysakhovich, Vsevolod and Lefrancois, Olivier and Dehais, Frédéric and Causse, Mickaël The Neuroergonomics of

Aircraft Cockpits: The Four Stages of Eye-Tracking Integration to Enhance Flight Safety. (2018) Safety, vol. 4 (n° 1).

pp. 1-15. ISSN 2313-576X

safety

Article

The Neuroergonomics of Aircraft Cockpits: The FourStages of Eye-Tracking Integration to EnhanceFlight Safety

Vsevolod Peysakhovich * ID , Olivier Lefrançois, Frédéric Dehais and Mickaël Causse

Département Conception et Conduite des véhicules Aéronautiques et Spatiaux (DCAS), ISAE-SUPAERO,Université de Toulouse, 10 Avenue Edouard Belin, 31055 Toulouse CEDEX 4, France;[email protected] (O.L.); [email protected] (F.D.); [email protected] (M.C.)* Correspondence: [email protected]; Tel.: +33-(0)5-61-33-87-46

Received: 2 November 2017; Accepted: 20 February 2018; Published: 27 February 2018

Abstract: Commercial aviation is currently one of the safest modes of transportation; however, humanerror is still one major contributing cause of aeronautical accidents and incidents. One promisingavenue to further enhance flight safety is Neuroergonomics, an approach at the intersection ofneuroscience, cognitive engineering and human factors, which aims to create better human–systeminteraction. Eye-tracking technology allows users to “monitor the monitoring” by providing insightsinto both pilots’ attentional distribution and underlying decisional processes. In this position paper,we identify and define a framework of four stages of step-by-step integration of eye-tracking systemsin modern cockpits. Stage I concerns Pilot Training and Flight Performance Analysis on-ground; stageII proposes On-board Gaze Recordings as extra data for the “black box” recorders; stage III describesGaze-Based Flight Deck Adaptation including warning and alerting systems, and, eventually, stageIV prophesies Gaze-Based Aircraft Adaptation including authority taking by the aircraft. We illustratethe potential of these four steps with a description of incidents or accidents that we could certainlyhave avoided thanks to eye-tracking. Estimated milestones for the integration of each stage arealso proposed together with a list of some implementation limitations. We believe that the researchinstitutions and industrial actors of the domain will all benefit from the integration of the frameworkof the eye-tracking systems into cockpits.

Keywords: eye-tracking; eye movements; assistive technologies; human factors; neuroergonomics;aircraft cockpit

1. Introduction

Commercial aviation is one of the safest modes of transportation nowadays with less than oneaccident per million departures [1]. This relatively low rate of accidents, which were 20–30 timeshigher in the 1960s, is mostly due to the technological progress of aeronautical systems, but also due toimprovements in pilot training, flight crew and air traffic control procedures. Technical failure today isthe cause of only about 10% of accidents, leaving a significant percentage as the implications of humanfactors. The exact values vary over years and sources and, as illustrated in Figure 1, approximately 60to 80 percent of aviation accidents involve human error [2]. In particular, today’s low rate of overallaccidents is due to the introduction of automation [3]. However, it has also shifted the role of thecrew from direct (manual) controllers to supervisors. Unfortunately, automation is not always fullyunderstood nor surveilled correctly [4] and this can generate complacency [5]. This phenomenon canpromote the failure of the crew to monitor flight instruments properly because of over-reliance onautomation. Thus, the adequate, active visual monitoring of the flight parameters in the cockpit is anessential piloting skill and becomes one of the most critical issues for flight safety [6].

Safety 2018, 4, 8; doi:10.3390/safety4010008 www.mdpi.com/journal/safety

Safety 2018, 4, 8 2 of 15

Safety 2018, 4, x FOR PEER REVIEW 2 of 15

the cockpit is an essential piloting skill and becomes one of the most critical issues for flight safety [6].

Figure 1. Aeronautical accidents from 1970 to 2014 with human factors, technical failure and weather conditions as contributory factors. Data retrieved from Bureau of Aircraft Accidents Archives (www.baaa-acro.com).

The supervision of the flight deck is performed by two pilots, namely the Pilot Flying (PF) and the Pilot Monitoring (PM), whose roles are defined by standard operation procedures [7]. The PF is responsible for controlling the flight (e.g., trajectory and energy) and gives orders to the PM. He fulfills these responsibilities by either supervising the auto-pilot and the auto-thrust when engaged, or hand-flying the aircraft. The PM is responsible for systems-related tasks and the monitoring of the flight instruments. He/she also performs actions requested by the PF such as radio communications, systems configurations and automatic mode selections. Eventually, the PM is responsible for monitoring the PF to cross-check his/her actions, to inform of any deviation of the flight parameters and backup if necessary.

However, maintaining an excellent monitoring performance is not easy. As Casner and Schooler [8] showed in their recent study of airline pilots in a high-fidelity simulator, when pilots are free to monitor, i.e., there is not a particular defined task, monitoring lapses often occur. In this sense, the National Transportation Safety Board (NTSB) in their accident review of major flight-crew-involved accidents between 1978 and 1990 [9] found that inadequate monitoring and cross-checking had occurred during more than 80% of accidents. However, even if the pilots are nowadays aware of the critical importance of adequate monitoring, accidents with monitoring lapses as contributing factors still occur. One recent example was the Asiana Airlines flight 214 accident in 2013. A change in the autopilot mode (which resulted in the deactivation of the automatic airspeed control) went unnoticed by the crew, as well as the subsequent drastic diminution of the speed. This lack of monitoring provoked a destabilized approach, with a below acceptable glide path and an excessive descent rate. The landing gear finally struck a seawall at San Francisco International Airport [10]. Three passengers were fatally injured, with 49 other people receiving serious injuries; the airplane was destroyed by the impact and subsequent fire.

Recently, the Federal Aviation Administration (FAA) published a final training rule that requires enhanced pilot monitoring training to be incorporated into existing air careers training programs [11], the compliance date being March 2019. Also, the Bureau Enquêtes-Accidents (BEA), the French investigation agency, recommended studying pilots’ monitoring with eye-tracking to improve piloting procedures [12]. In a recent survey [13], 75% of pilots answered that a publication of detailed information regarding the required visual patterns for the different phases of flight (for example, take-off, approach) could help them to enhance monitoring skills.

Consequently, the great challenge is to improve the pilot–aircraft interaction by considering the complex attentional and cognitive processes underlying piloting. Neuroergonomics [14] proposes using physiological tools to find valid and robust measures of human behavior and cognitive processes, such as attention. For example, electroencephalography [15], electrocardiography [16] or functional infra-red spectroscopy [17] are often used in flight/drive simulators to study cognitive and attentional states. However, none of these techniques can properly take measures remotely

Figure 1. Aeronautical accidents from 1970 to 2014 with human factors, technical failure and weatherconditions as contributory factors. Data retrieved from Bureau of Aircraft Accidents Archives(www.baaa-acro.com).

The supervision of the flight deck is performed by two pilots, namely the Pilot Flying (PF) andthe Pilot Monitoring (PM), whose roles are defined by standard operation procedures [7]. The PFis responsible for controlling the flight (e.g., trajectory and energy) and gives orders to the PM. Hefulfills these responsibilities by either supervising the auto-pilot and the auto-thrust when engaged, orhand-flying the aircraft. The PM is responsible for systems-related tasks and the monitoring of the flightinstruments. He/she also performs actions requested by the PF such as radio communications, systemsconfigurations and automatic mode selections. Eventually, the PM is responsible for monitoring thePF to cross-check his/her actions, to inform of any deviation of the flight parameters and backupif necessary.

However, maintaining an excellent monitoring performance is not easy. As Casner andSchooler [8] showed in their recent study of airline pilots in a high-fidelity simulator, when pilots arefree to monitor, i.e., there is not a particular defined task, monitoring lapses often occur. In this sense,the National Transportation Safety Board (NTSB) in their accident review of major flight-crew-involvedaccidents between 1978 and 1990 [9] found that inadequate monitoring and cross-checking hadoccurred during more than 80% of accidents. However, even if the pilots are nowadays aware of thecritical importance of adequate monitoring, accidents with monitoring lapses as contributing factorsstill occur. One recent example was the Asiana Airlines flight 214 accident in 2013. A change in theautopilot mode (which resulted in the deactivation of the automatic airspeed control) went unnoticedby the crew, as well as the subsequent drastic diminution of the speed. This lack of monitoringprovoked a destabilized approach, with a below acceptable glide path and an excessive descent rate.The landing gear finally struck a seawall at San Francisco International Airport [10]. Three passengerswere fatally injured, with 49 other people receiving serious injuries; the airplane was destroyed by theimpact and subsequent fire.

Recently, the Federal Aviation Administration (FAA) published a final training rule that requiresenhanced pilot monitoring training to be incorporated into existing air careers training programs [11],the compliance date being March 2019. Also, the Bureau Enquêtes-Accidents (BEA), the Frenchinvestigation agency, recommended studying pilots’ monitoring with eye-tracking to improve pilotingprocedures [12]. In a recent survey [13], 75% of pilots answered that a publication of detailedinformation regarding the required visual patterns for the different phases of flight (for example,take-off, approach) could help them to enhance monitoring skills.

Consequently, the great challenge is to improve the pilot–aircraft interaction by considering thecomplex attentional and cognitive processes underlying piloting. Neuroergonomics [14] proposesusing physiological tools to find valid and robust measures of human behavior and cognitive processes,such as attention. For example, electroencephalography [15], electrocardiography [16] or functionalinfra-red spectroscopy [17] are often used in flight/drive simulators to study cognitive and attentionalstates. However, none of these techniques can properly take measures remotely without direct contact

Safety 2018, 4, 8 3 of 15

with the human body. Meanwhile, the remote character of such a measurement device is one of theessential criteria to measure human attentional and cognitive behavior inside the cockpit, especiallyduring emergency situations. To highlight the importance of this point, let us take the example of thetragic, well-known flight AF-447 [18] from Rio de Janeiro to Paris that in 2009, after an aerodynamicstall, crashed over the Atlantic Ocean, taking 228 lives. As the captain interrupted his in-flight restand returned in an emergency to the cockpit, it is difficult to imagine him calmly putting on somewearable device and proceeding to a calibration, while vital decisions were to be made. It is worthnoting that there exist signal processing techniques that allow heart rate or body temperature to beestimated from video frames remotely. Nevertheless, these data are difficult to interpret and, especially,it is challenging to disentangle emotional reactions from the cognitive activity. There exist wirelessalternatives, such as watches or smart shirts that can monitor these parameters. However, wireless datatransfer may result in cybersecurity vulnerabilities and therefore is not recommended for transferringinformation relevant to flight safety.

Since eye-tracking devices can be directly embedded in the cockpit panel, one promising avenueis to consider this technology. Researchers have been publishing results of eye movements studies forover 60 years. However, the market of commercial eye trackers has been considerably democratizedonly during the last decade, and such devices today are non-invasive (can be remotely used withoutdirect contact with the body), relatively inexpensive, reliable and, importantly, eye-tracking data canbe available in near real time. All these advantages make the integration of eye trackers in modernflight decks realistic.

Due to the fact that “high definition” images are restricted to a small region called the fovea,humans need to shift their eyes constantly from one area to another to explore and monitor theirenvironment. The raw data samples from an eye tracker can serve to detect basic oculomotor events [19]such as fixations, during which we acquire useful visual information, and saccades, that serve tochange the location of the visual focus. These events allow the monitoring of ocular behavior andgive insights into the perceptual and cognitive processes [20] underlying reading, free viewing, anddecision-making (see, for example, [21,22]).

The use of eye-tracking in aeronautics is far from new. Starting with pioneer studies by Fitts andcolleagues around 1950 conducted in a flight simulator [23–25], many researchers have performedexperiments on pilots equipped with an eye tracker system. During his work, based on forty pilots’data, Fitts concluded that different flight instruments were checked with variable frequencies andrequired a different fixation length. He also found that more experienced pilots had a tendency to makeshorter fixations. Researchers have reproduced these results since then, but it was the first time that therecording of eye movements was demonstrated to be a valuable method of evaluating pilots’ ocularbehavior. At the time, researchers were confronted by the complexity of the recording set-up andelaborate manual analysis procedures. Nowadays, on the contrary, an eye tracker system can be easilyembedded in a flight simulator or even a real aircraft. Thus, eye-tracking has already been successfullyused to conduct high-fidelity flight simulator experiments [26–28] and real flight experiments [29,30].

These recent and promising advances in eye-tracking demonstrate the need for conceptualizingits integration into the cockpit. In the following section, we identify and define a framework of fourstages of eye-tracking integration in modern cockpits. These four stages are (I) Pilot Training, (II),On-board Gaze Recording, (III) Gaze-Based Flight Deck and (IV) Aircraft Adaptations. We support eachstage with a description of relevant incidents or accidents and explain in what way the eye-trackingintegration can enhance flight safety. An estimated milestone for the integration of each stage is alsoproposed together with a list of some implementation limitations.

2. Four Stages of Eye-Tracking Integration

In this section, we identify and define four stages of eye-tracking technology integration in thepiloting activity. The first stage relates to pilot training on-ground, whereas the other three stagescorrespond to operational flight situations. We highlight the interest of the eye-tracking in each of the

Safety 2018, 4, 8 4 of 15

four stages (with examples of aeronautical incidents/accidents for stages III and IV), give an estimatedintegration date milestone in years, and consider some implementation and use limitations. Figure 2presents a flowchart of this eye-tracking integration, showing the positioning of each of the four stages.

Safety 2018, 4, x FOR PEER REVIEW 4 of 15

estimated integration date milestone in years, and consider some implementation and use limitations. Figure 2 presents a flowchart of this eye-tracking integration, showing the positioning of each of the four stages.

Figure 2. Flowchart of the eye-tracking integration into modern cockpits, including all four stages of the framework. The eye and flight data are recorded (Stage II) and proceed to form a visual behavior database. This database can be used to enhance pilots’ training (Stage I) and to check the consistency of the visual behavior according to the flight context. If an inconsistency is detected, we can adapt flight deck (Stage III) or aircraft systems (Stage IV). Blue rectangles correspond to the existing elements. Green color indicate the elements that are to be integrated.

2.1. Stage I: Pilot Training/Flight Performance Analysis

In line with the recent training rule published by FAA in 2013 [11], which requires airlines to include a specific training program to improve monitoring by March 2019, one major axis of flight safety improvement is pilot training. In stage I of pilot training and flight performance analysis, eye-tracking data can be exploited in three different ways: as a flight expertise estimate, for incident debriefing and for example-based learning.

First of all, the eye-tracking data can be used to obtain an estimate of pilots’ monitoring (flight) skills. In their review of eye-tracking research in various domains (such as radiology, car driving, sports, and chess), Gegenfurtner and his colleagues [31] concluded that experts (compared to less experienced professionals) have, in general, shorter fixation durations during the comprehension of visualizations. This echoes Fitts’ previous conclusion about experienced pilots and their shorter fixation durations [23]. In the helicopter domain, during a simulated overland navigation task, Sullivan et al. [32] found an estimate of expertise cost (or rather benefit) on scan management skill. Their model predicted that, on average, for every additional thousand flight hours, the median

Figure 2. Flowchart of the eye-tracking integration into modern cockpits, including all four stages ofthe framework. The eye and flight data are recorded (Stage II) and proceed to form a visual behaviordatabase. This database can be used to enhance pilots’ training (Stage I) and to check the consistencyof the visual behavior according to the flight context. If an inconsistency is detected, we can adaptflight deck (Stage III) or aircraft systems (Stage IV). Blue rectangles correspond to the existing elements.Green color indicate the elements that are to be integrated.

2.1. Stage I: Pilot Training/Flight Performance Analysis

In line with the recent training rule published by FAA in 2013 [11], which requires airlines toinclude a specific training program to improve monitoring by March 2019, one major axis of flightsafety improvement is pilot training. In stage I of pilot training and flight performance analysis,eye-tracking data can be exploited in three different ways: as a flight expertise estimate, for incidentdebriefing and for example-based learning.

First of all, the eye-tracking data can be used to obtain an estimate of pilots’ monitoring (flight)skills. In their review of eye-tracking research in various domains (such as radiology, car driving, sports,and chess), Gegenfurtner and his colleagues [31] concluded that experts (compared to less experiencedprofessionals) have, in general, shorter fixation durations during the comprehension of visualizations.This echoes Fitts’ previous conclusion about experienced pilots and their shorter fixation durations [23].In the helicopter domain, during a simulated overland navigation task, Sullivan et al. [32] found

Safety 2018, 4, 8 5 of 15

an estimate of expertise cost (or rather benefit) on scan management skill. Their model predictedthat, on average, for every additional thousand flight hours, the median dwell time will decrease by28 ms, and the number of transitions between zones of interest (out-the-window and navigationalmap) will significantly increase. In their review of the literature on eye-movements in medicineand chess, Reingold and Sheridan [33] called it “superior perceptual encoding” of domain-relatedpatterns. However, this perceptual advantage is not the only thing gained with expertise. It isclosely related to the global processing advantage and knowledge of relevant visual patterns. Thus,Gegenfurtner et al. [31] found, for example, that experts fixate more on task-relevant areas and spendless time fixating on relevant information. This visual strategy of “information-reduction” [34] thathelps to optimize the visual information processing by separation of task-relevant from task-irrelevantinformation, was also found to differentiate expert from novice pilots in aviation [35,36]. For example,Schriver and colleagues [37] compared the distribution of visual attention using eye-tracking betweenexperienced and novice pilots during problem diagnosis in a flight simulator. They found, in particular,that faster correct decisions by experts during trouble diagnosis were accompanied by a correctattending of cues relevant to that failure. Van Meeuwen et al. [38] compared visual strategies of airtraffic controllers with a different level of expertise and found that experts had more effective strategies,and that, furthermore, the scan paths were more similar between expert controllers compared withnovices. More recently, a study of both PF and PM eye movements during the approach revealed thatthe PM’s attentional allocation was not optimal, especially during the short final. It corresponded toa high percentage of dwell time out of the window to the detriment of the processing of the flightparameters, such as the energy of the aircraft [39]. Similar measurements performed by Lefrançois andcolleagues [13] showed that the gaze allocation of pilots, who failed to stabilize the approach, was notoptimal compared to the pilots who succeeded. These results suggest that there is an optimal visualscan path for a given visual problem that can be acquired with years of expertise, but that can also belearned with eye-movement educational examples.

Secondly, recorded movie clips where the eye gaze position is superimposed on the visualscene could be used in training programs as example-based learning. Such eye movement modelingexamples were already successfully applied during cooperative problem-solving [40] and clinicalvisual observation [41,42]. Gaze following is a natural and innate tool of the learning [43], includingthe causal one. It allows the learner to discover what information is relevant and how to guide theirattention for a given goal by following the expert’s gaze. Jarodzka and colleagues [44] argued that theattentional guidance by gaze following is not only effective at improving current performance, butalso at fostering learning. Earlier, Nalanagula and colleagues [45] also suggested that gaze followingimproves training for visual inspection tasks. Recently, Leff and colleagues [46] showed that trainees’performance is enhanced when experts’ gaze guides their visual attention in a surgery context.

Eventually, the movie clips of replayed pilots’ eye movements could be used for subsequentanalysis and debriefing. Pilots do often request the recordings of flight parameters after a non-routineevent to understand better what happened during an incident and how to prevent it in the future. It isimpossible (and unnecessary) to remember all the eye movements we perform. Moreover, the majorityof fixation patterns are automatic and unconscious. Therefore, a replay of the attentional behaviorduring a simulated session, or a real flight (in the case of later integration of eye-tracking systems),would allow the pointing out of attentional errors by pilots, such as an excessive focus on a particularflight parameter or, on the contrary, its disregard. Let us note, however, that the user-friendly andsynthetic representation of eye movement recordings is not straightforward [47], and that furtherresearch is needed to provide intuitive tools to support analysis and reviewing of the eye-trackingdata. Thorough research is also needed to provide evidence that these eye movement examples caneffectively foster pilots’ learning.

To conclude this stage I, we suggest that eye-tracking technology is fairly useful for pilots’ trainingon the ground because (1) we can debrief their behavior by studying gaze patterns; (2) visual scanpaths can be learned by gaze following; and (3) monitoring skills can be estimated using eye-tracking

Safety 2018, 4, 8 6 of 15

data. The present stage can be already integrated and adopted by airline companies around theworld by using relatively inexpensive head-mounted eye-trackers available today on the market or byincorporating a remote eye tracker system in their training flight simulator.

2.2. Stage II: On-Board Gaze Recording

Eye-tracking has potential, not only on-ground for pilot training and flight skills analysis, butalso in operational settings. The first step of operational use is a recording of the crew’s points ofgaze to facilitate incident/accident investigations. All modern aircraft are equipped with a black boxcomprising a cockpit voice recorder and a flight data recorder for recording information, requiredfor crash investigation purposes. Obviously, pilots do not verbalize all their actions. Therefore, flightparameters and voice recordings are sometimes simply not enough to accurately reconstruct the courseof events. Eye-tracking technology can enhance these devices by also recording the gaze data. First, thetracking of pilots’ gaze is useful for accident aftermath to verify at which flight instruments the pilotslooked. The analysis of these data would help to deeply understand the failures in the human–systeminteraction (“Did the crew fail to perceive a critical warning? If so, why?”) that lead to an incident,and to avoid future repetition of the same circumstances. Secondly, pilots do not look exclusivelyoutside or on the cockpit instrument panel, they also monitor one another’s activity. In particular,they observe one another’s hands when they press a button, move the engine lever or point at adisplay [48]. Pointing in the airline cockpit is an important part of piloting; it guides one’s own andthe other’s visual attention and helps to establish shared situation awareness [49]. Thus, a recordingof the pilots’ gaze could help investigation agencies to infer joint awareness and crew coordination.Moreover, these data would be beneficial for cockpit manufacturers, revealing the effectiveness oftheir user interfaces and leading to design improvement. Similar benefits for the design of operationalprocedures are expected.

However, although technologically feasible, we recognize that the process of introducing anon-board gaze recording might take some time. For example, the Transportation Safety Board ofCanada indicates, in a recent reassessment of the responses to aviation safety, recommendationsconcerning cockpit image-recording systems [50] that “since 1999, the NTSB has issued 14recommendations to the FAA related to the installation of cockpit image-recording systems. To all but asingle recommendation, the NTSB has assessed the FAA responses as unacceptable”. The recent NTSB’ssafety recommendation [51] requires that each aircraft be retrofitted with a crash-protected cockpitimage recording system that should “be capable of recording, in color, a view of the entire cockpitincluding each control position and each action (such as display selections or system activations) takenby people in the cockpit.” Let us also note that, contrary to the cockpit image-recording systems, thisstage does not necessarily interfere with the pilots’ privacy. The gaze recordings are impersonal andshould record exclusively a gaze point within an established 3D-model of a given cockpit, or the nameof the corresponding area of interest chosen from an exhaustive list of all the flight deck instrumentsand visual displays. Thus, the eye-tracking recording would not entail any privacy infringement (suchas recording of pilots’ faces).

In the future, the cockpit voice recorder should be promoted to a human data recorder by alsorecording the gaze data. These data should comprise two channels to keep points of the gaze of boththe pilot flying and the pilot monitoring. Such application is already possible today because it wouldnot have any safety effects and, therefore, does not have to meet any failure probability requirement.A significant limitation of this stage is that the recordings should provide an accurate testimony offlight events. Therefore, it is mandatory to have an eye-tracking system of excellent quality, highprecision and robustness. Such on-board gaze recordings would provide valuable documentation forsafety boards to improve incident and accident investigations and may take the form of an additionalcolumn (e.g., “currently gazed area”) of data in the flight data recorder. These data can be used bythe flight crew or by the airline companies for the post-flight debriefing. For example, if a particularevent occurs during a flight, the flight crew or companies might want to review the visual circuits

Safety 2018, 4, 8 7 of 15

at the time of the problem. In any case, as for the training stage, there is a need for a tool that easesdata analysis and reviewing. Such volumes of data are difficult to process immediately and furtherresearch is needed to handle the large size of eye-tracking data. Altogether, we speculate that the earlyadaptation of such gaze recording will emerge within 5 years, the principal requirement being highreliability, precision and accuracy. If such a device would necessitate a calibration procedure, the setupshould be quick and simple (for example, during the preflight checklist).

2.3. Stage III: Gaze-Based Flight Deck Adaptation

The gaze recording is the simplest, but not the only, possible use of eye-tracking technologyin operational settings. Numerous studies have successfully used eye-tracking to detect differentdegraded cognitive states such as spatial disorientation [52], fatigue [53,54], attentional tunneling [55],or automation surprise [26]. Except for the adverse cognitive states, eye-tracking can also be used toinfer one’s intentions. Thus, for example, Peysakhovich et al. [56] showed that eye movements andpupil size could be predictive of upcoming decision-making in a simulated maritime environment. Inthe automobile domain, Zhou et al. [57], proposed a method to infer the intention of a truck driver tochange lanes based on eye-movements. Ha et al. [58] showed that it is possible to guess the thoughtsof a nuclear power plant operator using his or her eye movements. All these results let us imaginean operational support system that assists flight crews using human eye data as an input. Suchan enhancement of the flight deck would take the human psycho-physiological state into accountusing the eye-tracking data collected in real-time. An aircraft would thus adapt itself explicitly usingthe information derived from the crew’s gaze. Note, nevertheless, that if such a support tool wasconsidered as part of the aircraft system, safety objectives might be applied. Thus, if the failure of sucha system would have a minor safety effect (minor safety condition may include a slight reduction infunctional capabilities and a small increase in crew workload), then the aviation certification imposesat least 10−3 allowable failure probability [59]. As long as modern video-based eye-tracking systemscannot guarantee even the eye capture for 99.9% of the time (because of luminosity and geometryproblems), it will take some time before the eye-tracking technology, and the corresponding market,will evolve and achieve sufficient reliability. Moreover, much high-quality research is to be done todetect critical and complex cognitive states properly. However, the eye-tracking can be used in asimpler manner to detect the presence or the absence of an eye fixation on relevant information inthe cockpit.

To better understand how useful an eye-tracking system can be in operational flight situations, letus consider the following incident. An Air France Boeing 777-200 performing flight AF-471 was aboutto land at Paris Charles de Gaulle on 16 November 2011 [60]. During the approach, while the aircraftwas stabilized on the descent path, a Master Warning alarm indicated that the automatic landing modehad changed and this was called out by the relief pilot. According to the operator’s instructions, inthis case, a go-around should be called out and initiated, as was done by the PM. Then, accordingto the protocol, the PF should handle, in particular, the thrust and the pitch, while the PM shouldmanage the configuration change of the flaps. However, the PF erroneously and unintentionallypushed the auto-throttle disconnection switch instead of the Takeoff/Go-around switch that engagesthe go-around modes. Therefore, his nose-up inputs were contradictory with the auto-pilot systemthat was trying to keep the airplane on the descent trajectory. Meanwhile, the PM fully concentratedhis attention on monitoring the retraction of flaps, a process which takes about ten seconds. The PM’sfailure to properly monitor the aircraft’s attitude (position) and energy state had resulted in a brief lossof flight path and late adaptation of the go-around pitch. Thankfully, a third relief pilot was on boardand made two deviation callouts of “pitch attitude” that led the pilots to apply the nose-up input.In the report’s conclusion, the BEA underlined that “the serious incident was due to the inadequatemonitoring of flight parameters by the flight crew” [60]. Indeed, without the relief pilot on board whocalled out the pitch attitude, such behavior in the ground proximity could have resulted in a controlledflight into terrain, the second most recurrent type of accident [1].

Safety 2018, 4, 8 8 of 15

One of the reasons for the temporary loss of control was the PM’s excessive focus on the flaps’position indicator. Thus, he was ignoring the information on the aircraft’s attitude indicator (showingthe artificial horizon with bank and pitch angles), flight mode annunciator (to verify if the automaticmodes were consistent with actions) and engine parameters. Both the lack of pitch monitoring and theexcessive focus on flap configuration could be detected with an eye-tracking system. Thus, the firstwould be expressed as an absence of fixation on the attitude indicator, while the latter would inducean excessively long fixation on flap control. Once such an event was detected, and, in particular, whenan aircraft is about to exceed the nominal flight envelope, a counter-measure could be applied to thecrew. For instance, a brief retraction of the display of flap configuration could disengage the PM fromthe flaps and make him allocate some attention to the energy state and pitch attitude. A discrete, butvisible, highlight of these two latter parameters to attract the crew’s attention is also feasible.

Skippers Aviation de Havilland Dash 8-300 performing a charter flight to Laverton on 17 May 2012,is another example of an incident with degraded situation awareness as a contributing factor [61]. Whileconducting a visual circling approach to the runway, a high descent rate triggered the aircraft’s warningsystem. Despite the alarm, the crew continued their approach, despite the operator’s proceduresstipulating a go-around initiation in this situation. As noted in the safety board report [61], “the crew’smonitoring of the aircraft’s rate of descent and altitude relative to the minimum stabilization heightwas secondary to their monitoring of their position in relation to the ground. This external focusdegraded the crew’s overall situation awareness”. Thus, the crew was focused on descending througha break in the cloud and did not respond correctly to the sink rate alert and continued their approach.As in the previous situation, this unstable approach incident could have been avoided by re-engagingthe captain and the first officer’s attention to the rate of descent and altitude monitoring. The firstnecessary step to do so is to detect the pilots’ excessive fixation on the runway environment. Next,a brief counter-measure could be applied to disengage the captain and the first officer’s attentionfrom the outside. Such a counter-measure would reorient their attention to the rate of descent and thealtitude, which are necessary parameters to determine whether or not the approach is stabilized.

When eye-tracking systems are reliable enough to capture the crew’s gazes in all conditions(different visual angles, head movements, turbulences, luminance), the stage III of gaze-based flightdeck adaptation will be relatively easy to implement, at least at some basic level of logic. For example,for each instrument within the cockpit, with knowledge of the system and according to the flightphase (determined by real-time analysis of relevant flight parameters such as position, altitude andspeed), one may define the following: (1) maximum allowed dwell time; and (2) maximum allowedtime without monitoring. Then, if we detect that one of the rules is broken, a counter-measure couldbe applied. For instance, if the descent rate was not monitored (with a fixation) for some time byneither the Pilot Flying nor the Pilot Monitoring during the descent phase, an auditory alert may beplayed back. Alternatively, if we detect an extremely long fixation on flap configuration, for instance,the information can be retracted for a short period to re-engage the pilot’s attention. Nevertheless,before such an alerting system can be implemented, it is necessary to sort all the possible sourcesof information by priority order according to each flight phase and aircraft configuration. Indeed,information overload is one of the factors that promote pilots’ inadequate monitoring. Therefore, thesystem should carefully choose what information should be provided and verify that it is necessarilyessential for the flight safety. It is also worth noting that gazing at an instrument does not guaranteethat the information is processed, which is often referred to as “looked but failed to see”. However, itis possible to reduce the risk of not seeing the information and to prevent “not looked therefore failed”using the eye-tracking [62]. Figure 3 illustrates this simple logic in the form of a flowchart of stageIII. Taking all of this into account, we speculate that the early adaptation of such gaze-based supportsystems may occur within a decade.

Safety 2018, 4, 8 9 of 15Safety 2018, 4, x FOR PEER REVIEW 9 of 15

Figure 3. An example of a possible flowchart of stage III “Gaze-Based Flight Deck Adaptation”. The red pathway concerns an excessive focus on a particular instrument. The green pathway concerns a lack of instruments’ monitoring. In this example, the visual behavior database consists of a set of rules concerning maximum allowed values (a fixation duration, a time between two fixations) as a function of both instrument and flight context.

2.4. Stage IV: Gaze-Based Aircraft Adaptation

Eventually, the final and most distant (because of immature technology) step of the eye-tracking integration in the cockpit, is the gaze-based aircraft adaptation. To palliate humans’

Figure 3. An example of a possible flowchart of stage III “Gaze-Based Flight Deck Adaptation”. Thered pathway concerns an excessive focus on a particular instrument. The green pathway concerns alack of instruments’ monitoring. In this example, the visual behavior database consists of a set of rulesconcerning maximum allowed values (a fixation duration, a time between two fixations) as a functionof both instrument and flight context.

Safety 2018, 4, 8 10 of 15

2.4. Stage IV: Gaze-Based Aircraft Adaptation

Eventually, the final and most distant (because of immature technology) step of the eye-trackingintegration in the cockpit, is the gaze-based aircraft adaptation. To palliate humans’ limited attentionaland cognitive resources, different counter-measures and automation changes can be used and triggeredusing the pilot’s estimated attentional state. This stage can be combined with the previous stage IIIto provide the crew with some feedback about aircraft system changes. At stage IV, we propose toallow the automation to change the aircraft’s configuration or to take authority based on the crew’sgaze, and not only provide visualization enhancement and sonification as at stage III. For example,the automation could temporarily take authority over the crew in the case of incapacity of the humanoperator [63–65]. Such authority policy, for instance, already exists in F-16 fighters and will be acceptedin the next standard version of Dassault Rafale fighters. The aircraft automatically perform an auto-pullmaneuver if the pilot does not properly react to avoid an upcoming controlled flight into terrain. Inthis example, however, the automatic maneuver is triggered at the last moment to leave more freedomfor the pilot. However, eye-tracking can help to anticipate erroneous visual circuits that may precededegradation of the situation [56]. Important safety issues from such pilot–aircraft interaction implysignificant constraints on the probability of a failure condition. If such a system change would lead tohull loss and fatal injury of the flight crew, then the lower limit of the probability of a failure condition(classified as “catastrophic” in this case) should be between 10−6 and 10−9, depending on the aircraftclass [59].

Let us consider the following two examples to figure out how such gaze-based aircraft adaptationcould improve flight safety. A Flybe de Havilland Dash 8-400 was performing flight BE-1794 to Exeteron 11 September 2010 [66]. During the approach, the crew, being distracted by an unfamiliar avionicsfailure, forgot to monitor the flight path and descended below its cleared altitude. As states the report,“both pilots became distracted from the primary roles of flying and monitoring the aircraft and did notnotice that “altitude select” and “vertical speed” modes were no longer engaged”. A terrain proximitywarning system alerted the crew that the airplane was more than 700 ft below the selected altitude.After a difficult recovery, the aircraft continued to a safe landing.

Another similar incident occurred with the Flybe flight BEE247S on approach to EdinburghAirport on 23 December 2008 [67]. The aircraft descended below the glide slope, as the appropriatemode of the flight director was not selected. An approach controller noticed it before the flightcrew. As contributory factors, the safety board report cited “an absence of appropriate monitoringof the flight path and the FMA (Flight Mode Annunciator)”. As mentioned earlier, this lack of flightparameter surveillance can be detected by an eye-tracking system. It would be expressed as an absenceof fixations on the FMA, the descent rate and the altitude for some time. Within the integrationstage III, an auditory verbal alarm can remind the crew of the required visual controls during theapproach to palliate this oversight. In the case of stage IV, the difference is that the automation cantake the initiative. For example, in the case of the BE-1794 flight, the captain encountered a failure ofan Input–Output Processor and lost some of the primary flight display elements. He tried to regainthe indications of the primary flight display by switching the air data computer that disengaged the“altitude select” and “vertical speed” modes. That did not help to recover the indications and, as thepilot did not verify the FMA status, he was unaware of the disengagement of these modes. One couldimagine, that the aircraft re-engages the previously selected modes automatically (maybe with someauditory feedback) without the proper FMA verification neither from the pilot flying nor the pilotmonitoring. This authority policy is an example of stage IV eye-tracking integration in the cockpitwhen the aircraft takes over the pilot input if degraded situational awareness was detected. Bearing inmind both the present insufficient reliability of the eye-tracking system and significant consequencesof false detection of degraded situational awareness, we speculate that the early adaptation of thisstage IV is not to be expected within the next 20 years. Table 1 concludes this section by providing afew examples of possible eye-tracking usage within each of four integration stages.

Safety 2018, 4, 8 11 of 15

Table 1. Gaze data usage examples within each of the four integration stages.

Integration Stage Eye-Tracking Data Usage

Stage I: Pilottraining/Flight

Performance Analysis

• Movie clips with eye-movement modeling for example-based learning

• Statistics of dwell percentages on different instruments after atraining session

• Comparison of scan paths and fixation durations to evaluate the progress ofpilot trainees/estimate pilots’ skills

Stage II: On-Board GazeRecording

• Reconstruction of pilots’ visual attentional distribution during a flight forincidents/accidents investigation

• Analysis of crew’s joint attention and shared situational awareness

Stage III: Gaze-BasedFlight Deck Adaptation

• Display a notification at the point of the pilot’s gaze to ensure itsvisual processing

• Counter-measure an excessive focus on a parameter (expressed as anextremely long fixation) by retracting the information for a short period orhighlighting the relevant ones to attract attention

• An auditory notification if a critical parameter (according to the flight phase)is left without monitoring for a long period

Stage IV: Gaze-BasedAircraft Adaptation

• Turn on an automation mode if the crew is considered as unable to pursuethe operation

• Perform an automatic maneuver

3. Conclusions

The aviation domain can benefit from the growth and maturation of eye-tracking technology tofurther reduce the rate of incidents due to human factors. This paper is the first one of its genre to defineand retrace the framework consisting of four main stages for the integration of eye-tracking systemsinto cockpits. We ordered these four steps in the chronological order of their possible implementation,thus retracing the vector of the eye-tracking integration process into the cockpits. We believe that,if used wisely, the eye-tracking technology can facilitate and accelerate pilot training, facilitate theinvestigation of in-flight incidents and considerably enhance flight safety. While this paper reflectsupon the aviation domain, we also believe that this work will be useful for practitioners in otherdomains such as automobile, air traffic control and command and control centers.

Acknowledgments: This work was supported by the French Ministry of Higher Education and Researchscholarship. The publication fees were funded by Toulouse Tech Integration Jeunes Recrutés (TTI-JRINP-INSA-ISAE) 2017 individual grant.

Author Contributions: Vsevolod Peysakhovich wrote the paper, Olivier Lefrançois, Frédéric Dehais, and MickaëlCausse revised the paper.

Conflicts of Interest: The authors declare no conflict of interest.

References

1. Boeing. Statistical Summary of Commercial Jet Airplane Accidents Worldwide Operations, 1959–2014; AviationSafety, Boeing Commercial Airplanes: Seattle, WA, USA, 2015. Available online: http://www.boeing.com/news/techissues/pdf/statsum.pdf (accessed on 2 November 2017).

2. Shappell, S.; Detwiler, C.; Holcomb, K.; Hackworth, C.; Boquet, A.; Wiegmann, D.A. Human error andcommercial aviation accidents: An analysis using the human factors analysis and classification system.Hum. Factors 2007, 49, 227–242. [CrossRef] [PubMed]

Safety 2018, 4, 8 12 of 15

3. Lee, J.D. Human factors and ergonomics in automation design. In Handbook of Human Factors and Ergonomics,3rd ed.; John Wiley & Sons: Hoboken, NJ, USA, 2006; pp. 1570–1596.

4. Mumaw, R.J.; Sarter, N.; Wickens, C.D. Analysis of pilots’ monitoring and performance on an automatedflight deck. In Proceedings of the 11th International Symposium on Aviation Psychology, Columbus, OH,USA, 8–11 May 2001.

5. Parasuraman, R.; Manzey, D.H. Complacency and bias in human use of automation: An attentionalintegration. Hum. Factors 2010, 52, 381–410. [CrossRef] [PubMed]

6. Sumwalt, R.; Cross, D.; Lessard, D. Examining How Breakdowns in Pilot Monitoring of the Aircraft FlightPath. Int. J. Aviat. Aeronaut. Aerosp. 2015, 2. [CrossRef]

7. Federal Aviation Administration (FAA). Safety Alert for Operators No15011, Roles and Responsibilities for PilotFlying (PF) and Pilot Monitoring (PM); Flight Standards Service: Washington, DC, USA, 2015. Availableonline: https://www.faa.gov/other_visit/aviation_industry/airline_operators/airline_safety/safo/all_safos/#2015 (accessed on 2 November 2017).

8. Casner, S.M.; Schooler, J.W. Vigilance impossible: Diligence, distraction, and daydreaming all lead to failuresin a practical monitoring task. Conscious. Cognit. 2015, 35, 33–41. [CrossRef] [PubMed]

9. National Transportation Safety Board (NTSB). A Review of Flightcrew-Involved, Major Accidents of U.S. Carriers,1978 through 1990, Safety Study NTSB/SS-94/01; NTSB: Washington, DC, USA, 1994.

10. National Transportation Safety Board (NTSB). Descent Below Visual Glidepath and Impact with Seawall,Asiana Airlines Flight 214, Boeing 777-200ER, HL7742, San Francisco, California, July 6, 2013; Report No.AAR-14/01; NTSB: Washington, DC, USA, 2014. Available online: http://www.ntsb.gov/investigations/AccidentReports/Reports/AAR1401.pdf (accessed on 2 November 2017).

11. Federal Aviation Administration (FAA). 14 CFR Part 121 Qualification, Service, and Use of Crewmembers andAircraft Dispatchers; Final Rule; Federal Register: Washington, DC, USA, 2013; Volume 78, Issue 218. Availableonline: http://www.gpo.gov/fdsys/granule/FR-2013-11-12/2013-26845 (accessed on 2 November 2017).

12. British European Airways (BEA). Study on Aeroplane State Awareness during Go-Around. 2013. Availableonline: https://www.bea.aero/etudes/asaga/asaga.study.pdf (accessed on 2 November 2017).

13. Lefrançois, O.; Matton, N.; Causse, M.; Gourinat, Y. The role of Pilots’ monitoring strategies in flightperformance. In Proceedings of the 32nd EAAP Conference (European Association for Aviation Psychology),Cascais, Portugal, 26–30 September 2016.

14. Mehta, R.K.; Parasuraman, R. Neuroergonomics: A review of applications to physical and cognitive work.Front. Hum. Neurosci. 2013, 7, 889. [CrossRef] [PubMed]

15. Giraudet, L.; St-Louis, M.-E.; Scannella, S.; Causse, M. P300 event-related potential as an indicator ofinattentional deafness? PLoS ONE 2015, 10, e0118556. [CrossRef] [PubMed]

16. Borghini, G.; Astolfi, L.; Vecchiato, G.; Mattia, D.; Babiloni, F. Measuring neurophysiological signals in aircraftpilots and car drivers for the assessment of mental workload, fatigue and drowsiness. Neurosci. Biobehav. Rev.2014, 44, 58–75. [CrossRef] [PubMed]

17. Gateau, T.; Durantin, G.; Lancelot, F.; Scannella, S.; Dehais, F. Real-Time State Estimation in a Flight SimulatorUsing fNIRS. PLoS ONE 2015, 10. [CrossRef] [PubMed]

18. British European Airways (BEA). Final Report on the Accident on 1st June 2009 to the Airbus A330-203Registered F-GZCP Operated by Air France Flight AF 447 Rio de Janeiro–Paris. Report No. f-cp090601.en.2012. Available online: https://www.bea.aero/docspa/2009/f-cp090601.en/pdf/f-cp090601.en.pdf (accessed on2 November 2017).

19. Holmqvist, K.; Nyström, M.; Andersson, R.; Dewhurst, R.; Jarodzka, H.; Van de Weijer, J. Eye Tracking:A Comprehensive Guide to Methods and Measures; Oxford University Press: Oxford, UK, 2011.

20. McCarley, J.S.; Kramer, A.F. Eye movements as a window on perception and cognition. In Neuroergonomics:The Brain at Work; Parasuraman, R., Rizzo, M., Eds.; Oxford University Press: New York, NY, USA, 2007;pp. 95–112.

21. Rayner, K. Eye movements and attention in reading, scene perception, and visual search. Q. J. Exp. Psychol.2009, 62, 1457–1506. [CrossRef] [PubMed]

22. Orquin, J.L.; Loose, S.M. Attention and choice: A review on eye movements in decision making. Acta Psychol.2013, 144, 190–206. [CrossRef] [PubMed]

23. Fitts, P.M.; Jones, R.E.; Milton, J.L. Eye movements of aircraft pilots during instrument-landing approaches.Aeronaut. Eng. Rev. 1950, 9, 24–29.

Safety 2018, 4, 8 13 of 15

24. Jones, R.E.; Milton, J.L.; Fitts, P.M. Eye Fixations of Aircraft Pilots, I. A Review of Prior Eye-Movement Studies anda Description of a Technique for Recording the Frequency, Duration and Sequences of Eye-Fixations during InstrumentFlight; USAF Technical Report 5837; Wright-Patterson Air Force Base: Dayton, OH, USA, 1949.

25. Milton, J.L.; Jones, R.E.; Fitts, P.M. Eye Movements of Aircraft Pilots: II. Frequency, Duration, and Sequenceof Fixation When Flying the USAF Instrument Low Approach System (ILAS); USAF Technical Report 5839;Wright-Patterson Air Force Base: Dayton, OH, USA, 1949.

26. Dehais, F.; Peysakhovich, V.; Scannella, S.; Fongue, J.; Gateau, T. Automation Surprise in Aviation: Real-TimeSolutions. In Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems,Seoul, Korea, 18–23 April 2015; ACM: New York, NY, USA, 2015; pp. 2525–2534.

27. Van de Merwe, K.; van Dijk, H.; Zon, R. Eye movements as an indicator of situation awareness in a flightsimulator experiment. Int. J. Aviat. Psychol. 2012, 22, 78–95. [CrossRef]

28. Björklund, C.M.; Alfredson, J.; Dekker, S.W. Mode monitoring and call-outs: An eye-tracking study oftwo-crew automated flight deck operations. Int. J. Aviat. Psychol. 2006, 16, 263–275. [CrossRef]

29. Dahlstrom, N.; Nahlinder, S. Mental workload in aircraft and simulator during basic civil aviation training.Int. J. Aviat. Psychol. 2009, 19, 309–325. [CrossRef]

30. Dehais, F.; Causse, M.; Pastor, J. Embedded eye tracker in a real aircraft: New perspectives on pilot/aircraftinteraction monitoring. In Proceedings of the 3rd International Conference on Research in Air Transportation,Fairfax, VA, USA, 1–4 July 2008; Federal Aviation Administration: Washington, DC, USA, 2008.

31. Gegenfurtner, A.; Lehtinen, E.; Säljö, R. Expertise differences in the comprehension of visualizations:A meta-analysis of eye-tracking research in professional domains. Educ. Psychol. Rev. 2011, 23, 523–552. [CrossRef]

32. Sullivan, J.; Yang, J.H.; Day, M.; Kennedy, Q. Training simulation for helicopter navigation by characterizingvisual scan patterns. Aviat. Space Environ. Med. 2011, 82, 871–878. [CrossRef] [PubMed]

33. Reingold, E.M.; Sheridan, H. Eye Movements and Visual Expertise in Chess and Medicine. In OxfordHandbook on Eye Movements; Oxford University Press: Oxford, UK, 2011; pp. 528–550.

34. Haider, H.; Frensch, P.A. Eye movement during skill acquisition: More evidence for the information-reductionhypothesis. J. Exp. Psychol. Learn. Mem. Cognit. 1999, 25, 172–190. [CrossRef]

35. Bellenkes, A.H.; Wickens, C.D.; Kramer, A.F. Visual scanning and pilot expertise: The role of attentionalflexibility and mental model development. Aviat. Space Environ. Med. 1997, 68, 569–579. [PubMed]

36. Kasarskis, P.; Stehwien, J.; Hickox, J.; Aretz, A.; Wickens, C. Comparison of expert and novice scan behaviorsduring VFR flight. In Proceedings of the 11th International Symposium on Aviation Psychology, Dayton,OH, USA, 8–11 May 2017; pp. 1–6.

37. Schriver, A.T.; Morrow, D.G.; Wickens, C.D.; Talleur, D.A. Expertise differences in attentional strategiesrelated to pilot decision making. Hum. Factors 2008, 50, 864–878. [CrossRef] [PubMed]

38. Van Meeuwen, L.W.; Jarodzka, H.; Brand-Gruwel, S.; Kirschner, P.A.; de Bock, J.J.; van Merriënboer, J.J.Identification of effective visual problem solving strategies in a complex visual domain. Learn. Instr. 2014,32, 10–21. [CrossRef]

39. Reynal, M.; Colineaux, Y.; Vernay, A.; Dehais, F. Pilot Flying vs. Pilot Monitoring during the Approach Phase:An Eye Tracking Study; HCI-Aero: Paris, France, 2016.

40. Velichkovsky, B.M. Communicating attention: Gaze position transfer in cooperative problem solving.Pragmat. Cognit. 1995, 3, 199–223. [CrossRef]

41. Jarodzka, H.; Balslev, T.; Holmqvist, K.; Nyström, M.; Scheiter, K.; Gerjets, P.; Eika, B. Conveying clinical reasoningbased on visual observation via eye-movement modelling examples. Instr. Sci. 2012, 40, 813–827. [CrossRef]

42. Litchfield, D.; Ball, L.J.; Donovan, T.; Manning, D.J.; Crawford, T. Viewing another person’s eye movementsimproves identification of pulmonary nodules in chest X-ray inspection. J. Exp. Psychol. Appl. 2010, 16,251–262. [CrossRef] [PubMed]

43. Flom, R.E.; Lee, K.E.; Muir, D.E. Gaze-Following: Its Development and Significance; Lawrence ErlbaumAssociates Publishers: Mahwah, NJ, USA, 2007.

44. Jarodzka, H.; van Gog, T.; Dorr, M.; Scheiter, K.; Gerjets, P. Learning to see: Guiding students’ attention via amodel’s eye movements fosters learning. Learn. Instr. 2013, 25, 62–70. [CrossRef]

45. Nalanagula, D.; Greenstein, J.S.; Gramopadhye, A.K. Evaluation of the effect of feedforward training displaysof search strategy on visual search performance. Int. J. Ind. Ergon. 2006, 36, 289–300. [CrossRef]

Safety 2018, 4, 8 14 of 15

46. Leff, D.R.; James, D.R.; Orihuela-Espina, F.; Kwok, K.W.; Sun, L.W.; Mylonas, G.; Athanasiou, T.; Darzi, A.W.;Yang, G.Z. The impact of expert visual guidance on trainee visual search strategy, visual attention and motorskills. Front. Hum. Neurosci. 2015, 9. [CrossRef] [PubMed]

47. Peysakhovich, V.; Hurter, C. Scanpath visualization and comparison using visual aggregation techniques.J. Eye Mov. Res. 2018, 10. [CrossRef]

48. Hutchins, E.; Palen, L. Constructing Meaning from Space, Gesture, and Speech. In Discourse, Tools andReasoning: Essays on Situated Cognition; Springer: Berlin/Heidelberg, Germany, 1997; pp. 23–40.

49. Nevile, M. Seeing the point: Attention and participation in the airline cockpit. In Interacting Bodies, Proceedingsof the 2nd International Conference of the International Society for Gesture Studies, Lyon, France, 15–18 June 2005;Mondada, L., Markaki, V., Eds.; ENS LSH & ICAR Research Lab: Lyon, France, 2007.

50. Transportation Safety Board of Canada. Reassessment of the Responses to Aviation Safety RecommendationA03-08. 2014. Available online: http://www.bst-tsb.gc.ca/eng/recommandations-recommendations/aviation/2003/rec_a0308.pdf (accessed on 2 November 2017).

51. National Transportation Safety Board (NTSB). Safety Recommendation Ref. A-15-1 through -8. 2015.Available online: http://www.ntsb.gov/safety/safety-recs/recletters/a-15-001-008.pdf (accessed on2 November 2017).

52. Cheung, B.; Hofer, K. Eye tracking, point of gaze, and performance degradation during disorientation. Aviat.Space Environ. Med. 2003, 74, 11–20. [PubMed]

53. McKinley, R.A.; McIntire, L.K.; Schmidt, R.; Repperger, D.W.; Caldwell, J.A. Evaluation of eye metrics as adetector of fatigue. Hum. Factors 2011, 53, 403–414. [CrossRef] [PubMed]

54. Caldwell, J.A.; Mallis, M.M.; Caldwell, J.L.; Paul, M.A.; Miller, J.C.; Neri, D.F. Fatigue countermeasures inaviation. Aviat. Space Environ. Med. 2009, 80, 29–59. [CrossRef] [PubMed]

55. Régis, N.; Dehais, F.; Rachelson, E.; Thooris, C.; Pizziol, S.; Causse, M.; Tessier, C. Formal Detection of AttentionalTunneling in Human Operator–Automation Interactions. IEEE Trans. Hum. Mach. Syst. 2014, 44, 326–336.[CrossRef]

56. Peysakhovich, V.; Vachon, F.; Vallières, R.B.; Dehais, F.; Tremblay, S. Pupil dilation and eye movements can revealupcoming choice in dynamic decision-making. Proc. Hum. Factors Ergon. Soc. Annu. Meet. 2015, 59, 210–214.[CrossRef]

57. Zhou, H.; Itoh, M.; Inagaki, T. Eye movement-based inference of truck driver’s intent of changing lanes.SICE J. Control Meas. Syst. Integr. 2009, 2, 291–298. [CrossRef]

58. Ha, J.S.; Byon, Y.J.; Baek, J.; Seong, P.H. Method for inference of operators’ thoughts from eye movementdata in nuclear power plants. Nucl. Eng. Technol. 2015, 48, 129–143. [CrossRef]

59. Federal Aviation Administration (FAA). Advisory Circular No 23.1309-1E “System Safety Analysis andAssessment for Part 23 Airplanes”. 2011. Available online: http://www.faa.gov/documentLibrary/media/Advisory_Circular/AC%2023.1309-1E.pdf (accessed on 2 November 2017).

60. British European Airways (BEA). Momentary Loss of Control of the Flight Path during a Go-Around (ReportNo. f-pp111116.en). 2014. Available online: http://www.bea.aero/docspa/2011/f-pp111116.en/pdf/f-pp111116.en.pdf (accessed on 2 November 2017).

61. Australia’s Transportation Safety Board. Unstable Approach Involving de Havilland Canada Dash 8 VH-XFZ(Report No. AO-2012-070). 2013. Available online: http://www.atsb.gov.au/media/4471254/ao-2012-070_final.pdf (accessed on 2 November 2017).

62. Fletcher, L.; Zelinsky, A. Driver inattention detection based on eye gaze—Road event correlation. Int. J.Robot. Res. 2009, 28, 774–801. [CrossRef]

63. Tessier, C.; Dehais, F. Authority Management and Conflict Solving in Human-Machine Systems. AerospaceLab2014, 4, 1–8.

64. Inagaki, T. Design of human–machine interactions in light of domain-dependence of human-centeredautomation. Cognit. Technol. Work 2006, 8, 161–167. [CrossRef]

65. Inagaki, T.; Sheridan, T.B. Authority and responsibility in human–machine systems: Probability theoreticvalidation of machine-initiated trading of authority. Cognit. Technol. Work 2012, 14, 29–37. [CrossRef]

Safety 2018, 4, 8 15 of 15

66. Air Accidents Investigation Branch (AAIB). DHC-8-402 Dash 8, G-JECF (Report No. EW/C2010/09/04).2012. Available online: https://assets.digital.cabinet-office.gov.uk/media/5422fc25ed915d1371000899/DHC-8-402_Dash_8_G-JECF_06-12.pdf (accessed on 2 November 2017).

67. Air Accidents Investigation Branch (AAIB). DHC-8-402 Dash 8, G-JECI (Report No. EW/C2008/12/05). 2010.Available online: https://assets.digital.cabinet-office.gov.uk/media/5423028e40f0b61342000a93/DHC-8-402_Dash_8__G-JECI_03-10.pdf (accessed on 2 November 2017).

© 2018 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open accessarticle distributed under the terms and conditions of the Creative Commons Attribution(CC BY) license (http://creativecommons.org/licenses/by/4.0/).