This project investigates the flight behavior of airline pilots and the factors affecting that behavior in a series of challenging simulated flights. We need software that can “playback” state of the simulator so events can be viewed for coding. We also need to display the coordinates specifying eye fixations on this display. Some additional functionality is also needed. This is a continuing project with development in Python. Our study produces a large, heterogeneous data set including a variety of types of written records, a large corpus of simulator log files, and associated eye-tracking data. We want to compare pilot activity on different flights, on a variety of measures, such as the time between two events (e.g. an initiating challenge and following response) and the frequency of particular patterns of behavior. The overall project will need identifying, coding, and analyzing the occurrence and timing of behaviors and linking behaviors in different data types (captured by the simulator, by eye tracker, or by manual annotations).

The Project 

[[{"fid":"1917","view_mode":"width_400","fields":{"format":"width_400","field_file_image_alt_text[und][0][value]":false,"field_file_image_title_text[und][0][value]":false},"type":"media","field_deltas":{"1":{"format":"width_100","field_file_image_alt_text[und][0][value]":false,"field_file_image_title_text[und][0][value]":false},"2":{"format":"width_400","field_file_image_alt_text[und][0][value]":false,"field_file_image_title_text[und][0][value]":false}},"attributes":{"style":"float: left;","class":"media-element file-width-400","data-delta":"2"}}]] Student researchers in this project teamed up with their project partner to come up with a more cost efficient and simpler way of visualizing and interpreting eye tracking data of pilots in commercial and defense aircrafts. They decided to do this by overlaying the eye-tracking data onto a flight panel.

They tackled mathematical inconsistencies in their data, addressed frequency mismatches, synchronized hundreds of individual components, and found the best way to overlay the eye tracking data onto a meticulously designed, accurate and proportioned digital panel modelled on a pilot’s cockpit.  A demo can be seen on the left. 

The red lines are dynamic and indicate the movement of the eyes, updating with the data. This makes for a much better and simpler visualization of how a pilot’s eyes view the control panel and will help NASA and other agencies to optimize the layout of the control panel to allow easier navigation and command over the control for the pilot.

In this way, their research is much more than just a visualization. This gives researchers better insight into the psychology of pilots and assess their actions with respect to external stimuli. Understanding the behaviour and vision of the pilots has been an important topic ever since a number of high profile aircraft crashes caused due to erratic and confused behaviour on part of the pilots. Many advanced simulators cost millions of dollars and the research of this group has the potential to cut these costs massively.

Term
Spring 2020
Topic
Data Visualizations
Physical Science/Engineering