Featured Product
This Week in Quality Digest Live
Innovation Features
Lisa Apolinski
Adding what customers want
Adam Zewe
Understanding how machine-learning models behave to apply them more broadly
Edd Gent
Increasingly visible and accessible, but with caveats
Matt Fieldman
German system offers tips for U.S. counterparts
Belinda Jones
Reaping the benefits of manufacturing intelligence

More Features

Innovation News
Technique could lead to next-generation transistors based on materials other than silicon
Equator system aids manufacturers of precision firearm parts
Datanomix chosen for its No Operator Input approach to production monitoring and out-of-the-box data automation
New lines improve software capability and analysis
Printable, steam jet-resistant PCS for automotive applications
VSL hosts special edition of show at new center in Rotterdam
Latest line touts comprehensive coverage, ease of use

More News

Tannaz Mirchi


How Much Should Air Traffic Controllers Trust New Flight Management Systems?

Understanding how people interact within complex systems

Published: Wednesday, December 21, 2016 - 16:38

With airfares at their lowest point in seven years and airlines adding capacity, this year’s holiday air travel is slated to be 2.5 percent busier than last year. The system we use to coordinate all those flights, however, is decades old, and mostly depends on highly trained air traffic controllers, who keep track of where all the planes are, where they’re heading, how fast they’re going, and at what altitude.

As the national airspace gets more crowded, and as technology improves, the Federal Aviation Administration (FAA) has begun upgrading the air traffic control systems. The new system is called NextGen, and some of its capabilities are already being rolled out across the country. It is intended to make air traffic faster, more efficient, more cost-effective, and even, through fuel savings, less damaging to the environment. It will also help air traffic controllers and pilots alike handle potential hazards, whether they involve weather, other aircraft, or equipment problems.

But we the traveling public will be able to realize all these benefits only if future air traffic controllers make the most of the technology. As a human factors researcher seeking to understand how people interact within complex systems, I have found that there are challenges for controllers learning to properly trust the computer systems keeping America in the air.

Use as directed

The NextGen system is designed for humans and computers to work in tandem. For example, one element involves air traffic controllers and pilots exchanging digital text messages between the tower and airplane computer systems, as opposed to talking over the radio. This arrangement has several benefits, including eliminating the possibility someone might mishear a garbled radio transmission.

Human controllers will still give routing instructions to human pilots, but computers monitoring the airspace can keep an eye on where planes are and automatically compare that to where they are supposed to be, as well as how close they get to each other. Automated conflict detection tools can alert controllers to possible trouble and offer safer alternatives.

In addition, air crews will be able to follow routing instructions more quickly, accepting the digital command from the ground directly into the plane’s navigation system. This, too, requires human trust in automated systems. That is not as simple as it might sound.

Trust in automation

When the people who operate automated tools aren’t properly informed about their equipment—including what exactly it can and can’t do—problems arise. When humans expect computerized systems to be more reliable than they are, tragedy can result. For example, the owner killed in the fatal Tesla crash while in autopilot mode may have become over-reliant on the technology or used it in a way beyond how it was intended. Making sure human expectations match technical abilities is called “calibration.”

When people and machinery are properly calibrated to each other, trust can develop. That’s what happened during a 16-week course training air traffic controller students on a desktop air traffic control simulator.

Researchers typically measure trust in automated systems by asking questions about the operator’s evaluations of the system’s integrity, the operator’s confidence in using the system, and how dependable the operator thinks the system is. There are several types of questionnaires that ask these sorts of questions; one of them, a trust scale aimed at the air traffic management system as a whole, was particularly sensitive to discerning changing trust in the student group I studied.

I asked the air traffic controller students about their trust in the automated tools such as those provided by NextGen on the first day, at the midterm exam during week nine of their course, and at the final exam at the end of the training. Overall, the students’ trust in the system increased, though some trusted it more than others.

Too much trust or too little?

There is such a thing as trusting technology too much. In this study, some students, who trusted the system more, were actually less aware than their less-trusting classmates of what was going on in the airspace during simulated scenarios at the final exam with lots of air traffic. One possible explanation could be that those with more trust in the system became complacent and did not bother expending the effort to keep their own independent view (or “maintain the picture,” as air traffic controllers say).

These more-trusting students might have been more vulnerable to errors if the automation required them to manually intervene. Correlation analyses suggested that students with more trust were less likely to engage in what might be called “nontrusting” behaviors, like overriding the automation. For example, they were less likely to step in and move aircraft that the automated conflict-detection tools determined were far enough apart, even if they personally thought the planes were too close together. That showed they were relying on the automation appropriately.

These trust disparities and their effects became clear only at the final exam. This suggests that as they became familiar with the technology, students’ trust in the systems and their actions when using it changed.

Previous research has shown that providing specific training in trusting the automation may reduce students’ likelihood of engaging in nontrusting behaviors. Training should aim to make trainees more aware of their potential to overly trust the system, to ensure they remain aware of critical information. Only when the users properly trust the system—neither too much nor too little—will the public benefits of NextGen truly be available to us all.

The ConversationThis article was originally published on The Conversation. Read the original article.


About The Author

Tannaz Mirchi’s picture

Tannaz Mirchi

Tannaz Mirchi is a human factors engineer at Pacific Science & Engineering and an online lecturer for an introductory human factors course at California State University, Long Beach. Mirchi has a bachelor’s degree in psychology and a master’s degree in human factors.



Training is important, but until the system gets setup which will cost $$$$, it won't make a whole lot of difference.  Many government programs also become obsolete before implementation.  We need to put a priority on some and less on others, pay attention to wasteful spending (using the Sneator form Oklahoma's waste report as a guidline to start).  I don't have all of the information needed to make that decision, nor am I an a position to influence it either.

In all cases we need to remember what was said by a French Philosopher back in the 1800's:  Everyone wishes to live at the expense of the state, forgetting that the state lives at the expense of everyone.  Paraphrase of Frederic Bastiat's quote.  Someone painted it ona wall in Downtown Denver back in the early 1980's,  ihave quoted and requoted it many times since then.

Thanks for highlighting the subject.