This article is published in Aviation Week & Space Technology and is free to read until Dec 06, 2024. If you want to read more articles from this publication, please click the link to subscribe.

Opinion: AI Cannot Replace Human Responsibility In Aviation

fingers pointing to digital cloud device

FAA standards for uncrewed aircraft systems still put human responsibility at the forefront.

Credit: Peachaya Tanomsup/Alamy Stock Photo

A popular joke about artificial intelligence compares its risks to those of natural stupidity. The implication is that humans have caused so much trouble that it cannot be worse to turn decisions over to the machine.

In aviation, human competence is essential to the safety rules governing design, production, operations and maintenance. Air-breathing individuals are granted authority either collectively or individually to perform, supervise and/or approve for return to service the work performed on civil aircraft under Part 43. Owners, operators and pilots carry responsibility for safe operation under Parts 91, 121, 135 and others while flying equipment manufactured by “persons” subject to the rules.

Even Part 107’s standards for uncrewed aircraft systems put human responsibility front and center: §107.12 requires FAA certifications for remote pilots in command of such aircraft. While no person is on the aircraft, the human is a required part of the system directing its flight.

ARSA has written about the excitement caused by new and “emerging” technologies, which regularly exceeds the attention that should be given to exciting equipment established in non-aerospace fields before it “emerges” for us. Machine learning tools can simplify tasks, improve analysis and remove burdens from the humans bearing responsibility for safety. What AI cannot do is take on direct responsibility, which the rules vest in the “persons” certificated or authorized by the FAA.

Rather than seek artificial replacements for humans, the industry’s challenge is to improve the lives of individuals on whom system safety relies. A 2021 report accepted by the FAA’s Safety Oversight and Certification Advisory Committee (SOCAC) provided a road map toward that investment. Still the only official SOCAC output, the analysis was produced by the body’s Workforce Development and Training Working Group, which had been charged with developing standards of knowledge and skills for government personnel overseeing aviation certificates.

“The Federal Aviation Administration (FAA) and industry must develop an aviation safety workforce that can accommodate and respond to modern oversight methodologies and technology,” the report’s executive summary said. “To aid that goal, the SOCAC Subcommittee Working Group examined strategies and methods for attaining knowledge and critical thinking skills to support current and future aviation safety duties.”

The result of that examination was a tiered system of knowledge assessment for regulatory compliance, technology and professionalism. Across these disciplines, effective training ranges from beginning to advanced knowledge and provides specific in-depth experience for those with relevant job responsibilities. The report projected knowledge retention using a “cone of learning” model—from minimally retained reading of information to combining mental and motor skills to ensure the fullest understanding and memory of a lesson.

It is fun to joke about human frailty. And, kidding aside, artificial tools can be a great help. But finding and bolstering natural intelligence is vital to aviation safety.


Brett Levanto is vice president of operations at Obadal, Filler, MacLeod & Klein, managing firm and client communications in conjunction with regulatory and legislative policy initiatives. He provides strategic and logistical support for the Aeronautical Repair Station Association.