It’s one of the conundrums of the drone industry: autonomy is necessary to maximize the benefits of uncrewed systems. Anyone who has watched a science fiction movie or read the news, however, understands that a high level of autonomy comes with a level of risk: risk that the decisions made by a computer may, someday, be incorrect.
That was the topic of a session at the AUVSI NE UAS and AAM Summit October 26. The session was led by Dr. Javier de Luis, Aerospace Consultant. dr Javier de Luis is also the brother of Graziella de Luis who was killed on-board a 737 Max in Ethiopia on March 10, 2019. de Luis has researched deeply what went wrong with the 737 Max that caused two tragic crashes, including the one that killed his sister. While there was no single issue or problem that led to the outcome, Dr. de Luis makes a compelling argument for focusing on some of the systemic problems which could help avoid a similar tragedy in the drone industry or in any highly automated system.
The Technical Problem with the 737 Max
The identified fault in the 737 Max was found in the MCAS: the Maneuvering Characteristics Augmentation System. “It’s a system most of you probably haven’t heard of,” said de Luis. “More unfortunately, it was a system most of the pilots hadn’t heard of either.” The MCAS was designed to prevent a stall by activating when the angle between the wing and the airflow rises too high.
It’s simplistic to simply blame the crash on the MCAS autonomous system, however. Many failures in the development and certification of the aircraft contributed to the ultimate outcome. “MCAS relied on one single sensor. And when activated, it activated repeatedly,” explains de Luis. “All of that would have been OK if the pilots had been trained on what to do if it failed, but MCAS was downplayed to fast-track certification and reduce training requirements. And the engineers sort of convinced themselves that it was going to be OK.”
Diving into the Issues: What Went Wrong with Development?
The accident that costs Dr. de Luis his sister had a complex origin – and there are lessons to be learned from the tragedy that go beyond the autonomous system that caused it. It’s an issue that cannot thoroughly and fairly be covered in a short article or speech, but there were predictable and obvious problems that stand out, and are relevant to the aircraft development and certification process for new advanced aircraft and systems.
“This was a disaster,” said de Luis. “Technical, managerial, and regulatory deficiencies all contributed. Accidents rarely have one single cause… Autonomy has all kinds of feedback loops that start to go wrong when you encounter a situation that you haven’t thought of.”
In the development process, de Luis explained, poor strategic planning created a catch up mentality in Boeing management. Surprised by the release and success of the Airbus competitive aircraft, Boeing was driven by the need to produce a new aircraft in a short time frame. While “no one set out to produce a bad aircraft,” de Luis pointed out, Boeing management was removed – both physically and culturally – from the engineering teams. Any engineers who may have had concerns about the system were unable to bring those concerns to the table for consideration by the right people.
Immediately following the tragedy, individual pilots were blamed for the accident. dr de Luis strongly disagrees with that premise.
Boeing was committed to getting the 737 Max certified as a follow on to an existing design, rather than as a new aircraft. “That meant that ‘no new training’ was non-negotiable,” de Luis pointed out. “You can’t blame pilots for not knowing something you didn’t tell them in the first place.”
“I have no doubt that excellent piloting skills can sometimes overcome bad design,” de Luis said. “But it still doesn’t excuse bad design.”
In the drone industry, aircraft certification is a hot topic. The FAA’s careful process may be in part the result of lessons learned – and are an important part of preventing the next tragedy.
For the Boeing 747 Max, certification was expected to be a rubber stamp. “There was a mentality that regulatory requirements were fungible,” said de Luis. “Boeing felt that if there were any problems on the regulatory side they could fix it, by talking to the right person.”
This is in part due to a shift over time to an ODA (Organizational Design Authorization) methodology, replacing the traditional DER (Designated Engineering Representative) methodology. In a large organization, who is ultimately responsible for signing off? “DER dilutes individual responsibility,” de Luis said. “At the FAA, political appointees can overrule technical recommendations. Decisions are often based on non-public information, not available for independent review.”
There was no one cause for the 747 Max failure. That doesn’t mean that it’s not worth studying to inform the design and certification of future aircraft. “It’s easy to get frustrated by the complexity of the problem,” said de Luis. “It’s human nature to look for simple answers. Complexity hides the true cause and responsibility.”
“But when you go back and think about what you know for sure: airplanes shouldn’t fall out of the sky because one sensor fails. Gravity never gives up.”
Miriam McNabb is the Editor-in-Chief of DRONELIFE and CEO of JobForDrones, a professional drone services marketplace, and a fascinated observer of the emerging drone industry and the regulatory environment for drones. Miriam has penned over 3,000 articles focused on the commercial drone space and is an international speaker and recognized figure in the industry. Miriam has a degree from the University of Chicago and over 20 years of experience in high tech sales and marketing for new technologies.
For drone industry consulting or writing, Email Miriam.
Subscribe to DroneLife here.