Boeing 737 MAX Raises Concerns Over How FAA Will Ensure The Safety Of
Autonomous Aircraft
Uber air taxi
Jeff Holden, Uber's
chief product officer. Uber envisions a fleet of electric-powered 'flying
taxis.' They would be piloted at first, but to achieve scale, urban air taxis
will need to be autonomous.
After two deadly crashes of Boeing's 737
MAX believed to be linked to design flaws in a flight control system,
policymakers and investigators are examining why the U.S. Federal Aviation
Administration didn't spot the problems during the plane's certification. Some
aerospace observers say it underlines a looming problem: that the agency may not
be equipped to vet the safety of the much more complicated software that will
enable the next generation of flight, including autonomous drones and pilotless
urban air taxis.
"We need to have an oversight agency that has modernized
in a way that allows them to engage deeply with these technical experts at the
companies asking for certification," says Ella Atkins, an aerospace engineering
professor at the University of Michigan whose research is focused on autonomous
systems.
Boeing has halted deliveries of the 737 MAX after the March 10
crash of a model of the plane operated by Ethiopian Airlines, the second crash
of a 737 MAX in five months. CEO Dennis Muilenburg has acknowledged that a new
flight control feature on the new version of its bestselling plane called the
maneuvering characteristics augmentation system (MCAS) contributed to the
crashes, which killed 346 people. The Chicago-based jet maker will give its next
briefing on its business during its quarterly earnings report Wednesday.
Analysts are expecting a drop in adjusted profit.
The FAA is also under
scrutiny, both for lagging other countries' aviation regulators in grounding the
737 MAX after the Ethiopia crash, and for its certification of MCAS. Congress,
an international panel of aviation authorities, the Department of
Transportation's inspector general and federal prosecutors are all examining
MCAS, which that was designed to automatically push the MAX's nose down during
certain maneuvers to counter the plane's tendency to pitch upward due to the
placement of its new larger engines. MCAS was classified as not critical to
safety, allowing it to be triggered by a single, non-redundant sensor. The FAA
and Boeing assumed that if MCAS malfunctioned, pilots would recognize it as a
problem with the plane's automatic stabilizer trim system, which was on the
previous version of the 737, and switch it off using previously established
procedures.
But according to preliminary reports from the investigations
into the Ethiopian crash and the prior accident, the loss of a Lion Air jet off
Indonesia on Oct. 29, sensor failures in both cases improperly triggered MCAS,
and the pilots were unable to counteract it, leading to fatal
dives.
Despite the 737 MAX crashes, many observers say the current safety
certification process for aircraft software has generally worked well. Safety
critical programming rarely fails to operate as designed; rather what problems
there have been have tended to stem from failures to foresee danger points in
the design specifications, including the unexpected ways that pilots can
interact with the system, as seems to be the case with MCAS.
"It's very
hard to analyze mixed human-automation systems, in part because humans don't
behave in a reliable way," says R. John Hansman, a professor of aeronautics at
the Massachusetts Institute of Technology.
Software problems appear to
have contributed to only a handful of air accidents (including one case
involving an Airbus A330 that bears some similarities to the two Boeing 737 MAX
crashes). Safety has been aided by the slow pace of change in aviation - new
generations of airliners tend to be spaced out by 10 to 20 years - and plane
makers' tendency to reduce costs by reusing already certified code rather than
completely rewriting software.
Even the flight controls of Boeing's most
advanced airliner, the 787, incorporates programming developed for much older
planes, says Hansman.
With a limited budget and staff, the FAA has for
decades relied on industry to shoulder most of the burden of certifying the
safety of aircraft. As of 2013, more than 90% of the work was being done by
deputized consultants and employees at the manufacturers it oversees, according
to a report from the Government Accountability Office. With software, the
certification process is focused on spelling out what the programming is needed
to do and ensuring that the code matches up to those requirements. Designated
representatives of the FAA will guide the development of a testing scheme by the
developer and audit some of those tests, but FAA itself doesn't execute any of
the code.
Daniel Elwell, the FAA's acting administrator, told a
congressional panel after the Ethiopian crash that it would require roughly
10,000 more employees and another $1.8 billion for the agency to do the
certification job by itself.
Nonetheless, experts warn that the agency
will need a more sophisticated approach to assess the algorithms being developed
to direct autonomous drones and urban air taxis.
With current flight
control systems, "you just verify that given certain inputs you get an expected
output," says Mykel Kochenderfer, an aerospace professor at Stanford who's the
co-director of the university's Center for AI Safety and the director of the
SAIL-Toyota Center for AI Research.
The software controlling autonomous
cars and aircraft will have to be capable of learning from experience and
reacting to situations the designers couldn't anticipate, and its decisions may
be hard to interpret.
"When our autonomous system is doing something
counterintuitive, is it doing something wrong or right? Sometimes the
explanation for that behavior is very complicated," says
Kochenderfer.
Companies developing urban air taxis like Boeing's Aurora
Flight Sciences, Textron unit Bell Helicopter and billionaire Larry Page's Kitty
Hawk are in a dialogue with the FAA to establish a roadmap for bringing their
vehicles to market. The University of Michigan's Atkins says that a vast talent
gap between the companies and FAA in computer science slants that conversation
toward industry.
She says that it's critical for the agency to hire more
computer science experts and develop the ability to independently validate and
verify code.
An FAA R&D and engineering advisory committee chaired by
Hansman has warned repeatedly over the years that the agency needs more
expertise in software, among other technical areas. However, the agency faces a
tall task in competing for computer science graduates with Silicon Valley and
the urban air mobility startups that offer the excitement of building new things
and the possibility of striking it rich.
The FAA said officials were
unavailable for an interview. A spokesman told Forbes by email that the agency
has significant capabilities in computational and automation systems, and that
it will take incremental steps toward introducing autonomous
aircraft.
Experts say they're confident that the technical issues are
solvable to make autonomous systems safe, but whether it can be done affordably
is another question.
The FAA requires redundancy of safety critical
sensors and systems on passenger aircraft - for example, airliners typically
have three independent flight guidance computers. Those back-up systems mean
higher costs and higher weights, which could reduce payloads. Given the small
size of most autonomous air taxi concepts, that could blow up the business
case.
"I'm skeptical we can provide the same level of integrity in a
small autonomous vehicle at a price point we can afford," says
Hansman.
Continued progress in shrinking the size of electronics will be
necessary - Honeywell, for example, has developed radar sensors for drones that
are the size of a paperback book. And researchers are developing ways to
substitute physics-based models for redundant sensors. But much work remains to
be done, and the spotlight that the 737 MAX crashes have put on flight controls
and the fact that components can fail with catastrophic consequences serves as a
sobering reminder of the stakes.
"It's fun to talk about Amazon package
delivery and urban air mobility through Uber," says Peter Seiler, an aerospace
professor at the University of Minnesota. "But you can't just anticipate that
you've gotten on airplanes all your life and it's rare for these things to fail
and that's all going to be fine. There are technical issues that have to be
worked out to maintain that track record of safety and reliability."
Abonner på:
Legg inn kommentarer (Atom)
Ingen kommentarer:
Legg inn en kommentar
Merk: Bare medlemmer av denne bloggen kan legge inn en kommentar.