Investigators Probe Software's Role In Deadly Boeing 737 Max Crashes | Connecticut Public Radio
WNPR

Investigators Probe Software's Role In Deadly Boeing 737 Max Crashes

Mar 22, 2019
Originally published on March 22, 2019 8:04 am
Copyright 2019 NPR. To see more, visit https://www.npr.org.

STEVE INSKEEP, HOST:

The investigation of two Boeing plane crashes points, for the moment, at software. There's evidence that a program called MCAS pointed the planes' noses down repeatedly without pilots even knowing, ultimately leading to the deaths of hundreds of people. Boeing and the FAA have promised a programming patch, as it's called. This is a reminder that software is designed to do almost everything around us and not always for the best. Here's NPR's Martin Kaste.

MARTIN KASTE, BYLINE: The first time it really sank in for most people that software is in everything was during the VW scandal, when we found out that a program inside the cars had been quietly cheating on emissions tests. At the time, WBUR in Boston had Ray Magliozzi of Car Talk on, and the conversation quickly turned to things that have minds of their own.

(SOUNDBITE OF ARCHIVED BROADCAST)

RAY MAGLIOZZI, BYLINE: So I bought myself a wonderful television that decides at 2 o'clock in the morning that it wants to turn itself on.

ANTHONY BROOKS, BYLINE: (Laughter).

MAGLIOZZI: But we are frustrated by the technology around us because unlike, you know, televisions of old where you could give them a belt on the side...

BROOKS: That's right.

MAGLIOZZI: ...Or whatever, or take a tube out and fix it, those days are gone.

KASTE: Chris Gerdes would agree with that. Though he'd say that when it comes to cars, software has mostly been a good thing.

CHRIS GERDES: I think software has given us some amazing performance that we've become used to in our daily lives.

KASTE: Gerdes is a professor of mechanical engineering at Stanford. He says software has generally made cars cleaner and safer. But he says other aspects of design can get short shrift.

GERDES: A lot of times, you see systems that would be much easier to control if somebody had been thoughtful about the mechanical design.

KASTE: He thinks this is actually more true in other industries, for instance, medical devices, where he sometimes does consulting. He'll encounter a poorly designed moving part on some machine, and the assumption is that the problem will eventually be fixed by the control software.

GERDES: You know, there have been a few cases where it's been exactly that conversation of - wait a minute? Why did you do that?

KASTE: Sometimes software can make a medical device potentially deadly. The most notorious case was in the '80s when a new design of a radiation therapy machine replaced the mechanical safety controls with software - buggy software, it turned out. And patients were injured or killed by excessive radiation. And just last year, software problems were the single biggest cause of medical device recalls. Even when lives aren't at stake, software solutions on physical things can sometimes be just plain annoying.

KARA PERNICE: I was driving my parents' car the other day, and I could not figure out how to turn down the air conditioning on a touch screen. Like, are you kidding me?

KASTE: Kara Pernice is senior VP of the Nielsen Norman Group, a company that specializes in user experience design - UX, they call it. She's not against software. But too often, she says, she sees it just slapped on top of other systems, like the car's dashboard, without considering whether maybe a mechanical solution might have been better.

PERNICE: Many times, hardware-software creation is disjointed. So I think this idea that software people are brought in later is common, and it's a huge problem.

KASTE: In the case of the 737 Max, the suspect software was added to compensate for a compromise in the physical design. Boeing had added newer, bigger engines to an older airframe, and that made the nose tip up more. So the software was added to push the nose back down again.

KRISTI MORGANSEN: I am very carefully not commenting on the ongoing investigation because it's an ongoing investigation. We don't know.

KASTE: This is Kristi Morgansen, professor of aeronautics and astronautics at the University of Washington in Seattle. Without passing judgment on the 737, she says it is normal to use software to improve on older legacy parts of a plane.

MORGANSEN: It would be so cost-prohibitive to start from scratch, to build an entirely new software system - a new avionics system to go with it. It would take so long that the business model doesn't work.

KASTE: Business models are what concern Eben Moglen. He's a law professor at Columbia and champion of software transparency. He says industry sometimes uses software to cut corners on things, for instance, image-enhancement software in camera phones, which lets them use cheaper lenses.

EBEN MOGLEN: Because it allows them to make pictures that consumers think are just wonderful out of tiny bits of bad glass, right?

KASTE: But he says that's a minor concern compared to the 737 situation. He's scandalized that the pilots may not have known about the software or how it worked. And for Moglen, this is a much deeper problem.

MOGLEN: What we're looking at, in the case of some aerodynamic software taking over from pilots without telling them, is an example of why, even if you didn't think this had anything to do with politics, it is still true that systems that don't explain themselves to the human beings who interact with them are dangerous.

KASTE: It's politics to him because it's about political values, like freedom. He says if we in the 21st century still believe in human autonomy, then we need to put that value into the software that runs our world. Martin Kaste, NPR News.

(SOUNDBITE OF JOASIHNO'S "EFOM") Transcript provided by NPR, Copyright NPR.