121pilot, a U.S. commercial airline pilot and contributor to Live and Let’s Fly, writes about automation.
Automation Of Commercial Airlines? It’s Not Going To Happen…
Matthew’s recent blog entry about Boeing’s desire to cut pilots out of flying is just the latest in a long string of articles in aviation journals about how airliners are moving toward being fully automated and eliminating pilots. There is a good deal of wishful thinking on the subject and some of it comes from people who should know better. Because it’s one thing for the Federal Aviation Administration (FAA) to contemplate allowing small drones to fly around delivering packages to your door. It’s quite another to contemplate a large airliner full of people operating autonomously. Frankly, it’s not going to happen, if at all, for many years to come. To understand why we need to look at the rules under which aircraft are certified.
Examining The Rules
Airliners are designed according to a strict set of rules intended to ensure safety of flight, first and foremost. The requirements that an aircraft must meet in order to be certified to fly are listed in part 25 of the Code of Federal Regulations otherwise known as the Federal Aviation Regulations or FAR’s. The section that is most pertinent to our discussion is 25.1309 Equipment, Systems, and Installations. This section of the FAR’s specifies:
“The Airplane systems and associated components, considered separately and in relation to other systems must be designed so that – (1) The occurrence of any failure condition which would prevent the continued safe flight and landing of the airplane is extremely improbable.”
Determining whether or not a proposed design meets this requirement requires failure analysis. How that must be done is spelled out in another document, the Advisory Circular (AC) 25.1309-1. It’s worth noting that the current released version of this document is 25.1309-1A. But the latest not-as-of-yet officially published draft is the B version. Since the FAA and EASA are accepting the use of the B version in aircraft certification programs. it is the one I’m going to reference.
The AC goes on to classify various failures into five possible categories:
- No Safety Effect
Each failure condition has a probability associated with it that spells out the likelihood of a failure occurring. Let’s look at what the one we are most concerned with means. A catastrophic failure is one that would result in fatalities, usually with the loss of the aircraft. Such failures must be shown by analysis to be Extremely Improbable which means that the odds of such a failure occurring are no worse than 10–9 (1 chance in a billion).
Consider then an automated airliner. Clearly a failure of the automation that flies the airplane is going to be classified as a catastrophic event because it’s almost certainly going to result in a crash with fatalities. Which means that the automation is going to have to be shown to have a chance of failure that is extremely improbable. That’s going to include its ability to deal with various system malfunctions that can be expected to occur.
If you could limit the range of probable malfunctions to the more normal types of events like an engine failure on takeoff you could certainly program a system that could react properly to such an event and safely fly the airplane back around to a landing. The problem comes when such systems are inevitably exposed to the messy real world. It’s simply not going to be possible to program an automated system that never needs human oversight or decision making. The range of variables is simply too high. Which means on some level you’re going to have to have humans in the loop.
A Single Pilot Solution?
This raises the question of how you do that. One proposal is to go to a single pilot airliner where the pilot is essentially there to supervise the automation and deal with the unexpected. There are however, numerous problems with this idea. First and foremost, the human body isn’t sufficiently reliable. Pilot incapacitations, although rare, do occur and people have died at the controls. It’s one of the principle reasons that airliners and large aircraft are not already operated single pilot.
Second, you do have to consider the possibility of pilot suicide. With only one pilot in the plane, there is nothing to stop an unbalanced individual from taking others with him.
Third, is the issue of proficiency. Flying is a perishable skill that if not practiced regularly can be lost. The industry is already working to address issues of automation dependency and a single pilot airliner is only going to make that worse. Because remember the single pilot isn’t there to fly the airplane, he’s there in case the automation fails to take control.
Fourth, is the issue of qualifications. When you have only one pilot in the airplane, you’re going to need people with more experience and even more rigid training than what is required today. But if future airliners are only flown by one person how will we be able to produce the next generation of pilots to have that experience?
Lastly, even with a very highly trained single pilot the sort of event that is going to dump the airplane in his lap really needs two pilots to manage. The workload when your facing major systems failures just gets too high. It’s why Cockpit Resource Management, which trains crews to act as a team, has proven so successful at reducing the accident rate from the days when the Captain was god on high not to be questioned.
Remote Systems Prone To Errors, Hacking
So, if an automated airliner with a single pilot onboard isn’t the answer, what about some system of remote control like what is being used with drones right now? It would certainly address issues of incapacitation and suicide for example. A remote pilot would also in theory be able to respond to the unexpected. The major problem there comes with the data link and its associated systems. We have to remember that if they fail the results could be catastrophic, which means the chance of it failing must be less than 10–9. There is another problem to consider too the moment we allow remote control of an airliner, hacking. If you have the ability to remotely control an airliner there will be people who will want to exploit that for evil means. Which means the chance of that data link being hacked must also be shown to be less than 10–9. I feel pretty comfortable stating that a data link that is reliable and un-hackable doesn’t exist and isn’t likely to exist possibly ever.
Consider too that even if you developed a link that was believed to be reliable and secure enough, what would happen if it was compromised? You’d have to immediately ground the entire fleet and its possible that it might never fly again. Imagine the effect if your automated airliner became as common as the 737 is today and you suddenly had to ground the world wide fleet, perhaps permanently.
“Black Swan” Events
Finally, we have to discuss the probability of a black swan type of event. A black swan event is an unpredictable event beyond the bounds of what is normally expected possibly with severe consequences. Qantas 32 was such an event in that the shrapnel from the engine did so much damage and disabled so many systems that it was well outside the bounds of anything that had been considered. Continental flight 120 was another such event.
In this case the 757 was flying from Anchorage to Seattle with one of its generators inoperative. Mid-flight while over water off the coast of Alaska at night the remaining engine driven generator failed. That left the APU carrying the electrical load which it should have been able to do. Until a few minutes later when it overheated and shut down. At this point the last line of defense an emergency backup generator powered by aircraft hydraulic power should have come on but it too failed leaving the crew with nothing more than battery power. Certifying an automated airliner means creating a system that is robust enough to handle such events. The problem is that by definition a black swan event is unpredictable, which again argues that you’re going to on some level have to have a human in the loop.
When you have an accident in which pilot error plays a role it’s understandable that people would look to eliminate the source of such errors. It seems easy to conclude that the computer would not have made the same mistakes and the accident would not have happened. This of course ignores events where the automated systems have failed and the difference between success and failure was a well-trained crew at the controls. The real question that the MAX crashes give rise to isn’t automation it’s one of pilot training and proficiency. But, that’s a larger subject for another day.
Like Matthew when I read chatter of automated airliners and how the piloting profession is going to go the way of the TV repair man or video rental store, I’m drawn to a media image. In the first Trailer released for the upcoming Top Gun 2 movie the Admiral (played by Ed Harris) tells Maverick “The end is inevitable Maverick your kind is headed for extinction.” Maverick’s response says it all. “Maybe so sir…But not today.”