In light of the recent Boeing 737 MAX crashes and ongoing recertification efforts, filmmaker Tom Goodall wonders what might happen if over-automation in aviation goes too far.
With the release of the US Congressional Committee report on the Boeing 737 MAX crashes, the issue of automation in aviation is once again on our minds. Those causes were complex, but the events ultimately gave us a taste of what we fear over-automation could lead to. A computer malfunctioned and the result was fatal. Auto-pilots, colloquially called “George”, are in control of our planes more than ever.
This has been the trend since the advent of human flight, and it has undeniably made flying safer. Even pilots skeptical of automation will admit that their duties are made much simpler and focused because of these advances. Accompanying this trend is the idea that every step down this road is an improvement. With unmanned military drones in the air, many believe automated cargo planes are next and self-flying passenger airliners are on the horizon. Some say we are getting close to the pinnacle of aviation, but I am not so sure.
I’ve always loved flying. I’m one of the few people I know who’s excited by exploring airports and setting up camp in my economy seat, simply because the miracle of human flight never fails to leave me in awe. However, the 737 MAX incidents stirred something in me that I couldn’t ignore.
As a filmmaker by trade, I felt compelled to tell a cautionary tale about automation in aviation. My team and I began work on a film project titled Duration 13:00, which will explore the perils of technology taking the reins away from experienced pilots and the lengths the flight crew must go to in order to save everyone on board.
The response to our film project launch has been overwhelming. We launched a crowdfunding campaign that exceeded our expectations, surpassing our first goal in just two weeks, and is now close to reaching our “unrealistic dream goal”. I’ve heard from pilots, commentators, enthusiasts, YouTube-influencer aviators, and members of the public, praising the endeavour.
With increasing automation apparently increasing safety, it may seem strange that our cautionary tale is resonating so deeply. It seems that there’s something about this trend of automation that disconcerts many of us at a fundamental level. I’m not suggesting we stop this positive trend, but that it has a tipping point. Proponents of total automation might say we’re only against the idea out of some primitive instinct that needs control to feel secure.
But control isn’t all we’re giving up. If we give up control, we give up responsibility.
If a human makes a mistake, we own that. If a machine makes a mistake, we allowed that. If most auto-planes flew safely but even one crashed, then we could have done more. We could have tried to stop it, if only there was a human there to try and fix the problem. Human flight is a miracle and a marvel, but it only takes one little glitch to turn a soaring plane into a falling tin can. With stakes like that, I’m not sure that’s a responsibility we ever want to let go of.
Automation can go too far: when we give total control to a machine, when we no longer have the final say. Whether it’s in a plane of tomorrow that can fly itself or in designing software for the planes of today, no matter how good the machines are, there should always be a pilot in the seat able to take control. That control, that responsibility, is ours.
Written by Tom Goodall