October 7th, 2008. An Airbus A330 quietly flies from Singapore to Perth at 37000 feet.
All of a sudden, the aircraft pitches down without any order from the pilot. In fact, the pilot flying is trying to regain control and to understand what is going on. Five minutes later and without any previous warning, the aircraft pitches down for the second time. One hour after these events, the airplane lands at Learmonth Airport, Western Australia.
Kevin Sullivan, the Captain on that flight, said after the incident: “One thing is certain, the computers blocked my control inputs. For a pilot, loss-of-control is the ultimate threat. It is our job to control the aircraft, and if computers and their software, by design can remove that functionality from the pilot, then nothing good is going to come out of that.”*
Here we are – a bit more than a decade later – discussing how to take those pilots out of the flight deck. The narrative that humans create problems and machines solve them is flawed. However, this argument is used to promote the European agenda of Artificial Intelligence (AI) technology introduction across a number of sectors.
It is true that humans make mistakes. But humans have innate characteristics that even the most advanced machines cannot replicate. Creativity, resilience or teamwork are human capacities that AI or machine learning (ML) cannot replicate at this moment in time.
The narrative that humans create problems and machines solve them is flawed
Incorporating artificial intelligence in commercial aircraft with the aim of helping pilots might be a wise step. But the idea to completely substitute a pilot with AI raises all kinds of safety questions and opens an ethical debate about the role of humans that is equally, if not more, important.
When the first aviation legislation was developed at an international level, the need for humans to be in control was incorporated. This is the reason why a pilot may deviate from the rules in order to preserve the safety of the flight. We, as professional pilots, accept the responsibility of the safety of hundreds of people because we are in control of the machine.
I firmly believe that our society is not ready to accept that a machine, no matter how “intelligent” it can be, could take control of our lives and take decisions on our behalf. But I fear that some of our aviation authorities may not feel the same. And this is a conversation we need to have, before we give up the control.
* Excerpt from the book “No Man’s Land” (Kevin Sullivan. 2019. ABC Books)
by Juan Carlos Lozano, ECA Vice-President
Airbus A320 Captain Iberia