The Boeing 737 and the risks of automation over-reliance



Around the world, Boeing 737 Max planes sit unused, collecting dust. The grounded planes are awaiting permission to return to service, following two crashes within five months of each other. In both cases, it seems that the pilots lost a battle against an automated system, known as MCAS — with fatal consequences.

These tragedies have highlighted an issue that’s not restricted to the Boeing 737 Max. Indeed, they depict a lesson that applies to automation users across all industries. That is, the risks of automation over-reliance.

Here’s what we can (and should) learn from the Boeing 737 crashes.


What happened?

October 29, 2018, and March 10, 2019, have a sad occurrence in common: a Boeing 737 Max crash. These incidents resulted in a loss of life — and both seem to relate to the MCAS automation the planes boast.

MCAS is short for ‘Manoeuvring Characteristics Augmentation System’. It is automation software designed to stop the plane stalling while gaining altitude. The plane’s powerful engines risked pushing the nose of the plane too high. To avoid this, sensors would inform the MCAS system when a climb was too steep. The system would then automatically adjust the nose of the plane down, to restore a safe angle.

Unfortunately, faulty sensors gave incorrect input, falsely triggering the automated systems. As a result, the software put the planes into a nosedive. Despite the best efforts of the human pilots, they could not override the system, and, after a struggle, the planes went down.


Automation over-reliance

The fault, however, doesn’t lie with automation, but with over-reliance. In both crashes, the automation did exactly as it was designed to do. It responded correctly to the input it received. But the input was faulty, and the pilots couldn’t disengage the automation system.

In other words, automation over-reliance led to powerless pilots and two tragedies.

This over-reliance on automation can cause problems no matter the industry. Indeed, there’s a real danger of placing too much on your automation system, forgetting your human team in the process. As a result, humans aren’t equipped to take over and handle a faulty input or anomaly, leading to the potential for chaos to ensue.

The crashes serve a stark example. The MCAS system was put in place to prevent a problem — a task that a human pilot could have handled in most cases. Instead, the system was relied on entirely to fix the problem, to the point that it was impossible for the pilots to fly without it active. Some faulty input later, and everything ended up out of control.


Automation is great, but it isn’t infallible

Automation software has proved a boon across every industry — including aviation. It’s reduced manual error, assisted efficiency of customer service, and boosted productivity. It’s helped feed conversational UI, giving chatbots their early start. And, it’s even created new jobs.

When used correctly, as a tool to help humans rather than replace them, there’s very little downside. It’s perfect for handling repetitive and routine tasks.

But even in business process automation, machines are not infallible. Human flexibility and problem-solving should still sit at the core of your workflows. This is where the risks of automation over-reliance come in.


Man vs machine

The core characteristic of automation over-reliance is the use of automation systems as the first port of call. That is, we rely on automation to do the job, while humans monitor it. Rather than automation acting as a safety net — a tool to help humans — humans are acting as a safety net for automation systems.

As a result, humans feel displaced from their roles. With little more to do than monitor and feed the automation, they’re at risk of growing complacent. When this happens, they’re less likely to notice mistakes.

Plus, harnessing a culture of automation over-reliance gets unhealthier over time. It can mean that automation is invariably used for every new, non-routine task. Even those that a human team member can and should handle.

It’s a vicious cycle. Automation over-reliance causes complacency, and complacency causes more over-reliance. Software is eating the world, but we shouldn’t let it eat our ability to act on our own.


Automation and a human-centred approach

The answer to avoiding automation over-reliance lies in changing your automation approach. Instead of automating everything you can, you should put human ability at the core of your business. That is, your automation should weave around your human team. Your automated processes can offer a safety net against manual error.

In other words, you should automate only the repetitive processes that will support a human team. (And not those that will replace them.) Human team members should also have adequate training, so that they know the best ways to use automation to support their workflows. 

The key takeaway here is the need for a human-centred approach to automation. For the Boeing 737 Max MCAS system, this approach to automation would mean that the pilots were in control. The automation system would act only as a safety net in the event of a manual error. And it can be easily turned off in case of a fault.


Learning from tragedy

Considering these tragedies, the automation lesson behind the events is one that should not go unnoticed.

When deploying automation, it’s important to keep your humans at the core of what you do. Keep humans in the driver’s seat and let automation act as their seat belt. Weave automation into their tasks, process and workflows. Avoid automation over-reliance.