Commentary: You know who should study the Boeing 737 crashes? Driverless car makers

Commentary: You know who should study the Boeing 737 crashes? Driverless car makers

The two disasters serve as a warning for industries, including self-driving cars where technology is taking over, says the Financial Times' Brooke Masters.

A boy looks on as forensic investigators work at the crash site of an Ethiopian Airlines Boeing 737
A boy looks on as forensic investigators work at the crash site of an Ethiopian Airlines Boeing 737 MAX aircraft. (Photo: AFP/TONY KARUMBA)

LONDON: I am not normally a nervous flyer. But the similarities between the recent Ethiopian Airlines disaster and the October crash of another Boeing 737 Max 8 flown by Indonesia’s Lion Air were enough to make me applaud the decision to ground the planes while authorities figure out what went wrong.

Boeing has been working on fixes for the 737 Max’s flight control software ever since the October crash was blamed in part on an anti-stall system going awry.

The software, known as MCAS, appears to have repeatedly forced the plane’s nose down because an “angle of attack” sensor misread the plane’s angle to the ground. The pilot tried repeatedly to pull the nose up but the plane fell into a fatal dive.

STILL WAITING

We are still waiting for a report on the cause of the Ethiopian crash, which killed 157 people. But on Monday, the US Federal Aviation Administration poured cold water on the company’s hopes of getting the jets back in the air quickly. 

“Time is needed for additional work ... to ensure that Boeing has identified and appropriately addressed all pertinent issues,” it said.

LISTEN: The debate over Boeing's 737 MAX 8 planes, an episode on The Pulse podcast

Initial reports suggest that flight recorder data from the Ethiopian crash shows resemblances to the Lion Air tragedy. 

If the two disasters do turn out to have similar roots, that should serve as a warning in other areas where technology is taking over part, though not all, of crucial tasks from human experts.

CRITICAL LESSONS FOR CAR MAKERS

The Lion Air pilots’ desperate struggle with the anti-stall software holds critical lessons for car makers experimenting with self-driving technology. 

It has emerged that Boeing and the FAA had agreed MCAS could be installed without extensive retraining of pilots who had flown other 737 models.

Questions are now being raised whether that made it harder for human pilots to take control when MCAS erroneously forced the nose down.

Ethiopia crash victims
Relatives react at the scene where the Ethiopian Airlines Boeing 737 Max 8 crashed shortly after takeoff on Sunday killing all 157 on board, near Bishoftu, south of Addis Ababa, in Ethiopia on Mar 13, 2019. (Photo: AP/Mulugeta Ayene)

Boeing argues that pilots were always able to override the system by flipping switches in the cockpit. But some pilot unions have said their members did not know enough about it. (US enforcers are looking into the approval process.)

Similar issues have started to arise as car makers launch vehicles that can sometimes operate without human intervention, but are not fully self-driving. Tesla’s “autopilot” software keeps a car in lane, matches its speed to the traffic and leaves a motorway at the right exit. 

READ: The future of aviation? Expect more automation, a commentary

But drivers are told to stay alert, keep hold of the wheel and take over in tricky situations. That has not always happened, leading to fatal accidents.

INTRODUCING SEMI-AUTOMATION 'IRRESPONSIBLE'

The chief executive of Volvo Cars, Hakan Samuelsson, warned last week that introducing such semi-automation can be “irresponsible” and cause accidents when misplaced confidence leads to “over-reliance” by consumers.

What we know so far about the Boeing crashes offers another perspective on the same issue. Asking a highly skilled pilot to seize back control from a system he knew little about proved deeply problematic in the Lion Air case.

READ: We expected airplanes to be safe. Boeing undermined that expectation, a commentary

It would be that much harder for an unwary and ill-prepared driver to seize control just as a problem looms. Consider how scary it is to have to brake suddenly after a long period of monotonous driving.

Having to do it after a period of not driving at all feels like a recipe for disaster.

Boeing 737 MAX 8 aircraft have been grounded around the world
Boeing 737 MAX 8 aircraft have been grounded around the world following a deadly crash in Ethiopia that killed 157 people. (Photo: AFP/JOE RAEDLE)

ASSISTIVE TECHNOLOGY

Yet the reach of semi-autonomous cars is expanding. EU officials last week considered requiring new cars to be fitted with devices that cap a vehicle’s speed at the legal limit.

Such systems, which can be overridden by pushing hard on the accelerator, could cut fatalities by 20 per cent overall, but also make hazards worse for drivers who are ill-prepared to speed up. The EU ultimately shied away, and new cars will instead warn drivers they are speeding.

But I can’t help wondering how many “assistive technologies” are being added to daily life without our fully understanding their effect. As Daimler chief executive Dieter Zetsche warned on Tuesday (Apr 2), it only “takes one spectacular incident” to undermine confidence, even in systems that on balance make us safer.

Boeing has agreed to equip all of its planes with alarms to alert pilots that key sensors disagree. Car makers should start thinking now what they need to do to prevent the accidents that could befall unwary drivers.

Source: Financial Times/sl

Bookmark