This semantic shift is having dangerous consequences that shouldnt be overlooked.

It’s free, every week, in your inbox.

The crowd was amazed, most likely.

Autonomous driving still isn’t a reality in 2020 — and it’s getting people killed

The systems connected the hydraulics that control the planes elevators and rudders to its navigational systems.

Doing this enabled the plane to maintain course and elevation all on its own.

Its kind of whereLevel 2 self-driving systemsare now, but more on that later.

Article image

Despite massive developments in aviation autopilot systems, one thing has remained: Human pilots.

Pilots are highly trained specialists in their field car drivers, generally speaking, are not.

The incident

Take the recently closed case of Apple employee Walter Huang.

paris, seine, sperry, aviation, autopilot

In 2018, Huangs Tesla Model X was driving on Autopilot when it crashed.

Before the crash, Autopilot had been active for 18 minutes.

He was transported to hospital where he later died from his injuries.

Sperry, Cachin, flight, automatic, pilot

seems like Huang was treating his Tesla like a fully autonomous vehicle when it isnt.

Teslas arent even close to being fully autonomous vehicles.

You cannot buy a self-driving car today.

car, tesla, model x, crash, huang

Were not there yet.

While Waymo regularly makes headlines, its only operating inPhoenix, Arizona at the moment.

Each level is subtly nuanced from the next, to group them all as one is heavy-handed.

huang, Tesla, Model X, crash, impact

Indeed, it seems the public generally isnt aware of these nuances either.

This is despitecountless warnings in vehicle user manualsthat specifically say these systems are not designed for full automation.

Whats more concerning though, these systems rarely include any technology to prevent their misuse.

autonomous, driving, teacher, waymo, tesla, feature, level, self-driving

Five manufacturers replied promptly, saying they were working to implement the recommendations.

Tesla, on the other hand, ignored the NTSBs communications.

This is perhaps the defining point in every fatality thats occurred in a Tesla whilst Autopilot was engaged.

self-driving, autonomous, levels, explained

The system can still work even if the driver isnt paying attention.

Despite these cases, Tesla has remained adamant that its vehicles are safer than conventional cars.

It found that Teslas were involved in up to 50% more accidents.

dead mans handle, tram, train, driver,

The insurer pointed out the quick and immediate acceleration of EVs as the most likely reason behind these incidents.

In all the Autopilot fatalities, drivers overestimated and failed to adapt accordingly to the technology.

It seems the lack of familiarity might be a larger factor in EV self-driving accidents than has been acknowledged.

Tesla, Model S, crash, Autopilot

That said, its not entirely the fault of drivers.

From manufacturer to manufacturer, the industry lacks clarity.

Confusion costs lives

There are two ways to look at this.

For manufacturers, using workds like Autopilot supposedly helps them differentiate their product against the competition.

But it also obscures the true nature of the features by wrapping them in misleading futuristic-sounding marketing language.

Its clear using the terms like Autopilot as a generic catch-all for high level driver aids is problematic.

Its time the industry adopted consistent terminology that better describes what these systems actually do.

They should also make it impossible to use these technologies without paying full attention to the road.

Drivers deserve safety, not chaos and confusion.

Story byMatthew Beedham

Matthew is the editor of SHIFT.

He likes electric cars, and other things with wheels, wings, or hulls.

Also tagged with