Tesla Autopilot ruling a “hazard light on misleading marketing”

Posted on

A court in Munich has ruled that Tesla has exaggerated promises about its so-called ‘self-driving’ technology, with the carmaker’s use of the term ‘Autopilot’ coming under strong scrutiny.

As The New York Times states: “Tesla’s use of the brand name Autopilot for its software, as well as claims the company made on its German website about the software’s function, create the false impression that the car can drive itself, a Bavarian state court ruled. In fact, the court said, Autopilot is a driver-assistance system that requires human intervention.”

Matthew Avery, research director at Thatcham Research comments:

“We have long warned of the pitfalls to the Autopilot system. Its seemingly competent performance can encourage drivers to hand too much control to the vehicle and lose sight of their responsibilities behind the wheel.

“This is a progressive process that begins when motorists are marketed the ‘self-driving’ experience.

“Autopilot is not a self-driving system. It is there to provide driver assistance, not become an invisible chauffeur.

Naming is key

“We support the German competition commission’s ruling. Naming is key, and Autopilot is an especially misleading term.

“How many times have movies depicted an airline captain disengaging completely when switching on autopilot – leaning back in their chair, reaching for a cup of coffee or even leaving the cockpit entirely?

“But it’s not just the name of the system. Tesla marketing frequently suggests the car is capable of ‘full self-driving’. Just recently some UK customers received an email communication stating:

Our records indicate that you haven’t upgraded your Model S… to Full Self-Driving Capability. You can upgrade now at a reduced price of £2,200.

Catastrophic outcomes

“The outcomes of driver over-reliance on the Autopilot system can be catastrophic. Reports of accidents with Autopilot engaged have become all too familiar. Many are fatal and we don’t know if drivers were “taking a chance” or worse still, literally believing their Autopilot system was fully capable of driving the car itself.

“When marketed and used sensibly, systems like this will ultimately benefit road safety. However, without a safety-first principle enshrined in new technology adoption, our roads will become more dangerous and it will take longer to reap the societal benefits new systems have the potential to bring.

Spotlight on driver assistance

“We’re therefore continuing to shine the spotlight on driver assistance systems by testing and evaluating their performance. As a member of Euro NCAP, we have developed new protocols and testing methodologies and we’re launching the results of these tests later this year to show how these systems should be used and how effective they are at striking the balance in offering the right level of assistance to drivers.

“The resulting ratings will also consider carmaker marketing and encourage carmakers to be prudent about performance.

“Although the case in Germany focused on Tesla’s Autopilot, we believe it should serve as a brightly flashing hazard light on misleading marketing for all carmakers.

“If the warning is heeded, we look forward to a future where these exciting technologies can truly deliver on their promise.”

Thatcham Research is at the forefront of vehicle testing and a champion for the safe adoption of new vehicle technologies. In recent years its views on the dangers of overselling the so-called ‘self-driving’ capability of current driver assistance systems have been covered widely in the media.

In September 2019, Thatcham Research also launched 12 guidelines to minimise bumps in the road on the journey towards fully Automated Driving. Developed with the ABI, the guidelines came as part of Thatcham Research’s work with International Regulators on designing new rules which will eventually allow Automated Driving Systems onto Motorways.

Follow the link for video and to download the full ‘Defining Safe Automation’ report: