Tesla hit parked police car 'while using Autopilot'

  • Published
Crashed TeslaImage source, Laguna Beach Police Department
Image caption,
A number of Tesla vehicles have been involved in crashes.

A Tesla car has crashed into a parked police car in California.

The driver suffered minor injuries and told police she was using the car's driver-assisting Autopilot mode.

The crash has similarities to other incidents, including a fatal crash in Florida where the driver's "over-reliance on vehicle automation" was determined as a probable cause.

Tesla has said customers are reminded they must "maintain control of the vehicle at all times".

In a statement, it added: "When using Autopilot, drivers are continuously reminded of their responsibility to keep their hands on the wheel."

As yet, it has still to be confirmed that the Autopilot mode was indeed engaged.

The BBC is not responsible for the content of external sites.View original tweet on Twitter

The California crash appears to be the latest example of semi-autonomous vehicles struggling to detect stationary objects. A Tesla driving in Autopilot hit a stationary fire engine in Utah in May.

According to a police report obtained by the Associated Press, the Tesla accelerated before it hit the vehicle.

The BBC is not responsible for the content of external sites.View original tweet on Twitter

It has also emerged that a Tesla Model 3 driver has blamed Autopilot for a crash in Greece last Friday, in which the car suddenly veered right "without warning".

The motorist, You You Xue, voiced his concerns about Autopilot on Facebook.

"The vigilance required to use the software, such as keeping both hands on the wheel and constantly monitoring the system for malfunctions or abnormal behaviour, arguably requires significantly more attention than just driving the vehicle normally," he wrote.

One influential tech industry-watcher has raised concern about Tesla's software, noting that Google's car division has claimed that an all-or-nothing approach is safer.

"There is a serious argument that the incremental, 'level 2/3' approach to autonomous cars followed by Tesla, where the human isn't driving but might have to grab the wheel at any time, is actively dangerous and a technical dead end," tweeted, a partner at the venture capital firm Andreessen Horowitz.

"Waymo decided not to do this at all."

It is not the first time the Autopilot feature has been linked to dangerous behaviour.

In England, a driver was banned from driving after putting his Tesla in Autopilot on the M1 and sitting in the passenger seat.

Media caption,

Bhavesh Patel was filmed by a passenger in another car

'Deceptive' naming

The news comes after two US rights groups urged the Federal Trade Commission to investigate Tesla over its marketing of the assisted driving software.

The Center for Auto Safety and Consumer Watchdog said it should be "reasonable" for Tesla owners to believe that their car should be able to drive itself on Autopilot.

It called the naming of the Autopilot "deceptive and misleading".

Media questions

The chief executive of Tesla, Elon Musk, has previously complained abut media attention on Tesla crashes. He tweeted: "It's super messed up that a Tesla crash resulting in a broken ankle is front page news and the ~40,000 people who died in US auto accidents alone in past year get almost no coverage."

His comments received support from prominent academic and psychologist Steven Pinker, who has in the past voiced concerns about Tesla's Autopilot.

The BBC is not responsible for the content of external sites.View original tweet on Twitter