Editor’s note: In a letter to Tesla founder and CEO Elon Musk, Consumer Watchdog calls for he and the company to disengage Tesla’s autopilot feature in the wake of a recent death and other incidents. The California-based group has been an outspoken critic of autonomous car technology as being developed by Google and other companies. You can watch a Tesla video about autopilot at: https://www.youtube.com/watch?v=jWreyC2l-dw. The full text of the letter to Musk is reprinted here:

Consumer Watchdog Calls On Elon Musk To Disable Tesla’s Autopilot, Pledge To Be Liable For Self-Driving Failures If Feature Returns

We are writing to express Consumer Watchdog’s concerns about Tesla Motors’ woefully inadequate response to the tragic May 7 fatal crash in Florida of a Tesla Model S being controlled by the autopilot feature, and the emerging pattern of blaming victims involved in the crashes while using the feature. Our first concern is your inexplicable delay in announcing the crash. You made a public acknowledgement on June 30, only after the National Highway Traffic Safety Administration announced it was investigating. Such a delay when your customers continued to drive on public highways relying on autopilot and believing it to be safe is inexcusable.

An autopilot whose sensors cannot distinguish between the side of a white truck and a bright sky simply is not ready to be deployed on public roads. Tesla should immediately disable the autopilot feature on all your vehicles until it can be proven to be safe. At a minimum, autopilot must be disabled until the complete results of NHTSA’s investigation are released.

One of the most troubling aspects of Tesla’s deployment of autopilot is the decision to make the feature available in so-called Beta mode. That’s an admission it’s not ready for prime time. Beta installations can make sense with software such as an email service; they are unconscionable with systems that can prove fatal when something goes wrong. You are in effect using your customers as human guinea pigs.

You want to have it both ways with autopilot. On the one hand you extoll the supposed virtues of autopilot, creating the impression that, once engaged, it is self-sufficient. Your customers are lulled into believing their car is safe to be left to do the driving itself. On the other hand you walk back any promise of safety, saying autopilot is still in Beta mode and drivers must pay attention all the time. The result of this disconnect between marketing hype and reality was the fatal crash in Florida, as well as other non-fatal crashes that have now come to light.

If autopilot is proven safe to deploy, Tesla must assume liability for any crashes that occur when the feature is engaged. Both Volvo and Mercedes have said they will accept liability when their self-driving technology is responsible for a crash. Are you willing to make that pledge? In response to the tragic Florida fatal crash Tesla said in a blog, “We would like to extend our deepest sympathies to his family and friends.” Do you accept responsibility for the crash?

Instead, based on your own statements you appear have a pattern of blaming the victims. In a July 6 blog post that referred to the Florida crash Tesla said:

“To be clear, this accident was the result of a semi-tractor trailer crossing both lanes of a divided highway in front of an oncoming car. Whether driven under manual or assisted mode, this presented a challenging and unexpected emergency braking scenario for the driver to respond to. In the moments leading up to the collision, there is no evidence to suggest that Autopilot was not operating as designed and as described to users: specifically, as a driver assistance system that maintains a vehicle’s position in lane and adjusts the vehicle’s speed to match surrounding traffic.”

There are more troubling indications that you are not willing to assume responsibility when autopilot fails, but rather to blame the victims. On July 1 a Model X crashed, rolling over, on the Pennsylvania turnpike while in autopilot mode, according to the driver. Fortunately the driver and his passenger survived the rollover. Tesla’s response: “Based on the information we have now, we have no reason to believe that Autopilot had anything to do with this accident.” Last November, according the Wall Street Journal, a Tesla in autopilot rear-ended a parked truck on I- 66 in Virginia. Tesla said the crash “was the result of driver error….To an attentive driver, it would have been clear that the driver should immediately slow the vehicle to avoid the accident.”

Tesla is rushing self-driving technologies to the highways prematurely, however, as the crashes demonstrate, autopilot isn’t safe and you should disable it immediately. If autopilot can ultimately be shown to meet safety standards and is then redeployed, you must pledge to be liable if anything goes wrong when the self-driving system is engaged.