It was the crash the auto industry knew was coming but still feared.

The death of a Joshua Brown, 40, who was using Tesla Motors’ semi-autonomous mode could add to the public’s apprehension of driverless cars even before they reach the road in big numbers. Most major automakers and technology companies, including Google and Uber, are working on fully autonomous cars, and have worried that a highly publicized crash could hurt those efforts.

A driver so enamored of his Tesla Model S sedan that he nicknamed the car “Tessy” and praised the safety benefits of its sophisticated “Autopilot” system, Brown has become the first U.S. fatality in a wreck involving a car in self-driving mode.

Brown, from Canton, Ohio, died in the accident May 7 in Williston, Florida.

According to a Tesla statement issued Thursday, the cameras on Brown’s Tesla Model S failed to distinguish the white side of a turning tractor-trailer from a brightly lit sky and didn’t automatically activate its brakes. Brown didn’t take control and activate the brakes either, Tesla said.

“Following our standard practice, Tesla informed NHTSA about the incident immediately after it occurred. What we know is that the vehicle was on a divided highway with Autopilot engaged when a tractor trailer drove across the highway perpendicular to the Model S,” the company said.

“Neither Autopilot nor the driver noticed the white side of the tractor trailer against a brightly lit sky, so the brake was not applied. The high ride height of the trailer combined with its positioning across the road and the extremely rare circumstances of the impact caused the Model S to pass under the trailer, with the bottom of the trailer impacting the windshield of the Model S. Had the Model S impacted the front or rear of the trailer, even at high speed, its advanced crash safety system would likely have prevented serious injury as it has in numerous other similar incidents.”

  • More coverage: Read the full blog from Tesla plus watch a video about how Autopilot works.

The National Highway Traffic Safety Administration announced the driver’s death Thursday, and said it is investigating the design and performance of the Autopilotsystem.

Brown was an enthusiastic booster of his 2015 Tesla Model S and in an April video he posted online he credited its sophisticated Autopilot system for avoiding a crash when a commercial truck swerved into his lane on an interstate.

Automakers and analysts have said they need to be careful as they introduce more and more semi-autonomous features, from automatic braking to adaptive cruise control. People can quickly learn to rely on them, or assume they work better than they actually do. The possibility of a fatal accident was always a concern.

“For years people have been saying the technology is ready, and it’s one of my pet peeves, because no it’s not,” said Bryant Walker Smith, a law professor at the University of South Carolina and an expert on autonomous driving issues.

Tesla stressed that its Autopilot system is new, noting that drivers must manually enable it and that they “must maintain control and responsibility for your vehicle” while using the system.

“Autopilot is getting better all the time, but it is not perfect and still requires the driver to remain alert,” the Palo Alto, California-based company said in a statement.

Karl Brauer, a senior analyst with Kelley Blue Book, said the accident is a huge hit to Tesla’s reputation.

“They have been touting their safety and they have been touting their advanced technology,” he said. “This situation flies in the face of both.”

Tesla’s shares dropped 3 percent in after-hours trading to $206.25 after the government said it would investigate how Tesla’s Autopilot system performed at the time of the crash.

But beyond Tesla, the accident could increase public skepticism about semi-autonomous and autonomous driving. In a survey released last month by the University of Michigan, two-thirds of drivers said they are moderately or very concerned about riding in a self-driving vehicle. Just 16 percent of the 618 drivers surveyed said they would rather ride in a self-driving car.

Walker Smith said it was inevitable that a semi-autonomous or autonomous car would crash. The Brown crash can help focus the discussion of regulators and others on driverless technology and its limitations, he said. It could also remind drivers that the technology isn’t perfect and they need to stay alert.

But Walker Smith said it would be unfortunate if public sentiment swung so far against driverless cars that people would never benefit from their lifesaving potential. On the day the Tesla driver died, he said, approximately 100 other people died on U.S. roads. No one knows how many of those deaths could have been prevented by cars that could predict crashes before they happen and brake by themselves.

“Driving today is dangerous, and there is no panacea. Every solution creates its own set of problems,” Smith said.