Editor’s note: In the third of WRAL TechWire’s four-part series looking at The Human Future, our Allan Maurer reports on the future of autonomous – or robotic – vehicles. Coming soon to a street near you? Not so fast. Humans are going to be behind the wheel for sometime yet.

DURHAM – Do you know why the companies developing autonomous cars do so much on-the-road testing, testing, testing?

It’s because their software is not good at dealing with novel situations, so they require enormous amounts of experience with actual roads and driving situations.

Duke University’s Michael Clamann, a senior research scientist at the Humans and Autonomy Lab (HAL), discussed the “Arrival of the Robot Car,” in a Moogfest pub talk at Fullsteam Brewery in Durham on Saturday.

Clamann said that while the companies working on driverless cars have made great progress, they still face technical, social, and legal hurdles.

[VIDEO: Watch a report about Duke’s HAL at https://www.youtube.com/watch?time_continue=1&v=UQjFKjXMns4 ]

In 2013, the US Department of Transportation’s National Highway Traffic Safety Administration (NHTSA) defined five different levels of autonomous driving, and Clamann began by reviewing them.

The five levels of autonomous vehicles:

  1. Level One: At least one function is assisted, but most still require the driver.
  2. Level Two: The vehicle has at least one acceleration/deceleration device (such as cruise control or crash avoidance, and one steering system, such as lane centering, or automatic parking. Many of these features are already available from various automakers.
  3. Level Three: The driver can intervene in “safety critical” situations and take over manually. Clamann noted this poses challenges, because a driver may not be paying attention yet must not only react instantly to a situation and the car must also instantly allow the switch-over.
  4. Level Four: This is a fully autonomous vehicle, but only in the “operational design domain” of the vehicle, meaning it does not cover all driving scenarios. They can only drive where they have been before. Considering the difficulties involved in a Level Three car, most manufacturers prefer going directly to Level Four vehicles.
  5. Level Five: This is a fully autonomous vehicle that can respond like a human driver in all situations. That includes environments that stymie many autonomous vehicles now, such as a gravel road, snow, and so on.

The data provided by an autonomous car’s redundant sensing systems: video, radar, a system called Lidar, which has more resolution than radar but less than video, has to be crunched by a powerful computer. “Massive data crunching is needed to make the video effective,” Clamann said.

The radar is used to see and avoid large objects while the Lidar, which has serious limitations at night or in rain or snow, backs up the video in some situations, such as recognizing pedestrians. The vehicle computer compares what it sees to a database (developed during all that on the road testing). “They need to build a visual lexicon.”

They lack human ingenuity

This presents difficulties. Pedestrians overlap and stand behind bushes, Clamann said. “They wear Halloween costumes and push walkers. He cited one study suggesting the systems are only 75 percent accurate and have trouble seeing pedestrians at night.

“The great thing about machine learning algorithms is that they never drink. They never get tired. But they don’t know what to do in new situations, so a new situation will freeze the car” Clamann said, a point he made repeatedly. “They lack the ingenuity of humans in novel situations. As long as a situation is built into algorithms, it works great. Building those algorithms takes time. But they’re getting better and better every day.”

Clamann noted that a Scientific American article said “autonomous cars are still teen drivers when it comes to s­­­afety.”

Clamann asked, “How safe is safe enough?” Right now 36,000 people die in traffic accidents a year, and 90 percent of the accidents are due to human error.” Getting that number down is a big argument for developing autonomous cars that don’t make such mistakes.

Who is responsible?

But there are also social questions involved, Clamann said. “Are people going to be ok if people get killed by an autonomous vehicle?” He pointed out that hacking also has to be addressed and researchers have already demonstrated these systems can be hacked. It may be even more of a problem as vehicles communicate with other vehicles.

How will insurance companies treat these driverless cars? “That will have to be worked out.”

If the car’s software or systems make an error that causes an accident, who is responsible, the manufacturer or the owner? “You weren’t driving and didn’t program it,” Clamann said. Volvo addressed this by saying it will take the responsibility in such cases.

Then, there is the question of how do you program moral choices into a machine? If an autonomous vehicle is faced with killing the driver if it goes straight, killing a group of children if it turns right or a group of nuns if it turns left, what choice should it make?

It may be such a rare decision. “Do we want to hold up this technology trying to prevent a situation that comes up every 12 years?” Clamann asked. “It is important for us to talk about these things,” he added.

Everyone wants to be first

Potential benefits of driverless cars include providing transportation to those who would not have it otherwise, the blind, the elderly, the ill. There are environmental benefits. All the autonomous car manufacturers are making fully electric vehicles. And they could save many of those highway deaths caused by human error. The problem there, Clamann noted, is that as long as people are still behind the wheel, “they sometimes do crazy things.”

Problems facing the industry include the splintered development by multiple manufacturers from Google to Tesla to Ford to Volvo. There is little if any cooperation in developing these systems. “They all want to be first,” Clamann said. “It’s a little scary that they are all developing different ways to solve the problems.”

Legal ramifications are equally fractured and policies have to be in place at the state and federal level before many of these vehicles make it onto the highways. Right now, only 13 states have laws on the books allowing driverless cars (North Carolina is not one of them).

There are even less serious considerations in making these self-driving vehicles commonplace. They perform so perfectly in staying lane centered and making turns that people actually found it annoying because we’re used to the less perfect sensation of human driving. As a result, Google actually “Added some slop” into its autonomous car programming,” said Clamann.

Another difficulty is how to let pedestrians know if a driverless car sees them or is slowing for a stop or accelerating. Clamann described a Duke experiment using a fake driverless van that tried including a warning display on top of the vehicle. Only one in 50 pedestrians actually looked at it. “We aren’t trained to read displays on the tops of vehicles,” he explained.

So how long will it be before these autonomous vehicles become dominant? Clamann said most people keep a vehicle for about 17 years. So it might take that long to get most drivers to switch even if the legal, social, infrastructure and technology problems are all solved.

Nevertheless, a number of manufacturers expect to have fully autonomous vehicles, often for commercial purposes such as city taxies, by 2018, while others say 2020. Here’s a list of what manufacturers have planned