It’s not technology, but humans that may not be ready for self-driving cars

06.07.2016
The news last week that the owner of a Tesla Model S was killed after the car crashed into a tractor-trailer while its Autopilot feature was engaged raises an obvious question: is the self-driving technology safe

The accident, however, raises an equally important question. Are people prepared to responsibly use semi-autonomous driving technology

The accident, which took place May 7 in Williston, Fla., is the first known fatal crash involving a vehicle using autonomous technology based on computer software, sensors, cameras and radar. The Florida Highway Patrol identified the driver who was killed as Joshua Brown, 40, of Canton, Ohio.

The National Highway Traffic Safety Administration (NHTSA) has opened an investigation into the accident.

Tesla, which can retrieve driving log data from its cars, stated in a blog that the Model S's autopilot sensors failed to detect the white semi-truck as it turned in front of a sedan with a bright sky behind it.

If the problems that robotics encounter in extreme environments are any indicator, then self-driving cars are a bad idea, according to David Mindell, author of Our Robots, Ourselves: Robotics and the Myths of Autonomy.

Mindell, a professor in MIT's Department of Aeronautics and Astronautics, pointed to the Apollo space program, which landed U.S. astronauts on the moon six times. The moon missions were originally planned to be fully autonomous, with astronauts as passive passengers. After much feedback, the astronauts ended up handling many critical functions, including the lunar landings.

Pointing to a concept developed by MIT professor of mechanical engineering Tom Sheridan, Mindell in an interview with MIT's Technology Review, said the level of automation in any given project can be rated on a scale from one to 10, but a higher level of automation doesn't necessarily lead to greater success.

"The digital computer in Apollo allowed them to make a less automated spacecraft that was closer to the perfect 5," Mindell told the Review. "The sophistication of the computer and the software was used not to push people out, but to give them true control over the landing."

As another example, Mindell cited commercial airliners, which have automated systems such as auto pilot and auto landing, but the aircraft still require highly trained pilots to manage those systems and to make critical decisions in real time.

Mindell said "it's reasonable to hope" that vehicles with autonomous features will help to "reduce the workload" of drivers in incremental ways in the future. But total automation, he said, is not the logical endpoint of vehicle development.

Tesla has a cult-like following among its vehicle owners, who rave about the quality and sophistication of the vehicles' telematics, handling, speed and, yes, Autopilot features.

Like many other Tesla Model S owners, Brown was fond of demonstrating Autopilot's capabilities and had posted videos displaying it. At the time of the fatal crash, the truck driver claimed Brown was watching a Harry Potter movie, but police reports have not confirmed that, and Tesla has stated that movies cannot be viewed on the car's central consul infotainment screen.

According to published accounts, Brown loved cutting-edge technology and the personification of it in his Tesla Model S, which he nicknamed "Tessy." He even credited its Autopilot with avoiding at least one potential accident in the past. A video posted by Brown on YouTube shows his Model S swerving itself to avoid a utility truck that cut it off. Tesla founder and CEO Elon Musk reposted the video on his Twitter account as an example of Autopilot's capabilities.

According to a statement from his family, Brown was a master Explosive Ordnance Disposal (EOD) technician in the U.S. Navy, and an entrepreneur. He was a member of the Navy's elite Naval Special Warfare Development Group (NSWDG), also known as SEAL Team Six.

Tesla's Autopilot software is, however, a pilot program. It was never intended to be relied on as a fully autonomous driving technology.

AutoPilot is an opt-in feature and is not on by default. The feature requires the driver to accept an agreement every time it is turned on.

While adaptive cruise control and automated braking are becoming more common features in the latest vehicles today, Tesla has taken advanced driver assistance systems (ADAS) to a higher level, including prolonged self-steering and automated lane changing. Autopilot, however, is not fully autonomous; it's a public beta program that's intended to assist and not fully take over the task of driving. Data from the beta program is transmitted back to Tesla, allowing the company to improve on the technology.

Currently, Tesla's Autopilot functionality conforms to level 2 autonomous functionality, according to the NHTSA. The NHTSA has created five levels to describe autonomous functionality, level 0 equating to no self-driving features and level 4 being a fully self-driving vehicle.

Level 2 involves automation of at least two primary control functions designed to work in unison, such as adaptive cruise control in combination with lane centering. Even though Tesla's Autopilot feature only meets the NHTSA's level 2 criteria, some Tesla owners have been using it as if it were level 4.

While advanced driver assistance systems (ADAS), such as Tesla's Autosteer feature, can adjust the steering angle, steering rate and speed to determine the appropriate maneuvering operation, Tesla has said drivers must keep their hands on the steering wheel in order to be able to react to unexpected situations.

"When drivers activate Autopilot, the acknowledgment box explains, among other things, that Autopilot 'is an assist feature that requires you to keep your hands on the steering wheel at all times,' and that 'you need to maintain control and responsibility for your vehicle' while using it," Tesla stated in a blog.

Even with warnings from Tesla, just days after Autopilot was released as an over-the-air software upgrade, a plethora of videos surfaced" on sites such as YouTube that showed drivers with their hands off the steering wheel of their Tesla Model S. In an inexplicable act of stupidity, one driver even climbed into the back seat of his car to demonstrate confidence in the self-driving technology.

In light of such videos, and shortly after the Autopilot release last year, Musk said he would be placing more constraints on how it could be used.

Tesla is expected to release Autopilot 2.0 later this year with improvements based on the beta program's data.

Like Tesla's technology, the vast majority of mainstream vehicles adopting autonomous driving features will be controlled by ADAS or "guardian angels" that learn over time, Gil Pratt, CEO Toyota Research Institute, recently told reporters and analysts at MIT.

Speaking at the New England Motor Press Association Technology Conference at MIT, Pratt said that 30,000 motor vehicle fatalities occur in the U.S. each year. That number may seem high, but as a whole, U.S. drivers are excellent at avoiding crashes.

So, instead of taking the wheel from drivers' hands, as a fully autonomous vehicle would do, Pratt said automakers are more focused on ADAS now and will be for many years to come. Tesla, however, has stated that it will have a fully autonomous vehicle ready by 2018.

Tesla is not alone. Nearly every major car maker has announced plans for fully autonomous vehicles in the future. In the meantime, the steering wheel will remain firmly in the driver's control. For example, BMW just announced it plans to produce a fully autonomous car called iNEXT, suited for both city streets and highways, by 2021.

Arguably, however, Tesla has pushed the envelope on ADAS further than any other vehicle currently available, a strategy that Musk defended in the company's blog.

Musk referred to Brown's accident as the first known fatality in just over 130 million miles driven in Tesla vehicles where Autopilot was activated. Musk compared that statistic to the roughly one fatality for every 60 million miles driven globally. By Musk's accounting, Autopilot is at "the very least" 50% safer than vehicles not using it.

While Autopilot may make vehicle's safer, it is also a false security blanket. Humans will be human and will test the limits of technology, even placing themselves and others in harm's way to do it.

Perhaps ADAS needs one more advancement: a touch-sensitive steering wheel that ensures drivers keep their hands where they're supposed to be at all times. Or perhaps an inward facing camera could be used to determine if a driver is involved in an activity that's unsafe, and the car could automatically pull to the shoulder of a road then forcing the driver to reset the vehicle before continuing to drive.

In one fashion or another, safety advancements should be put in place to ensure drivers are unable to abuse ADAS while at the same time allowing humans ultimately to maintain control over their vehicles.

(www.computerworld.com)

Lucas Mearian