Self Driving Cars – The Promise and the Risk

blg

Most of us have heard of Google self-driving cars. Since 2009, their fleet of custom vehicles has put in more than one million self-driven miles. Although few of us have ever seen one of these vehicles, the same technology has been slowly making itsway into the vehicles we are buying and driving today. I believe that that we are reaching a turning point, wherethis increasingly sophisticated technology, which offers immense promise in the long run, is becoming available to consumers more rapidly than we can safely use it.This became unfortunately clear on May 7, 2016, when a Tesla Model S was involved in a fatal accident while Autopilot, Tesla’s self-driving technology, was apparently in control of the vehicle.

The Self-Driving Car, Today and Tomorrow

Some self-driving technologies are actually quite common. Adaptive cruise control, which maintains a safe distance from the car in front, was originally offered only in expensive vehicles, but is now an affordable option. Many cars will warn you if are leaving your lane or automatically brake if they sense an impending collision. Some cars even parallel park themselves.

Along with Google, Tesla Motors has become synonymous with the state of the art in self-driving cars, but Tesla’s advanced technology is available to consumers. Every Tesla Model S manufactured after September 2014 has the hardware to support self-driving, and this capability was enabled via software Version 7.0 shortly afterwards. The Model S features adaptive cruise control, hands-free lane keeping, on-demand lane changing, automatic parallel parking, and 360-degree collision warning.Tesla calls it Autopilot.

Tesla sees today’s Autopilot as just the beginning. For example, they have released an app, called Summon, that allows a Tesla owner to summon his or her car. The idea is that you stand in your driveway and press a button and your garage will open and your car will come to you. Their vision: “During this Beta stage of Summon, we would like customers to become familiar with it on private property. Eventually, your Tesla will be able to drive anywhere across the country to meet you, charging itself along the way. It will sync with your calendar to know exactly when to arrive” (Tesla Motors, 2016).

While Teslas are certainly the best know commercially available self-driving cars, other companies are working hard to catch up. Toyota, BMW, Nissan, Ford, General Motors, and even Apple and Uber have announced programs to develop self-driving cars. Hackers are at it as well – the first person to have hacked the iPhone claims to have built a self-driving car in his garage in a month (Bloomberg Business, 2015).

We’re entering a brave new world.

Pushing the Limits

Tesla is of course aware of the risks of this technology and their web site states their position: “Tesla Autopilot relieves drivers of the most tedious and potentially dangerous aspects of road travel. We’re building Autopilot to give you more confidence behind the wheel, increase your safety on the road, and make highway driving more enjoyable… The driver is still responsible for, and ultimately in control of, the car” (Tesla Motors, 2015).

The need for the driver has not gone away; as Slate(Brogan, 2015) quotes a Redit poster, “Your Tesla Model S is not going to drive you home drunk anytime soon.” Or, as the automotive web site Jalopnik puts it, the Model S “doesn’t really act like a self-driving car; it acts like a regular car that has the world’s best cruise control” (Orlove, 2016). Tesla’s Autopilot works best on the freeway. It does not function well in heavy rain. The current version doesn’t recognize traffic lights, signs, or traffic cones. The fatal accident on May 7 was apparently caused by Autopilot confusing a turning truck with an overhead sign.YouTube videos do show problems. A video titled “Tesla Autopilot tried to kill me!” shows a swerve into oncoming traffic (on a two lane road). In another, the reviewer discusses a speeding ticket received during self-driving.

Yet people are testing the limits. Car and Driver reports: “Tesla’s legal eagles are quick to note that, ‘Autopilot is a hands-on feature. You must keep your hands on the steering wheel at all times.’ Flouting those admonitions, we quickly discovered it works just fine hands-free” (Sherman, 2015).One YouTube video, while admonishing, “THIS IS A JOKE,” shows a driver reading a newspaper and unable to see out the windshield as the car is cruising down the highway. Another, warning “DO NOT ATTEMPT,” shows the driver’s seat empty as a Tesla proceeds down the road. These drivers claim to be on private roads and to have spotters, but they show what can be done and to a certain extent advocate pushing the capabilities of the technology.

Tesla recognizes that there are issues and has restricted Autopilot functionality in its first software update since roll out (Version 7.1). According to Mashable(Jaynes, 2015), “Elon Musk admitted that Tesla is going to add ‘some additional constraints’ to the recently unveiled Autopilot system in order to ‘minimize the possibility of people doing crazy things with it.’”

The Risks

I believe we are entering a dangerous time, but my concern is not self-driving in obviously inappropriate ways. People are smarter than this. We trust drivers to operate vehicles in a safe manner, and most of us do. YouTube stunts aside, I hope such scenarios are unlikely in the real world – although there are allegations that the driver in the Tesla fatality was watching a movie.

I think the real danger is subtler. I am all for the development of “driverless cars,” as I do think they will ultimately be much safer that human drivers, but I am very concerned about this interim situation where the responsibilities of the driver will be reduced significantly, but not entirely. There’s a gray zone, somewhere between using Autopilot in dark, rainy school zone and on a wide-open eight-lane freeway with well-marked lanes. In that gray zone there is a chance that the driver may need to intervene to prevent an accident and the driver will need to stay alert in order to jump in and regain active control of the vehicle. Distraction and inattention are issues when we have to do 100% of the driving; how will we maintain sufficient attention when our vehicles are driving autonomously, so that we are able to successfullyquickly intervene on the rare occasions when we have to? Google realizes this issue, and their goal is to develop a fully self-driving car – no steering wheel or brake pedal. Their goal is to skip the stage that I feel is most dangerous.

Many drivers have bad habits behind the wheel. We talk on the phone, text, eat, and perform other activities that are distracting and take our attention off of the road. Currently self-driving cars are new and exciting, and drivers are more likely to be attentive because of the novelty of the experience. But what happens when self-driving becomes routine, and drivers lapse those bad habitsbecause they believe the autopilot is safely in control? Will they look up from Facebook, or a movie, in time to hit the brakes?

I’m seeing an overconfidence that could contribute to this problem. YouTube reviewers love Autopilot, like to experiment with it, and are prone to trust the technology once they see it in action.One says it “drives better than most of the drivers in Miami” – although this was the reviewer who was pulled over for speeding. Another, on a four-lane road, has to take the wheel to prevent his car from plowing into traffic cones, but proclaims, “Tesla Autopilot: It is amazing!”CNET says a cost-to-coast drive in a Tesla, made primarily on autopilot, “should put to rest the notion (however foolish) that Tesla’s Autopilot is inherently unsafe or even remotely dangerous” (Kolk, 2015).

I’m also seeing a tolerance of issues that do arise. There is a tendency to view the car as software, where bugs are expected. Elon Musk has, in fact, referred to the software as a “public beta test” (Ramsey, 2015). With over-confidence and risk taking evident, the consequence of a software issue could be a fatal vehicle accident, rather than an annoying reboot.

Proceeding Carefully

Despite these concerns, I’m quite excited about self-driving cars. They have the potential to be safer than those driven by people. They couldalso relieve the tedium of commuting, use roads more efficiently, and help the elderly remain mobile.  That said, there are issues and in a follow-up article I’ll describe some of the human factors concerns behind these issues and how they might be addressed.


Dr. Craig Rosenberg is an entrepreneur, human factors engineer, computer scientist, and expert witness. You can learn more about Dr. Rosenberg and his expert witness consulting business at www.ui.expert

Bibliography