Technology Misuse Endangering Automated Driving


If we’ve learned anything from the era of reality television and user-generated online video, it’s that a surprising number of people will risk great harm by misusing themselves or technology to get some online attention. Whether it’s blowing up a microwave, eating laundry detergent pods, or misusing driver assist features on a car, too many are willing to abandon common sense in search of the dopamine hit that comes with seeing the number of views ratchet higher. I shake my head in bewilderment when I hear of someone swallowing a detergent pod, but at least they are not putting others in harm’s way.

Vehicle Travel Should Be Serious

More concerning is seeing videos of people using today’s vehicle partial automation systems, like Tesla AutoPilot, beyond the scope of its capabilities or trying to figure out how to trick it into functioning as a more highly automated system. I have no issues with hardware hacking of stationary devices, or vehicle systems not related to driving. Repurposing hardware you have purchased to provide added functionality can be fun, educational, and allows you to extract more value from it.

But modifying or tricking a vehicle’s guidance system puts innocent bystanders at risk, with potentially disastrous consequences. People who override driver assistance systems or pay little attention to the vehicle’s operation could negatively affect the adoption of automated vehicles.

Consumers Shouldn’t Overestimate Vehicle Autopilot

Tesla AutoPilot and similar systems from General Motors, Volvo, Mercedes-Benz, Nissan, and others are not automated driving systems. Except for GM’s SuperCruise, none of these systems are reliably able to hold a vehicle in lane to the degree of hands-off functionality. All of the driving systems, including GM’s, require the driver to remain engaged with eyes on the road and ready to take over.

Overconfident Users Are Misusing Existing Automated Capabilities

While Tesla CEO Elon Musk often talks about software updates that will give AutoPilot full self-driving capability, that day has not arrived and may never be here with the current generation of hardware. Despite the well-known flaws and limitations of AutoPilot, Tesla owners continue to ignore warnings from the system and the company, using the system in ways or in places where it should be disabled. One owner that has posted dozens of videos to YouTube recently tried to demonstrate that stuffing oranges between the steering wheel rim and spokes could fool the system into thinking the driver’s hands were on the wheel. Had this been done on a closed track, it might have been an interesting stunt. On a public road, with other vehicles around, this was downright reckless.

An Apple engineer recently died when his Tesla was on AutoPilot mode and ran into a highway barrier in California. While the system clearly failed to hold the vehicle in the lane, this driver had previously complained about the car exhibiting the same bad behavior to Tesla service. Since the accident, several other Tesla owners have replicated the situation while recording video with a hand-held phone, risking further injuries.

A pedestrian was killed by an Uber autonomous test vehicle in another instance of a driver not paying attention as instructed and pushing the technology beyond its limits. Automakers need to continue clarifying the vast differences between the driver assist technologies of today and the driver not needed technologies of tomorrow.

Holding out Hope for Progress

A number of studies have already shown that a majority of people don’t trust automated driving systems. Automation has the potential to provide enormous societal benefits by saving lives and damage to property. However, if the actions of those looking for views erode public trust in the technology even as it improves, those benefits may remain off in the horizon.

Leave a comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.