Do You Want a Self-Driving Car That Can’t Deal With Weather?


Mockup of Google's prototype low-speed autonomous car

Mockup of Google’s prototype low-speed autonomous car

Over the last several years, Google has made a lot of noise about its progress in developing self-driving cars. However, throughout that time one thing has remained absolutely consistent in all of their progress reports – the sun is always shining and the roads are clear and dry. Unlike the automakers based here in Michigan, Google exists in a seemingly magical environment that is rarely subject to the vagaries of weather. As I pumped some gas and cleaned the salt crust off my headlights the other day, I began to think about the autonomous vehicle prototypes I’ve seen recently.

Tech companies like Google are often accused of living in a bubble where they aren’t subject to the issues that we mere humans have to deal with on a daily basis. Rarely has this been more true than with autonomous vehicles, which Google thinks will be on the road in the next few years. I say fat chance.

Technology is best applied where it can accomplish tasks, better, faster, cheaper and more reliably than a human. One of the primary arguments for the development of self-driving vehicles is the desire to eliminate accidents and the more than 30,000 deaths that result in the United States alone every year. This is certainly a laudable goal. Human misbehavior is the primary factor behind the vast majority of traffic accidents. Unfortunately, the current reality of the technology is far from being able to achieve this goal.

Earlier this month, a winter storm swept across the Great Lakes region leaving behind two separate, massive pileups of more than 100 vehicles each in whiteout conditions. This was close to a worst case scenario, but it’s far from uncommon in this part of the country during at least one-quarter of the year. None of the autonomous vehicles currently being tested are even remotely capable of dealing with these sorts of conditions, instead throwing full control back in the hands of often hapless human drivers.

For example, last September at the ITS World Congress in Detroit, several automakers including General Motors, Toyota and Honda provided demonstrations of fully or semi-autonomous vehicles. GM’s latest prototype, an Opel Insignia with new, more affordable lidar sensors that are integrated into the bodywork was the most advanced. At CES earlier this month, BMW showed an i3 with similar sensor integration doing self-parking demonstrations. Each of these cars has four laser sensors behind windows built into the corners of the car with the GM having two more in the grille and trunklid.

insignia lidar

Autonomous Opel Insignia prototype demonstrated at 2014 ITS World Congress

During the winter months in northern states, falling snow is soon followed by snow plows and trucks spreading tons of salt on the pavement in an attempt to liquify the white stuff. While this certainly helps the tires get some purchase on the pavement, as those tires rotate, they invariably throw up a spray of salty slush that gets deposited on the vehicles behind.

Within a few minutes of driving in these conditions, headlight lenses become increasingly opaque leaving the road ahead nearly dark. Under those same conditions, the sensor covers on an autonomous vehicle would become similarly impregnable to the lasers.

Lasers are only part of the required sensor package for an self-driver. Cameras that monitor, lane markings, signs and distinguish between humans, animals and other vehicles are equally suspect in poor weather. At ITS, Mary Barra, GM CEO announced that in 2016, the company would launch its semi-autonomous Super Cruise system on the new Cadillac CT6 and vehicle-to-vehicle (V2V) communications on the CTS.

“On Super Cruise, if the painted lane lines on the road are snow covered, the system will not work,” acknowledged Dan Flores, GM technology communications manager. “Due to the current technology limitations, the driver will still need to be engaged for some time.”

The same caveat applies to any of the dozens of vehicles now available with lane departure warning and prevention systems. These systems are equally ineffective even of dry roads where no lane markings exist, a common phenomenon in rural areas.

Interestingly, the V2V radios that will debut on CTS and other vehicles in the coming years may actually prove to be far more useful in these most challenging conditions. When vehicles are able to broadcast real-time alerts about road conditions, speed, acceleration and more, the human drivers following behind will get instant alerts when something bad is happening, often with enough time to react and either avoid or mitigate the hazard.

“During the recent massive pileups in Michigan, V2V communications could have alerted the vehicles behind to what was happening before the drivers arrived at the crash location,” said Scott McCormick, president, Connected Vehicles Trade Association. “This is a situation where current sensor technology would not have helped.”

V2V along with vehicle-to-pedestrian communications are expected to proliferate through the vehicle ecosystem much more quickly than autonomous capability in the next decade. At a cost expected to be less than $100 on new vehicles, a variety of companies are also looking at affordable retrofit solutions for existing vehicles.

It’s very important for Google, automakers and suppliers to continue development and testing of automated driving technology. There are still a great many hardware and software problems to resolve before we can hop into our cars, tell them where to go and then sit back and relax.

In the meantime, we need testing in real-world conditions that include bad weather, unpaved rural roads, and interaction with the more than 200 million human driven vehicles on American roads. Until autonomous driving technology can reliably outperform a human when it is needed most, there is little reason to pay a big price premium to get it.

Leave a comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.