#nissan old and new #carsandcoffee
Instagram filter used: Normal
Photo taken at: Chrysler Headquarters and Technology Center
Instagram filter used: Normal
Photo taken at: Chrysler Headquarters and Technology Center
When was the last time you ever actually read an end-user license agreement or terms of service before clicking “Accept” to install a piece of software or join the latest social network? Odds are that unless you are a lawyer, the answer is never. The technology companies that make these products would probably like it to stay that way. However, in the world of the self-driving car, that is not an acceptable policy. The tragic death of a Tesla Model S driver in Florida highlights the need for all automakers to be more open and transparent about the limitations of autonomous technology.
It seems that barely a day goes by when we don’t get a breathless press release from an automaker, supplier, technology company, or Silicon Valley startup about the amazing progress that they are making on self-driving technology. You can already go out today and purchase vehicles from a number of brands that promise at least partial autonomous capability, and full autonomy is being targeted by the end of this decade. While Tesla Autopilot, Volvo Pilot Assist, and other similar systems seem truly magical when they work as advertised, there are far more scenarios where these systems do not function at all.
Unfortunately, we have not seen Tesla CEO Elon Musk stand on a stage and tell people not to use Autopilot in the city, on curving rural roads, or in the snow. GM CEO Mary Barra stood on the stage at the 2014 ITS World Congress in Detroit and promised a Cadillac with hands-off Super Cruise capability in 2016. I’ve experienced prototype systems from Toyota and Honda and driven production systems from Tesla and Volvo, and when they work, they are incredibly impressive.
I am an engineer by training and technology analyst by trade, and I have a much greater understanding than the average consumer about how these systems work. As a result, I can never truly relax with these systems because I’m always on the lookout for the failure mode, and they are numerous. Unless very explicitly told, the average consumer will be so excited by the prospect of turning over control to a computer that they will not pay any attention to the warnings that Autopilot is very much in beta before enabling it. Volvo doesn’t even give that warning before allowing Pilot Assist to be engaged.
Tesla is fortunate that many of its existing customers are early adopters that expect technology to be imperfect, although most of them probably don’t expect to be at risk of injury when it fails. When the Model 3 arrives and mainstream consumers try Autopilot and find its limitations, they aren’t likely to be as forgiving, and the same is true for every other automaker offering autonomous features. Navigant Research’s Autonomous Vehicles report projects more than 4 million autonomous-capable vehicles to be sold by 2025. Those customers need to know what the systems can do—and, more importantly, what they cannot.
We don’t yet know all the details of what happened in the tragic crash in Florida. Similar accidents where one vehicle crosses a highway divider happen all the time, and fatalities occur when humans are in control. What we do know is that we are far from a time when we can just sit back and relax and let the computer do the driving. Every company involved in this space needs to be far more upfront with consumers about this technology can do or risk poisoning the market.
I made my debut on Al Jazeera today to talk about autonomous driving https://youtu.be/uD_c8nXqRqU
It's past time for automakers, suppliers and tech companies (I'm looking at you Google and Geohot) to be more candid and transparent about the real limitations of today's vehicle automation technologies.
Automakers Need to Start Being More Candid About the Limits of Autonomous Technology
When was the last time you ever actually read an end-user license agreement or terms of service before clicking “Accept” to install a piece of software or join the latest social network. Odds are t…
Accurate and precise map data will be a crucial component of a robust and reliable transportation ecosystem
Whoever Owns the Maps Owns the Future of Self-Driving Cars
??Inside the battle to own the data that will drive the automotive industry of tomorrow.?
The first fatality in a +Tesla vehicle under #autopilot mode demonstrates that truly autonomous vehicles need a more robust sensor suite than what Tesla and Mobileye have provided. It also shows why everyone should stop calling Autopilot "self-driving." It is not self-driving, it is advanced driving ASSIST!
First Tesla AutoPilot Fatality Demonstrates Why Lidar And V2V Probably Will Be Necessary
2016 Tesla Model S 70D with Auto Pilot (photo credit: Sam Abuelsamid) The news this week of the first traffic fatality while using the semi-autonomous Tesla AutoPilot system highlights a number of important issues. First and foremost is that AutoPilot does not magically transform a current Model S or X into […]