The future of driving may not be as autonomous as we think
- According to a survey conducted by the American Automobile Association (AAA), 73 per cent of US drivers say they would be afraid to ride in an autonomous car
- The main reason why people’s excitement about self-driving cars is diminishing is because of serious accidents involving autonomous vehicles
- Along with safety issues, self-driving tech also raises cybersecurity issues. Autonomous cars rely on sensor data, and where there’s data, there’s always the chance of a cyber-attack
- Self-driving cars can’t even be washed at traditional automated car washes, because the hard brushes can damage the sensors of a self-driving vehicle, while soap and water could potentially ‘blind’ the sensors
- Governments needs to introduce clear regulations for self-driving cars before they fully take over our streets
There’s been a lot of hype around self-driving vehicles over the past few years. Inefficiencies in traditional transportation systems such as traffic congestion, high carbon emissions, and a lack of parking space are making autonomous cars more appealing. This encouraged several car manufacturing companies to start working on this new technology.
But self-driving cars aren’t that perfect. In fact, they might be scarier than initially thought. So, before this emerging technology enters our roads, it’s time for a reality check. Are self-driving systems truly our future, or just another hype?
Uber’s self-driving car killed a pedestrian
The main reason why people’s excitement about self-driving cars is diminishing year after year is safety. For instance, earlier this year, an Uber-owned autonomous vehicle killed a pedestrian in Arizona. This was the first fatality of a pedestrian caused by an autonomous vehicle in the US. The car, identified as a Volvo XC90, had a safety driver but was in its autonomous mode at the time of the accident. It was around 10 p.m. when a 49-year-old pedestrian, Elaine Herzberg, was hit by the vehicle, which was moving at a speed of 64 kilometres per hour. Herzberg was crossing a stretch of road that’s over 1.6 kilometres long with no pedestrian crossings, making it dangerous for pedestrians in general. The car, just like other self-driving vehicles, was equipped with lidar technology, thanks to which it can detect its surrounding environment. Lidar relies on sensors to detect pedestrians, cars, and objects on the road and is designed to perform well at night time.
The authorities have released a video of the accident, showing that the safety driver inside the vehicle wasn’t paying attention to the road and didn’t have her hands on the wheel. This is what they’re are advised to do, so they can take control of the vehicle in case of an emergency. Besides Arizona, Uber was testing its vehicles in California, Pennsylvania, and Canada. Soon after the accident, the company temporarily suspended all of its testing projects. While autonomous car manufacturers have claimed such tech is completely safe, the Guardian reports that self-driving vehicles are “entering the most dangerous phase”.
In March, one of Tesla’s Model X vehicles crashed into a concrete divider on the highway in California. The vehicle soon caught fire. Unfortunately, the driver wasn’t able to make it out in time and was killed in the accident. All this happened while Tesla’s vehicle was operating on its autopilot system. While it’s not hard to believe that this happened due to a technology failure, the company blames the driver for the accident. The driver didn’t respond to any warnings provided by the vehicle, and his hands weren’t detected on the wheel before the accident occurred. Tesla’s autopilot is created to maintain speed and change lanes, but it still requires the driver to intervene to avoid accidents, the company explains.
The company further defended its tech, stating that the safety levels of its vehicles are very high. Every year, there are approximately 1.25 million deaths caused by car accidents. According to the company, a person driving a Tesla car with the autopilot system is 3.7 times less likely to end up in a car accident. While it seems that Tesla is in denial, accidents involving its vehicles are becoming more common. In Utah, another Tesla car failed to stop at a red light and crashed into a fire truck. Fortunately, the driver suffered only minor injuries.
All these accidents show that self-driving technology is still in the experimental stages, and a lot needs to be done before we can say it’s completely safe.
The threat of hacking
Along with safety issues, self-driving tech also raises cybersecurity issues. Autonomous cars rely on sensor data, and where there’s data, there’s always the chance of a cyber-attack. Imagine you’re enjoying a ride in your self-driving car. You’re about to arrive to the right location, but the car suddenly turns and takes you to a completely different location. Or even worse, what if that same car starts acting strange during the ride and stops in the middle of nowhere. This is what a team from the University of Michigan thinks could happen if cyber criminals manage to hack into self-driving systems. Just like any other technology, autonomous cars are vulnerable, too. As the team explains, “Without robust, fool-proof cybersecurity for autonomous vehicles, systems and infrastructure, a viable, mass market for these vehicles simply won’t come into being.” And automakers should be aware of this.
Are law makers preventing the roll out of self-driving vehicles?
Though some believe that self-driving cars will be on our roads by 2020, not all countries are ready for their arrival. The US federal government, for instance, has been quite slow in issuing laws regarding autonomous vehicles. The main reason why some laws are still put on hold are safety concerns. New rules issued across US states only add to the confusion. This is especially the case with accidents involving autonomous vehicles. If an accident happens, who’s to blame? With traditional cars, the driver is responsible, but with self-driving vehicles, things aren’t that simple.
Michigan issued a law in which a self-driving system is regarded as the vehicle’s driver. In case of an accident, the car manufacturer will be legally responsible. Nevada issued a different regulation, declaring the car’s owner is responsible for any incident. In Australia, the authorities passed a law in which a human driver is required to be present in the vehicle. This creates barriers to the adoption of autonomous vehicles. “With automated vehicles, there will be times when an ‘automated driving system,’ rather than a human, will be in control of the vehicle,” says Paul Retter, the chief executive of the National Transport Commission (NTC). Retter claims that Australia needs “nationally consistent law to know who is in control of a motor vehicle at any point in time”.
Autonomous cars aren’t that convenient after all
If you thought that having a self-driving car will bring convenience to your life, think again. While autonomous cars are promising to change the transportation industry, when it comes to simple tasks, such as washing them, they could become your worst nightmare. And this is no exaggeration. Automated car washes are widely available and have been a part of our routine for years. But this can be dangerous for self-driving vehicles. CNN reveals that the hard brushes used in car washes can damage the vehicle’s sensors, while soap and water could potentially ‘blind’ the sensors. Just bear in mind we’re talking about technology worth $100,000. On the other hand, self-driving vehicles need to be cleaned more often, because dirt or water residue can affect the sensors’ performance and impact the safety of the drive.
Self-driving technology has come a long way, but more needs to be done. Though the future is unpredictable, one thing is certain: issues such as safety and cybersecurity, as well as regulations for autonomous cars need to be addressed before the tech becomes widely available. Until that happens, we’ll still be getting from point A to point B in traditional cars that can’t drive themselves.