As we humans are prone to mistakes, so are self-driving cars. After all, it is humans who build these cars in the first place! And with so many situations that can happen on the road, it’s hard to program a vehicle to work in every scenario. However, we’ll have to do that in the next couple of years, if we want to see the benefits associated with getting rid of human error at the steering wheel.
Just to make an idea - statistics show that more than 90% of car crashes, even up to 96%, are caused by human error. And just because we don’t see crashes that often in front of us it doesn’t mean they don’t happen. Just in the United States alone, there are more than 6.4 million car crashes annually.
So, what can we do? To begin, it’s crucial to find that sweet spot of balance between autonomous systems and human involvement. When creating a self-driving vehicle, manufacturers need to make sure that they establish good communication between all the data ever gathered by their vehicles on the road. Combine that with simulations fed into the software of the car, and a huge processing power to analyze all the real-time data during a ride, and yeah… It’s essential to think about every scenario that can happen, and that is slowing down the mass implementation of self-driving cars.
And self-driving cars are not the only problem. Autonomous systems in general can make mistakes. Drones, artificial intelligence, and everything in between are not perfect. When it comes to self-driving automobiles, though, we have to pay extra attention! The risks are far higher and can result in disastrous consequences for everyone involved. Like for example, in 2018, Uber got into serious problems after one of their self-driving cars failed to assess a situation correctly and killed a pedestrian.
After the investigation, it looked like the car’s software failed to recognize the person in front of it, namely Elaine Herzberg, because of a bike that she was pushing along. Uber ended up not being responsible for this death, but the supervisor driver who was in the car and did not pay attention to the road was charged with homicide out of negligence.
It’s common sense that the terms, conditions, and instructions for a self-driving car should be perfectly made clear to buyers. We need to make sure that human drivers understand how these vehicles operate and what they should do in case something bad happens. This information should be simple to grasp and provided in such a way that every driver can make educated judgments fast. People’s lives are at stake until we can fully trust the software of a self-driving car to function properly on its own. And that might take some more time.
But these kinds of innovative features, like the self-driving option in some automobiles, can bring tremendous benefits in some cases. Equipping a car with sensors that identify when the driver is not paying attention or is unable to take control of the vehicle, the autonomous system can take control of the vehicle and drive it to a safe spot where the driver can resume control later.
How we manage this is what’s truly important. Car crashes are the most common possible outcomes; therefore, it’s crucial to find the balance between autonomous and human judgment. Even though self-driving cars are supposed to be safe, accidents can occur if the human driver doesn’t understand their car’s limitations in full.
And if an accident occurs, who will be responsible? It’s a vital question that we need to address. Is the autonomous system to blame? Or the human driver, because he didn’t follow certain rules? Or maybe the car manufacturer? The only way to determine this is by analyzing the exact circumstances of the accident - and even then, we cannot be one hundred percent sure.
Even more so, it’s interesting to think about what an autonomous car decides in some extreme cases. For example, what should a self-driving car do if it had to pick between hitting a person or hitting another car? It makes perfect sense for us now, but sometimes in the heat of the moment, even the human brain fails to make the right decision. This is a dilemma that manufacturers must solve before self-driving cars become widely available and adopted.
One thing’s for certain. It will take some time before we can create a balance between autonomous vehicles and human judgment. It might not even happen, ever, but we will just have to wait and see. Technology has a lot in store for us.