Will driverless cars be safer
The future of the automotive industry seems destined to be autonomous vehicles. With Google investing heavily in the technology,; Tesla rumoured to be preparing their own driverless cars, and the world’s most valuable company Apple expected to throw their hat into the ring, the future of driving is computer controlled. But will it really be safer? With human error removed from the equation, will we become too comfortable with trusting an emotionless calculating computer to transport us safely from A to B?
Initial teething problems
Autonomous cars are already appearing on the roads, with approval for testing being given across several states in the USA. The result so far has been a number of crashes, as other road users adjust to driverless cars on the road. So far, Google have announced approximately a dozen crashes since testing began, the majority of those a result of human error. It seems members of the public keep rear ending the Google cars. The reason why is unclear, with Google blaming it solely on human driver error.
This raises an important question. Will driverless cars truly be safe whilst human drivers are on the road? There is little doubt that not everyone will embrace the autonomous lifestyle. Many people mistrust technology, whilst others enjoy the act of driving itself. Even if it becomes mandatory to switch to a driverless car, what would the cost be? And will big car companies sit back and allow their catalogue of popular vehicles to be made redundant by the new technology?
Another potential issue is the hacking threat to driverless cars. Being completely computer controlled means that if someone nefarious can hack the program, they could take control of a person’s car with them powerless to resist. Not only could this cause a lot of accidents, there is potential for even greater threats.
The FBI have already set up a taskforce to analyse the potential threats of terrorists and criminals taking over driverless cars for their own means. A car can become a battering ram if programmed to ignore traffic signs and plough into pedestrians or buildings. There is a worry that autonomous cars can become mobile gun platforms, with criminals able to shoot from the vehicles with ease. A hacked car could kidnap the driver and pedestrians, driving them where the hacker wants. And the biggest threat is the placement of a bomb in a driverless car, driven to the target and detonated without harming the attacked.
Picture the scene. You are in your driverless car and an accident appears ahead. There are two options. The car swerves to avoid the crash but endanger a crowd of pedestrians nearby. Or the car swerves out of the way of the crash but hits a wall, endangering the passenger’s life. What would you do? Well with driverless cars, the decision is out of your hands. A computer program would quickly calculate the best possible outcome. So how do you set up the computer to make this decision?
Would a computer choose to hit a child as opposed to a crowd of people? What price do you set on each person’s life? It is a moral conundrum that could mean a car puts you at risk to save other peoples. A good principle in general, but is this what you want from your car?
The future is driverless
There is no doubt that the future of driving is autonomous. As technology progresses and develops, more teething problems will be encountered. Solutions will be proposed and implemented, with the technology becoming safer as time goes on. The question is not “If” but “When”. Soon we will be seeing driverless cars on the road. It doesn’t mean we should discount the potential dangers, just be aware that it will take time to perfect. And when it is perfected, we can look forward to a safer, traffic-free life ahead of us.
published: 25/09/2015 08:54:43