Autonomous cars are getting closer and closer to becoming an everyday sight on our roads. In fact, Audi just recently announced a pledge to spend close to $16 billion (£12.63 billion) on electric mobility and self-driving technology through 2023, while Volkswagen stated only last summer that its driverless car technology, complete with Level 4/5 capability which means that no attention is required from the driver, is on course to be released into a VW dealership near you in 2021.
However, there are some issues which still need to be figured out when it comes to driverless vehicles before they become a reliable and trustworthy way to get from A to B, says Lee Dover.
Autonomous vehicles can’t predict human behaviour as well as other humans can
Eye contact helps so much when you’re behind the wheel. For one thing, drivers who are trying to navigate safely through a busy area such as a city centre will constantly be keeping an eye out for pedestrians and trying to predict their behaviour. It is those subtle glances which can ensure accidents are avoided on our roads, as motorists will hopefully be able to quickly react if someone suddenly decides to step off a pavement for whatever reason.
Then there is the importance of eye contact that is made between motorists. For example, a driver will be able to realise if another person behind the wheel is in an emotional state or distracted by seeing them while they are driving alongside one another. When an issue is noticed, most drivers will aim to either keep their distance from the car with the problematic motorist or be better placed to anticipate sporadic movements.
Behind the wheel, both eye contact and the correct reading of body language comes in so useful when drivers are making turns at intersections, merging into lanes and reacting to unexpected changes in traffic patterns.
Andersen introduces Alexa-integrated voice controlled EV charger
Bearing all the above in mind, the question that must be asked is: how much emotional intelligence will autonomous vehicles possess to be able to make predictions based on the way humans often act? Human drivers often act subconsciously behind the wheel but expecting the same from technology may be asking a little too much — even with sensors and algorithms coming into play.
The problem with snow
As a driver, you’ll only know the dread of heading out onto the road when it’s snowing all too well. If it’s not the snowstorms themselves that will extremely limit your view of the road, it will be the slippery road surfaces and the difficulty in knowing when a stretch of road turns.
The last problem of those highlighted in the previous paragraph has also raised particular concern in a future world filled with driverless cars. This is because self-driving vehicles rely on the use of cameras to be able to track the lines on a pavement and read road signs that it passes. What happens if a layer of snow causes lane dividers to temporarily disappear though? And how about when a drifting snowstorm results in signs being covered entirely?
eRDS launches COBie-compliant cloud-based software
Standstills are to be expected when there’s snow in the air, but it will be interesting to know if autonomous vehicles will even be able to start a journey whenever the harsh wintry weather hits.
Preventing driverless cars from panicking in normal situations that can look problematic
Professor of robotics at the University of Oxford, Paul Newman, is also the founder of Oxbotica, a firm tasked with building driverless cars. He has highlighted another issue being faced by autonomous vehicle manufacturers by setting out a situation that we may all have encountered as drivers.
The case in point is to imagine a scenario where two vehicles are approaching one another while travelling at speed but in different directions along a gently curved country road. As humans, we will be confident that each car in this situation will stick to their own lanes and eventually pass safely a few feet to the side of one another.
FSB report assesses FinTech developments and potential
However, Mr Newman pointed out: “But for the longest time, it does look like you’re going to hit each other.”
With this in mind, what can be done to educate a self-driving car that it doesn’t need to panic in a similar situation? On the one hand, you will want to avoid the vehicle veering off the road as it attempts to avoid a collision that is never going to occur. On the flipside though, no-one will be wanting driverless vehicles to become too complacent and then fail to react should it find itself actually heading towards a head-on incident.
Currently, Mr Newman states that getting an autonomous vehicle to guess right every single time “is a hard, hard problem”.
For driverless cars to be truly regarded as the future of motoring, problems like those detailed in this article will surely need to be addressed.