We speak to Ace & Tate Creative Fund on Garmin Connect
It is expected that new regulations and changes to the UK Highway Code will be needed before driverless cars can be used fully on the roads.
But the concept has thrown up a number of conceptual and safety challenges – including how to account for real-world driving conditions.
“Thousands and thousands of people are killed in car accidents every year,” said Dolgov to Reuters, in relation to the “speeding”, er, feature. The point is that on a motorway or highway, cars which drive much slower than those around them are at more risk than those breaking the arbitrary speed limit.
The current cars undergoing tests in the US are essentially normal vehicles, but with a large cylinder of sensors and lasers on the roof, and various other tech around the edge of the car.
But Google has also announced a separate line of electric self-driving cars it intends to build which will be limited to 25mph – presumably ruling it out of breaking the law in most circumstances.
Meanwhile a separate but related recent debate focused on whether self-driving cars should be programmed to protect the occupant at all costs, or whether they should prioritise greater numbers of pedestrians outside the vehicle… IE whether your Google car should be programmed to kill you.
The announcement might seem like a tiny detail in the evolution of the Google X project, but driverless cars could be on the roads within a decade, and have led to thorny legal and ethical issues. If you are using an autonomous car are you legally liable under the terms of your insurance? How can you be sure that your car, in its efforts to protect the occupants, won’t take out surrounding pedestrians if it has to take evasive action?
And perhaps most important of all, if Cecilia Abadie, who was ticketed for wearing Google Glass while driving, had been the lead passenger in a Google driverless car instead, would she have got a ticket then?