The promise of self-driving cars remains unfulfilled, as the technology still requires drivers to co-pilot the vehicles to avoid collisions.
Between driverless cars, autonomous weapons and AI-powered medical diagnostic tools, it seems there will be no shortage of ethically-complex situations involving AI in the future.
An autonomous vehicle expert explains how Tesla’s Autopilot works, what prompted US authorities to investigate the system and what changes might be in store for the company.
The public holds self-driving cars to incredibly high safety standards – and we’re working to meet them.
Tesla has a lot more going for it than just its electric cars. VW must think wider and bolder to save the business.
Self-driving cars may someday drop off their owners downtown and then leave to find free parking. What would that mean for cities of the future?
The real ethical challenge of driverless cars is not deciding how they respond in emergencies – it’s facing up to the failings of human drivers.
How will people respond once they realise they can rely on autonomous vehicles to stop whenever someone steps out in front of them? Human behaviour might stand in the way of the promised ‘autopia’.
An increase in the use of self-driving cars will change parking infrastructure in cities, and hopefully result in more colourful character neighbourhoods.
Autonomous cars need to learn how to drive just like people do: with real-world practice on public roads. It’s key to safety, and to public confidence in the new technologies.
Transport policies in European cities are on a collision course with the tech industry’s ambitions for self-driving cars.
Governments have started to see automation as the key to brighter urban futures. But what will this look like?
The report of the mathematician and deputy of Essonne Cédric Villani renders his report on artificial intelligence today.
The biggest ethical challenges for self-driving cars arise in mundane situations, not when crashes are unavoidable.