Plenty of talk about what we want from artificial intelligent systems, but what do we actually mean by AI? From a legal and regulation point of view, we do need a definition.
Have questions about artificial intelligence or the future of robotics? Wondering if your job is vulnerable to automation? Concerned about superintelligent AI? Now’s your chance to ask.
The challenge in making AI machines appear more human.
Flickr/Rene Passet
There is much debate on the ethics of artificial intelligence machines that are designed to kill. But who’s responsible when a non-lethal AI system causes damage, harm or even death?
A ban on killer robots is useless if your enemy doesn’t play by the rules.
Flickr/Bob Snyder
The thousands of people who signed an open letter calling for a ban on autonomous killer weapons and robots are misguided. We already have such killing machines and we should embrace them.
The robots in the DARPA Robotics Challenge can open doors and drive cars. But developing machines that can think for themselves is more tricky.
There are a lot fewer workers on the assembly line today. And it’s not just car manufacturing that has seen jobs lost to automation.
Ford Europe/Flickr
The more we automate jobs, the more we need to find new jobs for people, especially if the government wants us to stay in the workforce longer. That’s going to take some clever thinking.
Tomorrow’s engineers? Unlikely.
Robot engineer via www.shutterstock.com
The debate over whether lethal autonomous weapon systems (LAWS) – often called ‘killer robots’ – should be banned continues, although it’s far from settled.