When people think about how AI might ‘go wrong’, most probably picture malevolent computers trying to cause harm. But what if we should be more worried about them seeking pleasure?
Natural language coding means that people won’t need to learn specialized coding languages to write programs or design websites. But large corporations will control the means of translation.
Undergraduate programs are springing up across the US to meet the burgeoning demand for workers trained in big data. Yet many of the programs lack training in the ethical use of data science.
If you see the Tesla Bot as a joke or a harbinger of a dystopian future, you could be missing the real threat, which has more to do with Elon Musk’s power than robots run amok.
Politicians of all stripes, computer professionals and even big-tech executives are calling on government to hit the brakes on using these algorithms. The feds are hitting the gas.
New Zealanders are worried about autonomous weapons. But military alliances with the US and Australia, and potential economic gains from local robotics research, mean NZ won’t yet take a tough stand.
Common sense is a broad and diverse set of abilities that help define what it means to be human. AI researchers are struggling to endow computers with it.
Like atomic bombs and chemical and biological weapons, deadly drones that make their own decisions must be tightly controlled by an international treaty.
Lawyers were thought to be mostly immune from the coming AI revolution, but two legal experts explain why jobs that rely on human ingenuity can still be affected.
Shang Gao, University of Illinois Chicago and Jalees Rehman, University of Illinois Chicago
Machine learning is great at finding patterns but doesn’t know what those patterns mean. Combine it with knowledge gained from genetic research and you have a powerful view into the workings of cells.
The decision is supported by the government’s policy environment in recent years. This has aimed to increase innovation, and views technology as a way to achieve this.
Harisu Abdullahi Shehu, Te Herenga Waka — Victoria University of Wellington; Hedwig Eisenbarth, Te Herenga Waka — Victoria University of Wellington, and Will Browne, Queensland University of Technology
Robots are more likely than people to misclassify emotions when reading faces that are partially covered. This could lead to unexpected behaviours when they interact with people wearing masks.
It’s difficult to tell a shipwreck from a natural feature on the ocean floor in a scan taken from a plane or ship. This project used deep learning to get it right 92% of the time.