A machine learning expert predicts a new balance between human and machine intelligence is on the horizon. For that to be good news, researchers need to figure out how to design algorithms that are fair.
Is this face just an assembly of computer bits?
PHOTOCREO Michal Bednarek/Shutterstock.com
A legal loophole could grant computer systems many legal rights people have – threatening human rights and dignity and setting up some real legal and moral problems.
The past and present of Google – what’s next?
Sirirat/Shutterstock.com
It could seem attractive to try to teach computers to detect harassment, threats and abusive language. But it’s much more difficult than it might appear.
What algorithm turned these lights red?
monticello/Shutterstock.com
New research has uncovered a previously unknown weakness in smart city systems: devices that trust each other. That could lead to some pretty terrible traffic, among other problems.
Computers today are fast and powerful but they still can’t think like a human when it comes to some tasks we find easy. That’s why tech companies are turning to neuroscience for help.
Can an algorithm explain itself?
Robot decision via shutterstock.com
A European Union law will require human-understandable explanations for algorithms’ decisions. A team of researchers has found a way to provide that, even for complex calculations.
There are reasons to believe the promise of people analytics may not live up to the hype.
shutterstock
Unrestricted access to information is vital to a vibrant democracy.But if this information is inaccurate, biased or falsified, the fundamental freedom of informed choice is denied.
News delivery via social media is based on a business model that exploits our need for self-validation.
Reuters/Dado Ruvic
Changes in news media distribution and the impartiality of news sources provide good reason to be concerned. However, digital inequality is not the way to understand or measure it.
How fast can it get here?
Box delivery image via Hadrian / Shutterstock.com
Math isn’t prejudiced, goes the argument. But these arithmetic programs can learn bias from the data fed into them by human beings, leading to unfair treatment and discrimination.
Programs like Hour of Code introduce computer programming to students in an engaging manner.
Hour of Code 2014/Flickr
If machines run by artificial intelligence take over the world it’s only because we programmed them to do so. So how can fuzzy logic help us prevent that?