It can be complicated to teach a computer to detect harassment and threats.
It could seem attractive to try to teach computers to detect harassment, threats and abusive language. But it's much more difficult than it might appear.
What algorithm turned these lights red?
New research has uncovered a previously unknown weakness in smart city systems: devices that trust each other. That could lead to some pretty terrible traffic, among other problems.
A branch of AI research promises to deliver computers that evolve their own software but the tech industry has yet to catch on.
Two Stanford researchers used a deep neural network to detect sexuality from profile pictures on a US dating website.
We have far more to worry about from outdated science that embodies dubious prejudices than we do from deep learning networks.
Trust in me.
We prefer to go with our guts.
All those neurones: if only a machine could really think like a human.
Computers today are fast and powerful but they still can't think like a human when it comes to some tasks we find easy. That's why tech companies are turning to neuroscience for help.
Can an algorithm explain itself?
Robot decision via shutterstock.com
A European Union law will require human-understandable explanations for algorithms' decisions. A team of researchers has found a way to provide that, even for complex calculations.
There are reasons to believe the promise of people analytics may not live up to the hype.
Despite its promises, people analytics has serious ethical implications and can adversely affect organisations and how people are treated at work.
Unrestricted access to information is vital to a vibrant democracy.But if this information is inaccurate, biased or falsified, the fundamental freedom of informed choice is denied.
News delivery via social media is based on a business model that exploits our need for self-validation.
Changes in news media distribution and the impartiality of news sources provide good reason to be concerned. However, digital inequality is not the way to understand or measure it.
How fast can it get here?
Box delivery image via Hadrian / Shutterstock.com
Algorithms can discriminate, even when their designers don't intend that to happen. But they also can make detecting bias easier.
It’s all just data – how can it be prejudiced?
Math isn’t prejudiced, goes the argument. But these arithmetic programs can learn bias from the data fed into them by human beings, leading to unfair treatment and discrimination.
Programs like Hour of Code introduce computer programming to students in an engaging manner.
Hour of Code 2014/Flickr
If we want students to be well prepared for the 21st century, then we should be teaching coding in school.
A model of the Terminator from the popular movie series where machines take over the world.
If machines run by artificial intelligence take over the world it's only because we programmed them to do so. So how can fuzzy logic help us prevent that?
Much harder than scoring a goal.
When the English football fixtures were announced in June, many fans would have studied them from their own perspective. Are the fixtures fair to their team? Why do they have to travel the full length…