President Trump’s ban on immigration from several mostly Muslim countries was ultimately upheld by the Supreme Court. President Biden revoked it on his first day in office.
Andrew Harnik/AP Photo
A civil rights group is suing Facebook for its failure to stop the spread of anti-Muslim hate speech on the platform.
Algorithms help lots of people discover new music.
Music recommendation algorithms are more likely to suggest music by male than female artists.
Search engines often serve up a distorting blend of information and misinformation.
Crispin la valiente/Moment via Getty Images
Search engines, like social media algorithms, get you to click on links by learning what other people click on. Enticing misinformation often comes out on top.
If the historical data used to train an AI system disadvantages certain minority groups, the system can be swayed to follow these patterns in its own decision-making process.
A-level students protest the use of algorithms to determine their grades.
Jonathan Brady/PA Wire/PA Images
Problems in the underlying data mean we can’t leave algorithms to decide things on their own.
Scientists are arguing over how YouTube might help turn people into extremists.
Emotion recognition technology, an outgrowth of facial recognition technology, continues to advance quickly.
A report calls for banning the use of emotion recognition technology. An AI and computer vision researcher explains the potential and why there’s growing concern.
When algorithms make decisions with real-world consequences, they need to be fair.
A machine learning expert predicts a new balance between human and machine intelligence is on the horizon. For that to be good news, researchers need to figure out how to design algorithms that are fair.
Algorithms can reinforce existing biases in society.
The fundamental problem with AI is it is often riddled with society’s existing biases and prejudices.
Technology firms should use more design fiction to explore and avoid potential negative consequences, such as AI bias.
Specialist machine learning and narrow AI could help us to start removing the “four Ds” - dirty, dull, difficult, dangerous - from our daily work.
Artificial intelligence is predicted to contribute some US$15.7 trillion to the global economy by 2030. A new report looks at issues specific to New Zealand.
Social biases in digital tech create racist face recognition software and sexist hiring tools, but more data collection isn’t the answer.
Sometimes the questions become too much for artificial intelligence systems.
When algorithms are at work, there should be a human safety net to prevent harming people. Artificial intelligence systems can be taught to ask for help.
An ethicist on why fixing algorithms may not be the best response to algorithmic bias.
Les gilets jaunes Joan Mora.
What do the Carlos Ghosn scandal, the rising power of algorithms and the “gilets jaunes” have in common? The need to extend the spatial and temporal definitions of responsibility.
What can an algorithm find when it reads a book?
Some AI technologies aren’t advanced enough to provide useful insights, but simpler tools can yield new opportunities to explore the humanities.
New technology, old flaws.
Expecting algorithms to perform perfectly might be asking too much of ourselves.
How well will artificial intelligence balance the human concept of fairness?
Artificial intelligence poses opportunities as well as dangers; understanding them – and regulating carefully – will help avoid harm to individuals and society as a whole.
Biometric Mirror is an interactive application that takes your photo and analyses it to identify your demographic and personality characteristics.
A new tool called Biometric Mirror exposes the need for public debate about the ethics of AI.
People who share potential misinformation on Twitter (in purple) rarely get to see corrections or fact-checking (in orange).
Shao et al.
Information on social media can be misleading because of biases in three places – the brain, society and algorithms. Scholars are developing ways to identify and display the effects of these biases.