Think it's a mere coincidence the first two letters of "algorithm" hint at Artificial Intelligence?
Some have suggested that deracialising the academy requires all researchers, teachers and students to link knowledge and identity. What might this mean for mathematics?
We need better surveillance systems to detect epidemics early. But while social media has been flagged as a potential solution, we're not quite there yet.
Is the rise of big data and the use of algorithms by businesses to blame for modern society's ills?
Changes in news media distribution and the impartiality of news sources provide good reason to be concerned. However, digital inequality is not the way to understand or measure it.
Algorithms that learn from large data sets can pick up inherent social biases. That could perpetuate the biases, or even worsen them.
Making decisions about what people do and don't read is the traditional role of an editor, no matter what Facebook claims.
Business Briefing: trusting an algorithm with investment decisions.
The Conversation13.9 MB (download)
Financial advice was once the realm of bankers and brokers now startups are developing digital platforms to take advantage of how trusting we are of investment advice from computers.
Algorithms can discriminate, even when their designers don't intend that to happen. But they also can make detecting bias easier.
Machine learning is being used to see if it's possible to predict whether someone will commit a crime some time in the future. But does this risk condemning people for a crime they haven’t committed?
Imagine a CEO that could bridge international work days, across country markets, working 24 hours a day.
New research shines light on whether creating such a haven as a new type of exchange that slows trading down a bit could attract enough traders to be effective.
Data-driven algorithms drive decision-making in ways that touch our economic, social and civic lives. But they contain inherent biases and assumptions that are too often invisible to the public.
Software is eating everything in this online, digital world. We need to design code that uses as little energy as possible.
Imagining possible futures can help us plan a secure information technology environment for the years to come.
If smart cities run on big data and algorithms that channel only 'relevant' information and opinions to us, how do we maintain the diversity of ideas and possibilities that drives truly smart cities?
The disruption happening thanks to algorithms is happening all around us.
We increasingly depend on algorithms applied to big data, but even algorithms make mistakes that could label us in worrying ways
Math isn’t prejudiced, goes the argument. But these arithmetic programs can learn bias from the data fed into them by human beings, leading to unfair treatment and discrimination.
Humans are no longer the only judges of creativity. Computers can perform the same task – and may even be more objective.