Surveillance software that identifies people from CCTV is eroding human rights and democracy.
You can't change your fingerprint if it's stolen like you'd change your password.
Legal bans and moratoriums on other emerging technologies need not be permanent or absolute, but the more powerful a technology is, the more care it requires to operate safely.
Research has found ways to detect deepfakes through flaws that can't be fixed easily by the fakers.
Campaigners in the UK are pushing to protect privacy and make the security services more accountable.
Social biases in digital tech create racist face recognition software and sexist hiring tools, but more data collection isn't the answer.
AI can help make government more efficient – but at what cost? Citizens' lives could be better or worse, based on how the technology is used.
People – individually and in groups – were not as good at facial recognition as an algorithm. But five people plus the algorithm, working together, were even better.
When algorithms are at work, there should be a human safety net to prevent harming people. Artificial intelligence systems can be taught to ask for help.
Current techniques to protect biometric details, such as face recognition or fingerprints, from hacking are effective, but advances in AI are rendering these protections obsolete.
Even the world’s best available training – used to train police, border control agents and other security personnel – does not compensate for natural talent in face recognition.
New technologies like facial recognition are coming – whether we like it or not. We can't turn back the tide, but we can manage new technology to do the least harm and most good.
New research on facial recognition technology trials by the police calls for tighter regulation to protect human rights.
For those who still consider memes like the #10yearchallenge as harmless and innocent information sharing perhaps it's time to reconsider.
The government can access your phone metadata, drivers licence photo and much more. And new research shows Australians are OK about it. But that might change.
A new study shows that facial recognition software assumes that black faces are angrier than white faces, even when they're smiling.
By looking closely at traits like wing feathers and spot patterns, a computer scientist trained an algorithm to recognize individual woodpeckers.
Some AI technologies aren't advanced enough to provide useful insights, but simpler tools can yield new opportunities to explore the humanities.
Over the 12 months of the research, more than 100 arrests and charges were – at least in part – assisted by AFR.
Australia's parliament will soon decide on a bill to try to regulate facial recognition technology, but it leaves a lot of questions unanswered.