Fake videos pose a risk to democratic representation, participation, and discussion. Canadians need to be mindful of their existence as we head towards the federal election.
The Russians won’t be alone in spreading disinformation in 2020. Their most likely imitator will be Iran. Also, Instagram could get even more infected with intentional misinformation than it has been.
A key element of the battle between truth and propaganda has nothing to do with technology. It has to do with how people are much more likely to accept something if it confirms their beliefs.
The law is out of step with technology that means anyone can manipulate your images in hyper-realistic ways.
A new technique for detecting deepfakes conceives of videos as flip-books and looks for changes in successive frames of a sequence.
Laws against 'revenge porn', upskirting, deepfake pornography are piecemeal, and a review will take years to conduct. Here are three things government can do now to support victims.
Research has found ways to detect deepfakes through flaws that can't be fixed easily by the fakers.
People fall for fake photos regardless of whether they seem to come from Facebook or The New York Times. What actually helps?
We know that social media platforms have an incentive to promote whatever gets the most attention, regardless of its authenticity. We're more reluctant to admit that the same is true of people.
When artificial intelligence systems try to behave like humans and make mistakes, they show their limits – but also their startling advances.
It's easier than ever to create a fake image and spread it far and wide online. But there are steps that you can take to protect yourself from fishy photos.
More democratic forms of politics, journalism and fact-checking will be needed when we can no longer trust any video footage.
Protecting democracy requires more than just technical solutions. It includes education, critical thinking and members of society working together to agree on problems and find solutions.
Will we soon no longer be able to discern which videos are real and which are fake?
The new technology behind machine learning-enhanced fake videos has a crucial flaw: Computer-generated faces don't blink as often as real people do.
People can now use artificial intelligence to swap the faces of actors in pornographic videos with those of people they know, raising fears about a new form of revenge porn.