A key element of the battle between truth and propaganda has nothing to do with technology. It has to do with how people are much more likely to accept something if it confirms their beliefs.
A new technique for detecting deepfakes conceives of videos as flip-books and looks for changes in successive frames of a sequence.
The new technology behind machine learning-enhanced fake videos has a crucial flaw: Computer-generated faces don't blink as often as real people do.
Photos are full of information, from your location to phone model, and digital forensics can help extract it.
Cyberdetectives look for digital doors or windows left unlocked, find electronic footprints in the dirt and examine malicious software for clues about who broke in, what they took and why.
A new technique could help the police identify more criminals from just their footprints.
This week's hack of the Bureau of Meteorology appeared to come from China, but how do we know? The problem is, it's notoriously difficult to pinpoint the origin of a hack.
Paris police were able to use information found on a phone, but what details can be found that could tackle future attacks?
So much of modern life involves our digital devices – including crime. As the field of digital forensics gains prominence, practitioners need practical and ethical guidelines.