Scholars discuss the Facebook-Cambridge Analytica scandal: what happened, what's at stake, how to fix it, and what could come next.
When building a smart city, it's vital that governments and citizens know up-front who will control the collected data.
The silver lining to the Cambridge Analytica case is that more people are recognising that we pay for online services with not only our own privacy, but that of our friends, family and colleagues.
The Cambridge Analytica scandal wasn't a data breach – it was a violation of academic ethics. Maybe it's universities, not social networks, that need to update their privacy settings.
Smartphones are key elements of two-factor authentication processes. Weakening their security threatens people's digital identities.
What happens to your Facebook account, your iTunes purchases and your email messages when you die?
Companies are compiling your smartphone data into shockingly intimate profiles that can be used against you.
It's not just fitness trackers – mobile phones can reveal users' whereabouts too, even with location tracking turned off.
What scholars know, are learning and are predicting about the privacy of electronic data, online activity, smartphone use and electronic records.
Should police be able to use cellphone records to track suspects – and law-abiding citizens?
Consumers can't read, understand or use information in companies' privacy policies. So they end up less informed and less protected than they'd like to be. New research shows a better way.
The COAG agreement to share our biometric data - including some photo ID - is an erosion of our privacy and will give people a false sense of comfort.
The modern world depends on critical systems, networks and data repositories that are not as secure as they should be. Breaches will continue until society as a whole makes some big changes.
What governments and companies think they know about us – whether or not it's accurate – has real power over our actual lives.
A freedom of information request reveals that Google wants its AI company DeepMind to get involved in the 100,000 Genomes Project.
Nobody can understand the legal language in privacy policies. Can artificial intelligence digest the text and produce a human-readable explanation?
BCI devices that read minds and act on intentions can change lives for the better. But they could also be put to nefarious use in the not-too-distant future. Now's the time to think about risks.
UK politicians are planning very different approaches to data privacy, security and surveillance.
You need to start thinking about what will happen to your online data when you die.
When smartphone apps get permission to access your location or other activity, they often share that data with other companies that can compile digital profiles on users.