The terms of the Australian Privacy Principle 3.6 are quite clear. So why is there not a single published case of this law being enforced?
The Tim Hortons consumer app was found to have collected detailed user information, including location data. As a privacy violation, this challenges perception of Tim Hortons as a trusted brand.
A professor of digital society and an ethics researcher discuss COVID passes and what they mean for the UK.
Researchers are looking into the potential technological threats to data safety and privacy from the smart supermarkets of the future.
Have you ever been targeted with ads that are scarily specific to you, and wondered how the app or website could have known?
Many businesses struggle with data security, but the new Privacy Act means they will have to make protecting customers’ personal information a priority.
It’s the biggest monopolisation case since a 1998 lawsuit against Microsoft. But it may be several years before a settlement of any kind is reached.
The new bill would open the gates for your data to freely exchange hands between any ‘accredited’ agency. The proposal is more arrogant than it is effective.
It’s not clear how individuals are being targeted. And while they’re mostly high-profile people, that doesn’t mean there’s no lesson for the average person to take away.
Technology is not neutral, as facial recognition algorithms and predictive policing have shown us. Algorithms discriminate by design, reflecting and reinforcing pre-existing biases.
In the past decade, the Australian government has commissioned data analytics projects worth more than A$200 million. We have little information about what they involved.
The watchdog has voiced concerns over the proposed US$2.1 billion merger, from which both users and Australian health services could lose out.
Most of us are probably having our data tracked in some form. And while there are regulatory safeguards in place to protect user privacy, it’s hard to say whether these are enough.
Social biases in digital tech create racist face recognition software and sexist hiring tools, but more data collection isn’t the answer.
Big tech companies compete over who can gather the most intelligence on their users. Countries like Russia and China turn this information against their citizens.
Tech companies have vowed to do better when it comes to using data ethically, but most ethics initiatives are neither enforced nor enforceable.
Biases are difficult to shed, which makes workplace diversity a powerful and necessary tool for catching unsuspected bias before it has a chance to cause damage.
Parents should inform themselves, and review their and their children’s privacy settings.
DeepMind’s machine learning collaboration with another NHS trust (this time, it’s applying the tech to breast cancer) kicks up more questions of trust.