Governments can exclude certain groups of people in policies and services not only by the type of data they collect but also how they collect, store, analyze and use the data.
ChatGPT is fuelled by our intimate online histories. It’s trained on 300 billion words, yet users have no way of knowing which of their data it contains.
The Tim Hortons consumer app was found to have collected detailed user information, including location data. As a privacy violation, this challenges perception of Tim Hortons as a trusted brand.
Many businesses struggle with data security, but the new Privacy Act means they will have to make protecting customers’ personal information a priority.
The new bill would open the gates for your data to freely exchange hands between any ‘accredited’ agency. The proposal is more arrogant than it is effective.
It’s not clear how individuals are being targeted. And while they’re mostly high-profile people, that doesn’t mean there’s no lesson for the average person to take away.
Facial recognition algorithms are usually tested using white faces, which results in the technology being unable to differentiate between racialized individuals.
(Shutterstock)
Technology is not neutral, as facial recognition algorithms and predictive policing have shown us. Algorithms discriminate by design, reflecting and reinforcing pre-existing biases.
In the past decade, the Australian government has commissioned data analytics projects worth more than A$200 million. We have little information about what they involved.
The watchdog has voiced concerns over the proposed US$2.1 billion merger, from which both users and Australian health services could lose out.
While leaks and whistleblowers continue to be valuable tools in the fight for data privacy, we can’t rely on them solely to keep big tech companies in check.
SHUTTERSTOCK
Most of us are probably having our data tracked in some form. And while there are regulatory safeguards in place to protect user privacy, it’s hard to say whether these are enough.
Big tech companies compete over who can gather the most intelligence on their users. Countries like Russia and China turn this information against their citizens.
Tech companies have an economic imperative to avoid grappling too seriously with the ethical issues surrounding data usage.
Shutterstock
Tech companies have vowed to do better when it comes to using data ethically, but most ethics initiatives are neither enforced nor enforceable.
Power over business, democracy and education will likely continue to lie with data and data-dependent tools, such as machine learning and artificial intelligence.
Shutterstock
Biases are difficult to shed, which makes workplace diversity a powerful and necessary tool for catching unsuspected bias before it has a chance to cause damage.
It’s never too early – or too late – to start talking to your children about how to protect their data from people who might misuse it.
Shutterstock