With face masks now compulsory or recommended in various parts of the country, how are facial recognition systems functioning?
Facial recognition algorithms are usually tested using white faces, which results in the technology being unable to differentiate between racialized individuals.
Technology is not neutral, as facial recognition algorithms and predictive policing have shown us. Algorithms discriminate by design, reflecting and reinforcing pre-existing biases.
The federal government has used military-grade border patrol drones like this one to monitor protests in US cities.
_ Jonathan Cutrer/Flickr
Avoiding drones' prying eyes can be as complicated as donning a high-tech hoodie and as simple as ducking under a tree.
Facial recognition algorithms will always make mistakes. But how can we make them less discriminatory?
Smith Collection/Gado/Sipa USA
There are questions being raised about the legality of scanning, storing and sharing facial images. The law currently doesn't prohibit even highly intrusive levels of surveillance by private entities.
Face surveillance makes it easier to oppress vulnerable populations and violate everyone's basic rights. It's time for a moratorium.
The US is also 'looking at' banning the Chinese social media app.
Police forces have a wide range of options for monitoring individuals and crowds.
Police forces across the country now have access to surveillance technologies that were recently available only to national intelligence services. The digitization of bias and abuse of power followed.
Christopher Pike / Reuters
Temperature-scanning systems are not always accurate at detecting fever, and raise a host of privacy concerns.
Facial recognition software could be applied to managing people during pandemics.
Recently, police forces have come under criticism for their engagement of facial recognition technologies. But pandemic response plans may increasingly incorporate surveillance.
There are few guarantees that the facial recognition system is secure or even that it is accurate.
You’d thinking flying in a plane would be more dangerous than driving a car. In reality it’s much safer, partly because the aviation industry is heavily regulated. Airlines must stick to strict standards…
How you feel about eating chips and wearing your pyjamas out – experiments show how differently you react when you're being observed.
The more we use facial recognition, the more we see its limits and its risks.
When it comes to faces, most of us are typical-recognisers, with just a small percentage classed as super-recognisers.
"Super-recognisers" who can identify a range of ethnicities could help increase fraud detection rates at passport control and decrease false conviction rates that have relied on CCTV.
Emotion recognition technology, an outgrowth of facial recognition technology, continues to advance quickly.
A report calls for banning the use of emotion recognition technology. An AI and computer vision researcher explains the potential and why there's growing concern.
Andrew Hastie said the broad objectives of the identity-matching system were sound, but key changes were needed to ensure privacy and transparency.
Human rights groups say the bill is an attempt to introduce mass surveillance to Australia and an egregious breach of individual privacy.
Surveillance software that identifies people from CCTV is eroding human rights and democracy.
You can't change your fingerprint if it's stolen like you'd change your password.
Is there still time to reach the ‘off’ button?
Legal bans and moratoriums on other emerging technologies need not be permanent or absolute, but the more powerful a technology is, the more care it requires to operate safely.