Menu Close

Artikel-artikel mengenai Facial recognition

Menampilkan 1 - 20 dari 91 artikel

Vaccine passports may soon be required for travelling amid the COVID-19 pandemic. Like biometrics, they’ll likely become a permanent part of our daily lives — and there’s barely been any debate about them. (AP Photo/Rick Bowmer)

Why we need to seriously reconsider COVID-19 vaccination passports

COVID-19 vaccine passports are being presented as a relatively simple technological solution to our current travel woes. But meaningful public debate about their merits and problems is essential.
Simply making an effort to consider the person behind the mask can help address the biases exacerbated by wearing one. (Shutterstock)

Face masks hide our facial expressions and can exacerbate racial bias

Wearing face masks hides our facial expressions and affects our social interactions. They make it harder for us to read facial expressions and can contribute to racist perceptions.
Many of the people who broke into the U.S. Capitol building on Jan. 6 carried cellphones, which can be tracked, and posted photos of their activities on social media. Photo by Saul Loeb/AFP via Getty Images

How law enforcement is using technology to track down people who attacked the US Capitol building

Facial recognition, social media and location tracking give law enforcement a leg up in a monumental investigation.
Facial recognition technology raises serious ethical and privacy questions, even as it helps investigators south of the border zero in on the rioters who stormed the U.S. Capitol. (Pixabay)

As U.S. Capitol investigators use facial recognition, it begs the question: Who owns our faces?

We have unwittingly volunteered our faces in social media posts and photos stored in the cloud. But we've yet to determine who owns the data associated with the contours of our faces.
Facial recognition algorithms are usually tested using white faces, which results in the technology being unable to differentiate between racialized individuals. (Shutterstock)

AI technologies — like police facial recognition — discriminate against people of colour

Technology is not neutral, as facial recognition algorithms and predictive policing have shown us. Algorithms discriminate by design, reflecting and reinforcing pre-existing biases.
Police forces have a wide range of options for monitoring individuals and crowds. Nicholas Kaeser/Flickr

High-tech surveillance amplifies police bias and overreach

Police forces across the country now have access to surveillance technologies that were recently available only to national intelligence services. The digitization of bias and abuse of power followed.

Kontributor teratas

Lebih banyak