Connect with others across the political spectrum

Sign in / Sign up

Local Pittsburgh Issue

Should #CityCouncils have the authority to #Regulate how #FacialRecognition software is used in #Police investigations?

Score for this "Yes" opinion :
Score is TBD

"#FacialRecognition needs regulation" Aug 01, 2024

Computer programs are by their very nature as biased as the humans who design them. The #facialrecognition software is unreliable for identifying people of color, the elderly, women, and children. The Municipal government of Pittsburgh has not only the right, but the responsibility to regulate whether or not police investigators use facial recognition technology.

We’ve heard the notion time and again that too much of anything is dangerous, and the use of this technology is no exception. New innovations are often seen as life changing, and it's human nature to want to use the latest and greatest technology. Notwithstanding that truth, we need to exercise caution because people's lives and freedom are at stake. 

Facial recognition technology has become a hotly debated topic worldwide. It has already been banned in many cities including San Francisco and Boston. After the Black Lives Matter protests, companies like Amazon declined to make their facial recognition services available to law enforcement agencies. Several other companies are facing similar decisions. 

So why is facial recognition technology so flawed in the first place? The algorithms used in the software are designed to emulate human thought patterns, which are innately biased. The technology needs to be fixed to improve its ability to recognize different ethnicities before the government or law enforcement agencies can even think of using it in good conscience. 

It is vital to look to the past for guidance to avoid repeating the same mistakes. This isn’t the first time black people and other minorities have been treated unequally by law enforcement. 

A federal study was published in 2019 which highlighted that the technology performed well when identifying faces of middle aged white men, but its accuracy was much lower for children, women, the elderly and people of color. The rate of error was at its highest regarding black women, as it was found by Gebru, Buolamwini and Raji. We have no way to know the scope of devastating consequences this system, prone with error, could bring to people of color.

We can't ignore the fact that facial recognition technology has the potential to jeopardize the liberty and civil rights of innocent people. It is definitely not worth the risk of handing it over to the police unless there is significant oversight, and even then, it should be used with caution. Otherwise we run the risk of weaponizing systemic racism rather than fighting it as we should.

This website uses cookies
ViewExchange uses cookies to improve performance of the website, to personalize content and advertisements, and to overall provide you with a better experience. By clicking “Accept” or by continuing to use ViewExchange, you accept the use of cookies. You can control your data settings including opting out by clicking here.