Technology should be used to ensure public welfare and development, not to promote #racist profiling and inequitable #criminal investigations. Police have the duty of protecting citizens' basic constitutional rights. Mass surveillance--because that's basically what Facial Recognition Tech (FRT) is--deprives people of their right to #privacy and freedom of expression.
It was a great move by the Boston City Council to preclude police from using FRT surveillance technology. Then-President of the Boston City Council (and now Mayor) Michelle Wu summed it up best: “Boston should not use racially discriminatory technology that threatens the privacy and basic rights of our residents. Community trust is the foundation for public safety and public health.”
The law has to be applied evenly to all citizens, regardless of their #racial or #gender identity. The biggest problem about FRT is its inaccuracy in identifying people who happen to be members of minority groups. Bronx Public Attorney Kaitlyn Jackson, who defended a client who was jailed for six months based on a faulty FRT match, said: “I think people sometimes feel a sense of ease, like 'That would never happen to me because I'm not somebody who has had a lot of interactions with the police'... but no one can guarantee that you don't look a lot like somebody who committed a crime. Nobody is safe from poor facial recognition technology.”
Thankfully, other places have taken a similar stance. The Senate followed along Boston City Council's same line of reasoning when they enacted the Facial Recognition and Biometric Technology Moratorium Act of 2021. This act states that law enforcement officers are not allowed to submit FRT results as evidence in criminal proceedings. In fact, they are not even allowed to use the technology in any official capacity. Hopefully this will make it less likely for innocent citizens to spend time in jail just because they look like someone who is a criminal.
Technological implementation should always be carefully reviewed and its implications should be evaluated before it is deployed, and this is especially important in this case given the severe consequences that can result from an incorrect identification. Police work needs to focus on solid evidence and investigative techniques, and our society can't afford for them to take any shortcuts by using assistive technology that we already know is prone to giving erroneous results. Experts in the field say that FRT is rapidly improving and becoming more accurate as they build up a larger database of images for computers to match.
The Security Industry Association is one of the few groups who have opposed the ban on facial recognition software. What the SIA should understand is that absolute power corrupts absolutely, and giving police the freedom to use this technology without any oversight is a recipe for disaster. It would only be a matter of time before we had to deal with people using it irresponsibly.
There are multiple reforms implemented in recent times to address the prominent issue of racism and profiling that are prevailing in the society. Banning police from using this tech is one such step. It's very important to set the right policies in motion that will ensure prosperity and equality for Boston residents. By banning discriminatory facial recognition technology, we have moved one step further to countering racism.