Currencies29366
Market Cap$ 2.23T-4.38%
24h Spot Volume$ 43.96B+21%
DominanceBTC50.91%+0.09%ETH16.96%-0.07%
ETH Gas13 Gwei
Cryptorank
MainNewsFacial Recog...

Facial Recognition Technology and Racial Inequities in Policing


Facial Recognition Technology and Racial Inequities in Policing
Jan, 26, 2024
3 min read
by CryptoPolitan
Facial Recognition Technology and Racial Inequities in Policing

Facial Recognition Technology (FRT) has come under scrutiny once again due to concerns that it may exacerbate racial inequities in policing. Recent research conducted by Scientific American online has shed light on the potential biases embedded in FRT algorithms and their real-world consequences. 

The study revealed that law enforcement agencies utilizing automated facial recognition disproportionately arrest Black individuals, raising critical questions about the technology’s fairness and reliability.

Flaws in FRT algorithms

The researchers behind the report argue that several factors contribute to the disproportionate impact of FRT on Black communities. One major concern is the lack of diversity in the algorithms’ training datasets, which often lack adequate representation of Black faces. This deficiency can lead to inaccuracies in identifying individuals from minority groups.

Another contributing factor is the belief among law enforcement that these programs are infallible. This overreliance on technology can cause officers to trust FRT results unquestionably, even when flawed. 

Furthermore, the study suggests that officers’ inherent biases can magnify the issues within FRT, leading to wrongful arrests and detrimental consequences for innocent individuals.

One striking example of the real-world consequences of FRT misidentification is the case of Harvey Eugene Murphy Jr. The 61-year-old grandfather is currently suing Sunglass Hut’s parent company after the store’s facial recognition technology falsely identified him as a robber. 

The robbery occurred in a Sunglass Hut store in Houston, Texas, where two armed individuals stole both cash and merchandise.

Houston police, relying on FRT, identified Murphy as a suspect, even though he resided in California at the time of the crime. He was arrested upon returning to Texas to renew his driver’s license. While in jail, Murphy alleges that he was sexually assaulted by three men in a bathroom, resulting in life-altering injuries.

Although the Harris County District Attorney’s office eventually determined that Murphy was not involved in the robbery, the damage had already been done during his time in custody. His lawyers argue that this case exemplifies FRT’s inherent flaws and consequences in practice.

Expert insights and calls for action

Os Keyes, an Ada Lovelace Fellow and PhD Candidate at the University of Washington, asserts that these systems are designed to automate and expedite existing police biases, particularly against individuals who are already marginalized or in the criminal justice system. 

Keyes emphasizes that the negative outcomes of FRT are both inevitable and horrifying, emphasizing the need for broader reform in policing and FRT regulation.

The researchers from Scientific American highlight that private companies like Amazon, Clearview AI, and Microsoft typically develop FRT algorithms used by law enforcement agencies. 

Despite advances in deep-learning techniques, federal testing has shown that most facial recognition algorithms struggle to accurately identify individuals, especially those who are not white men.

In 2023, the Federal Trade Commission (FTC) took action against using FRT by prohibiting Rite Aid from utilizing this technology. Rite Aid had wrongly accused individuals of shoplifting based on FRT matches. In one alarming incident, an 11-year-old girl was stopped and searched by a Rite Aid employee due to a false match.

Similarly, the Detroit Police Department faced a lawsuit after their FRT misidentified a pregnant woman, Porcha Woodruff, as a carjacking suspect. Woodruff was eight months pregnant at the time and was wrongfully jailed.

The FTC acknowledged that people of color are often misidentified when using FRT. The overrepresentation of white males in training datasets leads to skewed algorithms, resulting in the disproportionate marking of Black faces as criminals. This, in turn, contributes to the unjust targeting and arrest of innocent Black individuals.

Calls for accountability and reform

In light of these concerns, the researchers emphasize the need for companies developing FRT products to prioritize staff and image diversity in their development processes. However, they also stress that law enforcement agencies must critically examine their methods to prevent technology from exacerbating racial disparities and violating individuals’ rights.

As the debate surrounding facial recognition technology continues, it becomes increasingly clear that comprehensive reform and regulation are necessary to address the systemic issues of its implementation. 

The consequences of flawed FRT algorithms extend far beyond technical glitches, affecting the lives and well-being of individuals, particularly those from marginalized communities.

Read the article at CryptoPolitan
MainNewsFacial Recog...

Facial Recognition Technology and Racial Inequities in Policing


Facial Recognition Technology and Racial Inequities in Policing
Jan, 26, 2024
3 min read
by CryptoPolitan
Facial Recognition Technology and Racial Inequities in Policing

Facial Recognition Technology (FRT) has come under scrutiny once again due to concerns that it may exacerbate racial inequities in policing. Recent research conducted by Scientific American online has shed light on the potential biases embedded in FRT algorithms and their real-world consequences. 

The study revealed that law enforcement agencies utilizing automated facial recognition disproportionately arrest Black individuals, raising critical questions about the technology’s fairness and reliability.

Flaws in FRT algorithms

The researchers behind the report argue that several factors contribute to the disproportionate impact of FRT on Black communities. One major concern is the lack of diversity in the algorithms’ training datasets, which often lack adequate representation of Black faces. This deficiency can lead to inaccuracies in identifying individuals from minority groups.

Another contributing factor is the belief among law enforcement that these programs are infallible. This overreliance on technology can cause officers to trust FRT results unquestionably, even when flawed. 

Furthermore, the study suggests that officers’ inherent biases can magnify the issues within FRT, leading to wrongful arrests and detrimental consequences for innocent individuals.

One striking example of the real-world consequences of FRT misidentification is the case of Harvey Eugene Murphy Jr. The 61-year-old grandfather is currently suing Sunglass Hut’s parent company after the store’s facial recognition technology falsely identified him as a robber. 

The robbery occurred in a Sunglass Hut store in Houston, Texas, where two armed individuals stole both cash and merchandise.

Houston police, relying on FRT, identified Murphy as a suspect, even though he resided in California at the time of the crime. He was arrested upon returning to Texas to renew his driver’s license. While in jail, Murphy alleges that he was sexually assaulted by three men in a bathroom, resulting in life-altering injuries.

Although the Harris County District Attorney’s office eventually determined that Murphy was not involved in the robbery, the damage had already been done during his time in custody. His lawyers argue that this case exemplifies FRT’s inherent flaws and consequences in practice.

Expert insights and calls for action

Os Keyes, an Ada Lovelace Fellow and PhD Candidate at the University of Washington, asserts that these systems are designed to automate and expedite existing police biases, particularly against individuals who are already marginalized or in the criminal justice system. 

Keyes emphasizes that the negative outcomes of FRT are both inevitable and horrifying, emphasizing the need for broader reform in policing and FRT regulation.

The researchers from Scientific American highlight that private companies like Amazon, Clearview AI, and Microsoft typically develop FRT algorithms used by law enforcement agencies. 

Despite advances in deep-learning techniques, federal testing has shown that most facial recognition algorithms struggle to accurately identify individuals, especially those who are not white men.

In 2023, the Federal Trade Commission (FTC) took action against using FRT by prohibiting Rite Aid from utilizing this technology. Rite Aid had wrongly accused individuals of shoplifting based on FRT matches. In one alarming incident, an 11-year-old girl was stopped and searched by a Rite Aid employee due to a false match.

Similarly, the Detroit Police Department faced a lawsuit after their FRT misidentified a pregnant woman, Porcha Woodruff, as a carjacking suspect. Woodruff was eight months pregnant at the time and was wrongfully jailed.

The FTC acknowledged that people of color are often misidentified when using FRT. The overrepresentation of white males in training datasets leads to skewed algorithms, resulting in the disproportionate marking of Black faces as criminals. This, in turn, contributes to the unjust targeting and arrest of innocent Black individuals.

Calls for accountability and reform

In light of these concerns, the researchers emphasize the need for companies developing FRT products to prioritize staff and image diversity in their development processes. However, they also stress that law enforcement agencies must critically examine their methods to prevent technology from exacerbating racial disparities and violating individuals’ rights.

As the debate surrounding facial recognition technology continues, it becomes increasingly clear that comprehensive reform and regulation are necessary to address the systemic issues of its implementation. 

The consequences of flawed FRT algorithms extend far beyond technical glitches, affecting the lives and well-being of individuals, particularly those from marginalized communities.

Read the article at CryptoPolitan