K

Kathleen Martin

Guest
Between 2018 and 2021, more than one in 33 U.S. residents were potentially subject to police patrol decisions directed by crime-prediction software called PredPol.
The company that makes it sent more than 5.9 million of these crime predictions to law enforcement agencies across the country—from California to Florida, Texas to New Jersey—and we found those reports on an unsecured server.
Gizmodo and The Markup analyzed them and found persistent patterns.
Residents of neighborhoods where PredPol suggested few patrols tended to be Whiter and more middle- to upper-income. Many of these areas went years without a single crime prediction.
By contrast, neighborhoods the software targeted for increased patrols were more likely to be home to Blacks, Latinos, and families that would qualify for the federal free and reduced lunch program.
These communities weren’t just targeted more—in some cases, they were targeted relentlessly. Crimes were predicted every day, sometimes multiple times a day, sometimes in multiple locations in the same neighborhood: thousands upon thousands of crime predictions over years. A few neighborhoods in our data were the subject of more than 11,000 predictions.
The software often recommended daily patrols in and around public and subsidized housing, targeting the poorest of the poor.
“Communities with troubled relationships with police—this is not what they need,” said Jay Stanley, a senior policy analyst at the ACLU Speech, Privacy, and Technology Project. “They need resources to fill basic social needs.”
Continue reading: https://gizmodo.com/crime-prediction-software-promised-to-be-free-of-biases-1848138977?mc_cid=1e441f5787&mc_eid=9e56404948
 

Attachments

  • p0006008.m05660.gizmodo_logo_wine.png
    p0006008.m05660.gizmodo_logo_wine.png
    35.5 KB · Views: 5