In a new book, a sociologist who spent months embedded with the LAPD details how data-driven policing techwashes bias.
THE KILLING OF George Floyd last May sparked renewed scrutiny of data-driven policing. As protests raged around the world, 1,400 researchers signed an open letter calling on their colleagues to stop collaborating with police on algorithms, and cities like Santa Cruz, New Orleans, and Oakland variously banned predictive policing, facial recognition, and voice recognition. But elsewhere, police chiefs worked to deepen partnerships with tech companies, claiming that the answer to systemic bias and racism was simply more data.
In her new book, “Predict and Surveil: Data, Discretion, and the Future of Policing,” sociologist Sarah Brayne slays that assumption with granular detail.
#policebrutality #blacklivesmatter #racism #police #governmentreform
I took a class called "crime trends and patterns," at the masters level back in 2009.
It was clear that there is no way to prevent a biased person from making the data lie.
And data that creates "high crime" or "drug trafficking" areas can be created by police deployment patterns. Eg. Racist police deployment justifies more racist police deployment.