Predictive Policing Software Falls Flat: A Case Study from The Plainfield, NJ Debacle

Published on 14 October 2023.

Introduction:

In recent years, predictive policing software has garnered significant attention for its potential to revolutionize law enforcement. However, a recent investigation into Geolitica’s crime prediction software, once known as PredPol, has revealed alarming inefficiencies and raised critical questions about the efficacy of such technology. In this blog post, we delve into the findings of the investigation, shedding light on the shortcomings of Geolitica’s software and the broader implications for predictive policing.

The Geolitica Debacle in Plainfield, NJ: An analysis conducted by The Markup revealed startling results concerning Geolitica’s crime prediction software in Plainfield, New Jersey. The software, which generated daily crime predictions based on historical incident reports, had a success rate of less than 1%. Out of 23,631 predictions analyzed, fewer than 100 predictions aligned with actual reported crimes. These dismal results questioned the accuracy and effectiveness of Geolitica’s algorithm.

Issues Highlighted in the Investigation:

  1. Abysmal Success Rates: Geolitica’s predictions rarely matched reported crimes, indicating a success rate of less than half a percent. Specific crime categories, such as robberies and burglaries, exhibited even lower success rates, raising concerns about the software’s reliability.
  2. Lack of Police Utilization: Despite the software’s implementation, Plainfield officials stated that they never used the system to direct patrols. The low number of arrests related to Geolitica’s predictions further undermined the software’s effectiveness in aiding law enforcement efforts.
  3. Over-Predictions vs. Actual Crimes: Geolitica’s system generated a massive number of predictions in Plainfield compared to the relatively small number of actual reported crimes. This imbalance suggested that the software’s predictions might not have been accurate reflections of the city’s crime patterns.
  4. Potential Bias and Negative Consequences: Predictive policing systems, when inaccurate, can perpetuate biases and cause negative consequences. Sending law enforcement officers to locations where crimes are inaccurately predicted can lead to increased distress among certain communities, potentially exacerbating the problems it seeks to solve.

Looking Beyond Predictive Policing: The Plainfield case underscores the limitations of predictive policing and raises questions about its utility in law enforcement. Experts argue that focusing solely on policing might not be the solution. Instead, a more holistic approach that addresses the root causes of crime, engages community members, and invests in social programs could be more effective in ensuring public safety.

Conclusion:

Geolitica’s failed experiment in Plainfield serves as a stark reminder of the challenges associated with predictive policing software. As we move forward, it is imperative to critically evaluate the role of technology in law enforcement, considering its potential biases and societal impacts. A more nuanced and community-focused approach might offer a more sustainable and equitable path toward ensuring the safety and well-being of our communities.

Check out Wired for the Full Story: https://www.wired.com/story/plainfield-geolitica-crime-predictions/

editor's pick

news via inbox

Nulla turp dis cursus. Integer liberos  euismod pretium faucibua