Is AI the solution to Removing the Systemic Bias from Police Investigations?

You can’t read a single tech blog without coming across the magical “AI ” acronym, which stands for Artificial Intelligence, and is a rising technology fad that seems to be the solution for everything. With the rise of Alexa and Google Assistant speakers in home settings, and virtual assistants on all of your gadgets from your phones to your cars, it seems that AI is the answer to everything — it can speak to you, answer questions, and predict what type of content should be surfaced to you, what you’ll most likely want to buy, and where you’re most likely going to be headed based on time of day. Artificial Intelligence is surpassing human intelligence levels by enabling us to make quicker and more data driven decisions. Could this be the solution to removing systemic bias from police investigations?

In this week’s reading, Gruman, Schneider, and Coutts show us how social psychological theory has played a significant role in “identifying possible sources of bias and error that occurs during police investigations, and in developing procedures to increase the accuracy and integrity of the work of police officers.” What this comes down to is ultimately that the inequalities and biases that exist in the real world, and that have unfortunately become ingrained in our thought process via our natural exposure to these biases in society, are now ever present in the police investigation setting as well. This systemic bias can invalidate investigations, and ultimately result in individuals being erroneously convicted for crimes that they have not committed. While the U.S. Attorney General pulled together a panel of experts to develop a document called Eyewitness Evidence: A guide for Law Enforcement to provide police officers with a set of national guidelines for the collection and preservation of eyewitness evidence for criminal investigations, only 56% of police agencies have changed one or more of their policies since the publishing of this guide (Police Executive Research Forum, 2013). If the content is available but police stations and police officers are unable to leverage this content, perhaps the solution is AI — what if artificial intelligence could predict where and when crime will take place based on historical data? Research shows that “offenders criminalize familiar areas, and there are detectable patterns associated with the times and locations of their crimes” (Friend, 2013). 

A team of researchers from the AI Now Institute investigated 13 police jurisdictions in the United States that began utilizing predictive policing technology. Unfortunately, the studies are not in the favor of this technology. AI uses historical data to develop a model by which it will predict the future trends & decision making process. At least 9 of the 13 jurisdiction in this study appeared to “hve used police data generated during periods when the department was found to have engaged in various forms of unlawful and biased police practices.” That means, there is inherent systemic bias being baked into this new technology. Artificial intelligence is not magical, and it has no awareness — what you feed the system is what you will get out as well. If there is bias in the data being fed, there will be bias in the decision making that is being delivered by the system.

What is the solution? I believe the solution is not an easy one that can be fixed by putting a computer or a new technology trend in place to replace human errors in judgement. As Dr. Martin Luther King Jr. said, “injustice anywhere is a threat to justice everywhere.” Every single person in this world is a part of the solution. We need to take active measures in every stage of life, starting with the biases that we learn in school, in our textbooks, in our college education, and in our public policies in order to train every single person out of this systematic bias. When someone becomes a police officer they are not learning the systemic bias in the police station — unfortunately, they have been raised with these biases surrounding them and thus it is ingrained in them. In addition to mandating that police stations utilize the eyewitness guide that was published in order to make changes in their police stations, we also need to prepare these police officers from an early age to not have this systemic bias. We all are a component of this problem, and we all need to take active steps to prevent this systemic bias from existing everywhere, but especially in our criminal justice system. There will always be crimes committed, there will always be criminals in the real world, but it is in our best interest to ensure that we are convicting the right individuals, not tampering evidence, not performing illegal or biased investigations, and getting these criminals the help they need by placing them in an unbiased criminal justice system to be supported, rehabilitated, and re-entering our society properly if appropriate.

 

References:

Friend, Zach. “Predictive Policing: Using Technology to Reduce Crime.” FBI, FBI, 9 Apr. 2013, leb.fbi.gov/articles/featured-articles/predictive-policing-using-technology-to-reduce-crime.

Greene, Tristan. “Predictive Policing Is a Scam That Perpetuates Systemic Bias.” The Next Web, 22 Feb. 2019, thenextweb.com/artificial-intelligence/2019/02/21/predictive-policing-is-a-scam-that-perpetuates-systemic-bias/.

Gruman, J. A., Schneider, F. W., & Coutts, L. M. (Eds.) (2012). Applied Social Psychology: Understanding and Addressing Social and Practical Problems (2rd ed.). Thousand Oaks, CA: Sage Publications

1 comment

  1. Thanks for sharing this really interesting intersection of how technology can potentially help with removing bias from police investigations.

    Thinking more about the sociological perspective, I’m curious if individuals norms are at play here when first generating an opinion/bias about a particular group of people. Having read more into the development of personality, it appears to be rather malleable over the duration of our lifetime – whereas IQ is rather fixed. Thinking more about how to address this, we’ve seen more and more police officers have to attend ethics training to help mitigate any inherent bias that they might have (James, T., 2017).

    As someone that works in technology, I’ve been close to the development of Artificial Intelligence and/or Machine Learning over the past couple of years. The point you make about the seed data being utilized to train the AI model is most certainly the issue at hand. It makes me think that a better solution would be to take the “most just” police departments across the country or world and then use their data to train these models. Certainly tho, these models will not be able to account for every unique case and human intervention will most likely have to occur. I do feel that Artificial Intelligence can help cut down on wrongful convictions and fines, but, it will most likely not be fully developed for another 5 or so years. Again, I believe this due to the training of the models and my personal experience with artificial intelligence.
    Your summary is great – we all need to do better to provide solutions to our own bias. Whether it is training after fact of generating these biases or being more aware of how we raise our children, it will all help. Developing an intervention in our schools, workplaces, and homes will help mitigate these concerns and hopefully spread throughout the fabric of our society.

    References –
    James, T. (2017, December 23). Can Cops Unlearn Their Unconscious Biases? Retrieved from https://www.theatlantic.com/politics/archive/2017/12/implicit-bias-training-salt-lake/548996/

Leave a Reply


Skip to toolbar