AI-inspired policing doesn’t make streets any safer – report

Cyber crime

The use cases for artificial intelligence have been far reaching to date, though a new report from Rand Corporation claims it does not work effectively in policing.

Similar to the concept of Tom Cruise’s Minority Report, the City of Chicago has been testing whether predictive analytics can be used to address the city’s high gun-crime rate. Using various metrics and algorithms, the city’s police force is able to build a Strategic Subjects List of individuals who are perceived as likely to be shot, involved in a shooting or be those who would pull the trigger themselves. According to the team at Rand those on the list are no more or less likely to become a victim of a homicide or shooting than those in the control group.

“It is not clear how the predictions should be used in the field,” the team said in the report. “One potential reason why being placed on the list resulted in an increased chance of being arrested for a shooting is that some officers may have used the list as leads to closing shooting cases. The results provide for a discussion about the future of individual-based predictive policing programs.”

The Strategic Subjects List consisted of 426 people who were deemed to be at highest risk of gun violence. These individuals were then referred to the relevant police department for what is described as preventive intervention, though the definition of what this means is unclear. As with many terms used by governments around the world, the definition is vague, leaving interpretation open. This in itself could be seen as a dangerous precedent when dealing with a subject area which can be perceived as sensitive.

Predictive policing is one area in which the National Institute of Justice (NIJ) has been investigating for some time, and has contributed roughly $2 million in grants to the Chicago Police Department to fund the project in question. The team has hoped a success launch would see such data driven policing techniques rolled out throughout the states, though the report from Rand Corporation is unlikely to help the growth of the concept.

Artificial intelligence and advanced analytics do have wide-spread use cases, though incorporating into police work does create a slight moral dilemma. Profiling techniques in police work are not new by any means, though the incorporation of artificial intelligence does add an element of speed which was not present before. Profiling in itself is a technique which has come under criticism in recent years, and adding in a sensitive topic such as artificial intelligence is not likely to have positive impact on the debate.

Another area which has been brought to the limelight is the intentions of those individuals who are on the list. A computer algorithm will make a prediction as to the intentions of the individual, but that is no guarantee these actions will be taken. This in itself is another moral dilemma, as it could put officers in contact with individuals unnecessarily, thus creating an issue out of nothing.

Artificial intelligence is likely to be one of the more prominent facets of the technology industry in the next few years, though it seems the introduction of such techniques into police work is a little premature for the moment.

  • BIG 5G Event

Leave a comment

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.