The NYPD confirmed in March that it has been using an algorithmic machine learning software since December 2016 to sift through police records to find patterns and stitch together similar crimes. Called the Patternizr, the software algorithm performs the thankless job of combing through tens of thousands of case files from 77 precincts in five boroughs. In theory, this frees police officers to do more actual policing.

J. Brian Charles filed this report on the NYPD’s Patternizr for Governing:

Patternizr automates much of that process. The algorithm scours all reports within NYPD’s database, looking at certain aspects — such as method of entry, weapons used and the distance between incidents — and then ranks them with a similarity score. A human data analyst then determines which complaints should be grouped together and presents those to detectives to help winnow their investigations.

On average, more than 600 complaints per week are run through Patternizr. The program is not designed to track certain crimes, including rapes and homicides. In the short term, the department is using the technology to track petty larcenies.

The NYPD used 10 years of manually collected historical crime data to develop Patternizr and teach it to detect patterns. In 2017, the department hired 100 civilian analysts to use the software. While the technology was developed in-house, the software is not proprietary, and because the NYPD published the algorithm, “other police departments could take the information we’ve laid out and build their own tailored version of Patternizr,” says Levine.

Since the existence of the software was made public, some civil liberties advocates have voiced concerns that a machine-based tool may unintentionally reinforce biases in policing.

“The institution of policing in America is systemically biased against communities of color,” New York Civil Liberties Union legal director Christopher Dunn told Fast Company. “Any predictive policing platform runs the risks of perpetuating disparities because of the over-policing of communities of color that will inform their inputs. To ensure fairness, the NYPD should be transparent about the technologies it deploys and allow independent researchers to audit these systems before they are tested on New Yorkers.”