The NYPD is using a new pattern recognition system to help solve crimes
The New York City Police Department is using a new software system called Patternizr, which helps officers search through “hundreds of thousands” of case files, according to a report in The Washington Post.
The report says that the software was developed in house, and allows analysts to search across a wide range of files to look for patterns or similar crimes. Previously, they would have had to have gone through physical files. In one example, officers used the system to connect two crimes — a man who used a syringe to steal a drill in two different Home Depots in New York City. Rebecca Shutt, the crime analyst who solved the case explained to the Post that the system “brought back complaints from other precincts that I wouldn’t have known.”
This isn’t a Minority Report-like system that seeks to predict where crimes will occur, nor is it a system that uses AI to parse through CCTV footage. Rather, it’s a system that searches through the NYPD’s databases for patterns, allowing detectives to search from a much wider pool of data in the course of an investigation. The system can help bring in additional sources of information from across the NYPD, making it difficult to see patterns for crimes that might have occurred elsewhere.
The NYPD says the department rolled out the software in 2016, but first revealed its existence in an issue of INFORMS Journal on Applied Analytics. According to NYPD assistant commissioner of data analytics Evan Levine, the, and former director of analytics Alex Chohlas-Wood, the department spent two years developing the software, and claim that the NYPD is the first to use such a system in the US.
Chohlas-Wood and Levine tell Post that they used 10 years of previously-identified patterns to train the system, and in testing, it “accurately re-created old crime patterns one-third of the time and returned parts of patterns 80 percent of the time.” The Post says that the system doesn’t take a suspect’s race into account in the course of its search, as a precaution against racial biases.
The result appears to be one that helps reduce some of the work that’s required for investigators, partially automating a process that has up until now been accomplished manually.