Predictive policing in the United States

Last updated

In the United States, the practice of predictive policing has been implemented by police departments in several states such as California, Washington, South Carolina, Alabama, Arizona, Tennessee, New York, and Illinois. [1] [2] Predictive policing refers to the usage of mathematical, predictive analytics, and other analytical techniques in law enforcement to identify potential criminal activity. [3] [4] Predictive policing methods fall into four general categories: methods for predicting crimes, methods for predicting offenders, methods for predicting perpetrators' identities, and methods for predicting victims of crime. [5]

Contents

In the United States, the technology has been described in the media as a revolutionary innovation capable of "stopping crime before it starts". [6] However, a RAND Corporation report on implementing predictive policing technology describes its role in more modest terms:

Predictive policing methods are not a crystal ball: they cannot foretell the future. They can only identify people and locations at increased risk of crime ... the most effective predictive policing approaches are elements of larger proactive strategies that build strong relationships between police departments and their communities to solve crime problems. [5]

In November 2011, TIME Magazine named predictive policing as one of the 50 best inventions of 2011, using the term "pre-emptive policing". [7]

Methodology

Predictive policing uses data on the times, locations and nature of past crimes, to provide insight to police strategists concerning where, and at what times, police patrols should patrol, or maintain a presence, in order to make the best use of resources or to have the greatest chance of deterring or preventing future crimes. This type of policing detects signals and patterns in crime reports to anticipate if crime will spike, when a shooting may occur, where the next car will be broken into, and who the next crime victim will be. Algorithms are produced by taking into account these factors, which consist of large amounts of data that can be analyzed. [8] The use of algorithms creates a more effective approach that speeds up the process of predictive policing since it can quickly factor in different variables to produce an automated outcome. From the predictions the algorithm generates, they should be coupled with a prevention strategy, which typically sends an officer to the predicted time and place of the crime. [9] The use of automated predictive policing supplies a more accurate and efficient process when looking at future crimes because there is data to back up decisions, rather than just the instincts of police officers. By having police use information from predictive policing, they are able to anticipate the concerns of communities, wisely allocate resources to times and places, and prevent victimization. [10]

Police may also use data accumulated on shootings and the sounds of gunfire to identify locations of shootings. The city of Chicago uses data blended from population mapping crime statistics, and whether to improve monitoring and identify patterns. [11] PredPol, founded in 2012 by a UCLA professor, is one of the market leaders for predictive policing software companies. [12] Its algorithm is formed through an examination of the near-repeat model, which infers that if a crime occurs in a specific location, the properties and land surrounding it are at risk for succeeding crime. This algorithm takes into account crime type, crime location, and the date and time of the crime in order to calculate predictions of future crime occurrences. [12] Another software program that is utilized for predictive policing is operation LASER, which is used in Los Angeles to attempt to reduce gun violence. [13] However, LASER was discontinued in 2019 due to a list of reasons, but specifically because of the inconsistencies when labeling people. [14] Furthermore, some police departments have also discontinued their usage of the program given the racial-biases and ineffective methods associated with it. [15] While the idea behind the predictive policing model is helpful in some ways, it has always had the potential to technologically reiterate social biases, which would inevitably increase the pre-existing patterns of inequality. [16]

The models used are not typically built on any direct assumptions about the data or what might cause crime. This is with the intent of removing human judgement and the opportunity for bias that comes with it from the equation however bias within the model may be unavoidable if the data used to build the models is itself biased as predictive models are only able to replicate patterns found in existing data. [17] Furthermore, while many models avoid using race, gender, location, or other sensitive and potentially biasing variables, it is extremely difficult to eliminate all proxies for such variables due to correlations between them and much of the other data available to law enforcement which is used by the models. [18]

History

Attempting to predict crimes within police departments can first be traced back to work conducted by the Chicago School of Sociology on parole recidivism in the 1920s. Involved in this process was sociologist Ernest Burgess, who used the research to craft the actuarial approach. The approach works to find and weigh certain factors that correlate with the prediction of future crime. Soon this spread into various parts of the justice system, leading to the creation of prediction instruments such as the Rapid Risk Assessment for Sexual Offense Recidivism (RRASOR) and the Violence Risk Appraisal Guide (VRAG). [19]

In 2008, Police Chief William Bratton at the Los Angeles Police Department began working with the acting directors of the Bureau of Justice Assistance and the National Institute of Justice to explore the concept of predictive policing in crime prevention. [20] In 2010, researchers proposed that it was possible to predict certain crimes, much like scientists forecast earthquake aftershocks. [1]

In 2009, the National Institute of Justice held its first predictive policing symposium. At the event, Kristina Rose, its acting director, claimed that the Shreveport, Los Angeles, D.C. Metropolitan, New York, Chicago, and Boston Police Departments were interested in implementing a predictive policing program. [21] Today, predictive policing programs are currently used by the police departments in several U.S. states such as California, Washington, South Carolina, Arizona, Tennessee, New York and Illinois. [1] [2]

From 2012, NOPD started a secretive collaboration with Palantir Technologies in the field of predictive policing. [22] According to the words of James Carville, he was impetus of this project and "[n]o one in New Orleans even knows about this". [22]

In 2020 the Fourth Circuit Court of Appeals handed down a decision which found predictive policing to be a law-enforcement tool that amounted to nothing more than reinforcement of a racist status quo. The court also held that to grant the government exigent circumstances exemption in this case would be a broad rebuke to the landmark Terry v. Ohio case which set the standard for unlawful search and seizure. [23] Predictive policing, which is typically applied to so-called 'High crime areas' – "relies on biased input to make biased decisions about where police should focus their proactive efforts", [24] and without it police are still able to fight crime adequately in minority communities. [25]

Effectiveness

The effectiveness of predictive policing has been tested through multiple studies with varying findings. In 2015, the New York Times published an article that analyzed predictive policing's effectiveness, citing numerous studies and explaining their results. [26]

A study conducted by the RAND Corporation found that there was no statistical evidence that crime was reduced when predictive policing was implemented. The study cites that predictive policing is only half of the effectiveness. Carefully executed human action is the second half of its effectiveness. This prediction and execution is highly dependent on the reliability of the input of the data. If the data is unreliable the effectiveness of predictive policing can be disputed. [27]

Another study conducted by the Los Angeles Police Department in 2010, found its accuracy to be twice that of its current practices. [1] In Santa Cruz, California, the implementation of predictive policing over a six-month period resulted in a 19 percent drop in the number of burglaries. [1] In Kent, 8.5 percent of all street crime occurred in locations predicted by PredPol, beating the 5 percent from police analysts. [28]

A study from the Max Planck Institute for Foreign and International Criminal Law in an evaluation of a three-year pilot of the Precobs (pre crime observation system) software [29] said no definite statements can be made about the efficacy of the software. The three-year pilot project will enter a second phase in 2018. [30]

According to the RAND Corporation study, the quality of data used for predictive policing can be severely insufficient if data censoring, systematic bias, and relevance is deficient. Data censoring is the implementation of data that omits crime in certain areas. Systematic bias can result when data is collected that shows a certain number of crimes, but does not sufficiently report when the crimes took place. Relevance is the usefulness of data that drives predictive policing. [27]

Documentation of these deficiencies have been reported to cause ineffective and discriminatory policing. One specific data collection reported on the "Disproportionate Risks of Driving While Black". This report showed that black drivers were significantly more likely to be stopped and searched while driving. [31] These biases can be fed into the algorithms used to implement predictive policing and lead to higher levels of racial profiling and disproportionate arrests.

According to the RAND study, the effectiveness of predictive policing requires and depends on the input of data that is high in quality and quantity. Without thoroughly sufficient data, predictive policing results in negative and inaccurate outcomes. Furthermore, it is also cited that predictive policing is inaccurately referred to as the "end of crime". However, the effectiveness of predictive policing depends fundamentally on the tangible action taken based on predictions. [27]

A 2014 report on the use of risk assessment models used to assist with determining conditions of parole found that risk assessments were very effective at reducing rates of recidivism and argues that banning such models would not be an effective solution to the problem of racial disparities in the criminal justice system but would shift the issue back to biased human decision making. [32]

A 2013 report on predictive policing found that much simpler models relying on basic crime statistics have often performed comparably well to more complex models without the drawback of being difficult to interpret and evaluate making them potentially a more reliable and trustworthy alternative. [33]

Independent evaluation of predictive policing experiments in Chicago, Illinois and Shreveport, Louisiana found neither program to have any statistically significant impact on crime. [34] [35] The Chicago experiment was, however, found to increase the arrest rate for targeted individuals despite having no difference in likelihood of involvement in crime. [35] The Shreveport experiment was found to reduce law enforcement spending by six to ten percent compared to groups not part of the program and some officers reported improved community relations as a result of the program. [35]

Hot spot policing strategy

A particular method of predictive policing called hot spot policing has had a positive effect on crime. [36] Evidence provided by the National Institute of Justice shows that this method has decreased the frequency of multiple, violent, and drug and alcohol offenses among others. [37] However, without careful execution and sufficient data implementation this method can perpetuate implicit bias and racial profiling.

Criticisms

Criticisms of predictive policing often focus on ethical concerns surrounding the opacity of complex algorithms limiting the ability to assess their fairness, potentially biased data sources used to create the models, and constitutional rights of individuals to due process. [17] Many algorithms used by law enforcement are purchased from private companies that keep the details of their workings hidden as trade secrets. This limits the public’s access when attempting to evaluate potential bias in the predictive models used. [17] Additionally, predicting locations and individuals associated with crime is seen by some as fundamentally unconstitutional who argue that it is contrary to the principle that everyone is presumed innocent until proven guilty. [17]

A coalition of civil rights groups, including the American Civil Liberties Union and the Electronic Frontier Foundation issued a statement criticizing the tendency of predictive policing to proliferate racial profiling. [38] The ACLU's Ezekiel Edwards argues that such software is more accurate at predicting policing practices than it is in predicting crimes. [39]

Some recent research is also critical of predictive policing. Kristian Lum and Isaac William have examined the consequences of training such systems with biased datasets in 'To predict and serve?'. [40] Saunders, Hunt and Hollywood demonstrate that the statistical significance of the predictions in practice verge on being negligible. [41]

In a comparison of methods of predictive policing and their pitfalls Logan Koepke comes to the conclusion that it is not yet the future of policing but 'just the policing status quo, cast in a new name'. [42]

In a testimony made to the NYC Automated Decision Systems Task Force, Janai Nelson, of the NAACP Legal Defense and Educational Fund, urged NYC to ban the use of data derived from discriminatory or biased enforcement policies. She also called for NYC to commit to full transparency on how the NYPD uses automated decision  systems, as well as how they operate. [43]

According to an article in Significance, 'the algorithms were behaving exactly as expected – they reproduced the patterns in the data used to train them' and that 'even the best machine learning algorithms trained on police data will reproduce the patterns and unknown biases in police data'. [44]

In 2020, following protests against police brutality, a group of mathematicians published a letter in Notices of the American Mathematical Society urging colleagues to stop work on predictive policing. Over 1,500 other mathematicians joined the proposed boycott. [45]

Some applications of predictive policing have targeted minority neighborhoods and lack feedback loops. [46]

Cities throughout the United States are enacting legislation to restrict the use of predictive policing technologies and other "invasive" intelligence-gathering techniques within their jurisdictions.

Following the introduction of predictive policing as a crime reduction strategy, via the results of an algorithm created through the use of the software PredPol, the city of Santa Cruz, California, experienced a decline in the number of burglaries reaching almost 20% in the first six months the program was in place. Despite this, in late June 2020 in the aftermath of the murder of George Floyd in Minneapolis, Minnesota, along with a growing call for increased accountability amongst police departments, the Santa Cruz City Council voted in favor of a complete ban on the use of predictive policing technology. [47]

Accompanying the ban on predictive policing, was a similar prohibition of facial recognition technology. Facial recognition technology has been criticized for its reduced accuracy on darker skin tones – which can contribute to cases of mistaken identity and potentially, wrongful convictions. [48]

In 2019, Michael Oliver, of Detroit, Michigan, was wrongfully accused of larceny when his face registered as a "match" in the DataWorks Plus software to the suspect identified in a video taken by the victim of the alleged crime. Oliver spent months going to court arguing for his innocence – and once the judge supervising the case viewed the video footage of the crime, it was clear that Oliver was not the perpetrator. In fact, the perpetrator and Oliver did not resemble each other at all – except for the fact that they are both African-American which makes it more likely that the facial recognition technology will make an identification error. [49]

With regards to predictive policing technology, the mayor of Santa Cruz, Justin Cummings, is quoted as saying, "this is something that targets people who are like me," referencing the patterns of racial bias and discrimination that predictive policing can continue rather than stop. [50]

For example, as Dorothy Roberts explains in her academic journal article, Digitizing the Carceral State, the data entered into predictive policing algorithms to predict where crimes will occur or who is likely to commit criminal activity, tends to contain information that has been affected by racism. For example, the inclusion of arrest or incarceration history, neighborhood of residence, level of education, membership in gangs or organized crime groups, 911 call records, among other features, can produce algorithms that suggest the over-policing of minority or low-income communities. [51]

A 2014 report argues that the principle of using past behavior to assess future risk is itself fair, but that the existing records used are not representative of actual past behavior. For example, historical rates of marijuana use are generally consistent across racial lines but there are significant disparities in arrest rates for marijuana possession offenses indicative of unequal enforcement by police which has led to minority groups having significantly more criminal records. By having one group be overrepresented in historical arrest data, any models then trained on that data will be biased to consider members of that group to be at a higher risk of committing crimes in the future. [32]

Faced with privacy concerns of citizens in response to the threat of governmental monitoring and automated surveillance by law enforcement, Maine passed a law in 2021 prohibiting facial recognition’s use by the government in most cases and only provides exceptions in a limited set of serious cases such as identifying missing persons. [52]

NYPD’s Patternizr model was created to streamline the work of crime analysts and investigators in identifying strings of crimes as being related to one another and potentially committed by a single perpetrator. The NYPD argues that the model’s ability to rapidly detect patterns in crime has led to the quick and correct identification of several serial offenders. Critics of the program argue that it is unfair to citizens, based on unproven social science, and could lead to false confessions and imprisonment of innocent individuals who are flagged by the program and feel they have no choice but to accept a plea deal. [18]

See also

Related Research Articles

Data mining is the process of extracting and discovering patterns in large data sets involving methods at the intersection of machine learning, statistics, and database systems. Data mining is an interdisciplinary subfield of computer science and statistics with an overall goal of extracting information from a data set and transforming the information into a comprehensible structure for further use. Data mining is the analysis step of the "knowledge discovery in databases" process, or KDD. Aside from the raw analysis step, it also involves database and data management aspects, data pre-processing, model and inference considerations, interestingness metrics, complexity considerations, post-processing of discovered structures, visualization, and online updating.

Machine learning (ML) is a field of study in artificial intelligence concerned with the development and study of statistical algorithms that can learn from data and generalize to unseen data, and thus perform tasks without explicit instructions. Recently, artificial neural networks have been able to surpass many previous approaches in performance.

<span class="mw-page-title-main">Facial recognition system</span> Technology capable of matching a face from an image against a database of faces

A facial recognition system is a technology potentially capable of matching a human face from a digital image or a video frame against a database of faces. Such a system is typically employed to authenticate users through ID verification services, and works by pinpointing and measuring facial features from a given image.

<span class="mw-page-title-main">Crime analysis</span>

Crime analysis is a law enforcement function that involves systematic analysis for identifying and analyzing patterns and trends in crime and disorder. Information on patterns can help law enforcement agencies deploy resources in a more effective manner, and assist detectives in identifying and apprehending suspects. Crime analysis also plays a role in devising solutions to crime problems, and formulating crime prevention strategies. Quantitative social science data analysis methods are part of the crime analysis process, though qualitative methods such as examining police report narratives also play a role.

Predictive analytics is a form of business analytics applying machine learning to generate a predictive model for certain business applications. As such, it encompasses a variety of statistical techniques from predictive modeling and machine learning that analyze current and historical facts to make predictions about future or otherwise unknown events. It represents a major subset of machine learning applications; in some contexts, it is synonymous with machine learning.

Crime prevention is the attempt to reduce and deter crime and criminals. It is applied specifically to efforts made by governments to reduce crime, enforce the law, and maintain criminal justice.

<span class="mw-page-title-main">Pre-crime</span> Term for crimes not yet committed

Pre-crime is the idea that the occurrence of a crime can be anticipated before it happens. The term was coined by science fiction author Philip K. Dick, and is increasingly used in academic literature to describe and criticise the tendency in criminal justice systems to focus on crimes not yet committed. Precrime intervenes to punish, disrupt, incapacitate or restrict those deemed to embody future crime threats. The term precrime embodies a temporal paradox, suggesting both that a crime has not yet occurred and that it is a foregone conclusion.

<span class="mw-page-title-main">Domain Awareness System</span> Surveillance system by the NYPD and Microsoft

The Domain Awareness System, the largest digital surveillance system in the world, is part of the Lower Manhattan Security Initiative in partnership between the New York Police Department and Microsoft to monitor New York City. It allows the NYPD to track surveillance targets and gain detailed information about them, and is overseen by the NYPD Counterterrorism Bureau.

<span class="mw-page-title-main">Legal technology</span> Technology and software to provide legal services

Legal technology, also known as Legal Tech, refers to the use of technology and software to provide legal services and support the legal industry. Legal Tech companies are often startups founded with the purpose of disrupting the traditionally conservative legal market.

<span class="mw-page-title-main">Surveillance issues in smart cities</span> Overview about surveillance issues in smart cities

Smart cities seek to implement information and communication technologies (ICT) to improve the efficiency and sustainability of urban spaces while reducing costs and resource consumption. In the context of surveillance, smart cities monitor citizens through strategically placed sensors around the urban landscape, which collect data regarding many different factors of urban living. From these sensors, data is transmitted, aggregated, and analyzed by governments and other local authorities to extrapolate information about the challenges the city faces in sectors such as crime prevention, traffic management, energy use and waste reduction. This serves to facilitate better urban planning and allows governments to tailor their services to the local population.

<span class="mw-page-title-main">Algorithmic bias</span> Technological phenomenon with social implications

Algorithmic bias describes systematic and repeatable errors in a computer system that create "unfair" outcomes, such as "privileging" one category over another in ways different from the intended function of the algorithm.

Correctional Offender Management Profiling for Alternative Sanctions (COMPAS) is a case management and decision support tool developed and owned by Northpointe used by U.S. courts to assess the likelihood of a defendant becoming a recidivist.

PredPol, Inc, now known as Geolitica, is a predictive policing company that attempts to predict property crimes using predictive analytics. PredPol is also the name of the software the company produces. PredPol began as a project of the Los Angeles Police Department (LAPD) and University of California, Los Angeles professor Jeff Brantingham. PredPol has produced a patented algorithm, which is based on a model used to predict earthquake aftershocks.

<span class="mw-page-title-main">Police surveillance in New York City</span>

The New York City Police Department (NYPD) actively monitors public activity in New York City, New York, United States. Historically, surveillance has been used by the NYPD for a range of purposes, including against crime, counter-terrorism, and also for nefarious or controversial subjects such as monitoring political demonstrations, activities, and protests, and even entire ethnic and religious groups.

DataWorks Plus LLC is a privately held biometrics systems integrator based in Greenville, South Carolina. The company started in 2000 and originally focused on mugshot management, adding facial recognition beginning in 2005. Brad Bylenga is the CEO, and Todd Pastorini is the EVP and GM. Usage of the technology by police departments has resulted in wrongful arrests.

<span class="mw-page-title-main">Algorithmic Justice League</span> Digital advocacy non-profit organization

The Algorithmic Justice League (AJL) is a digital advocacy non-profit organization based in Cambridge, Massachusetts. Founded in 2016 by computer scientist Joy Buolamwini, the AJL uses research, artwork, and policy advocacy to increase societal awareness regarding the use of artificial intelligence (AI) in society and the harms and biases that AI can pose to society. The AJL has engaged in a variety of open online seminars, media appearances, and tech advocacy initiatives to communicate information about bias in AI systems and promote industry and government action to mitigate against the creation and deployment of biased AI systems. In 2021, Fast Company named AJL as one of the 10 most innovative AI companies in the world.

<span class="mw-page-title-main">Rashida Richardson</span> American attorney and scholar

Rashida Richardson is a visiting scholar at Rutgers Law School and the Rutgers Institute for Information Policy and the Law and an attorney advisor to the Federal Trade Commission. She is also an assistant professor of law and political science at the Northeastern University School of Law and the Northeastern University Department of Political Science in the College of Social Sciences and Humanities.

Himabindu "Hima" Lakkaraju is an Indian-American computer scientist who works on machine learning, artificial intelligence, algorithmic bias, and AI accountability. She is currently an Assistant Professor at the Harvard Business School and is also affiliated with the Department of Computer Science at Harvard University. Lakkaraju is known for her work on explainable machine learning. More broadly, her research focuses on developing machine learning models and algorithms that are interpretable, transparent, fair, and reliable. She also investigates the practical and ethical implications of deploying machine learning models in domains involving high-stakes decisions such as healthcare, criminal justice, business, and education. Lakkaraju was named as one of the world's top Innovators Under 35 by both Vanity Fair and the MIT Technology Review.

Automated decision-making (ADM) involves the use of data, machines and algorithms to make decisions in a range of contexts, including public administration, business, health, education, law, employment, transport, media and entertainment, with varying degrees of human oversight or intervention. ADM involves large-scale data from a range of sources, such as databases, text, social media, sensors, images or speech, that is processed using various technologies including computer software, algorithms, machine learning, natural language processing, artificial intelligence, augmented intelligence and robotics. The increasing use of automated decision-making systems (ADMS) across a range of contexts presents many benefits and challenges to human society requiring consideration of the technical, legal, ethical, societal, educational, economic and health consequences.

<span class="mw-page-title-main">Predictive policing</span> Use of predictive analytics to direct policing

Predictive policing is the usage of mathematics, predictive analytics, and other analytical techniques in law enforcement to identify potential criminal activity. A report published by the RAND Corporation identified four general categories predictive policing methods fall into: methods for predicting crimes, methods for predicting offenders, methods for predicting perpetrators' identities, and methods for predicting victims of crime.

References

  1. 1 2 3 4 5 Friend, Zach. "Predictive Policing: Using Technology to Reduce Crime". FBI Law Enforcement Bulletin. Federal Bureau of Investigation . Retrieved 8 February 2018.
  2. 1 2 Levine, E. S.; Tisch, Jessica; Tasso, Anthony; Joy, Michael (February 2017). "The New York City Police Department's Domain Awareness System". Interfaces. 47 (1): 70–84. doi:10.1287/inte.2016.0860.
  3. Rienks R. (2015). "Predictive Policing: Taking a chance for a safer future".
  4. Benbouzid, Bilel (2019). "To predict and to manage. Predictive policing in the United States". Big Data & Society. 6 (1). doi: 10.1177/2053951719861703 .
  5. 1 2 Perry, Walter L.; McInnis, Brian; Price, Carter C.; Smith, Susan; Hollywood, John S. (25 September 2013), The Role of Crime Forecasting in Law Enforcement Operations
  6. Rubin, Joel (21 August 2010). "Stopping crime before it starts". The Los Angeles Times . Retrieved 19 December 2013.
  7. "The 50 Best Inventions". Time . 28 November 2011. Retrieved 19 December 2013.
  8. "Predictive Policing Explained | Brennan Center for Justice". www.brennancenter.org. 2020-04-01. Retrieved 2020-11-19.
  9. National Academies of Sciences, Engineering (2017-11-09). Proactive Policing: Effects on Crime and Communities. National Academies Press. ISBN   978-0-309-46713-1.
  10. National Academies of Sciences, Engineering (2017-11-09). Weisburd, David; Majimundar, Malay K (eds.). Proactive Policing: Effects on Crime and Communities. doi:10.17226/24928. ISBN   978-0-309-46713-1. S2CID   158608420.
  11. "Violent crime is down in Chicago". The Economist. 5 May 2018. Retrieved 2018-05-31.
  12. 1 2 "Predict Prevent Crime | Predictive Policing Software". PredPol. Retrieved 2020-11-19.
  13. "NCJRS Abstract - National Criminal Justice Reference Service". www.ncjrs.gov. Retrieved 2020-11-19.
  14. "LAPD ends another data-driven crime program touted to target violent offenders". Los Angeles Times. 2019-04-12. Retrieved 2020-11-19.
  15. Winston, Ali (2018-04-26). "A pioneer in predictive policing is starting a troubling new project". The Verge. Retrieved 2020-11-19.
  16. Brayne, Sarah (2017). "Big Data Surveillance: The Case of Policing". American Sociological Review. 82 (5): 977–1008. doi:10.1177/0003122417725865. PMC   10846878 . PMID   38322733. S2CID   3609838.
  17. 1 2 3 4 Shapiro, Aaron (January 2017). "Reform predictive policing". Nature. 541 (7638): 458–460. Bibcode:2017Natur.541..458S. doi: 10.1038/541458a . ISSN   0028-0836. PMID   28128275. S2CID   4466696.
  18. 1 2 Griffard, Molly. "A BIAS-FREE PREDICTIVE POLICING TOOL? AN EVALUATION OF THE NYPD'S PATTERNIZR". Fordham Urban Law Journal. 47 (1): 43–.
  19. Ferguson, Andrew G. "Policing Predictive Policing" . Retrieved 17 November 2020.
  20. Perry, Walter L. (2013). Predictive Policing: The Role of Crime Forecasting in Law Enforcement Operations. RAND Corporation. p. 4. ISBN   978-0833081551.
  21. Rose, Kristina (November 19, 2009). "Predictive Policing Symposium: Opening Remarks" (PDF). Los Angeles, CA. p. 16. Retrieved November 13, 2020.
  22. 1 2 Winston, Ali (27 February 2018). "Palantir has secretly been using New Orleans to test its predictive policing technology". The Verge. Retrieved 23 April 2020.
  23. Cushing, Tim (27 July 2020). "Appeals Court Bashes Predictive Policing And The Judge Who Argued People In High Crime Areas Want Fewer Rights". Techdirt.
  24. "techdirt: Appeals Court Bashes Predictive Policing And The Judge Who Argued People In High Crime Areas Want Fewer Rights |".
  25. https://assets.documentcloud.org/documents/6997575/predpol4thCirc.pdf pg.31-32, 33-37
  26. Patel, Faiza (November 18, 2015). "Be Cautious About Data-Driven Policing". www.nytimes.com.
  27. 1 2 3 Perry, Walter L.; McInnis, Brian; Price, Carter C.; Smith, Susan; Hollywood, John S. (2013-09-25). "Predictive Policing: The Role of Crime Forecasting in Law Enforcement Operations".{{cite journal}}: Cite journal requires |journal= (help)
  28. "Don't even think about it". The Economist. 20 July 2013. Retrieved 20 December 2013.
  29. "IfmPt - Institut für musterbasierte Prognosetechnik". www.ifmpt.com (in German).
  30. "Predictive Policing". www.mpicc.de. Max Planck Institute for Foreign and International Criminal Law.
  31. LaFraniere, Sharon; Lehren, Andrew W. (2015-10-24). "The Disproportionate Risks of Driving While Black (Published 2015)". The New York Times. ISSN   0362-4331 . Retrieved 2020-11-20.
  32. 1 2 Breaux, Justin (11 Aug 2014). "Could Risk Assessment Contribute to Racial Disparity in the Justice System?". Urban Institute.
  33. Perry, Walter; McInnis, Brian; Price, Carter; Smith, Susan; Hollywood, John (2013). Predictive Policing: The Role of Crime Forecasting in Law Enforcement Operations. doi:10.7249/rr233. ISBN   9780833081483. S2CID   169831855.
  34. Hunt, Priscillia; Saunders, Jessica; Hollywood, John S. (2014-07-01). "Evaluation of the Shreveport Predictive Policing Experiment".{{cite journal}}: Cite journal requires |journal= (help)
  35. 1 2 3 Saunders, Jessica; Hunt, Priscillia; Hollywood, John S. (September 2016). "Predictions put into practice: a quasi-experimental evaluation of Chicago's predictive policing pilot". Journal of Experimental Criminology. 12 (3): 347–371. doi:10.1007/s11292-016-9272-0. ISSN   1573-3750. S2CID   255129540.
  36. "5 Things You Need to Know About Hot Spots Policing & The "Koper Curve" Theory". National Police Foundation. 2015-06-30. Retrieved 2020-11-20.
  37. "Hot Spot Policing Can Reduce Crime". National Institute of Justice. Retrieved 2020-11-20.
  38. "Statement of Concern About Predictive Policing by ACLU and 16 Civil Rights Privacy, Racial Justice, and Technology Organizations". American Civil Liberties Union.
  39. "Predictive Policing Software Is More Accurate at Predicting Policing Than Predicting Crime". American Civil Liberties Union. 31 August 2016.
  40. Lum, Kristian; Isaac, William (October 2016). "To predict and serve?". Significance. 13 (5): 14–19. doi: 10.1111/j.1740-9713.2016.00960.x .
  41. Saunders, Jessica; Hunt, Priscillia; Hollywood, John S. (12 August 2016). "Predictions put into practice: a quasi-experimental evaluation of Chicago's predictive policing pilot". Journal of Experimental Criminology. 12 (3): 347–371. doi:10.1007/s11292-016-9272-0. S2CID   152275830.
  42. Koepke, Logan (21 November 2016). "Predictive Policing Isn't About the Future". Slate.
  43. Nelson, Janai. "Testimony of Janai Nelson" (PDF). Archived from the original (PDF) on 8 June 2019. Retrieved 3 June 2022.
  44. Lum, Kristian; Isaac, William (October 2016). "To predict and serve?". Significance. 13 (5): 14–19. doi: 10.1111/j.1740-9713.2016.00960.x .
  45. Linder, Courtney (July 20, 2020). "Why Hundreds of Mathematicians Are Boycotting Predictive Policing". Popular Mechanics . Retrieved June 3, 2022.
  46. "Where in the World is AI? Responsible & Unethical AI Examples". map.ai-global.org.
  47. Kristi, Sturgill (June 26, 2020). "Santa Cruz becomes the first U.S. city to ban predictive policing". Los Angeles Times. Retrieved June 3, 2022.
  48. Simonite, Tom (July 22, 2019). "The Best Algorithms Struggle to Recognize Black Faces Equally". Wired.
  49. Cushing, Tim (July 14, 2020). "Detroit PD Now Linked To Two Bogus Arrests Stemming From Facial Recognition False Positives". Techdirt. Retrieved June 3, 2022.
  50. Asher-Schapiro, Avi (June 17, 2020). "In a U.S. first, California city set to ban predictive policing". Reuters. Retrieved June 3, 2022.
  51. Roberts, Dorothy (April 10, 2019). "Digitizing the Carceral State". Harvard Law Review. 132.
  52. Lee, Nicol; Chin, Caitlin (12 Apr 2022). "Police Surveillance and Facial Recognition: Why Data Privacy Is Imperative for Communities of Color". Brookings.