April 15, 2021

Wyden, Democrats Question DOJ Funding of Unproven Predictive Policing Technology

Wyden, Clarke, Warren, Markey, Merkley, Padilla, Warnock and Jackson Lee Ask whether DOJ Ensures Programs Are Effective, Tests for Bias and Requires Transparency About Use of Predictive Policing Systems

Washington, D.C. – U.S. Senator Ron Wyden, D-Ore., with seven other members, asked the Justice Department to account for how it funds and oversees so-called predictive policing programs, especially whether the programs actually reduce crime, and the potential to amplify biased results that harm marginalized groups.

Reps. Yvette D. Clarke, D-N.Y., and Sheila Jackson Lee, D-Texas, and Sens. Elizabeth Warren, D-Mass., Edward J. Markey, D-Mass., Jeff Merkley, D-Ore., Alex Padilla, D-Calif., and Raphael Warnock, D-Ga., joined the letter to DOJ.

Predictive policing systems use algorithms to recommend where to deploy law-enforcement resources, or to identify individuals who are allegedly high-risk to commit crimes, based on historical data. However, multiple audits of these systems have found no evidence they are effective at preventing crime.

“We ask DOJ to help ensure that any predictive policing algorithms in use are fully documented, subjected to ongoing, independent audits by experts, and made to provide a system of due process for those impacted,” the members wrote. “If DOJ cannot ensure this, DOJ should halt any funding it is providing to develop and deploy these unproven tools.”

Experts have warned that using policing algorithms based on flawed law-enforcement data can reinforce discriminatory practices that harm marginalized groups, without improving public safety. 

Read the full letter here.

The members asked DOJ to answer the following questions by May 28:

  1. Has DOJ analyzed the extent to which the use of these technologies comply with the Civil Rights Act of 1964 or other relevant civil rights laws? If so, what was the result of that analysis?
  2. Please provide a detailed annual accounting of all federal funding distributed by DOJ and its related agencies in support of activities related to developing and implementing predictive policing algorithms (including for pilots and research) at federal, state, and local levels for Fiscal Years 2010 - 2020. Please indicate all relevant federal accounts, program activity names, and sources of funding including, but not limited to, funding from the Strategies for Policing Innovation (SPI) program & Edward Byrne Memorial Justice Assistance Grant Program.
  3. Please also provide a detailed annual accounting of all federal funding distributed by DOJ and its related agencies in support of data collection, data linkage systems (or so called “data fusion centers”), and databases that are used to run, develop, or test predictive policing algorithms for Fiscal Years 2010 - 2020. Please indicate all relevant federal accounts, program activity names, and sources of funding.
  4. Please name each jurisdiction both currently and previously operating predictive policing algorithms funded, in part or in whole, by the DOJ?
    1. What specific systems or software tools are or were being used at each of these sites?
    2. What is the period of time that these projects or technologies are or were funded over?
    3. What specific types of technologies (such as simple formulas, predictive analytics, and machine learning) are being used by these algorithms?
  5. Does the DOJ require predictive policing projects and technologies purchased with federal funds, in part or in whole, to:Does the DOJ require agencies and departments using these tools to perform cost-benefit analyses prior to and after the completion of these projects?
    1. Be tested for efficacy, validity, reliability, and bias generally and in relation to protected classes such as, but not limited to, race, ethnicity, and sex prior to deployment by federal, state or local agencies? If not, why not?
    2. Require vendors to submit validation studies or audits during the procurement process?
    3. Be audited for efficacy, validity, reliability, and bias generally and in relation to protected classes such as, but not limited to, race, ethnicity, and sex on both an ongoing and retrospective basis (either by the state and local agencies, the DOJ, or external researchers)?
    4. Include training for any operators predictive policing tools regarding the limitations of these technologies and their legal obligations?
    5. Be reviewed retrospectively by police departments to assess efficacy, reliability, bias, and legal concerns regarding use?
  6. Does the DOJ require agencies and departments using these tools to perform cost-benefit analyses prior to and after the completion of these projects?
  7. Does the DOJ require agencies and departments using these tools to allow individuals to challenge a police decision that is based on the output of these tools?
  8. Has the DOJ ever stopped funding or otherwise supporting the use of predictive or automated policing due to concerns with the efficacy or impact of the program? If so, how was that decision made?
  9. Has the DOJ evaluated the nationwide impact of predictive policing on people in protected classes? If yes, what was learned from these evaluations?
  10. Does the DOJ provide guidance to agencies and departments using these tools on best practices for data sharing, legal discovery and evidentiary obligations?