Washington, DC the base of the most powerful government on earth. It’s also home to 690,000 people — and 29 obscure algorithms that define their lives. City agencies use automation to screen housing applicants, predict recidivism, detect food aid fraud, determine the likelihood of a high school dropout, inform sentencing decisions for young people, and more.
This snapshot of semi-automated urban life comes from a new report by the Electronic Privacy Information Center (EPIC). The nonprofit spent 14 months investigating the city’s use of algorithms and found they were in use by 20 agencies, with more than a third working in police or criminal justice. For many systems, city agencies do not provide complete information about how their technology has operated and been used. The project team concluded that the city was likely using even more algorithms that they were unable to uncover.
The findings are notable outside of D.C. because they add to the evidence that many cities have quietly put bureaucratic algorithms to work in their departments, where they can help make decisions that affect citizens’ lives.
Government agencies often turn to automation in hopes of improving efficiency or objectivity in bureaucratic processes, but citizens often find it difficult to know what they’re doing, and some systems have been found to discriminate and lead to decisions that cost lives. In Michigan, an unemployment fraud detection algorithm produced 40,000 false fraud allegations with a 93 percent error rate. A 2020 analysis by Stanford University and New York University found that nearly half of federal agencies use some type of automated decision-making system.
EPIC took a deep dive into the use of algorithms in one city to shed light on how they can impact citizens’ lives and encourage people elsewhere to do similar exercises. Ben Winters, who directs the nonprofit’s work on AI and human rights, says Washington was chosen in part because roughly half of the city’s residents identify as black.
“More often than not, automated decision-making systems have a disproportionate impact on black communities,” says Winters. The project found evidence that automated traffic cameras are disproportionately located in areas with higher black populations.
Cities with significant black populations have recently played a central role in campaigns against municipal algorithms, particularly in the police. Detroit became the epicenter of the facial recognition debate following the false arrests of Robert Williams and Michael Oliver in 2019 after algorithms mistakenly identified them. In 2015, the deployment of facial recognition in Baltimore following the police death of Freddie Gray led to one of the first congressional investigations into law enforcement use of the technology.
EPIC searched the algorithms, seeking public disclosures by city agencies, and filed public records requests seeking contracts, data sharing agreements, privacy impact assessments and other information. Six of the 12 city agencies responded by sharing documents such as a $295,000 contract with Thomson Reuters-owned Pondera Systems, which makes fraud detection software called FraudCaster that is used to screen food aid applicants. Earlier this year, California officials found that more than half of the state’s 1.1 million reports that Pondera’s software was flagged as suspicious were actually legitimate.