New York City Passes Bill to Study Biases in Algorithms Used by the City

The bill is believed to be the first in the country to push for open sourcing of the algorithms used by courts, police, and city agencies.

New York City is taking the first steps to address algorithmic bias in city services. City council passed a bill last week that will require the city to address bias in algorithms used by the New York Police Department, the city’s courts, and dozens of city agencies.

The bill, which Mayor De Blasio could sign before the end of the month, would create a task force to figure out how to test city algorithms for bias, how citizens can request explanations of algorithmic decisions when they don’t like the outcome, and whether it’s feasible for the source code of city agencies to be made publicly available, according to Council member James Vacca’s office.

The bill, Intro 1696-A, is a smaller step than advocates wanted. An earlier version of the bill, proposed by Vacca of the Bronx, would have mandated that all agencies that perform algorithmic decision-making—from policing to public school assignments—make their code publicly available. The passed version of the bill requires the task force to lead the “development and implementation of a procedure that may be used by the city to determine whether an agency automated decision system disproportionately impacts persons based upon age, race, creed, color, religion, national origin, gender, disability, marital status, partnership status, caregiver status, sexual orientation, alienage or citizenship status.”

Criminal justice reformers and civil liberties groups have long pointed out that despite claims of objectivity, these algorithms reproduce existing biases, disproportionately targeting people by class, race, and gender. A 2016 Pro Publica investigation found, for instance, that a risk assessment tool was more likely to mislabel black than white defendants. And studies have found facial recognition algorithms were less accurate for black and female faces.

Advocates who support the bill in its current form still hope that the city will eventually make the algorithms open-source.

“Having it open for anyone is the best approach,” said Rashida Richardson, a lawyer with the New York Civil Liberties Union told me. She qualified that while a task force is not ideal, it’s the best way for the city to deal with the issue.

“We don’t want a city government bulldozing through an issue like this,” Richardson said. “There may not be a one-size fits all model for transparency.”

An audit of NYC’s algorithms could result in criminal justice reform, and a broad group of civic programmers, public defenders, and police reformers supported the bill at an October 16 hearing.

Advocates hope the measure could shine light on the NYPD’s controversial predictive policing program, which the police department plans to spend $45 million over 5 years, according to documents released by the city.

Critics of predictive policing—which uses statistics to determine where cops should spend spend time on their beat—say it reinforces existing biases and brings cops back to already over-policed neighborhoods.

Rachel Levinson-Waldman, a senior counsel at the Brennan Center, testified at the October hearing that she filed a Freedom of Information request asking for documents about the NYPD’s use of predictive policing and only received them after suing the city. And the NYPD still refused to disclose the source code for the predictive policing program, claiming it would help criminals evade the cops.

“For anyone familiar with this technology, it is a ridiculous assertion,” Joshua Norkin, a lawyer with the Legal Aid Society said to me of the NYPD’s assessment that opening up these algorithms would help criminals. He says the program is potentially racially biased, and there is no reason to shield its workings from public scrutiny. The hope is that a task force created by this bill could begin to address these issues.

Also under scrutiny is an algorithm used by the city’s criminal justice agency, which exists to reduce pretrial detention. The agency uses a tool to determine a defendant’s flight risk, or the likelihood that they’ll fail to appear for a court date. Judges receive recommendations about whether a defendant should be released on bail or not at all.

Many of the categories used as criteria are proxies for poverty, Norkin and others say: they include, for example, whether a defendant has a working telephone, whether they live with a parent or spouse, and whether they’re employed in a training program.

Norkin says that the city’s Criminal Justice Agency has worked with Legal Aid Society on testing the formulas used for flight risk and other programs. But the city could still use a more permanent method of auditing these algorithms before they’re deployed.

One of the things preventing algorithms from being publicized are the interests of private companies with lobbying power.

Charlie Moffett, a graduate student at NYU, worked on the issue of algorithmic bias as a researcher for San Francisco’s Chief Data Officer. He said frequently the city did not fight language in contracts with third-party vendors that kept algorithms obscured from the public. The result is proprietary algorithms that cities are legally unable to release for public scrutiny.

Some of the pushback to the “open source” version of Vacca’s bill was from Tech:NYC, a trade group that lobbies city government on behalf of many of NYC’s tech companies, including AirBnB, Google, AOL as well as hundreds of small startups.

Taline Sanassarian, policy director of Tech:NYC, testified at an October hearing that the group was concerned about the release of proprietary algorithms, suggesting that this could have a “chilling effect” that would discourage companies from working with the city.

The suggestion raised eyebrows from advocates, including some programmers.

Sumana Harihareswara, an independent startup consultant and programmer said that Tech:NYC did not represent her or the programmers she knew. “I personally wouldn’t feel a chilling effect,” she told me, “and I don’t think the people I know in the programming space would feel it.” Norkin said the city should protect its citizens, not corporate interests.

“The idea that there’s some obligation on behalf of the government who represents the people to protect proprietary algorithms is totally ridiculous,” Norkin told me. “There is absolutely nothing that obligates the city to look out for the interests of a private company looking to make a buck on a proprietary algorithm.”

Tech:NYC executive director Julie Samuels said in a statement the organization is happy with the passed version of the bill: “We’re glad that the final version of this legislation creates an opportunity for all stakeholders to weigh in to help craft something that serves everyone.”

Bron: Motherboard