On Tuesday, New York City released a 33-page report on city agencies’ use of algorithms to make decisions about government services and programs. The same day, Mayor Bill de Blasio announced a new position within city government, a third-party officer who will guide, oversee, and report on the use of decision-making algorithms going forward.

Experts have long raised concerns over New York’s unmonitored use of algorithms, noting that even seemingly neutral tools are only as unbiased as the data they rely on.

But de Blasio’s new oversight role carves out a broad exemption protecting the NYPD from oversight. And at least one member of the task force that created the report says its work was hindered by an uncooperative city government.

City agencies will not be required to report information to the new oversight officer “that would interfere with a law enforcement investigation or other investigative activity by an agency or would compromise public safety,” according to de Blasio’s executive order creating the role.

“The language is pretty broad. So it actually pretty much gives the NYPD a broad basing to oppose having to cooperate with the officer,” said Ángel S. Díaz, Liberty & National Security Counsel at the Brennan Center for Justice. “And really, the NYPD’s opaque and unaccountable use of algorithms is one of the biggest reasons why we need a third party to oversee the city’s use of algorithms. So it’s a real missed opportunity to have any kind of teeth to what this officer is going to be tasked with doing.”

In a statement, Public Advocate Jumaane Williams and City Council Member Brad Lander criticized the exemption, stating that while they welcome the creation of an algorithms officer, law enforcement should not be excluded from oversight. “Decades of racially discriminatory policing have left us with troves of biased data that continue to inform today’s predictive policing algorithms, creating feedback loops that recreate harmful overpolicing in communities of color,” they noted.

City agencies use automated decision systems to aid in child welfare assessments, school assignments, fire department funding allocation, landlord investigations, disease surveillance, immigration decisions, and a variety of other services and activities, according to the AI Now Institute at NYU. The NYPD uses a host of technologies, including a gang database, license plate readers, facial recognition, and predictive policing software.

Díaz said the Brennan Center supports the POST Act, city council legislation that would require the NYPD to disclose which surveillance technologies it uses.

AI Now co-founder Meredith Whittaker was one of 16 stakeholders, advocates, and experts tasked by the city council to put together the report published this week. The council was moved to establish the task force after a 2016 ProPublica investigation exposed racial biases in predictive policing algorithms.

But Whittaker noted on Twitter that the government did not provide task force members with the information they needed to conduct real oversight or establish meaningful recommendations. She said members were never even given specifics on all the tools city agencies use, despite repeated requests for this basic information.

“The report was written by the city. They had the pen the whole time," Whittaker told Gothamist.

The resulting recommendations in the report are highly broad, such as suggesting the city develop resources to guide the use of decision-making algorithms and establish formal ways of reporting on their use.

In a statement, a City Hall spokesperson told Gothamist that “while not everyone agreed on every issue, after 18 months the final report reflected consensus on key, actionable recommendations for the City.”

Regarding the new algorithms officer, the city stated that “the language regarding public safety in the Executive Order is very similar to that in Local Law 49 of 2018, passed overwhelmingly by the City Council, creating the Automated Decisions System Task Force. The City is accepting the Task Force’s recommendation to create a central body within City government to work with all agencies, including the NYPD, to develop best practices around their use of algorithms. No agency is exempt, but rather specific information that would harm public safety and security of New Yorkers.”

The executive order states that “personnel and other resources” for this officer will be provided within 90 days, an “algorithms advisory committee” will be established within 120 days, and that the officer will publish the first report on December 1, 2020.

Díaz said the investigations exemption to the oversight role “shows a lack of leadership from our elected officials to hold the NYPD accountable.”

“As a country, we’re becoming aware of the big problems of relying on supposedly neutral algorithms, and the way that we can automate an unequal future,” he said. “It really is inexcusable for New York City to close its eyes to that reality.”