The New Orleans City Council passed a new law on Thursday that regulates certain parts of the city’s surveillance system and places an outright ban on specific pieces of surveillance technology, including facial recognition software and predictive policing. The ordinance, which had been repeatedly delayed over the past year, passed a month after The Lens reported that the NOPD has been using facial recognition software despite years of denials.
The council has been working on the ordinance for most of the year. And over that time, it was drastically watered down from what was initially drafted — in a partnership between Councilman Jason Williams and local grassroots advocacy group Eye on Surveillance.
The original ordinance would have created comprehensive approval processes, oversight protocols and regular reporting requirements for every part of the city’s sprawling surveillance system. Those were removed from the final version.
Even so, the ordinance that passed Thursday puts New Orleans on a short list of American cities to put a blanket ban on the use of facial recognition, which has been shown to misidentify Black, brown and Indigenous people at a far higher rate than white people. And the ordinance creates a new chapter in the City Code entitled “Surveillance Technology and Data Protection,” that will set the foundation for future surveillance regulations.
The ordinance, for example, establishes a formal definition for “surveillance technology” for the first time.
“Eye on Surveillance is proud of this important win for the people of New Orleans, and looks forward to working with neighbors and residents to explore evidence-based options for public safety,” a statement from the group said. “We will also continue to fight for further City action to address components of our original proposal that were not included in the final ordinance.”
The ordinance as passed puts outright bans on four pieces of technology — facial recognition, characteristic recognition and tracking software, predictive policing and cell-site simulators. A ban on license plate readers in the original ordinance was ultimately scrapped.
“As always, NOPD will work within the parameters set by the elected leadership of the City of New Orleans,” said a statement from NOPD spokesman Gary Scheets.
Characteristic tracking software is defined in the ordinance as “any software or system capable of tracking people and/or objects based on characteristics such as color, size, shape, age, eight, speed, path, clothing, accessories, vehicle make or model, or any other trait that can be used for tracking purposes, including BriefCam and similar software.”
Briefcam is a piece of software that New Orleans used for years that could search thousands of hours of footage nearly instantaneously to find people and vehicles that matched certain descriptions. The city claims that it stopped utilizing the software in late 2019.
Predictive policing is defined as “the use of predictive analytics software in law enforcement to predict information or trends about criminality, including but not limited to the perpetrator(s), victim(s), locations or frequency of future crime. It does not include, for example, software used to collect or display historic crime statistics for informational purposes.”
The city has had several pieces of software over the years that, critics have argued, fit that description. One high profile case was a software from Palantir, a CIA-funded Silicon Valley company. Palantir provided a software to the city, free of charge, that claimed to be able to predict which New Orleans residents were at the highest risk to commit a gun crime.
The Palantir software left the city in 2018. But other city-owned software and programs, including Motorola CommandCentral Predictive and Mayor LaToya Cantrell’s high-risk resident ID program, have had similar capabilities. It’s unclear whether both of those fall under the ordinance definition, or if there are any other pieces of software that could violate the new law.
Cell-site simulators, or stingrays, are devices that masquerade as cell towers that can intercept and mine data from all phone calls in the area. The city claimed this summer that it didn’t use the technology.
Facial recognition was the main topic of discussion on Thursday, likely a result of the recent revelation that the NOPD has been accessing facial recognition through partnerships with the FBI and the Louisiana State Police, even though the city has denied using the technology for years.
Just this week, the NOPD released dozens of pages of emails related to facial recognition in response to a public records request from the ACLU of Louisiana. The city had originally responded to the request in early November by saying “The Police Department does not employ facial recognition software.”
But then, after it was revealed that the city did in fact have access to the technology — not on its own but through its state and federal partners — the ACLU resubmitted its request. This time, the city said the request was overly burdensome and would be too difficult to compile. It did, however, release some limited records showing NOPD detectives making requests to the state to identify people with facial recognition.
The emails showed a formalized system for NOPD detectives to send requests to the state to utilize facial recognition and other surveillance techniques.
On Thursday, NOPD Superintendent Shaun Ferguson reiterated that the department didn’t have any existing policy for the use of facial recognition, but that it was working on one. He also indicated that he too was caught off guard by the facial recognition news.
“I had no idea we even had access to this, so it’s nothing we ever tried to hide at any point in time,” he said. “I have since learned that our department did not have a policy with regards to using this tool.”
Ferguson said that facial recognition was simply “a tool in the toolbox” that wasn’t used as the sole justification to make an arrest. He likened it to when the police issue a photo to local media or to Crimestoppers, arguing that the same level of error and bias in the facial recognition software could be found among eyewitness accounts.
“We have had instances where individuals, our citizens, have looked at photographs, like still prints that have been placed in the media, or have been disseminated through Crimestoppers, and it turned out they identified an individual who had absolutely nothing to do with the crime. So I really do not see the difference in using this particular tool.”
Ferguson, along with some council members, said they wanted to defer the ordinance to give the NOPD time to get a departmental facial recognition policy together. But others, including Williams and Councilwoman Helena Moreno, pushed back, saying that if anything, the technology should be banned for now until they can create a comprehensive policy, or until such a time that the technology proved more reliable and less biased.
Marvin Arnold, an organizer with Eye on Surveillance, argued that there are inherent problems with facial recognition that are more deeply ingrained than simple technical bugs that can be worked out in a year.
“If you look at government sources, at independent think tanks, we know that this technology is unreliable and doesn’t work. And it’s not an esoteric technical reason. It’s deep-rooted systemic inequities that create these technologies,” he said. “So why can’t facial recognition identify dark-skinned people as well as light-skinned people? Well, if all the computer programmers who develop the software are white and light-skinned, they’re going to pick a bunch of light-skinned data. And these kinds of subtle implementation details are not something that’s ever going to be fixed as long as racism is around. So talking about waiting until this is fixed is just missing the point.”
He also argued that the focus shouldn’t just be on the flaws of the technology, but on the lack of transparency and accountability the city has displayed.
“What the conversation today should be about is how the NOPD has been lying to us for years,” Arnold said. “The city lied to the council months ago. We’ve been under the impression there’s been no facial recognition, and that’s been a lie. That’s the story.”
Williams saw similar problems.
“Right now we’re talking about using a flawed technology on our private citizens that we’ve learned is already in place in the City of New Orleans and used against citizens,” he said. “When all the details were discovered, it highlights the point that because we the council have not put up guard rails, there are no checks. There is nothing to control how it’s used or how we are informed of its use.”
The ordinance was passed with a 6-1 vote, with Councilman Jared Brossett as the sole dissenting voice. The law will go into effect at the end of the month. How the law will be enforced, and how it will affect current city operations, is yet to be seen. But it appears that the law should immediately stop the practice of the NOPD accessing facial recognition services from state and Federal partners.
On the four pieces of banned technology, the ordinance says that no city official or entity can “obtain, retain, possess, access, sell, or use any prohibited surveillance technology or information derived from a prohibited surveillance technology.”
Later in the ordinance, however, it says that the city can still utilize evidence derived from facial recognition or characteristic tracking software “so long as such evidence was not generated by, with the knowledge of, or at the request of the City or any City official.”
The ordinance also mandates that any department that uses surveillance technology must designate a “Date Protection Officer” to ensure compliance with the new rules.
Some parts of the ordinance are completely untested. In one section regarding “automated decision systems” and artificial intelligence, says that “wherever decisions are made based on the identity of an individual, rather than based on patterns on the general population, such as air traffic control, individuals must have the option to opt out of automated decisions.”
It’s unclear what automated decisions or decisions based on artificial intelligence the city is making. It’s also unclear how an individual would go about trying to opt-out of them.
While the ordinance isn’t as comprehensive as the original, and while some enforcement questions remain outstanding, Williams lauded the ordinance as a first step toward the city regulating a surveillance system that already exists and continues to grow by the day
“It’s here,” Williams said. “The technology is here before there’s laws guiding the technology. And I think it’s a very dangerous position for communities to be in.”