Earlier this month, Mayor LaToya Cantrell’s administration revealed some details of a plan to expand the city’s video surveillance system with SafeCam Platinum — a program that will give the city access to live footage from potentially thousands of privately owned cameras.
But the power of the city’s surveillance network doesn’t just come from the sheer number of cameras. Its effectiveness is largely derived from the potent software behind the footage.
In August 2017, three months before former-Mayor Mitch Landrieu announced the opening of the city’s $5 million Real Time Crime Monitoring Center, the city spent $2.8 million to buy a suite of software from Motorola Solutions that includes artificial intelligence and object detection to help law enforcement sift through the thousands of hours of footage recorded every day.
The package includes software called BriefCam, CommandCentral Aware, CommandCentral Analytics, and CommandCentral Predictive.
“My initial reaction is, holy cow, this is not just big brother. This is colossal brother,” said Bruce Hamilton, a staff attorney at the ACLU of Louisiana. “When you take a step back, you really get the sense that the surveillance state is rapidly expanding here in New Orleans.”
According to the administrator of the monitoring center, Ross Bourgeois, the most commonly used software on a day-to-day basis are BriefCam and CommandCentral Aware. Briefcam, a Massachusetts firm founded a decade ago in Israel, got the attention of American law enforcement in 2013 when its software was credited with helping to identify the Boston Marathon bombers.
A Briefcam press release in June announced its partnership with New Orleans’ office of Homeland Security, which operates the city’s monitoring center, and described the software:
“BriefCam’s breakthrough technology detects, tracks, extracts and identifies people and objects from video, including; men, women, children, clothing, bags, vehicles, animals, size, color, speed, path, direction, dwell time, and more.”
Bourgeois told The Lens that the process is akin to “putting tabs in a textbook.” But it’s a bit more advanced than that. As live footage filters through the software, moving objects are separated from the background using “artificial intelligence” and tracked for as long as they remain in view of the camera.
Objects are then classified based on various characteristics to “identify men, women, children and vehicles with speed and precision, using 25 classes and attributes, face recognition, appearance similarity, color, size, speed, path, direction, and dwell time,” according to BriefCam’s website.
BriefCam added facial recognition capabilities in November, a year after the city had already purchased the software. Bourgeois said that due to a lack of funds, the city hasn’t purchased the updated version that includes facial recognition. The city has stated numerous times that does not use facial recognition.
Still, with the software the city does have, the NOPD could, for example, pull up every piece of footage that includes a man in a red shirt on a blue bike riding north on Esplanade Avenue. Or it could pull up every person with a backpack that appeared on the corner of Broad Street and Orleans Avenue, displaying them all simultaneously on one screen with timestamps over their heads to indicate when they were there.
“My question is, what does it mean to say you’re not using facial recognition technology when you’re using something that’s equally powerful?” Hamilton said. “So for instance, the city may not be able to identify me by my face, but if the city knows what I’m is wearing, or what I drive, I mean there’s any number of factors they can use to track you from place to place.”
BriefCam can also produce “heat maps” of dwell time, object interaction, and common paths. In a YouTube video from March, the head of the Hartford, Connecticut crime center illustrated how he used the software to track the movement paths of everyone who walked in the frame of one of the city’s cameras to track which homes they were coming in and out of. He claimed that the analytics helped him locate a drug operation.
“How long did that take me to do?” he said in the video. “That took me a minute.”
Motorola’s CommandCentral Aware is the other key component of the New Orleans system. According to Bourgeois, the software acts as a bridge between the monitoring center and the city’s 911 call center.
“Every time the police, fire, or EMS do anything, an incident is created, whether it’s a call for service or a self-initiated activity.” he said. “And every incident has a location.”
It also automatically compiles relevant data and information for each call, including video footage and past incident reports, and sends it directly to responders in the field.
‘I don’t find that very reassuring’
The city also purchased CommandCentral Analytics and CommandCentral Predictive as part of the software package, but Bourgeois said they aren’t being used.
“I have Microsoft Access on my computer because it came with Microsoft Office,” he said. “I’ve never used it, but I have it if I wanted to.”
When asked whether the city had the option to only buy CommandCentral Aware and BriefCam alone, or whether that would have been cheaper, he said he didn’t know. The city has not responded to follow-up questions from The Lens.
CommandCentral Predictive claims to use an “advanced algorithm” to accurately predict 30 percent of the next day’s crime. The software creates “prediction boxes” that can be as small as 500 square feet.
“Only CommandCentral Predictive tells officers where to look, and also what and who to look for,” says a brochure from Motorola.
Predictive policing programs are often criticized for relying on historical police data, which critics say perpetuates biases within the justice system.
“Predictive policing software doesn’t predict crime, it predicts policing,” Hamilton said. “I really hope that the city does not use that tool, but I’m not particularly assured by the fact that they say they’re not using it. Especially since I didn’t know, I don’t know if anyone knew, that the city had this tool to begin with.”
Real-Time Crime Monitoring Center employees are required to sign a city policy document that says the system can be used only for legitimate law enforcement and public safety purposes. It prohibits surveillance based on “perceived race, color, religion or creed, national origin,” and other categories. And it says that facial recognition “is not utilized.”
At the time of the surveillance system’s implementation, some residents and community organizations, as well as the city’s Independent Police Monitor, feared that it would disproportionately target communities of color and put the city’s non-citizen immigrant population at risk.
Bourgeois described that policy as “pretty ironclad.”
The policy dictates that disciplinary action “may be taken” against employees who violate the policy. But it doesn’t include a public or internal reporting mechanism for violations or create any independent oversight for the system.
Hamilton said he is unsure what protections the policy actually provides.
“Frankly, I don’t think the policy has any real legal effect,” he said. “Who’s going to find these violations and enforce this policy? To me, that’s the city saying, ‘well, we’ll be careful.’ And I don’t find that very reassuring.”
Despite protests, surveillance system continued to grow
In November 2017, Landrieu appeared with then-Councilwoman Latoya Cantrell at the opening of the Real Time Crime Monitoring Center. In front of a wall of screens projecting scenes from across New Orleans, Landrieu announced that the city already installed 80 cameras around the city, with 250 more coming in 2018, along with 112 new license plate readers.
He also announced his intention to pass an ordinance that would require every institution with a liquor license to install public facing cameras that would feed live footage into the monitoring center. The ACLU quickly condemned the plan as “surveillance on steroids.” Public outrage sparked a fierce local debate that reached the pages of The New York Times.
On March 20, the ordinance was withdrawn by Councilwoman Stacy Head, who told The Lens at the time that she didn’t believe “there was an appetite for it.”
But the surveillance network didn’t stop growing.
Over the course of 2018, the Landrieu and Cantrell administrations launched a number of quieter, incremental expansions. In August, for example, the city council, at the request of the Mayor, transferred a $100,000 state grant meant for economic development in Gentilly to the office of Homeland Security to install 8 new crime cameras in Gentilly.
In September, Cantrell used $70,000 from the million dollar CleanUpNOLA Initiative — designed to remove litter and blight — to purchase 10 more crime cameras. Bars that get in hot water with the city’s Alcoholic Beverage Control Board have started signing consent agreements that require them to install city-connected cameras. And Cantrell’s 2019 budget calls for 71 new crime cameras along the Lafitte Greenway.
“It’s very upsetting that the city is building on this surveillance network with more and more powerful tools and is not at all getting community input or having any oversight or transparency involved,” Hamilton said. “They’re only telling us after they’ve deploying them and they’re already being used. So there’s really no way for us to push back on that or say no, I don’t want you to use that.”
The city has a history of using powerful law enforcement software without informing the public. In February, an article from The Verge revealed that the Palantir, a CIA-Funded software company, had been contracted by the city of New Orleans for six years with little public knowledge or media attention. The software was used by the city to identify people who were supposedly at high risk of committing a gun crime. Some of those identified were targeted for “enhanced prosecution” by state and federal law enforcement.
Landrieu didn’t renew the Palantir contract when it expired in February and Mayor-elect Cantrell said she wouldn’t revive it. But in October, The Lens reported that the city was secretly building a new analytic tool to identify residents at “high risk” of being involved in a gun crime. Records showed that city officials hoped the new system would be focused on social services, rather than law enforcement. But details have still not been publicly released.
Hamilton said the lack of community engagement shows that the city isn’t taking seriously the sheer power of the network it’s building.
“These are tools that are obviously very powerful, and you’re going to have success stories,” he said. “But they’re also making mistakes and those are things the NOPD are not going to tell you about.”
Hamilton referred to a prior report by The Lens about a drug bust in June. The police used surveillance cameras to watch Clint Carter, who the police believed was selling drugs. When three police SUVs rolled up next to Carter, they were unable to find any drugs in the area. They arrested him anyway after searching him and allegedly finding brass knuckles. His lawyers said the search was unfounded and unconstitutional.
Carter was found not guilty on three charges in November, but because the new arrest was a violation of his parole from an earlier conviction, he was sent back to prison for three years, until February 2022.
“We’re talking about the potential for horrific abuse,” Hamilton said. “And maybe we haven’t seen that here yet — although I think the bust of Mr. Carter was a pretty good example — but I think we’ll see more significant, egregious abuses in the future.”
‘Black Letter Constitutional’
“If you’re in public, you do not have an expectation of privacy,” Landrieu said at the monitoring center’s grand opening in 2017. “I think that’s just the new day in age that we’re in, and people should conduct themselves accordingly.”
He also called the new system “black letter constitutional” — meaning it’s legal and well-established by precedent — a sentiment that has been repeated by city officials, including New Orleans Police Department Chief Michael Harrison, ever since.
“That is just not true,” said Hamilton.
Hamilton argued that although the U.S. Supreme Court has never directly ruled on the constitutionality of video surveillance, other rulings have established that people do, in fact, have at least some expectation of privacy in public. As an example, he pointed to a case from 2018 that set limitations on police access to phone data.
“A person does not surrender all Fourth Amendment protection by venturing into the public sphere,” wrote Chief Justice John Roberts in the majority opinion. “A central aim of the Framers was ‘to place obstacles in the way of a too permeating police surveillance.’”
In his opinion, Roberts cited a list of Supreme Court rulings that have established privacy protections in public spaces. Those judgements are largely built on the foundation of a 1967 ruling that established the principle that “the Fourth Amendment protects people, not places.”
There are other indications that the constitutionality of this type of advance surveillance software is at least unclear. A list of best practices for video surveillance created by U.S. Department of Homeland Security, for example, concedes that there may be constitutional violations that come with advanced surveillance technology.
“Technological features like magnification, night vision, infrared detection, and automatic identification and tracking, which pose significant dangers to privacy and other constitutional rights and liberties, should be used only where they are needed,” it says.
And Hamilton pointed out that the Louisiana Constitution actually has more robust privacy protections than the federal constitution because it explicitly lists a “right to privacy.” Ultimately, he said that the technology being added to the system “takes it beyond what’s constitutional.”
“When technology is used to enhance human perception beyond its normal bounds and capabilities, you’re getting into an area that requires a warrant.” he said. “Technology speeds way ahead of the law. And it takes a while for the courts to catch up and deal with these novel questions.”
But the major challenge with these privacy protections, Hamilton said, is semantic. It all depends on how one defines the idea of a “reasonable expectation of privacy.” But if the public isn’t being informed of surveillance practices before they’re implemented, he said there is no real way to gauge what the public finds unreasonable.
“Part of the problem with privacy law is that it’s governed by societal expectations,” he said. “Absolutely, the community has a right to know what surveillance tools the city is using. And it should have some kind of say in what kind of technology is used and how it’s used.”