I’m more skeptical than not, but there is a way to make me less so.
Two years in, Houston’s ShotSpotter program has resulted in 5,450 alerts, 99 arrests and the seizure of 107 guns, but no real consensus on its value as a crime-fighting tool or even how to measure its success.
Critics say the numbers — just 19 percent of the gunfire alerts in the last 25 months even led to an offense report — do not justify the $3.5 million cost of the controversial tool. In the remaining cases, officers were dispatched based on the alerts but did not find any evidence, such as shell casings.
Authorities filed 126 charges related to ShotSpotter alerts, including one capital murder charge, according to Houston Police Department Assistant Chief Milton Martin, who presented an update to a City Council committee last week. Half of those charges involved misdemeanor offenses, most commonly the illegal discharge of a firearm in the city.
Not directly reflected in those statistics, Martin said, is the intelligence that HPD was able to gather from ShotSpotter data. Because residents do not always call 911 to report every gunshot they hear, the tool has allowed officers to map out areas where gunfire problems are the most severe and deploy its resources accordingly, he said.
“Just in the first year of operation, over 200 shell casings that we collected were linked to firearms that were used in other crimes in other parts of the city,” Martin said. “While that’s not an automatic ‘Oh, now we know who to arrest,’ it’s information that investigators did not have before.”
Some advocates, however, say the numbers do not justify the cost of the program: $3.5 million for a five-year contract from 2022 to 2027 at an annual price of approximately $74,000 per square mile.
“Only 20 percent of alerts result in an offense support, meaning that 80 percent of responses are a waste of public resources,” Christopher Rivera, outreach coordinator at the Texas Civil Rights Project, said. “I believe that we can use the $3.5 million…and put it into programs that actually reduce gun violence, like housing and health care and debt relief.”
[…]
Meanwhile, critics and studies of the system in other cities raise questions about the accuracy and efficacy of the gunfire detection tool.
Little consensus exists even among officials who have adopted the technology. In Texas, San Antonio canceled its contract in 2017, after just one year of operation, saying that ShotSpotter simply was not worth the money. Harris County officials, however, have called ShotSpotter a “godsend” for the Aldine area.
Chicago’s former Inspector General Joe Ferguson said Houston’s statistics so far are “in the same universe” as those in other parts of the country that have been subjected to criticism by experts.
The author of a 2021 report by Chicago’s Office of Inspector General, Ferguson found that ShotSpotter alerts rarely led to evidence of a gun-related crime and could result in biased policing behaviors. He cautioned Houston officials against making premature conclusions based on ShotSpotter data during an interview with the Chronicle.
“What was found in Chicago and has been found in other places is the false positive rate is over 50 percent,” Ferguson said. “And people don’t understand that. People assume things are worse than they are. That spawns fear, and fear spawns overreaction, both as a political matter and in terms of response in the field and on the street.”
At the same time, Ferguson applauded Houston’s incremental approach to implementing the program.
“The way that Houston is going about it is the way that these things should be approached. It started with a pilot program, it is focused, it generates the data, and the data is subjected to analysis and made publicly available,” he said. “But the results that they’ve gotten so far aren’t significantly better than what has been reported nationally.”
I thought I had written about ShotSpotter before, but my archives say otherwise. This article does a pretty good job of telling you what you need to know, and there was a CityCast Houston podcast episode from last January that also discussed it, if you want to know more. My sense about this is similar to how I feel about security cameras, which is that it sounds like it could be beneficial, and may have value in certain specific circumstances, but we need to be very rigorous about the data that we have for it and make decisions based on that data. Basically, does the data say this thing works as its proponents claim it does, which is to say that it reduces or helps solve crime at a certain level, or does it not? What even is a reasonable expectation given our investment, the context in which we are using it (e.g, in a high-crime area or just someplace where the locals are loudly clamoring for it regardless of need), and the experiences of other cities? We need to know that going in, and we need to be willing to turn it off if it’s not working as hoped. If we have all that in place, then I’m willing to give it a try. If not, then surely there are better uses of the money.
Thanks for mentioning the podcast!
One key datapoint missing is “How many shotspotter alerts were also reported to 911?”.