Rewiring city’s technology ties following ShotSpotter saga

Whether you think ShotSpotter is a crime-fighting tool or an expensive failure, one thing is certain: Research and analysis about the technology enriched public conversation.

Much of the research produced on what ShotSpotter gets right and what it misses was generated because of a mistake.

In 2023, freelance journalist Matthew Chapman submitted a Freedom of Information Act request for data Chicago police had provided to economists studying the gunshot-detection system’s effects on police response times.

To Chapman’s surprise, CPD denied his request and forwarded a letter from ShotSpotter’s parent company, SoundThinking, which stated CPD should not have shared the data in response to any public records request. “At its core, the real issue is that Chicago should not have created and released the various documents for any of the noted requests — nor did SoundThinking know Chicago was doing so. The requestor never should have received the “many years” worth of alerts that he apparently believes are, and should be, the norm.”

Commentary bug

Commentary

CPD’s FOIA mistake shared a granular version of ShotSpotter alert data that enabled researchers to test and ultimately question ShotSpotter’s claims that its product improved officer response times to shootings. It’s an unlikely but remarkable story: A company that insists its data is a trade secret ended up inadvertently sparking more sophisticated public-policy debates about surveillance technology.

Chicago’s contract with ShotSpotter, which expired in September 2024, explicitly says: “City Data is the property of the City and Gunfire Data is the property of the Contractor.” The contract also underscores the contractor and its licensors “retain all ownership of all intellectual-property rights in and to all Gunfire Data,” reinforcing that the information generated about gunfire in city neighborhoods sits under the vendor’s control, not Chicago’s.

Companies now trying to sell artificial intelligence-driven analytics to governments for predictive policing, facial recognition, social-media monitoring, automated license-plate readers and more also seek contracts that privatize the technology’s gains and shift all risk to local governments.

We already know from U.S. Immigration and Customs Enforcement’s recent actions in Chicago that technologies meant for local policing can be used for political repression and persecution. At a moment when an authoritarian president is seeking every tool possible to target his perceived political enemies, there is no better time for Chicago to rethink how it does business with technology firms.

If the city cannot see, study or share the data private firms collect from the city, then it cannot evaluate whether the technologies work or whether they harm communities. The ShotSpotter saga shows that meaningful debate and reform hinge on independent scrutiny. Without access, there’s no accountability.

If Chicago wants to avoid paying the price of AI technologies that may do harm, it needs to fix how it buys technology. The city’s procurement system was built for buying trucks and traffic lights, not AI. As the American Civil Liberties Union of Illinois argues, we need a “smart procurement” process that treats data and algorithms as matters of public governance — not private property.

Procurement reform can start to take place with the following steps:

  1. Every contract for surveillance or AI technology should make clear that data collected in Chicago belongs to the city and its residents, not the vendor. The city should have the right to analyze and release it just as it does with other public records so researchers and watchdogs can assess whether these systems work as promised. Without public ownership, there is no transparency.
  2. Chicago should require an algorithmic impact assessment before any purchase. Just as an environmental review precedes a major construction project, an AI impact review should precede any new surveillance or analytics tool. This assessment would ask basic questions: What problem is this system supposed to solve? What are the potential biases? What happens to the data? And what safeguards are in place if things go wrong? These answers should be made public before any contract is signed.
  3. End contractual arrangements that force taxpayers to pay for technologies’ mistakes. The ShotSpotter contract shielded the company from liability for how its alerts are used, while offering no reciprocal indemnification for cities. Police misconduct settlements are already costing taxpayers millions of dollars. Imagine the scale at which these settlements might rise when police begin using unevaluated and unregulated AI technologies. Technology firms profiting from public contracts should share the liability for the consequences of their tools.

The ShotSpotter debate gave Chicago a glimpse of what an evidence-informed policy debate can look like. But it partly took a data-sharing error to make some of that discussion take place.

The manner in which cities regulate private technology for public policy is flawed. As tech firms begin to try and integrate their AI products in public streets, public schools, police departments and courtrooms, Chicago should lead the nation in revising procurement rules that make transparency the norm, not an accident.

Robert Vargas is a professor of sociology and director of the Justice Project at the University of Chicago.

Send letters to letters@suntimes.com. More about how to submit here.
(Visited 1 times, 1 visits today)

Leave a Reply

Your email address will not be published. Required fields are marked *