Flock's cloud wasn't breached — its cameras, accounts, and governance model failed exactly the way the system allowed them to.

Police departments don't buy Flock cameras because they're secure. They buy them because they're easy. A turnkey, cloud-managed license-plate network that requires no IT staff, no procurement friction, and no public debate. One purchase order and a city gets real-time vehicle tracking across every major road, streamed straight into a private company's servers.
That convenience is the product. The surveillance is the feature. And the scale is the point: more than 80,000 cameras across 49 states, stitched together into a single system that thousands of agencies can search with a few keystrokes. Flock didn't build a tool — it built an infrastructure layer for policing. [1]
What it didn't build was guardrails. The cameras exposed themselves. The accounts were compromised. The searches ballooned into the millions. And the system kept expanding anyway, because nothing in its design required anyone to stop.
Flock Safety was founded in 2017 in Atlanta by Garrett Langley, and by April 2026 had reached an $8.4 billion valuation on roughly $300 million in annual recurring revenue. It describes itself as "the world's first proactive safety company" and claims to work with more than 6,000 communities, 5,000 law enforcement agencies, and 1,000 businesses. Investors include Andreessen Horowitz, Founders Fund, Kleiner Perkins, and Y Combinator. [2]
Senator Ron Wyden and Representative Raja Krishnamoorthi described it more plainly in a November 2025 letter to the Federal Trade Commission: Flock operates "the largest network of surveillance cameras in the United States." [3]

Flock's network isn't just a collection of cameras — it's a shared investigative platform. Any agency that buys into the system can search license-plate scans across its own cameras and, by default, across the cameras of thousands of other departments. Access is granted through simple account creation, and once an officer is inside, the system treats them as a trusted participant. There is no technical barrier between jurisdictions, no meaningful friction, and no independent verification of who is searching for what. [4]
Every scan becomes a searchable record: plate number, timestamp, GPS coordinates, and a high-resolution image of the vehicle. Those records flow into Flock's cloud, where they can be queried by any authorized user. Searches are logged, but the logs are internal — visible to Flock and the agency, not to the public, not to oversight bodies, and not to the people being tracked. The system assumes good faith from every user, even when the evidence shows that assumption is misplaced.
Sharing is the default. Oversight is optional. Agencies can link their systems to neighboring cities, regional task forces, or federal partners with a single toggle. Once connected, data moves freely. A small-town police department can access scans from a major metro area. A federal agency can pull from local cameras without a warrant. And because Flock is a private company, the entire network sits outside the transparency rules that govern public infrastructure. [3]
This is the governance model: trust the user, trust the agency, trust the network. No mandatory multi-factor authentication. No independent audits. No external reporting. A system built for maximum access and minimum friction — and one that fails in exactly the ways that design guarantees.
The National Lookup Tool is the clearest expression of what Flock actually built. An officer in one city can run a plate through the cameras of thousands of departments in other states. Flock disclosed to Congress in August 2025 that roughly 75 percent of its law enforcement customers had opted into this feature. The other 25 percent did not prevent those customers from being searched by others who had. [3]

In December 2025, journalist Jason Koebler at 404 Media discovered something that should not have been possible: Flock's Condor PTZ cameras were streaming live to the open internet with no password required. He watched himself walking on a greenway in Brookhaven, Georgia via an unauthenticated public feed. He found other feeds: a woman walking a dog in suburban Atlanta, a man in a Bakersfield parking lot, children swinging on a playground. No login. No authorization. Just a URL and a browser. [5]
Independent security researcher Jon Gaines, publishing under the name GainSec, independently verified and expanded the finding. Scanning for Flock's specific network signature on port 8900, he located 67 exposed cameras across 19 cities in 15 states. Cities included Alpharetta, Georgia; Rocklin, California; Charlotte, North Carolina; and Vista, California. The cameras shared a single ISP, Verizon Business, with no firewall between the device and the internet. The admin portal required no authentication. The RTSP video stream required no authentication. And there was a delete-via-GET endpoint — a URL you could simply load in a browser to permanently destroy archived footage, no credentials needed. [6]
Musician and technologist Benn Jordan had already documented the hardware-level vulnerabilities in November 2025. In a video that accumulated 1.66 million views, he demonstrated a specific button sequence on the camera back that spawns an open Wi-Fi access point. From there, an attacker within wireless range could enable Android Debug Bridge over the network, gaining shell access to the device. He found exposed USB ports, hardcoded wireless credentials, unencrypted image storage, and an operating system, Android Things 8, that Google discontinued in 2021 and no longer updates. Joshua Michael at Nexanet AI separately found 53 API endpoints with hardcoded authentication keys. His assessment: "I've seen high school projects with better security." [7] [8]
GainSec's full research documented 51 findings across six Flock camera models, with 22 assigned CVEs and 8 more pending. CVE-2025-59403, logged in the National Vulnerability Database, describes the Collins application exposing administrative endpoints including /reboot, /logs, /crashpack, and /adb/enable without authentication, allowing denial of service, information disclosure, and remote code execution from the local network. [9]
Jon Gaines lost his job within 48 hours of the research going public. Benn Jordan reported being visited by individuals he believed were private investigators photographing his home.

The camera hardware problem and the account security problem are separate failures. Both exist simultaneously.
Cybersecurity firm Hudson Rock maintains a public database of credentials compromised by infostealer malware. A search of that database in late 2025 showed at least 35 sets of Flock customer credentials had been stolen. By April 2026, that number had grown to 45. Congressional staff ran the search themselves and confirmed the findings. Benn Jordan separately provided Wyden and Krishnamoorthi's offices with a screenshot from a Russian-language cybercrime forum showing Flock accounts being offered for sale. [3] [10]
The reason those credentials matter is what they unlock. A stolen law enforcement login grants access to Flock's full search interface, including cross-jurisdiction queries through the National Lookup Tool. There is no second factor standing between a stolen password and 80,000 cameras' worth of vehicle movement data.
On November 3, 2025, Senator Wyden and Representative Krishnamoorthi sent a letter to FTC Chair Andrew Ferguson demanding an investigation. The letter was direct: "Flock's failure to require MFA is likely an unfair business practice prohibited by Section 5 of the FTC Act." It cited four prior FTC enforcement actions, against Uber, Chegg, Drizly, and Blackbaud, where failure to require multi-factor authentication had been found to constitute an unfair business practice. [3]
Flock confirmed to Congress that it had not required MFA as a condition of access, and that it still does not support phishing-resistant MFA, which the Cybersecurity and Infrastructure Security Agency calls "the gold standard method." What Flock does support is SMS-based MFA, which Wyden's letter notes is "vulnerable to interception and phishing." A DEA officer had been found using a Palos Heights detective's Flock account, an account that did not have MFA enabled until after the password sharing was first identified by a reporter. [3]

For much of 2025, Flock's public position was simple: it had no relationship with the Department of Homeland Security. Local police agencies were routinely told this when they asked whether Flock data could be accessed by federal immigration enforcement. The answer they received was no.
The answer was wrong.
In an October 16, 2025 letter to Flock CEO Garrett Langley, Senator Wyden wrote: "Flock has assured its state and local law enforcement customers that the company does not provide access to the Department of Homeland Security (DHS). With this representation about DHS access to Flock data, Flock deceived its law enforcement customers." Flock itself later confirmed the deception, telling Wyden's office that "due to internal miscommunication, customers were inaccurately informed that Flock did not have any relationship with DHS, while pilot programs with sub-agencies of DHS were briefly active." [11]
The pilots Flock quietly ran while denying their existence: Customs and Border Protection conducted approximately 200 searches. Homeland Security Investigations conducted approximately 175 searches. The Naval Criminal Investigative Service and the Secret Service also had access. Flock announced an end to federal pilots in August 2025, after the denial had already become public. [11]
The University of Washington Center for Human Rights documented a parallel problem in Washington state. At least eight local agencies had explicitly authorized 1:1 sharing with Border Patrol. But ten additional agencies had their data accessed by Border Patrol without having authorized it at all, through what the report described as a "back door." Those agencies were never told. Yakima County Sheriff's logs showed two searches with the reason field logged simply as "ice." [12]
In Mountain View, California, the city discovered that Flock had enabled a "statewide" search setting on 29 of its 30 cameras without authorization, allowing more than 250 California agencies to search Mountain View's data. During the period that setting was active, federal agencies including ATF, Air Force offices in two states, and the U.S. GSA Office of Inspector General accessed the feed. When city officials pressed Flock about how the unauthorized setting had been activated, Flock told them it "no longer had records for how the system was turned on or how it was turned off." [13]
The Mountain View City Council voted unanimously to terminate their Flock contract on February 24, 2026. Police Chief Mike Canfield, who ordered the cameras turned off three weeks earlier: "I personally no longer have confidence in this particular vendor." [13]

Between December 2024 and October 2025, the Electronic Frontier Foundation obtained records representing more than 12 million searches logged by more than 3,900 agencies. The EFF's analysis of those records documented a surveillance network that had expanded far beyond its stated purpose of solving property crimes. [14]
More than 50 federal, state, and local agencies ran hundreds of searches through Flock's national network in connection with protest activity. The searches tracked vehicles at the February 50501 deportation-raid protests, March rallies for Mahmoud Khalil, April Hands Off protests, and June and October No Kings demonstrations. Named agencies include the Tulsa Police Department, which ran 38 protest-related searches; the Spokane County Sheriff, which ran searches across 95 networks for "no kings"; and the Beaumont Police Department in Texas, which ran six "KINGS DAY PROTEST" searches across 1,774 networks. [14]
The EFF also documented systematic discriminatory searches against Romani people. More than 80 law enforcement agencies used language perpetuating harmful stereotypes in their search reason fields: "g*psy vehicle," "roma traveler," "possible g*psy," "g*psy ruse." Fairfax County Police Department in Virginia logged more than 150 such searches. Sacramento PD ran six. The searches often had no cited crime, targeting traveling communities on the basis of ethnicity alone. EFF's framing was precise: "Flock's network didn't create racial profiling; it industrialized it, turning deeply encoded and vague language into scalable surveillance that can search thousands of cameras across state lines." [15]
In Johnson County, Texas, a sheriff's deputy ran two searches through Flock's network for a woman described in department records as having "had an abortion." The first search probed 1,295 Flock networks containing 17,684 cameras. The second expanded to a full month of data across 6,809 networks, covering 83,345 cameras. Court records obtained by EFF showed deputies had initiated a death investigation, logged evidence of a self-managed abortion, and consulted prosecutors about possible charges. The Johnson County DA found no statutory basis to charge. Sheriff Adam King subsequently faced indictment on unrelated felony sexual harassment, whistleblower retaliation, and aggravated perjury charges. [16]
Flock called EFF's reporting "purposefully misleading" and "clickbait." CEO Garrett Langley told Forbes the situation was "everything working as it should be." EFF's response was four words: "The truth is worse." [16]
The EFF's broader audit of the 11.4 million search dataset found that more than 14 percent of searches listed only the word "investigation" in the reason field, with no case number, no details, no accountability. The search log is a paper trail that circles back to itself.

If you read Investigation 04 on this site, you know how San Diego lost control of its smart streetlight surveillance network to a vendor that refused to turn it off. This is the sequel. The same poles. The same governance failure. A new company bolted onto the hardware.
In November 2023, the San Diego City Council approved a five-year contract worth approximately $12 million with Ubicquia as the prime contractor, with Flock Safety as the ALPR subcontractor integrated into Ubicquia's UbiHub streetlight platform. The contract was signed without competitive bidding. Mayor Todd Gloria signed it into law November 22, 2023. By December 2024, additional Flock services had expanded the total disclosed cost to nearly $5 million. [17]
What the 2024 Annual Surveillance Report submitted to City Council did not disclose: during a three-week window around the launch, 12,914 searches were conducted by other California law enforcement agencies through Flock's network without San Diego's authorization. The breach was discovered internally on January 17, 2024. It was not disclosed to the Privacy Advisory Board until June 13, 2025, seventeen months later. The report that was submitted to Council in February 2025 stated, directly, that "there were no data breaches or unauthorized access." That statement was false. [18]
At the December 9, 2025 City Council vote on ALPR use policy, the No votes came from Councilmembers Sean Elo-Rivera, Henry Foster III, and Vivian Moreno. Elo-Rivera: "Flock is a proven bad actor who is unwilling or unable to stop their technology from being abused. I can't look San Diegans in the eye and say that I am confident that their rights and privacy will not be violated by this company." Foster raised the Jordon revolving door directly from the dais: "I think that is totally inappropriate." [19]
The Privacy Advisory Board, which the TRUST Ordinance established to provide oversight of exactly this kind of deployment, had issued a formal recommendation in November 2025 to cease use of Flock entirely, citing the concealed breach and identifying 31 needed changes. The Public Safety Committee voted unanimously to override the recommendation. It was the second time the PAB had been overridden on Flock. The ordinance that created the board made its recommendations advisory. [19]
The structural problem Investigation 04 documented, a city that cannot unilaterally disable hardware it nominally owns, applies here too. The Flock system is subcontracted through Ubicquia. The ALPR cannot be disabled independently of the streetlight platform. Turning off Flock requires renegotiating with Ubicquia. At the December 9 meeting, Councilmember Foster discovered in real time that shutting off Flock would not save the city money because of how the contract is bundled. The city had built a second vendor dependency on top of the first one.

More than 30 cities and counties across the United States have terminated or suspended Flock Safety contracts since early 2025. The terminations span Oregon, Washington, Wisconsin, California, Virginia, Illinois, New York, Iowa, Texas, and Arizona. The reasons they gave, in their own words, are more useful than any summary.
The Staunton, Virginia police chief, Jim Williams, terminated the contract and received a copy of Flock CEO Garrett Langley's mass email to law enforcement customers framing the terminations as a "coordinated attack" by activist groups. Williams replied directly. His response, now public record: "As far as your assertion that we are currently under attack, I do not believe that this is so. What we are seeing here is a group of local citizens who are raising concerns. In short, it is democracy in action." [20]
In Bend, Oregon, Councilor Mike Riley voted to not renew: "I don't feel like it has strong enough guarantees about how they can use that data. Right now with Oregon being a sanctuary state and all the heightened concerns about ICE arrests and deportation, I just didn't feel like we had good protections for our community." The cameras went off at 3:15 p.m. on January 8, 2026. [21]
In Oshkosh, Wisconsin, the council voted 5-2 to renew the contract, then 7-0 to rescind the renewal within 24 hours after the police chief confirmed that Flock's CISO had misrepresented a camera capability during the vote. Deputy Mayor Karl Buelow: "I'm deeply embarrassed and sorry." [22]
In Verona, Wisconsin, a data audit of Flock search records turned up 974 federal-tagged searches and 1,628 searches by organizations self-identifying as ICE, plus 5,739 additional image searches. The mayor's response: "I think it was deliberate. Because I think that they want to keep the cameras up, whether they have permission or not." [22]
The Ring partnership — announced October 16, 2025, which would have allowed Flock to request footage from Amazon Ring doorbell cameras — was canceled February 12, 2026. Ring's statement: "The integration never launched, so no Ring customer videos were ever sent to Flock Safety." The cancellation came after the Wyden letter and, per Ring, "intense backlash." [23]

On December 8, 2025, Garrett Langley sent a mass email to Flock's law enforcement customers. Subject line: "Fact Check: No Hack. We will never stop fighting for you."
The email's central claim: "Flock has never been hacked. Ever." Its central framing: "Let's call this what it is: Flock, and the law enforcement agencies we partner with, are under coordinated attack." It described security researchers, journalists, and community activists as unified agents working to "defund the police, weaken public safety, and normalize lawlessness." It called Benn Jordan's documented hardware vulnerabilities "misleading headlines." [20]
The "never been hacked" claim is technically narrow and strategically misleading. It refers specifically to Flock's cloud platform, which was not compromised by an external attacker. It does not address 67 cameras that streamed without authentication. It does not address 45 sets of customer credentials stolen by infostealer malware. It does not address CVE-2025-59403, formally registered in the National Vulnerability Database, allowing remote code execution via an unauthenticated endpoint. It does not address credentials for sale on Russian cybercrime forums. It does not address the CBP and HSI federal pilots Flock denied before admitting. It does not address the 12,914 unauthorized searches in San Diego or the Mountain View "statewide" setting Flock could not explain. [6] [9]
Flock's CISO, Chris Castaldo, published a direct rebuttal of Benn Jordan's research. He described Jordan as a "YouTuber" and called the claim of compromising 80,000 cameras "totally false and misleading," noting that such a compromise would constitute a federal crime. What Castaldo's response appeared to conflate was Jordan's device-level hardware research, which documented vulnerabilities in individual camera units, with a separate finding by Joshua Michael at Nexanet AI who discovered a misconfigured Flock demo exposing a map of approximately 83,000 camera locations via an ArcGIS leak. These were distinct findings, combined in Castaldo's response in a way that misrepresented both. [24]
Flock called the organizations pushing back against the cameras "activist groups who want to defund the police." The Staunton police chief called it democracy. The Mountain View police chief called it a loss of confidence. Thirty city councils voted with their feet. The Charlottesville police chief called the Langley email "unprofessional" and "pouting." [20]
Flock's valuation went from $7.5 billion in March 2025 to $8.4 billion in April 2026. The cameras are still going up.

If the streetlight story was about losing control of the hardware, this one is about losing control of what the hardware does. San Diego couldn't turn its cameras off. Flock doesn't need to be turned off — it keeps working as long as someone, somewhere, keeps searching. The danger isn't a breach or a hack or a rogue actor. The danger is a system that performs exactly as designed, at national scale, with no one accountable for the consequences. Your city doesn't need smart streetlights to lose control. It only needs to join the network. [1]
The questions worth asking in your city: Who owns the platform? Does the vendor retain technical control over the cameras? Can your city independently disable the system? What federal agencies have accessed the data, and under what authorization? What DHS grant money paid for the hardware? What data-sharing agreements are in place, and who signed them?
San Diego asked these questions after the fact. Mountain View asked them after a breach. Verona asked them after an audit. Oshkosh asked them after a vote they had to reverse the next morning. In every case, the answers were worse than expected.
Flock Safety built the largest surveillance camera network in the United States by making it easy to buy, hard to audit, and nearly impossible to leave. Its cameras exposed themselves to the open internet. Its customer accounts were compromised and offered for sale. Its governance model enabled the surveillance of protesters, Romani families, and abortion patients at national scale. Its federal pilots were real while its public statements denied them. And when cities tried to leave, they found themselves bound to contracts that made exit expensive and, in some cases, technically impossible.
None of that was a hack. None of that was a breach. None of that was a rogue actor exploiting an otherwise secure system.
It worked exactly as designed. That is the problem.