CityGov is proud to partner with Datawheel, the creators of Data USA, to provide our community with powerful access to public U.S. government data. Explore Data USA

Skip to main content
Maine Communities and the AI Surveillance Question: When Small Towns Face Big Tech Decisions

Maine Communities and the AI Surveillance Question: When Small Towns Face Big Tech Decisions

Maine Communities and the AI Surveillance Question: When Small Towns Face Big Tech DecisionsBy: Dr. David Hatami, Ed.D., Founder & Managing Director, EduPolicy.ai

Something is happening in Maine that deserves our attention. While the State motto “The Way Life Should Be” is befitting for one of the most gorgeous states in our illustrious union, this matter should be of interest to everyone on the national stage. Not because it’s dramatic or unprecedented, but precisely because it’s so ordinary. Police departments in Sanford, Augusta, Lewiston, Waterville, and York have quietly begun leasing AI-enabled camera systems from a company called Flock Safety (Flock Safety, 2025).

This isn’t a story about technology gone wrong or a cautionary tale about surveillance states. Rather, it’s about what happens when small communities face complex technology decisions without clear frameworks for making them. Having lived in Maine and being a former Associate Academic Dean at Eastern Maine Community College, I have ties to the community that makes this a more personal story for myself.

Let me start with what I know from working with government organizations on AI implementation: the technology usually arrives before the policy. Let me say that again—the tech usually comes before the policy. This should concern everyone for a multitude of reasons.

That’s exactly what we’re seeing in Maine. These Flock camera systems can read license plates, track vehicle movements, and create searchable databases of who goes where and when. While this is not new, per se, other states like Florida already have systems like this in place. The police departments see them as crime-fighting tools. Privacy advocates see them as surveillance infrastructure. Both perspectives hold merit.

When I work with organizations on AI policy development, I always ask them to start with a simple question: What problem are we trying to solve? In Maine’s case, the answer varies by community. Sanford might be dealing with property crime. Augusta might want better traffic enforcement. Waterville might be trying to solve hit-and-run cases. Each town has legitimate public safety concerns, and the Flock system promises solutions. But technology solutions always come with trade-offs that aren’t always immediately visible.

The Flock system works by creating what’s essentially a digital dragnet. Every vehicle that passes a camera gets logged. The AI doesn’t just read license plates; it builds patterns, identifies anomalies, and creates a comprehensive record of community movement. This capability can (and does) genuinely help solve crimes. When someone’s car gets stolen or there’s a hit-and-run accident, having this data can make the difference between a solved case and a cold one. But we need to acknowledge what else we’re creating in the process.

Governor Janet Mills recognized this tension when she established Maine’s AI Task Force through executive order in December 2024 (Executive Office of the Governor of Maine, 2024). The 21-member group has until October 31, 2025, to deliver recommendations on AI policy for state and municipal deployments. That’s a tight timeline for complex questions, but at least it’s a formal acknowledgment that these issues need structured consideration. The Maine Office of Information Technology has already published a Generative AI Acceptable Use Policy for the executive branch, requiring vendor disclosure, risk assessments, and restrictions on certain types of input (MaineOffice of Information Technology, 2025). These are good first steps, but they don’t directly address the surveillance question that communities are grappling with right now.

What really bothers me about this situation is that small communities often lack the resources to properly evaluate these technologies before deployment. A police chief in a town of 20,000 people doesn’t have a team of AI ethicists or privacy lawyers. They have a public safety problem and a vendor offering a solution. The vendor’s sales materials emphasize success stories and crime reduction statistics. What they don’t emphasize are the long-term implications of normalizing pervasive surveillance or the precedents being set for data collection and retention.

The Maine Legislature has started addressing some of these concerns through bills like LD 1944 and LD 1727, which deal with AI deepfakes and chatbot transparency respectively (Maine Legislature, 2025a, 2025b). There’s also LD 872 pending, which would require the state’s Office of Information Technology to maintain a roster of approved AI software and ensure that AI-assisted decisions remain appealable (Maine Legislature, 2025c). These protective measures don’t directly address the immediate question of surveillance cameras already being deployed in Maine communities.

We’re witnessing a fundamental shift in how communities think about public safety, privacy, and governance, but we’re doing it without having the necessary conversations beforehand. The technology is moving faster than our ability to develop thoughtful policy around how to manage the social infrastructure that inevitably develops. I hate writing that paragraph because it sounds like something every technology skeptic has said for the past thirty years, but I think it’s true in this specific case.

The challenge for Maine communities isn’t whether to use AI-enabled surveillance—that horse has already left the barn in several towns. The challenge is developing governance frameworks that balance legitimate public safety needs with privacy concerns. This requires genuine community engagement and transparent decision-making processes.

From my perspective, there needs to be clear public disclosure about what data is being collected, how long it’s retained, who has access to it, and under what circumstances it can be shared. Second, there should be regular audits of how the system is being used, including analysis of whether it’s achieving its stated goals without creating unintended consequences. Third, there needs to be meaningful community input into these decisions, not just notification after the fact.

I keep thinking about something that happened in my consulting work with a small city government last year. They were considering similar surveillance technology, and the police chief made a comment that stuck with me: “We need this technology to keep up with the criminals who are already using technology against us.” This is a compelling argument, and not entirely wrong. But another official in the same meeting asked, “What kind of community do we become when everyone’s movements are tracked and recorded?” That’s the question Maine communities need to wrestle with. This question has even larger scale implications from a national perspective; much like in the UK where CC surveillance cameras are the norm, Americans have a very different sentiment about privacy.

The vendors of these systems—Flock Safety and others—aren’t villains in this story. They’re businesses providing tools that many law enforcement agencies genuinely find valuable. But their business model depends on expanding deployment, creating network effects where more cameras make the system more effective, which incentivizes broader adoption. This creates momentum toward surveillance expansion that’s difficult to reverse once it begins.

What Maine needs now is a framework for municipal AI governance that addresses these surveillance technologies specifically. This framework should include mandatory public hearings before deployment, clear data governance policies, regular effectiveness audits, and sunset provisions that require deliberate renewal rather than indefinite operation. Communities should also consider creating citizen oversight committees with real authority to review how these systems are being used.

The work ahead for Maine communities isn’t just about managing technology; it’s about defining what kind of society they want to be. Do they prioritize security over privacy? Efficiency over transparency? Different communities might reasonably reach different conclusions, but they need to be asked and answered through democratic processes, not through vendor contracts and administrative decisions.

As Maine’s AI Task Force works toward its October deadline, what happens in places like Sanford and Waterville matters because it’s setting precedents for how American communities integrate AI surveillance into daily life. The choices being made now, often without much fanfare or public attention, will shape the relationship between citizens and government for years to come.

The path forward requires honest acknowledgment that AI surveillance technology offers both genuine benefits and real risks. Once surveillance infrastructure is in place, it rarely gets removed. The decisions Maine communities make today about AI-enabled cameras will likely define their relationship with surveillance technology for a generation. They deserve careful consideration, robust public debate, and governance frameworks that protect both public safety and civil liberties. Anything less is a disservice to the democratic values these communities claim to uphold.

References

Executive Office of the Governor of Maine. (2024, December 20). Executive order establishing artificial intelligence task force*. State of Maine.

Flock Safety. (2025). Flock safety camera systems deployment in Maine municipalities* [Municipal deployment records]. Sanford, Augusta, Lewiston, Waterville, and York Police Departments.

Maine Legislature. (2025a). LD 1727: An act to require disclosure of artificial intelligence use in consumer interactions [Enacted legislation]. 131st Maine Legislature.

Maine Legislature. (2025b). LD 1944: An act to expand protections against non-consensual dissemination of certain images to include artificial intelligence deepfakes [Enacted legislation]. 131st Maine Legislature.

Maine Legislature. (2025c). LD 872: An act regarding artificial intelligence procurement and control [Proposed legislation]. 131st Maine Legislature.

Maine Office of Information Technology. (2025). Generative AI acceptable use policy for executive branch.  State of Maine.

More from Artificial Intelligence

Explore related articles on similar topics