Public interest in digital rights usually follows scandal cycles: an NSA leak here, a Facebook outrage there, then back to business as usual. The leadership change at the Electronic Frontier Foundation (EFF) lands in a very different moment. Government data grabs are no longer abstract fears but daily reality for communities targeted by US immigration enforcement—and the same infrastructure is rapidly being super‑charged by AI. A veteran litigator is stepping aside; a movement strategist is stepping in. The question for the next decade is blunt: can EFF scale from courtroom victories to a mass movement fast enough to keep up with algorithmic state power?
The news in brief
According to Ars Technica, the Electronic Frontier Foundation is preparing a significant leadership transition. Cindy Cohn, who has been at EFF for 26 years and served as executive director since 2015, is stepping down. She was one of EFF’s earliest litigators, playing a central role in landmark encryption and surveillance cases dating back to the 1990s.
Her successor, Nicole Ozer, will officially take over on 1 June. Ozer comes from the ACLU of Northern California, where she founded and led the Technology and Civil Liberties Program. Over the past two decades she has frequently partnered with EFF on impact litigation, legislative campaigns, and technical tools for privacy protection.
Ars Technica reports that this handover comes amid a spike in public concern over technology used by US Immigration and Customs Enforcement (ICE) and the Department of Homeland Security (DHS)—from automated license‑plate readers and camera networks to social‑media unmasking efforts—and growing anxiety about how AI will intensify government surveillance. Ozer’s stated priority is to “level up” EFF by expanding its base of supporters and building a broader social movement around digital rights.
Why this matters
Leadership changes at NGOs usually don’t move markets, but this one matters well beyond the nonprofit world. EFF is effectively the Supreme Court bar of the Internet: when a foundational digital rights case appears, chances are EFF is somewhere on the docket. The organisation’s priorities and tactics shape the legal environment for platforms, startups, and public institutions globally.
Cohn embodied the first era of digital rights: the “crypto wars,” early battles over wiretapping, and the fight to transplant constitutional principles into cyberspace. That era was courtroom‑centred and expert‑driven. Ozer represents something different: a movement strategist who sees litigation as one tool among many, alongside legislative campaigns, coalition‑building and public narrative work. In an AI‑driven surveillance context, that shift is not cosmetic—it is existential.
The power imbalance has changed. In the 1990s, the government needed telecoms and a handful of big vendors to surveil. Today, it taps into a vast commercial ecosystem of data brokers, ad‑tech profiles, smart cameras and cloud AI services. When ICE can buy location data instead of getting a warrant, or plug plate‑reader feeds into machine‑learning models, purely legalistic defences are too slow and too narrow.
Ozer’s focus on bringing “unconventional voices” into courtrooms is a recognition that technical correctness is no longer enough. Judges and lawmakers respond to stories and visible constituencies, not just expert declarations. Communities on the sharp end of surveillance—immigrants, protesters, racial minorities—need to be seen as protagonists, not footnotes. If EFF successfully pivots from a guild of experts to the legal arm of a broader social movement, that will raise the political cost of expanding AI‑driven surveillance—and that, in turn, will influence how aggressively governments and vendors push these tools.
The bigger picture
This transition sits at the intersection of three converging trends.
First, AI has moved from hype to deployment in law enforcement and migration control. Predictive policing systems, risk‑scoring tools for visa applicants, real‑time facial recognition in public spaces, and cross‑border data‑fusion platforms are no longer pilot projects. They are being operationalised—often through public‑private partnerships involving US cloud giants and specialised analytics firms. The same pattern seen with Palantir’s work for US and European agencies is now repeating with a new generation of AI vendors.
Second, the surveillance business model of Big Tech has become structurally entangled with state power. As Cohn has long argued, and Ars Technica reiterates, there was never a clean line between “government surveillance” and “corporate surveillance.” The Trump administration’s second term merely stripped away the polite fiction. When the state can ask Meta for user identities, Apple and Google for app removals, or data brokers for historical location trails, legal constraints aimed at direct state collection look increasingly outdated.
Third, civil‑society organisations themselves are professionalising and scaling. The early EFF was a small band of hackers and lawyers. Today we see a network of specialised groups: algorithmic accountability labs, migrant‑rights tech collectives, litigation boutiques focused on strategic cases. Organisations like ACLU, EDRi, NOYB, and the UK’s Open Rights Group coordinate across borders. Ozer’s background in “movement lawyering” is aligned with this shift: the job is less about finding the perfect constitutional argument, and more about orchestrating campaigns that combine lawsuits, regulation, technical tooling and grassroots pressure.
Compared to corporate AI rhetoric—dominated by “responsible AI” teams inside the very firms building surveillance tools—EFF plays a distinct role: it is largely independent of government and industry funding, and it is willing to litigate against both. How it chooses to confront AI‑enabled ICE operations, biometric mass surveillance, and data‑broker ecosystems will influence the space available for more moderate actors: corporate ethics teams, standards bodies, and academic researchers.
The European / regional angle
From a European perspective, this US leadership change is not a distant internal matter. US digital rights litigation has repeatedly triggered consequences in EU law and practice. The revelations that EFF and others helped surface in the NSA era ultimately fed into the Court of Justice of the EU’s Schrems I and II rulings, which invalidated successive EU–US data‑transfer frameworks and forced Brussels and Washington back to the negotiating table.
Today, Europe is attempting to regulate where US courts have largely hesitated. The GDPR, the Digital Services Act (DSA), the Digital Markets Act (DMA) and the newly adopted EU AI Act collectively establish a dense rulebook around data processing, platform responsibility and high‑risk AI systems—especially in law enforcement and migration. Yet regulation on paper is only as good as its enforcement. Here, US organisations like EFF and European groups like EDRi, the German GFF or Austria’s NOYB are de facto allies, testing laws in court and exposing cross‑border surveillance schemes.
AI‑enabled migration control is explicitly on the EU agenda: from Eurodac upgrades to Frontex’s expanding surveillance toolkit. The risk is that, in the name of “interoperability” with US systems and cooperation on security, EU agencies will quietly import US‑style practices or tools. Stronger EFF scrutiny of ICE, DHS and US data brokers could indirectly benefit European residents by surfacing abuses that implicate shared vendors or databases.
For European startups and cloud providers, there is another angle: a more assertive EFF raises the regulatory and reputational risks for building tools that can be repurposed for mass surveillance. That may slow some lucrative government contracts—but it also opens space for privacy‑preserving alternatives, from on‑device analytics to minimised data‑collection architectures, where European companies often have a competitive story to tell.
Looking ahead
What does this mean for the next five years?
Expect EFF to put far more emphasis on AI in public‑sector use, not just in commercial products. That could mean challenges to real‑time biometric surveillance in cities, litigation around automated decision systems in immigration proceedings, and pressure on courts to treat data‑broker purchases as constitutional “searches” requiring warrants. We are likely to see more technically sophisticated amicus briefs dissecting AI systems—and more community‑driven cases that feature directly affected plaintiffs.
Watch for closer coordination between US litigators and EU regulators around AI vendors. If an analytics platform is sold simultaneously to ICE and to European border agencies, a successful challenge in one jurisdiction will reverberate in the other. The EU AI Act’s prohibitions on certain biometric and predictive policing tools may become a benchmark that US advocates can point to in their own arguments.
Politically, much hinges on how the US Congress and the courts evolve after the next electoral cycles. The dream of a comprehensive US federal privacy law keeps reappearing and then stalling. Ozer has already sketched ideas for giving the public more direct influence over such legislation in the AI age, but that will require an unusually broad coalition: civil‑rights groups, technologists, small businesses, even segments of industry tired of the current patchwork.
The risks are clear. If movement‑building fails, AI‑driven surveillance will entrench itself faster than courts can react. Data‑rich incumbents—both tech giants and established contractors—will lock in government relationships, making later reform painful and costly. On the other hand, if EFF and its allies manage to turn today’s outrage over ICE abuses into durable institutions—legal precedents, regulatory norms, technical safeguards—the AI age could yet be steered toward enhancing, rather than eroding, civil liberties.
The bottom line
EFF’s leadership reset is more than a personnel change; it marks a strategic pivot from expert‑centric litigation to movement‑driven resistance against an AI‑powered surveillance state. In a world where governments can weaponise commercial data and machine learning at scale, narrow courtroom wins are no longer enough. The open question is whether EFF under Nicole Ozer can mobilise a broad enough coalition—on both sides of the Atlantic—to redraw the boundaries of acceptable state power before AI makes those boundaries meaningless. Where do you want that line to be drawn, and what are you willing to do to defend it?



