Analysis  ·  10 May 2026

The Regulator’s Dilemma

🎧 Prefer to listen? Audio version below — approximately 13 minutes.

The European Commission ordered Google to share its search data. Google’s own scientist showed it could re-identify users in two hours. The Commission is now in a bind of its own making.
By Alan Wright  ·  The Haunted Lighthouse Limited  ·  Peel, Isle of Man

In January 2026, the European Commission opened proceedings under Article 6(11) of the Digital Markets Act, requiring Google to grant third-party search engines access to anonymised ranking, query, click, and view data on fair, reasonable, and non-discriminatory terms. The deadline for a final decision is 27 July 2026.

The stated purpose is competition. The unintended consequence is a privacy problem the Commission did not anticipate, or at least did not anticipate publicly.

The dataset at issue is not a lightweight summary. It includes full search queries, timestamps, approximate location data, language preferences, device information, and detailed user interaction signals -- every click, every scroll, every decision about which result to ignore. The Commission’s proposed anonymisation mechanism relies on an allowlist model: individual query components are approved for sharing if they have been used by at least fifty signed-in users over a thirteen-month window.

It sounds robust. According to the people who tested it, it is not.


Sabotage Is a Strong Word

Aurélien Mähl, Senior Public Policy Manager at DuckDuckGo, did not mince words when speaking to TechRadar on 8 May 2026. Google, he said, is “not collaborative and not in the spirit of complying with this regulation.” Then he went further: it is, in his view, “in the spirit of sabotaging it.”

Sabotage is a strong word. It is also, in this context, a specific accusation. DuckDuckGo is a direct competitor to Google in search. Mähl is not a disinterested observer. But that does not make the claim wrong, and he is not alone in making it.

Tasos Stampelos, Director of EU Public Policy at Mozilla, has argued on the record that Google is using a “binary argument” -- framing the choice as either full data sharing or full privacy protection -- to escape the DMA’s obligations. Stampelos’s position is that the debate is not “black and white,” which is a diplomatic way of saying Google is presenting a false dilemma.

The Commission, for its part, has not publicly disputed the framing.


The Two-Hour Problem

The most technically significant evidence in this proceeding did not come from a competitor. It came from inside Google itself.

Sergei Vassilvitskii is a Distinguished Scientist at Google who has worked on differential privacy since 2012. In a formal letter sent to the European Commission and first reported by Reuters and The Next Web on 6 May 2026, Vassilvitskii disclosed that his red team -- working specifically on the anonymisation method proposed by Brussels -- was able to re-identify individual users within two hours.

This is not a theoretical objection. This is an empirical result produced by Google’s own privacy engineers, working on Google’s own proposed implementation, under conditions that are presumably more controlled than those that would exist in the wild once the data is in circulation across multiple recipients.

Vassilvitskii’s letter proposed alternative anonymisation guardrails that would, in his assessment, meet the DMA’s competitive intent without producing the re-identification risk his team demonstrated. The Commission has not publicly responded to the substance of that proposal.

To be clear about the incentive structure here: Google has every commercial reason to resist data sharing. Vassilvitskii is a Google employee. The letter serves Google’s interests. None of that changes the technical finding, and it is worth sitting with the implications: if Google’s own privacy team cannot make the proposed anonymisation hold for two hours, it is reasonable to ask who can.


Independent Confirmation

Lukasz Olejnik is an independent cybersecurity and privacy researcher. His affiliations include the International Committee of the Red Cross, CERN, and University College London. He has no disclosed relationship with Google, no commercial stake in the outcome of the DMA proceedings, and no obvious incentive to take Google’s side.

He published his own analysis of the re-identification risk in the proposed data sharing framework ten days before the Reuters story broke. His conclusion aligned with Vassilvitskii’s: the anonymisation model as proposed does not provide adequate protection against re-identification of individual users.

That is three independent lines of evidence -- Vassilvitskii’s red team result, Olejnik’s prior analysis, and Brave’s technical critique -- all landing on the same finding. The proposed anonymisation is not provably anonymous.

Josep Pujol, Chief of Search at Brave Software, made the same point directly: “The data this proposal would put into circulation is not, in our view, provably anonymous, and its release would impose severe privacy risk on hundreds of millions of European citizens that we do not believe is justifiable.”

Brave is Google’s privacy-focused nemesis in search. Pujol is not a neutral voice. But he is a technically credible one, and his critique is structural rather than rhetorical.


The Conspicuous Absence

The intended beneficiaries of this data sharing arrangement include not just rival search engines but AI companies -- OpenAI, Anthropic, Perplexity, Mistral -- all of whom would substantially improve their retrieval-augmented generation capabilities with access to Google’s search index and interaction data.

None of them have filed on-record submissions to the Commission’s consultation. The Next Web’s reporting from 6 May 2026 noted that these firms have largely avoided on-record statements to maintain their “privacy-respecting” reputations. They want the data. They do not want to be seen wanting the data. So they are quiet, and they are waiting.

This is not a scandal. It is a rational strategic calculation. But it is worth naming, because the Commission’s decision will affect their competitive positions substantially, and the absence of their voices from the public record is itself informative about the political dynamics at play.


The Structural Bind

The Commission now has a genuine problem. It has opened proceedings that require Google to share data. The data cannot be made demonstrably safe to share under the proposed mechanism. If the Commission proceeds on current terms, it risks mandating a privacy breach at continental scale -- the very outcome its own regulatory framework, the General Data Protection Regulation, exists to prevent. If it accepts Google’s proposed alternative guardrails, it risks producing a data sharing arrangement so limited in scope that it provides no meaningful competitive benefit to the companies who were supposed to receive it.

There is a version of this where the Commission finds a workable middle path before the 27 July deadline. There is another version where the deadline slips, the proceedings drag, and Google continues operating its search monopoly unmolested while the institution that regulates it tries to resolve a contradiction it wrote into its own rulebook.

The Digital Markets Act and the General Data Protection Regulation are both Commission instruments. They are both European law. In this proceeding, they are pulling in opposite directions, and the Commission is holding both ends of the rope.


A Note on Sources

One submission worth handling carefully is that of the Information Technology and Innovation Foundation, which filed a response arguing that AI chatbots should not be eligible data recipients and that the Commission’s proposed measures are disproportionate. That position aligns with Google’s. ITIF discloses Alphabet -- Google’s parent company -- as a supporter contributing over ten thousand dollars. That does not make their analysis wrong. It does mean it should be read with the funding relationship in mind.

We have not cited them as a neutral data point. We have mentioned them so you know they exist and can draw your own conclusions.


The Deadline

27 July 2026. That is when the Commission expects to adopt its final decision. Between now and then, it must resolve a question that has no clean answer: how do you enforce competition law that requires sharing data that cannot be safely anonymised under any method that has yet been demonstrated to work?

The Commission wrote the rules. The Commission opened the proceedings. The Commission is now in a bind of its own making.

That is the regulator’s dilemma.


Sources: TechRadar, 8 May 2026 (Mähl and Pujol interviews); Reuters, 6 May 2026 (Vassilvitskii letter); The Next Web, 6 May 2026 (AI lab silence); Lukász Olejnik independent analysis, published April 2026; ITIF consultation submission, 1 May 2026 (funding: Alphabet disclosed).

The Sovereign Auditor covers digital sovereignty, cybersecurity governance, and data protection policy -- with a particular focus on Isle of Man jurisdiction and Crown Dependency issues.

Support independent analysis. Subscribe directly -- or scan on your phone.

Payments via PayPal. Credentials delivered by email. No Substack. No Stripe. No middlemen.