Google Chrome Quietly Downloads 4GB AI Model Without User Permission

Google Chrome Secretly Downloads 4GB AI Model Without User Consent
© A. Krivonosov

A fresh controversy is brewing around Google Chrome: the browser may be quietly downloading a roughly 4GB local AI model onto users' devices without their knowledge. Security researcher Alexander Hanff tested the browser and concluded that the process happens without any notification or explicit user consent.

At the center of the issue is a file called 'weights.bin,' associated with Google's Gemini Nano AI model. According to Hanff, Chrome checks whether the device meets the necessary specs and then automatically triggers the download. In his test, the model was downloaded in the background over about 14 minutes while he was just browsing websites and not using any AI features. What's more, if you manually delete the file, it can reappear—unless you disable certain experimental settings or remove the browser entirely.

This practice raises questions not only from a technical standpoint but also from a legal one. Hanff believes such actions could violate European norms, including requirements for transparency and user consent. That's especially relevant for EU countries, where data processing regulations remain among the strictest in the world.

Then there's the resource issue. For users on limited data plans, a hidden 4GB download could mean real costs, and on weaker devices it adds extra strain on storage and system performance. Even with unlimited internet, doing this without the owner's knowledge feels questionable.

Hanff also highlights the potential scale of the consequences. If such models are mass-deployed onto millions of devices, the energy footprint adds up. He estimates the total carbon emissions could reach levels comparable to those of tens of thousands of cars, though exact numbers depend on many factors.

He links this to a broader industry trend: big tech companies increasingly enable AI features by default without making their operation fully transparent. He cites Claude Desktop as another example where similar hidden system changes have been reported. In his view, devices are increasingly treated as platforms for deploying technology rather than spaces fully controlled by users.

So far, Google has not officially commented. In theory, the company could argue that such downloads are meant to enable local data processing and enhance privacy. But that doesn't address the core question: is it acceptable to install multi-gigabyte components on a device without directly obtaining the owner's consent?