- Vulnerable U
- Posts
- Google API Keys Weren't Secrets. But then Gemini Changed the Rules.
Google API Keys Weren't Secrets. But then Gemini Changed the Rules.

For years, Google’s documentation told developers that certain API keys were not secrets. If you were embedding Google Maps in a website or wiring up Firebase in a front-end app, you were explicitly shown how to drop an API key directly into your HTML or JavaScript. That was the norm. It was documented. It was encouraged.
That guidance shaped developer behavior for over a decade. Teams built apps, launched products, experimented with side projects, and scattered API keys across public-facing codebases. In many cases, they did exactly what the documentation told them to do.
Then Gemini entered the picture.
Truffle Security published research showing that those same API keys can now be used to access Gemini services inside a Google Cloud project. If the Generative Language API is enabled, an old, unrestricted API key may be able to interact with Gemini. That includes accessing uploaded files, cached prompts, and other data sent to the model.

This represents a systemic shift in how those keys function. Developers created them under one threat model, the platform evolved and the privileges changed.
Retroactive Privilege Expansion in the Wild
Truffle calls it retroactive privilege expansion. Imagine you generated an API key years ago to power a Google Maps integration. At the time, it simply identified your project to a specific service. It was not meant to unlock sensitive data. You embedded it in client-side code because that was considered safe for that use case.
Fast forward to the present: Your Google Cloud project evolves, you experiment with Gemini, enabling the Generative Language API, and you start uploading documents for summarization or embedding AI features into your product.
That same API key, if left unrestricted, may now have access to Gemini endpoints.
Truffle scanned the internet and found thousands of publicly exposed API keys that could access Gemini. In some cases, researchers were able to retrieve data from Gemini environments using nothing more than a publicly visible key and a simple request. Even more striking, they reportedly identified similar exposure patterns within Google’s own projects.
The risk is not limited to data leakage. Gemini usage costs money. Many teams are already running into token limits or high AI spend. A public key with Gemini access can be abused for free model usage, image generation, or code generation, with the bill landing on the project owner.
Developers did not necessarily make a mistake. Many followed the documentation exactly. The problem is that platform capabilities expanded while old credentials remained in place.
What You Need to Do Right Now
Google has outlined plans to improve defaults and notify affected users. They intend to restrict default API key behavior, block discovered exposed keys, and alert project owners. Those are positive steps, but they do not solve the immediate exposure.
If you have ever generated API keys in Google Cloud, especially from the APIs and Services credentials page, you need to audit your environment.
Start by checking every project in your organization. In the Google Cloud Console, review enabled APIs and look for the Generative Language API. If it is not enabled, you are not affected by this specific Gemini pivot. Even so, it is still worth reviewing key restrictions.
If it is enabled, move to the credentials section and examine each API key. Look for keys marked as unrestricted. That is the default configuration. Also check whether the Generative Language API appears in the list of allowed services for any key.
Unrestricted keys or keys explicitly allowed to access generative services are your priority.
Next, determine whether any of those keys have ever been exposed publicly. Check client-side JavaScript, public GitHub repositories, documentation snippets, old test environments, and archived projects. Start with your oldest keys. Those were most likely created under the assumption that they were harmless.
If you find a key that was exposed and now has Gemini access, rotate it immediately. Then scope the new key tightly so it can only access the specific service it was intended for, such as Maps or Firebase, and nothing else.
Truffle offers open source tools to scan for exposed secrets, and enterprise-grade options as well. Regardless of tooling, the critical step is visibility. You cannot protect what you have not inventoried.
The bigger lesson is architectural. Cloud platforms evolve and services expand. Permissions that were once narrow can become broad. API keys that seemed low risk can quietly inherit new power.
If you treat every credential as temporary and every default as suspect, you are less likely to be surprised when the rules change.
