Join the conversation

Join the community of Machine Learners and AI enthusiasts.

Sign Up
lamhieu 
posted an update 11 days ago
Post
411
Copilot, Cursor and all those new AI-driven extensions are rising fast — they can read your codebase, optimize logic, and even automate whole workflows. But here’s the flip side: once these tools start peeking into .env files, nodes, and source code, the risk of leaking critical keys or sensitive info becomes very real.

So yeah, we’re coding faster than ever, but are we also opening the door to a whole new wave of vulnerabilities?

👉 Is this the hidden threat waiting to hit the next era of “vibe-coding”?

But here’s the flip side: once these tools start peeking into .env files, nodes, and source code, the risk of leaking critical keys or sensitive info becomes very real.

People who have such sensitive stuff should have the due diligence to pay for a model that fits their privacy needs.
Using some shady free API for building such stuff is a skill issue.

·

At the company where I work, we always use paid tools to build features and support tools such as Copilot, OpenAI APIs, Gemini, Vertex, and so on. But for example, with Copilot, it’s very difficult to know whether they don’t use enterprise data for training. Of course, we always see those ‘terms’ stated, but you know, the data passes through so many layers.

In this post