AI companies are afraid of data leaks, but they ended up leaking first themselves.



The incident currently going viral on X and GitHub is almost like a pre-written script—perfectly showcasing the fragility of the entire AI industry.

It went like this:

— Anthropic accidentally released an update on npm
— The package contained a roughly 60MB debug file
— Inside was a snippet of Claude’s internal code

Then the plot thickened.

After 23 minutes, a researcher (Chaofan Shou) found it, downloaded it, and made it public.
Within a few hours: millions of views.

By the time the Anthropic team reacted, this code had already:
— gone viral on GitHub
— been forked tens of thousands of times
— started to develop a life of its own

A classic Web2-world mistake…
Happening in an era where information spreads like a virus.

Now pay attention to the second part.

A developer appeared: Sigrid Jin.

He:
— analyzed the leaked code
— rewrote it into a Python version in 8 hours
— garnered tens of thousands of stars on GitHub
— and then rewrote it again in Rust

Done.
This product basically turned into an open-source clone.

But what’s truly important is here.

In the code, they found a system called “Undercover Mode.”

Its job was:
to prevent internal data leaks from the model.

In other words:

A company that makes leak-prevention protection…
leaked its own code through a debug file.

This isn’t just satire.
It’s a signal.

What does that mean?

1. AI companies aren’t as “closed” as they look
One small mistake → the back kitchen gets exposed immediately.
2. The moat around AI is far more fragile than investors think
If a product can be copied quickly → value shifts:
— from models → to data
— from data → to distribution
3. The speed of copying = a new reality
It used to take years,
but now it only takes a few hours.

What are the consequences?

— Profit margins for AI companies get squeezed
— The open-source movement accelerates
— “secret technologies” get devalued

And most importantly:

The market is starting to realize that,
value isn’t in the code—it’s in the ecosystem and user reach.

We’re moving toward a world like this:

— Models become commodity goods
— The winner isn’t the one with the “strongest AI”
— but the one with the strongest distribution, products, and funding

Now, today’s AI makes people think of the early days of the internet.
Everyone thought technology would win.

But in the end, the ones who win are
those who control attention.
View Original
post-image
[The user has shared his/her trading data. Go to the App to view more.]
This page may contain third-party content, which is provided for information purposes only (not representations/warranties) and should not be considered as an endorsement of its views by Gate, nor as financial or professional advice. See Disclaimer for details.
  • Reward
  • Comment
  • Repost
  • Share
Comment
Add a comment
Add a comment
No comments
  • Pin