Through a packaging error, Anthropic accidentally published roughly 512,000 lines of internal source code for Claude Code, its AI coding assistant. Within hours, the code was scraped, mirrored, and shared more than 100,000 times.
Importantly, Claude has not been open-sourced by accident. The leaked code didn’t include Claude’s model weights or training data. Instead, it exposed something arguably more valuable to competitors: the product layer (aka wrapper or harness). This is the part that turns Claude (the foundation model) into everyone’s favorite coding assistant, including: workflow orchestration, tool integration, memory handling, and context management.
This wasn’t a security breach or hack. It was an unforced error. A file accidentally shipped in the public release, letting anyone reconstruct the underlying codebase. Anthropic fixed the issue quickly and released a new version, but the damage was done.
Security researcher Chaofan Shou first flagged the exposure publicly. The most prominent fork, instructkr/claw-code, has already accumulated more than 99,200 stars and more than 91,000 forks. By the time Anthropic started issuing takedown notices, copies had spread everywhere including decentralized sites where (in practice) the code cannot be taken down, ever.
Embarrassingly, the exposed code revealed 44 unreleased features, including references to a “KAIROS” background agent capability and internal developer comments about engineering tradeoffs. Competitors building similar products just got a free design review of what is arguably the best AI coding assistant in the world, plus a look at Anthropic’s roadmap.
For Anthropic, this incident is particularly awkward given their positioning as a safety-focused AI company. It’s not existential, but it’s another lesson on the road to AGI.
Every company needs a Claw strategy. Do you have one?
Author’s note: This is not a sponsored post. I am the author of this article and it expresses my own opinions. I am not, nor is my company, receiving compensation for it. This work was created with the assistance of various generative AI models.