AI Cloned Her Voice, Then Claimed Her Songs

Murphy Campbell

Murphy Campbell is a folk singer from the mountains of western North Carolina. She plays banjo and dulcimer, records old Appalachian ballads (some written by distant relatives), and posts videos of herself performing in the woods. She has about 7,800 monthly listeners on Spotify. She is exactly the kind of artist the music industry’s copyright infrastructure was designed to protect.

A few weeks ago, her fans started messaging her about two new AI-generated tracks that appeared on her streaming pages: synthetic versions of “The Four Marys” and “Cuba” (songs she had previously recorded). The AI had been trained on her YouTube videos. The dulcimer sounded like what she described as a “warbled, metallic mess,” and her voice had been deepened and Auto-Tuned into something she called a “bro-country singer.” The songs were bad AI imitations. But, as far as Spotify was concerned, they were officially hers.

Then it got worse.

According to Murphy’s Instagram reel, Vydia, the distributor used to upload the AI-generated tracks, filed copyright claims against Campbell’s original YouTube videos, the very videos the AI had been trained on. Because YouTube does not use humans to review these types of initial claims, Campbell stopped making money on her actual content. Vydia was now making money off her original videos of her playing music in her own backyard. Someone had cloned her voice, uploaded the fakes through a legitimate distributor, and then used the music industry’s own copyright enforcement system to claim ownership of the originals.

Campbell described herself as being “in this weird limbo where I’m telling robots to take down music robots made.” After her story went viral on Instagram, Vydia withdrew every single one of their copyright claims. The entity that uploaded the fake tracks, called Timeless Sounds IR, has never been identified.

For full transparency, I reached out to Murphy Campbell via dm on Instagram, she did not respond. While I was unable to speak directly to Ms. Campbell, I did speak to two other YouTubers (off the record) who have had similar experiences.

I wanted some additional sources because this reverse copyright scam not only works, it’s more common than I would have believed. The scam works because the systems built to protect artists operate on the assumption that the first entity to register a song is the rightful owner. That assumption held when the bottleneck was human creativity. It breaks completely when AI can generate a synthetic version of any song, by any artist, in seconds.

Campbell is one of several folk and Americana artists targeted this way. Rolling Stone’s Jon Blistein documented cases involving Paul Bender, Veronica Swift, and Grace Mitchell, among others. Someone uploaded a fake AI track called “Together” to Blaze Foley’s Spotify page. The country-folk legend was murdered in 1989. He’s been dead for 37 years.

The Scale Problem

The Campbell story is about impersonation. The Michael Smith story is about scale.

Smith is a 54-year-old producer from Cornelius, North Carolina who made his money fixing Y2K bugs in the 1990s, built and sold three medical practice chains, and co-founded a record label that worked with Snoop Dogg and DJ Khaled. Starting in 2017, he created roughly 10,000 bot accounts across Spotify, Apple Music, Amazon Music, and YouTube Music. He partnered with the CEO of Boomy, an AI music generation platform, to produce thousands of songs per week. The songs had names like “Zygotic Washstands” and were credited to fictional artists called “Callous Post,” “Calorie Screams,” and “Calvinistic Dust.”

At peak, Smith was generating 661,440 fake streams per day and earning about $110,000 a month. He collected more than $8 million in fraudulent royalties before being arrested in September 2024. On March 19, 2026, he pleaded guilty to conspiracy to commit wire fraud, the first criminal prosecution for AI-assisted streaming fraud in the United States. He has agreed to forfeit $8,091,843.64. Sentencing is scheduled for sometime in July.

Smith’s co-conspirators, the Boomy CEO and the founder of a promotional platform called Indiehitmaker, have not been charged.

Two Failure Modes, One System

These two cases illustrate different failure modes in the same system. Campbell’s case shows that copyright enforcement tools can be weaponized against the people they’re supposed to protect. Smith’s case shows that streaming platforms built for human-scale music production cannot distinguish between real listening and automated fraud when AI removes the production bottleneck entirely.

The Response

The music industry is responding. Sony has removed more than 135,000 deepfake tracks. The Music Fights Fraud alliance now includes 28 organizations (Spotify, Amazon, YouTube, and, notably, Vydia). The NO FAKES Act has bipartisan support in Congress. Deezer, which reports 60,000 AI-generated songs uploaded to its platform every day, estimates that up to 85 percent of streams on those tracks are fraudulent. The FBI’s Nashville office has issued a public warning about criminal activity targeting the industry.

These are important steps, but they address symptoms. The structural problem is that copyright registration, Content ID, and streaming royalty distribution all assume a world where creating music requires meaningful human effort, where uploading music implies some claim of authorship, and where a stream represents a person choosing to listen. AI has invalidated all three assumptions simultaneously.

I have spent my entire career at the intersection of technology and music production. I watched the music industry fight digital distribution for a decade and lose. I watched every legacy system resist the transition, adapt reluctantly, and eventually rebuild from first principles. Every time, the initial response was to enforce the old rules harder. Every time, the answer was to build new systems that accounted for the new reality.

A New System for a New World

The copyright system we have was brilliant for the world it was designed for. Our new world requires a new copyright system designed for the world we actually live in. Murphy Campbell, sitting in the Appalachian woods with her banjo, should not have to explain to a robot that she is the person who wrote her own songs. The fact that she does say a lot about where we are in this process.

Every company needs a Claw strategy. Do you have one?

Author’s note: This is not a sponsored post. I am the author of this article and it expresses my own opinions. I am not, nor is my company, receiving compensation for it. This work was created with the assistance of various generative AI models.

About Shelly Palmer

Shelly Palmer is the Professor of Advanced Media in Residence at Syracuse University’s S.I. Newhouse School of Public Communications and CEO of The Palmer Group, a consulting practice that helps Fortune 500 companies with technology, media and marketing. Named LinkedIn’s “Top Voice in Technology,” he covers tech and business for Good Day New York, is a regular commentator on CNN and writes a popular daily business blog. He's a bestselling author, and the creator of the popular, free online course, Generative AI for Execs. Follow @shellypalmer or visit shellypalmer.com.

Categories

PreviousGoogle Vids Just Became Free for Everyone NextAnthropic Resets the Economics of Claws

Get Briefed Every Day!

Subscribe to my daily newsletter featuring current events and the top stories in AI, technology, media, and marketing.

Subscribe