AI Voice Cloning Danger

Generative AI has presented the music industry with new challenges. You’ve probably heard about the AI-generated song featuring cloned vocals of Drake and The Weeknd. It went viral, but was taken down because Universal Music Group (UMG) claimed copyright violations. (I’m not sure if copyright laws actually apply here.)

This morning on Good Day New York, I cited some other examples and also talked about the potential dark side of voice cloning.

I’m going to dive deeper into voice cloning and generative AI music production tomorrow at 12:15 p.m. ET on YouTube Live. I’m also going to talk about Senator Markey’s proposed “Block Nuclear Launch by Autonomous Artificial Intelligence Act” legislation and the concept of “default to distrust.” We’ll also take a tour of our new, free AI resource page. Get notified here.

P.S. We’ve added two new sections to Generative AI for Executives, our free online course: ChatGPT Security & Privacy Concerns, and Autonomous Agents. Enroll today.

Author’s note: This is not a sponsored post. I am the author of this article and it expresses my own opinions. I am not, nor is my company, receiving compensation for it.

About Shelly Palmer

Shelly Palmer is the Professor of Advanced Media in Residence at Syracuse University’s S.I. Newhouse School of Public Communications and CEO of The Palmer Group, a consulting practice that helps Fortune 500 companies with technology, media and marketing. Named LinkedIn’s “Top Voice in Technology,” he covers tech and business for Good Day New York, is a regular commentator on CNN and writes a popular daily business blog. He's a bestselling author, and the creator of the popular, free online course, Generative AI for Execs. Follow @shellypalmer or visit



PreviousAI Music Deepfakes Are Here. Now What? NextMicrosoft ChatGPT for Business On the Way

Get Briefed Every Day!

Subscribe to my daily newsletter featuring current events and the top stories in technology, media, and marketing.