Universal Music’s recent lawsuit against AI startup Anthropic underscores a pressing issue at the intersection of AI and copyright law. The suit alleges that Anthropic’s AI chatbot Claude reproduces copyrighted song lyrics almost verbatim, infringing on the rights of music publishers like Concord and Universal.
This isn’t just about song lyrics; it’s emblematic of a broader challenge. As AI models become more sophisticated, their ability to generate content that mirrors original work intensifies. Claude, for instance, can summarize a staggering 75,000 words, dwarfing the capabilities of its contemporaries like OpenAI’s ChatGPT.
The lawsuit draws parallels between AI and past technological advancements, from the printing press to web-crawlers, emphasizing that each innovation brought with it a responsibility to respect existing laws. The message is clear: AI’s potential shouldn’t exempt it from legal and ethical considerations.
I just typed “‘I will survive’ lyrics” into the Google search bar and got the lyrics perfectly displayed. Google isn’t paying anyone for surfacing the lyrics, and it was proud to tell me that it found “About 1,670,000 results (0.35 seconds)”, almost all of which were links to sites displaying the lyrics perfectly. Considering what you can find with a simple search, AI being able to surface a close approximation of song lyrics would seem to be the least of Universal’s copyright issues… but that’s not their point.
Recorded music companies look for revenue wherever it can be found – and they are quite litigious. It was the RIAA’s lack of vision and technological ineptitude that kept the music industry out of online revenue for 20 years. Let’s hope history does not repeat itself.
Author’s note: This is not a sponsored post. I am the author of this article and it expresses my own opinions. I am not, nor is my company, receiving compensation for it. This work was created with the assistance of various AI models, including but not limited to: GPT-4, Bard, Claude, Midjourney, Stable Diffusion, and others.