Musician Alleges AI Company Is Cloning Her Music and Filing Copyright Claims Against Her
When AI Steals Your Voice — Then Claims It Owns You
A musician has publicly accused an AI company of cloning her musical style and using automated systems to file copyright claims against her own original work. The case highlights the growing legal and ethical crisis at the intersection of AI-generated content and intellectual property.
The Allegation
The musician, who goes by @unlimited_ls on social media, claims that an unnamed AI company:
- Trained models on her music without authorization
- Generated AI tracks that closely mimic her style
- Filed copyright claims against HER original compositions, claiming the AI versions came first
- Created a legal nightmare where she must prove her own music predates the AI copies
Why This Is Especially Dangerous
This scenario represents a new and particularly insidious form of AI abuse:
- Reverse plagiarism: Instead of copying a human's work, the AI generates similar content and claims priority
- Automated legal weaponization: AI-generated copyright claims could be filed at scale, overwhelming individual creators
- Burden of proof inversion: Creators must now prove they created their own work before AI copied them
Industry Response
The music industry has been one of the most affected by generative AI, with several high-profile cases in 2025-2026:
- Major labels suing AI training platforms
- Voice cloning lawsuits from artists like Drake and Billie Eilish
- Streaming platforms implementing AI detection systems
Legal Framework Gap
Current copyright law was not designed for AI-generated content. Key unresolved questions:
- Can AI-generated works be copyrighted?
- Who owns the output of an AI trained on copyrighted material?
- How do creators establish priority when AI can generate thousands of variations instantly?
This case could set important precedents for how the legal system handles the collision between AI content generation and human creativity.
Source: @unlimited_ls on X