Spotify's new rules target AI voice clones, music spam
Spotify just rolled out new rules to fight back against AI misuse in music.
The focus is on stopping fake AI voice clones and music spam, plus making it easier for artists to flag issues before songs go live—helping prevent unauthorized uploads and impersonation.
Major artist distributors are now in the loop
Spotify's working closely with major artist distributors to catch profile mismatches and unauthorized uploads early.
They're also adopting a new industry standard that requires everyone to be upfront about how AI is used in tracks, whether it's vocals or instruments.
A smarter spam filter is coming soon too, aimed at blocking mass uploads and shady SEO tricks.
Over the past year, Spotify says it's already removed 75 million spam tracks, showing they're serious about keeping things fair for real artists.