AI-generated Songs are Being Uploaded to the Official Profiles of Deceased Musicians Without Permission
Spotify is under fire after AI-produced songs popped up on the official accounts of dead artists such as Blaze Foley and Guy Clark. These tracks weren’t simply created by artificial intelligence; they were posted as if written by the artists themselves, without their families, estates, or labels’ permission.
An example that caused an uproar was a track called “Together,” which appeared on Blaze Foley’s Spotify account in July 2025. Foley, a country artist who died in 1989, had nothing to do with the song. The singing was mediocre, the form was not his style, and the AI-designed cover art bore no resemblance to him. Craig McDonald, who assists in overseeing Foley’s music catalog, referred to the song as “a sloppy bot job” and called on Spotify to implement stronger guidelines to ensure this does not occur again.
The Lack of Checks on Music Platforms is Allowing Fake Content to Slip Through
The fiasco has triggered wider debates about how we deal with music legacies in the digital era. Most fans who were familiar with Foley’s style and voice could immediately tell that the song wasn’t authentic. However, newer generations of listeners may not even notice they are hearing something artificial, which can harm an artist’s reputation and legacy.
What’s even more infuriating is that this wasn’t an easy upload error. The song was released through TikTok’s SoundOn platform, an instrument that helps creators release music on platforms such as Spotify. When people reported it, Spotify took down the track, stating that it violated their policies regarding deceptive content. Yet that it passed through in the first place indicates there’s a huge loophole in the process.
Individuals who maintain these artist pages are now demanding a policy under which any new track posted on a deceased artist’s page would be subject to review and approval. It’s a modest measure, but one that could go some way toward safeguarding creative work from misuse.
Musicians and Industry Experts are Calling for Stricter Rules to Protect Artist Legacies from AI Missuse
Sites such as SoundOn have made it simple for anyone to share songs, even songs generated completely by artificial intelligence. While that has opened up new opportunities for artists, it also carries risks. SoundOn’s terms have been said to enable music to be used for training AI, and there are few protections to prevent fake or deceptive songs from being thrust onto big platforms.
Some solo artists and bands are fighting back in court. Artists such as Anthony Justice are suing music AI software such as Suno and Udio for taking their material without permission. They contend that such software scrapes tens of millions of actual songs to program AI to copy them usually going after lesser-known artists who lack the large labels to defend themselves.
Music promoters are now cautioning musicians to read distribution agreements closely before enrolling. The words “machine learning” or “data analysis” in the fine print may be a warning that your songs will be used to train the same tools that displace you.
Stay informed. Stay inspired. Subscribe to Inspirepreneur Magazine’s Newsletter for the latest developments on global conflicts, leadership insights, and strategic innovations shaping tomorrow’s world.