A North Carolina folk musician discovered AI-generated songs on her Spotify profile that she never recorded, then faced copyright claims filed against her own YouTube videos using those same cloned recordings. The case exposes critical gaps in how streaming platforms verify content and protect independent artists from AI-powered fraud.

What Happened

Murphy Campbell, a folk musician specializing in Appalachian banjo and dulcimer traditions, found two unauthorized tracks on her Spotify artist profile in January 2026. Fans alerted her to the songs, which she had never performed or recorded.

Someone had scraped her performances from YouTube, processed them through commercially available AI voice-cloning tools, and uploaded the synthetic covers to streaming platforms under her actual name. Two AI detection tools flagged one track, "Four Marys," as likely machine-generated.

The situation escalated in March when videos appeared on YouTube via distributor Vydia under the name "Murphy Rider." Those unauthorized recordings were then used to file Content ID copyright claims against Campbell's legitimate YouTube performances, threatening her ability to earn from her own work.

Why It Matters

This case illustrates a two-pronged attack that any independent musician could face. First, AI voice clones flood streaming platforms under an artist's real name. Second, those cloned recordings become weapons for copyright claims against the original creator's content.

The problem runs deeper than one artist. AI-driven scams targeting musicians are accelerating across the industry. Sony Music has targeted 135,000 deepfake tracks for removal. Spotify pulled 75 million spammy tracks in September 2025. Yet the platforms still lack front-end verification to confirm that someone claiming an artist's name actually is that artist.

Campbell described the frustration: "I'm in this weird limbo where I'm telling robots to take down music robots made."

Key Details

  • AI-generated tracks were uploaded to Spotify, YouTube Music, and Apple Music under Campbell's name without consent
  • Tracks were eventually removed from YouTube Music and Apple Music, but at least one remained on a duplicate Spotify profile
  • Vydia founder Roy LaManna (now CPO at Gamma) attributed the exploitation to gaps in audio content recognition databases, stating Campbell's recordings were not registered in protective fingerprinting systems
  • Spotify is testing tools to let artists approve songs before they appear on official profiles, but those protections are not yet widely available

What to Do Next

Independent musicians should register their recordings in audio fingerprinting databases (like Content ID and Audible Magic) to create a verification baseline. Artists who discover unauthorized AI clones should report them through each platform's takedown process and document everything for potential legal action.

The broader AI music industry continues to fracture between platforms that embrace AI generation and those building protections against it. For creators caught in the middle, proactive registration remains the best defense against an attack that exploits the gap between what AI can generate and what platforms can verify.