In May 2025, musician and AI ethicist Ed Newton-Rex sparked a wave of concern among artists when he highlighted a quietly updated section in SoundCloud’s Terms of Use. The clause states that uploaded content “may be used to inform, train, develop or serve as input to artificial intelligence or machine intelligence technologies or services as part of and for providing the services.”
The language is broad – and alarming for artists worried about their music being repurposed without consent. Even more concerning: there’s currently no explicit opt-out mechanism, and no binding commitment that content won’t be used for AI training in the future.
🧾 SoundCloud’s Response
In response to the criticism, SoundCloud provided clarifications to Pitchfork, stating:
- It does not currently use artist-uploaded music to train AI models.
- It does not allow third parties to scrape or use content from the platform for AI model training.
- The clause was added to clarify how user content may interact with internal AI-driven tools, such as fraud detection and music recommendation – not to enable the creation of generative AI music.
SoundCloud also told Pitchfork that if it ever considers using content for generative AI training, it would be “designed to support human artists” and would include a “clear opt-out mechanism.”
But for many artists, the issue isn’t just about future assurances – it’s about terms that already give SoundCloud broad latitude today.
⚖️ The Bigger Picture: Inputs, Consent, and Control
The SoundCloud episode underscores a wider problem in today’s digital economy: platforms are redefining the boundaries of user consent through vague or preemptive AI licensing terms. Creators upload music to share, connect, and monetize – but they’re also feeding complex AI systems with valuable inputs.
With no easy opt-out, no upfront notification, and no granular control over how their works are used, artists are placed in a vulnerable position.
💡 Why This Matters
As generative AI continues to evolve, these kinds of terms set a precedent. They raise serious questions about:
- Transparency in platform governance
- Consent in data and content usage
- Fair compensation for creative labor
- And ownership rights in a machine-learning-driven world
Musicians – and all creators – shouldn’t have to read legal fine print to protect their work from being fed into opaque AI systems.
📢 Contact a Galkin Law attorney to discuss your AI issues – www.galkinlaw.com
#SoundCloudPolicy #AIandMusic #MusicianRights #GenerativeAI #ContentGovernance