The SoundCloud music sharing platform says that “he never used the content of artists to train AI models” and that “he undertakes formal commitment that every use of AI on SoundCloud will be based on consent, transparency and artists’ control.” The update appears a few days after the artists reported that the changes introduced last year to the conditions of employ may mean that they reserved the right to employ their music and other content for training AI generative tools.
In the absence of a separate agreement, which otherwise states, you clearly agree that the content may be used to inform, train, develop or service as a contribution to artificial intelligence or technologies or machine intelligence services as part of services and for the provision of services. “
But Seton says that “in the coming weeks” this line will be replaced by:
We will not use your content for training generative AI models that are aimed at reproducing or synthesizing voice, music or similarity without a clear consent that should be confirmed through the OPT-in mechanism.
Seton repeats that SoundCloud has never used the content of members to train artificial intelligence, including large language models, to create music or imitate or replace members’ work. And echo, what Soundcloud spokesman said The Verge In the e-mail at the weekend, Seton says that if the company uses generative artificial intelligence, “it can provide this opportunity with our human artists with clear consent via the OPT-in mechanism.”
Ed Newton-Rex, a technological ethicist who has first discovered a change, is not satisfied with the changes. IN Post XHe says that an improved language can still allow “models trained at work that may not directly recreate your style, but still compete with you on the market.” According to Rex: “If they really want to solve fears, the required change is uncomplicated. It should simply read” we will not use your content for training generative AI models without clear consent. ”
SoundCloud did not answer immediately The VergeA request for comment.