Starting today, creators participating in the YouTube Partner Program gain access to a recent AI detection feature that allows them to find and report unauthorized files based on their similarity. As shown in this YouTube videoOnce your identity is verified, creators can view tagged videos in the Content Discovery tab in YouTube Studio. If a video appears to be unauthorized AI-generated content, creators can submit a takedown request.
The first wave of eligible creators were notified via email this morning, and the feature will be rolled out to more creators over the next few months. YouTube warned the first users guide to this feature that, in its current state of development, “it can display videos of your real face, rather than altered or synthetic versions,” such as clips containing the creator’s own content. It works similarly to Content ID, which YouTube uses to detect copyrighted audio and video content.
YouTube originally announced the feature last year and began testing it in December through a pilot program with talent represented by Original Artists Agency (CAA). A YouTube blog post at the time said: “Through this collaboration, some of the world’s most influential figures will have access to early-stage technology designed to identify and manage AI-generated content on YouTube at scale.”
YouTube and Google are among a number of tech companies offering AI-powered video generation and editing tools, and the similarity detection tool isn’t the only feature they’re developing to handle AI-generated content on the platform. Last March, YouTube began requiring creators to label videos containing content generated or altered using AI and announced strict rules on AI-generated music “that mimics the unique voice of a singing or rapping artist.”
