Image Credits:Matthias Balk/picture confederation / Getty Images8:02 AM PDT · April 21, 2026
YouTube is expanding its caller “likeness detection” technology, which identifies AI-generated content, specified arsenic deepfakes, to radical wrong the amusement industry, the institution announced connected Tuesday.
The exertion works likewise to YouTube’s existing Content ID system, which detects copyright-protected worldly successful users’ uploaded videos, allowing rights owners to petition removal oregon to stock successful the video’s revenue.
Likeness detection does the same, but for simulated faces. The diagnostic is meant to assistance creators and different nationalist figures from having their individuality utilized without their support — thing that’s often a occupation for celebs who find their likeness has been hijacked for scam advertisements.
The exertion was archetypal made disposable to a subset of YouTube creators successful a aviator programme past twelvemonth earlier expanding much broadly, including to politicians, authorities officials, and journalists this spring.
Image Credits:YouTubeNow, YouTube says the exertion is being made disposable to those successful the amusement industry, including endowment agencies, absorption companies, and the celebrities they represent. The institution has enactment from large agencies similar CAA, UTA, WME, and Untitled Management, which offered feedback connected the caller tool.
Use of the likeness exertion instrumentality does not necessitate the entertainer to person their ain YouTube channel.
Instead, the diagnostic scans for AI-generated contented to observe immoderate ocular matches of the enrolled participant’s face. They tin past take to petition removal of the video for privacy argumentation violations, taxable a copyright removal request, oregon bash nothing. YouTube notes that it won’t region each content, arsenic it permits parody and satire contented nether its rules.
Further down the road, the exertion volition enactment audio arsenic well, the institution says.
Related to this, YouTube has besides been advocating for akin protections astatine a national level, with its enactment for the NO FAKES Act in Washington D.C. This would modulate the usage of AI to make unauthorized recreations of an individual’s dependable and ocular likeness.
The institution hasn’t yet said however galore removals of AI deepfakes person been managed by the instrumentality truthful far, but noted successful March that the magnitude of removals was inactive “very small.”
Sarah has worked arsenic a newsman for TechCrunch since August 2011. She joined the institution aft having antecedently spent implicit 3 years astatine ReadWriteWeb. Prior to her enactment arsenic a reporter, Sarah worked successful I.T. crossed a fig of industries, including banking, retail and software.
You tin interaction oregon verify outreach from Sarah by emailing sarahp@techcrunch.com oregon via encrypted connection astatine sarahperez.01 connected Signal.















English (US) ·