California Approves Bill Requiring Consent for AI Replicas of Deceased Actors

Why Trust Techopedia
Key Takeaways

  • California Senate approved AB 1836, requiring consent for using deceased performers' likenesses in AI media.
  • The bill mandates estate permission if signed by Governor Newsom.
  • SAG-AFTRA backs the bill, linking it to recent contract agreements for TV and film.

California’s Senate has approved a new law requiring consent from deceased performers’ estates for AI likeness use.

AB 1836 was passed on August 31. If Governor Gavin Newsom signs the bill into law, anyone wanting to use a deceased person’s likeness in media must obtain permission from their estate.

This bill follows California’s earlier approval of AB 2602, which sets similar consent requirements for living actors.

SAG-AFTRA, representing about 160,000 entertainment and media professionals, strongly supported the legislation. The union praised the Senate’s approval of AB 1836 and hopes Governor Newsom will sign the bill. They pointed out that these measures align with the protections established in their TV and film contract with major studios following last year’s four-month strike.

Legislative Struggle with Unauthorized Voice and Likeness Replicas

Recently, SAG-AFTRA and AI startup Narrativ agreed to create a marketplace where actors can license their voices for AI-generated ads, allowing them to set fees and control usage. This deal has sparked debate within the industry. In July, Eleven Labs secured agreements with the estates of icons like James Dean and Judy Garland to use their voices in its new Reader App.

On July 26, SAG-AFTRA also led a strike over using actors’ images and voices in video games, raising concerns about AI replacing performers and risking job losses.

The rise of deepfakes, which surged tenfold from 2022 to 2023 with over 2 million reported identity fraud attempts, also underscores the need for regulation. In response, U.S. senators introduced the NO FAKES Act on July 31, which would criminalize unauthorized digital replicas of individuals’ voices and likenesses. OpenAI supports this legislation, allowing individuals to seek damages for unauthorized AI reproductions.