Senate Bill Would Guard Artists’ Work Against AI Training

Why Trust Techopedia
Key Takeaways

  • A Senate bill would block companies from training AI models on content without permission.
  • AI creators would have to make tools that protect content against training.
  • NIST would also have to set standards for the approach.

A newly introduced and bipartisan US Senate bill could make it illegal to train AI models on artists’ work without their permission.

The Content Origin Protection and Integrity from Edited and Deepfaked Media Act (COPIED Act) would give companies two years to develop tools that let creators attach content provenance and watermark data to their work. It would violate the law to train AI on this content.

The bill would give artists, journalists, musicians, and other creatives the ability to both protect against unwanted AI uses and sue those who ignore the rules. The National Institute of Standards and Technology would develop standards for the content protection.

The legislation is the work of Senate Commerce Committee chair Maria Cantwell (D-WA), fellow committee member Marsha Blackburn (R-TN), and AI Working Group member Martin Heinrich (D-NM). Numerous industry organizations and publications endorse the measure, including Hollywood’s SAG-AFTRA, the Recording Industry Association of America (RIAA), and the News/Media Alliance.

This isn’t the first bill meant to curb  abuse of AI in the creative space.  Senator Ted Cruz (R-TX) put forward the Take It Down Act in June to require that social media companies remove deepfake porn. Both federal and state politicians have either been introducing other AI-related bills or calling for stricter regulation.

President Biden also ordered the development of AI safety and security standards in October. The move would demand that AI platform developers share key details like test results before launching their technology.

However, the COPIER Act is notable as the first federal legislation that would restrict how AI producers train their models. They may have to use limited data sets that include unwatermarked and public domain material. While this could affect the quality of AI-generated media, it could also prevent the easy theft of copyrighted work.