Bennet, Colleagues Urge FCC to Require AI-Generated Content Disclosure in Political Ads on Radio and TV

Washington, D.C. — Colorado U.S. Senator Michael Bennet, a member of the Senate Committee on Rules and Administration with oversight over federal elections, alongside U.S. Senator Ben Ray Luján (D-N.M.) and six of their Senate colleagues urged the Federal Communications Commission (FCC) to adopt a proposed rule requiring the disclosure of AI-generated content in political ads on radio and TV.

“We recognize that the use of AI-generated content has many benefits. But like any new technology, AI poses risks to society, risks that are even more pronounced in the context of elections. The use of AI-generated content has the potential to amplify mis and disinformation, incite political violence, and suppress voter participation,” wrote the senators. 

“[F]oreign actors may use deceptive AI to sow discord and undermine our democracy and faith in elections,” continue the senators. “[A]s AI-generated content becomes more and more advanced, voters may find it difficult to recognize video, images, audio and text as fake. For this reason, we believe it is imperative that robust transparency and disclosure requirements are in place as soon as possible.”

In 2022, Bennet was the first senator to propose creating an expert federal body to regulate digital platforms with his Digital Platform Commission Act. In June 2023, Bennet called on major technology companies to identify and label AI-generated content, and introduced the Global Technology Leadership Act to bolster the government’s ability to assess U.S. capacity in emerging technologies relative to other countries. Bennet also introduced the Oversee Emerging Technology Act and the ASSESS AI Act to ensure government use of AI complies with fundamental rights, and joined his colleagues to introduce the REAL Political Ads Act to require a disclaimer on political ads for federal campaigns that use content generated by AI.

In addition to Bennet and Luján, U.S. Senators Angus King (I-Maine), Amy Klobuchar (D-Minn.), Chris Van Hollen (D-Md.), Raphael Warnock (D-Ga.), Peter Welch (D-Vt.), and Cory Booker (D-N.J.) also signed the letter.

The text of the letter is available HERE and below.

We write to express our support for the Federal Communications Commission’s (FCC) proposal to require disclosure of the use of AI-generated content in political ads on radio and TV. While more must be done to address the risks that AI poses to our elections, we urge the FCC to adopt these rules as the 2024 presidential election is less than two months away and, in some states, voters can begin casting ballots as early as this month.

We recognize that the use of AI-generated content has many benefits.  But like any new technology, AI poses risks to society, risks that are even more pronounced in the context of elections. The use of AI-generated content has the potential to amplify mis and disinformation, incite political violence, and suppress voter participation.  In addition, foreign actors may use deceptive AI to sow discord and undermine our democracy and faith in elections. Lastly, as AI-generated content becomes more and more advanced, voters may find it difficult to recognize video, images, audio and text as fake. For this reason, we believe it is imperative that robust transparency and disclosure requirements are in place as soon as possible.  

In addition, we support the following specific provisions of the proposed rules. First, we support on-air and written disclosure requirements. Such requirements are the most straightforward way to ensure that the public is notified of the use of AI-generated content in the advertisement they are viewing and/or hearing. Second, we support the application of transparency and disclosure requirements to both candidate and issue advertisements. This will ensure that both types of political ads are subject to the same standards. Next, we support the transparency and disclosure requirement applications to both broadcasters as well as other entities under the FCC’s jurisdiction. Again, this will ensure a more level playing field across mediums. Additionally, we urge the FCC to include an updated definition of “AI-generated content” to clarify that long-standing, basic editing tools are not considered as covered content. This will ensure that basic audio and video accessibility and editing tools are not negatively impacted by this necessary rulemaking on artificial intelligence. Lastly, we support a requirement that these rules take effect 90 days prior to an election as well as during the election certification process.

We urge the Commission to finalize and implement these rules as soon as possible. Thank you in advance for your attention to this important issue.