'Spineless': Democracy Watchdog Condemns FEC Refusal to Act on Election Deepfakes
"We need a clear FEC rule in place to deter fast-proliferating political deepfakes, which threaten electoral integrity and people's basic faith that what they see and hear is real—but the agency has utterly failed to deliver."
The Federal Election Commission on Thursday voted to forgo new rulemaking on the use of artificial intelligence in U.S. political campaign advertising, drawing sharp criticism from a watchdog group that said deepfakes threaten electoral integrity.
Public Citizen, the watchdog group, had last year petitioned the FEC to issue regulations clarifying that the use of deepfakes in political ads is illegal. The commission on Thursday formally declined to do so and instead voted in favor an anodyne "compromise" rule that states that artificial intelligence is subject to current regulations.
Robert Weissman, co-president of Public Citizen, said "compromise" was a "misnomer" and the FEC's position was in fact "compromised."
"We need a clear FEC rule in place to deter fast-proliferating political deepfakes, which threaten electoral integrity and people's basic faith that what they see and hear is real—but the agency has utterly failed to deliver," he said in a statement.
Fellow co-president Lisa Gilbert agreed, saying that "the threat of deepfakes is staring us in the face and unfortunately our elections agency has chosen to look the other way," and calling the decision "spineless and shameful."
There are six FEC commissioners, including three from each major party, with a rotating chairmanship. Democrats have criticized the structure in recent years, arguing that Republican commissioners block meaningful regulations—four votes are needed to pass any rule—and have made the FEC toothless. They argue that a strong FEC is more necessary than ever given the massive increase in spending on U.S. elections that's occurred since the Citizens United ruling was issued in 2010.
For Public Citizen's petition, however, the problem was not just the Republican commissioners. Two Democratic commissioners, Dara Lindenbaum and Shana Broussard, declined to support the petition and instead helped craft the anodyne interpretative rule.
Democratic Commissioner Ellen Weintraub, the current vice chair, supported the petition and has commended an ongoing effort by the Federal Communications Commission to regulate AI use in political advertising. The FCC has proposed requiring a disclosure when AI has been used in the making of an ad, drawing praise from watchdog groups such as Public Citizen. The two federal agencies have sparred over the FCC's proposal.
Republican FEC members spoke out strongly against the Public Citizen petition at an open meeting Thursday, arguing that the commission had neither the authority nor the expertise to regulate an emerging technology. Current Chair Sean Cooksey published an op-ed in The Wall Street Journal last month titled "The FEC Has No Business Regulating AI." He issued a 10-page statement on his opposition to the petition on Thursday.
If Congress hasn't yet granted the FEC such authority—a matter of interpretation of the law, which dates to the 1970s—it's possible that it could do so, as there is some level of bipartisan support for legislation on deepfakes. Multiple bipartisan bills have been introduced to prevent the use of AI in political ads, including one brought forth this week by Reps. Adam Schiff (D-Calif.) and Brian Fitzpatrick (R-Pa.), among others.
Schiff told The Associated Press the bill was "modest" and "really probably the lowest hanging fruit there is" in addressing AI misuse in politics. He and Fitzpatrick acknowledged their bill was a long-shot but said they would try to attach it to must-pass legislation later in the year.
No comments:
Post a Comment
Note: Only a member of this blog may post a comment.