A brand new public service marketing campaign that includes the actor Rosario Dawson and different Hollywood stars goals to alert Americans to not be duped by AI-generated deepfakes designed to mislead them about when, the place and easy methods to vote on Election Day.
“If one thing appears off, it in all probability is,” Dawson warns within the video spot, shared solely with NBC News.
Other celebrities featured within the video embody Chris Rock, Laura Dern, Michael Douglas, Amy Schumer and Jonathan Scott delivering the message that Americans ought to depend on state secretaries of state for details about voting within the 2024 election and to not fall for unverified claims about alleged adjustments at polling stations.
The celebrities say Americans could obtain a faux message claiming voting has been prolonged, or a polling location has closed or modified attributable to an emergency, or that new documentation is required to vote. “These are all scams designed to trick you into not voting. Don’t fall for it,” the celebrities say.
At the tip, the video reveals that a few of the Hollywood stars are mere deepfakes, with their voices and pictures superimposed on different actors.
The public service announcement, organized by the nonpartisan group SignifyUs and set to look on YouTube, comes amid rising concern that synthetic intelligence expertise might be used to confuse Americans concerning the time, place or method of voting at their native polling locations.
False info and different soiled tips aimed toward discouraging individuals from going to the polls is nothing new. But more and more superior A.I. instruments may make it simpler to confuse and deceive voters with video and audio that appears and sounds believable, consultants say.
“We’re not going to cease this from coming into existence,” mentioned Miles Taylor, one of many organizers of the marketing campaign. “But what we are able to do is make individuals conscious that that is the brand new spam, that that is going to be the kind of factor they see on a regular basis on-line that tries to deceive them, and to be sure that they don’t fall for that deception, particularly in a important interval of democratic transition.”
In January, AI-generated deepfake robocalls mimicking President Joe Biden’s voice urged voters to remain house and never participate within the New Hampshire Democratic major. And final month, a deepfake caller posing as Ukraine’s former overseas minister held a Zoom assembly with the chairman of the Senate Foreign Relations Committee, Democratic Sen. Ben Cardin of Maryland.
If the expertise “may be deployed in opposition to sitting U.S. senators successfully, then your common voter might be a possible goal,” mentioned Taylor, a former senior official within the Department of Homeland Security in the course of the Trump administration who resigned in 2019 and publicly criticized the previous president.
Taylor and different organizers mentioned that as AI expertise improves at a fast tempo, elevating public consciousness can be essential to inoculating Americans in opposition to makes an attempt to unfold false info, particularly throughout an election yr.
Joshua Graham Lynn, CEO and founding father of SignifyUs, mentioned that utilizing a light-hearted strategy with celebrities supplied a option to alert Americans to the difficulty with out inflicting panic.
“It was actually essential on this explicit situation to get the purpose throughout, to not freak individuals out, however to get them excited about it,” Lynn mentioned.
All the celebrities concerned “had been captivated with doing it as a result of they need to get the message in entrance of voters,” Lynn mentioned.
Instead of attempting to imitate a nationally identified determine, the hassle to mislead voters may attempt to use a deepfake to influence a voter that they’re listening to from an area election official or a church chief, consultants and former election officers say.
“You may make numerous havoc simply by hitting a lot of precincts throughout the nation, and since it’s not a identified particular person, it could be somewhat more durable to confirm shortly,” mentioned Kathy Boockvar, the previous secretary of state in Pennsylvania.
Organizers of the marketing campaign ran simulations over the previous yr to attempt to anticipate what would possibly occur on this yr’s election with AI-powered instruments. “The most alarming situations had been those the place deep faux applied sciences had been used to focus on native voters and to attempt to deceive them about their proper to vote,” Taylor mentioned.

The marketing campaign doesn’t try to sort out or fact-check the flood of false info circulating this election cycle, from candidates, commentators, deepfakes or different means. Instead, it focuses on verifiable, concrete particulars about when, the place and the way Americans can solid ballots on Election Day, Lynn mentioned.
“No one ought to come between an American and their vote,” Dawson mentioned in an announcement. “Unfortunately, it’s protected to say individuals are going to strive.”
To help understaffed state and native election workplaces in dealing with the onslaught of false info, together with deepfakes, a nonpartisan coalition of greater than 70 nonprofits have organized to assist election authorities establish and debunk false details about voting earlier than it goes viral.
Efforts to mislead Americans about their means to vote could have already begun, in response to rights teams.
Last week in Wisconsin, voting rights advocates requested state and federal authorities to examine nameless textual content messages that appeared aimed toward intimidating faculty college students from voting.
College college students in Wisconsin are permitted to register to vote both at their house or their college deal with.
In the 2020 election, U.S. authorities accused Iran of sending emails to Democratic voters in a number of battleground states aimed toward intimidating them into voting for then-President Donald Trump. The emails falsely claimed to be from the far-right group Proud Boys and warned the recipients that “we are going to come after you” in the event that they didn’t vote for Trump. It’s unclear to what diploma it had any impact. Iran has denied attempting to intervene in U.S. elections.