Below Supernav ↴

Deepfake fears surface ahead of 2024 election

  • Deepfakes can be nearly undetectable from real videos
  • Experts fear deepfakes can undermine trust in the democratic process
  • Deepfake robocalls have already been used in the election

 

Main Area Top ↴

AUTO TEST CUSTOM HTML 20241115101948

(NewsNation) — In the lead to the 2024 presidential election, a recent picture of former President Donald Trump and what appear to be supporters has gotten a lot of pushback.

The reason? It’s a deepfake. Obtained by the BBC, the photo was generated by artificial intelligence, and there is no evidence linking the photo to the actual Trump campaign.

But the image is concerning because deepfakes are only likely to get better and be more difficult to separate from reality as we approach the election in November.

Deepfakes are relatively cheap and in some cases free to make, but they can cause real damage during a political campaign. Voice cloning can be used to mimic a person in ads or robocalls. AI can also generate fake news reports.

These videos can trick an unsuspecting person if they don’t know what to look for: things like lips not quite syncing up with speech, unnatural speech cadence, skin that seems too smooth to be real and sometimes small glitches around boundaries, like between a person’s face and the background.

While some AI-generated content seems harmless, like fun face filters, the fear is that in the wrong hands, it could have the power to become undetectable.

By the November election, it’s entirely possible it will be extremely difficult to tell what’s real and what’s not.

Cybersecurity expert Nicole Tisdale said the biggest issue with deepfakes is the way they can undermine trust in the democratic process.

“People start to fear that they can’t believe what they are seeing and what they are hearing,” Tisdale told NewsNation’s Nichole Berlie. “So when we get to a place where you can’t believe your eyes, which is where we are, and you also can’t believe your ears when you’re talking about audio, it can really make people kind of question their decision to vote or to participate in a democratic process at all.”

Deepfake robocalls have already gone out to voters trying to trick them into not voting. Even for experts, Tisdale said, it can be hard to identify deepfakes. There are even more convincing examples that aren’t available to the public, she explained, showing just how realistic they can be.

“One of the best examples is actually a music video by Kendrick Lamar that has over 45 million views on YouTube,” Tisdale said.

The video, which she specifically gave as a nonpolitical example, includes deepfakes of Kanye West and Kobe Bryant, among others.

“Viewers have to understand, you’re not going to be able to detect this on your own,” Tisdale said. “You have to have technology, very advanced technology, to spot a really good deepfake.”

Politics

Copyright 2024 Nexstar Media, Inc. All rights reserved. This material may not be published, broadcast, rewritten, or redistributed. regular

test

 

Main Area Middle ↴

Trending on NewsNationNow.com

Main Area Bottom ↴