Foreign adversaries using artificial intelligence to meddle in U.S. election, FBI says

The FBI is warning that Russia and other adversarial nations are using artificial intelligence to confuse voters and meddle in our elections. The technology they can use to deceive us is getting far more advanced.

"It's already happening," noted President Biden. "AI devices are being used to deceive people."

For example, a robocall before the New Hampshire Primary imitated President Biden urging Democrats not to vote. The fake voice said, "your vote makes a difference in November, not this Tuesday."

READ: Bay Area voters head to the polls on Monday as early voting gets underway

This and other deep fake audio and video recordings are making people question what is and isn't real.

"You’ll see messed up hands and fingers you’ll see words on signs that don't quite publish as crisp as they should. That’s being perfected though," said cybersecurity expert David Derigiotis.  "It’s getting harder to tell even those giveaways."

That can have a very real impact on the next U.S. election and the people (and foreign governments) trying to deceive us all know it.

"Absolutely, you could change enough minds to swing elections especially as we see a lot of close polling in a lot of these races," said Cal State Professor Nolan Higdon.

The technology has already advanced beyond the point in which detectors can easily detect it. And as Derigiotis said, the minor glitches you may still be able to see for now are rapidly going away as the technology improves.

"I can tell you it’s not that far away. Some clips we are making right now are indistinguishable in my opinion," said Yonatan Dor who co-founded the Dor Brothers, a production company that specializes in producing high-end videos through the use of AI.

In fact, the Dor Brothers produced images through an AI generator that current AI detection programs did not identify as AI generated. They released some videos to show how it can make politicians look as if they're using weapons, robbing people, and getting arrested for it—creating absurd situations, so people would not confuse them with reality, but still get the gist of what's coming.

READ: Governor Ron DeSantis approves election procedure changes for counties hit by hurricanes

"It shows everyone the potential of the technology without hurting anybody," Dor said.

He stressed there are others who can use this same technology to cause a lot of harm, and we need to be prepared.

"A malicious person creates the same type of material we did with malicious intent that could potentially start wars or arguments or rig elections or whatever it is you have in mind," he warned.

Dropping a deep fake video just before an election can spread across the nation before fact-checkers have time to flag it.

"We're going to see the big wave of these deep fake ads coming out right before an election, before candidates really have a chance to try to say I never did that," said Public Citizen’s Craig Holman.

On the flip side, if this technology existed in President Nixon's time, he could have claimed the tapes that implicated him in scandal were fake, as could any politician today, creating a world in which people don't know what to believe.

"The sheer volume of impersonations and false images we're exposed to on social media leads us to no longer recognize reality when it's staring us right in the face," said U.S. Rep. Nancy Mace, (R) South Carolina.

Florida Polytechnic University hosts an artificial intelligence research lab. Professors Bayarit Karaman and Feng-Yen Yang are experts in producing and making deep fakes.

"The technology is increasing exponentially," said Dr. Karaman. "But sometimes lighting and shadow are not consistent."

READ: Presidential election 2024: Here’s your Bay Area voter’s guide

That's one sign. Computers can have trouble understanding and replicating angles of sun and shadows. Other signs can often be found in the background of photos.

"AI is focusing more on the front part and the background can make some mistakes," Dr. Karaman added.

Dr. Feng-Yen Yang demonstrated to us how you can also find clues in the audio. He recorded his voice and visual mannerisms to train an AI program and showed how his digital clone said whatever he typed on his keyboard—but at times with unusual inflection. That demonstrates how computers may have trouble determining how someone would change their inflection relative to their emotions.

People can also use the Google image search feature to upload pictures they may not be sure about. It won't tell them whether it's fake. It can show other sites where the images have been used. That can provide more context on where they came from, who made them, and why.

STAY CONNECTED WITH FOX 13 TAMPA:

2024 ElectionTechnology