what are they and how to spot them

0

(NewsNation) – Technology with the potential to learn and adapt is becoming increasingly sophisticated, resulting in doctored photos and videos that are increasingly difficult to differentiate from reality.

The advent of deepfakes, a type of so-called synthetic media, allows creators to make anyone, including world leaders, appear to be saying or doing anything. The FBI’s Cyber ​​Division issued a statement last March warning that “malicious actors will almost certainly leverage synthetic content for cyber and foreign influence operations over the next 12-18 months.” Already, the technology has been used to produce fake videos of Ukrainian President Volodymyr Zelenskyy and Russian President Vladimir Putin amid their nations’ ongoing war.

Deepfakes are created using artificial intelligence algorithms that learn from photos or audio clips to produce something similar, but artificial.

“So, for example, if we want to create realistic human faces or human voices, what we do is we give the neural network tons and tons of images of real faces, videos or human voices , audio signals,” said Siwei Lyu. , professor of innovation in the Department of Computer Science and Engineering at State University of New York College at Buffalo. “So the model will learn, improve over time.”

A recent example was aimed at the Ukrainian president. The deepfake video showed what appeared to be Zelenskyy urging Ukrainians to surrender in their fight against invading Russian troops. The Ukrainian leader’s body appears stiff in the video, his head too big and his face a different color from his neck.

Other deepfakes are strictly entertainment, if not more sophisticated. TikTok parody account @deeptomcruise posts hyper-realistic videos of what appears to be Tom Cruise going about his day. The effect is not disturbed even if the actor in the video obscures his face by putting on and taking off sunglasses or a hat. In a video on the nose, he performs a magic trick before looking at the camera and repeating “it’s all true”.

The possibilities of doctored media, however, go beyond deepfakes.

Recently, a hoax posing as Ukrainian Prime Minister Denys Shmyhal landed on March 17 during a video call with British Secretary of State for Defense Ben Wallace.

“Today an impostor claiming to be the Prime Minister of Ukraine attempted to talk to me. He asked several misleading questions and after becoming suspicious I ended the call,” Wallace said on Twitter after the post. incident.

The British government said Russia was behind the hoax and claimed that excerpts of the call, shared on YouTube, had been edited to twist and distort the truth.

In another video posted in 2020, House Speaker Nancy Pelosi’s voice was slowed down so her speech was garbled. The pitch of the video has also been adjusted to match Pelosi’s usual voice, making the fake harder to detect.

The most effective way to identify a deepfake is to cross-check media with other sources, Lyu said. If a quick search reveals wildly conflicting information, it’s more likely the media is wrong, he said.

Human eyes and ears are equally useful tools. Irregularities such as unusual blinking patterns, speech that is out of sync with mouth movements, and inconsistencies in the reflection of the subject’s eyes are all red flags, Lyu said.

“My favorite part to look at is the teeth,” Lyu said. “The generation algorithms, they usually struggle to create realistic looking teeth. You can’t identify individual teeth, you always see something that looks like a speck of white.

These indices are not irrefutable proof, but they indicate a higher probability that a media is wrong, he said.

Some deepfakes fall into the strange valley, where it can be harder to pinpoint what’s wrong.

In these situations, tools similar to those used to create deepfakes can help detect other synthetic media. Lyu was part of a team that helped create DeepFake-o-meter, an online platform that allows users to upload content and tells them if it’s likely to be fake. Similar tools are also used to help interpret the speech of stroke victims, Lyu said.

For better or worse, technology is here to stay, he said.

“It’s like nuclear technology. We can use it to generate electricity for homes and for people, but also to create atomic bombs to destroy the world,” Lyu said. “So it depends on who is using the technology.”

The long-term effect that deepfakes and doctored media have on society, however, could jeopardize public trust in the way they consume information.

“My personal view on this is, I think generally speaking, the very existence of deepfakes, or any type of manipulated media, is the fundamental erosion of our trust in the media that we see,” said Lyu said.

On the other hand, people who aren’t as tech-savvy are likely at greater risk of falling prey to manipulated media, he said.

“When they are bombarded with this type of information, they will be more easily influenced by this information and it can also have consequences,” Lyu said.

Share.

About Author

Comments are closed.