TechTech newsTechnology

Slovakia’s Election Deepfakes Show AI Is a Danger to Democracy

After a few hours they were ready to confirm that they believed the recording had been altered. Their label, which is still available to see on Slovak-language Facebook when visitors come across the post, says: “Independent fact-checkers say that the photo or image has been edited in a way that could mislead people.” Facebook users can then choose if they want to see the video anyway.

Both the beer and vote-rigging audios remain visible on Facebook, with the fact-check label. “When content is fact-checked, we label it and down-rank it in feed, so fewer people see it—as has happened with both of these examples,” says Ben Walter, a spokesperson for Meta. “Our Community Standards apply to all content, regardless of whether it is created by AI or a person, and we will take action against content that violates these policies.”

This election was one of the first consequential votes to take place after the EU’s digital services act was introduced in August. The act, designed to better protect human rights online, introduced new rules that were supposed to force platforms to be more proactive and transparent in their efforts to moderate disinformation.

“Slovakia was a test case to see what works and where some improvements are needed,” says Richard Kuchta, analyst at Reset, a research group that focuses on technology’s impact on democracy. “In my view, [the new law] put pressure on platforms to increase the capacities in content moderation or fact-checking. We know that Meta hired more fact-checkers for the Slovak election, but we will see if that was enough.”

Alongside the two deepfake audio recordings, Kuchta also witnessed two other videos featuring AI audio impersonations be posted on social media by the far-right party Republika. One impersonated Michal Šimečka, and the other the president, Zuzana Čaputová. These audios did include declarations the voices were fake: “These voices are fictitious and their resemblance to real people is purely coincidental.” However that statement does not flash until 15 seconds into the 20 second video, says Kutcha, in what he felt was an attempt to trick listeners.

The Slovakian election was being watched closely in Poland. “Of course, AI-generated disinformation is something we are very scared of, because it’s very hard to react to it fast,” says Jakub Śliż, president of Polish fact-checking group the Pravda Association. Śliż says he is also worried by the trend in Slovakia for disinformation to be packaged into audio recordings, as opposed to video or images, because voice cloning is so difficult to identify.

Like Hincová Frankovská in Slovakia, Śliż also lacks tools to reliably help him identify what’s been created or manipulated using AI. “Tools that are available, they give you a probability score,” he says. But these tools suffer from a black box problem. He doesn’t know how they decide a post is likely to be fake. “If I have a tool that uses another AI to somehow magically tell me this is 87 percent AI generated, how am I supposed to convey this message to my audience?” he says.

There has not been a lot of AI-generated content circulating in Poland yet, says Śliż. “But people are using the fact that something can be AI generated to discredit real sources.” There are two weeks until Polish voters will decide whether the ruling conservative Law and Justice party should stay in government for an unprecedented third term. This weekend, a giant crowd gathered in Warsaw in support of the opposition, with the opposition-controlled city government estimating the crowd reached 1 million people at its peak. But on X, formerly known as Twitter, users suggested videos of the march had been doctored using AI to make the crowd look bigger.

Śliż believes this type of content is easy to fact-check, by cross referencing different sources. But if AI-generated audio recordings start circulating in Poland in the last hours before the vote, as they did in Slovakia, that would be much harder. “As a fact-checking organization, we don’t have a concrete plan of how to deal with it,” he says. So if something like this happens, it’s going to be painful.”


Source link

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Back to top button