Video games, AI fuel wartime disinformation
Israel's Iron Dome anti-missile system intercepts rockets launched from the Gaza Strip, as seen from Ashkelon, in southern Israel, October 20, 2023. REUTERS/Amir Cohen
What’s the context?
AI could provide tools to create more realistic fake content that is hard to tell from the real thing, tech experts warn
- Video game used to spread disinformation after Hamas attack
- Malicious actors lower video quality to deceive viewers
- Challenges in detecting and combating fake content
LONDON – Something was wrong with the video of the Iron Beam, Israel's latest laser air defence system, but many among the millions who watched it and shared it on social media did not realise.
The film appeared to show dozens of Hamas rockets darting up into the night sky, followed by the mid-air flash of explosions as Israel's new, but not-yet deployed, laser weapon struck them down.
It would be a week, and millions of views, before fact-checkers revealed the truth. Facebook and X then added labels the posts.
"This is a video game," the note on X read.
Since the Oct. 7 Hamas attack on Israel, the military simulation game Arma 3 has become the source of numerous videos used to spread disinformation via social media accounts with hundreds of thousands of followers.
Tech and gaming experts warn that while game developers - devs - are using AI to make their products more realistic, some modders - people who modify those games - are using the same technology to produce fakes that are harder to detect. That then helps spread false narratives about unfolding global events and conflicts.
"AI technology is moving fast. Game dev studios are implementing it in their creative processes and so are modders," said Michael Cook, who specialises in AI in game design and development at King's College London.
This is a challenge for social media companies whose sites are used to publish the fake videos and spread false narratives.
"Even today we don't often notice the mistakes in AI-generated images when we quickly glance at them on our Facebook feed. It's going to get worse before it gets better," Cook told Context.
"Sadly, I think the result might be that people simply stop trusting what they see on the Internet altogether," he warned.
Arma 3 lets users extensively edit or 'mod' the game to include specific maps, weapons and vehicles without advanced technical skills. That makes it an easy way to create films and disseminate fake news.
"Finding and debunking these fake news videos is mainly a crowdsourced effort; one that involves our devs, Arma players and their related communities, and specialised fact-checking organisations," said Pavel Křižka, a spokesperson for Bohemia Interactive, which developed Arma 3.
But, he said, "even with the help of these professionals, there is no way to eliminate the problem."
In 2022, Arma was used to make a 36-minute video characterised as breaking news of the Russian invasion of Ukraine; a year before that, it was used to create a film of what was said to be a Pakistani air strike on Afghanistan.
Křižka said the company had been dealing with these types of videos since 2009. Moreover, developments in both game engines, the software that developers use to make games, and AI means the problem is likely to get worse, he said.
How to spot a human
The global video game industry is worth approximately $187.7 billion, with spending on consoles like Sony's PlayStation and Microsoft's Xbox devices, as well as the games themselves.
Currently, video games have obvious flaws that reveal their origins, Křižka explained.
Capturing natural-looking humans in motion is still very difficult, even for the most modern games, as is tracking how particles of fire, smoke and dust move after an explosion.
To combat this, malicious actors will pixelate footage taken from video games or film the screen with a smartphone - exaggerating the amount of camera shake - to suggest that the footage has been taken from real combat front lines.
Fact-checkers can still detect the origins of the original video, said Pippa Allen-Kinross, news and online editor at Full Fact, a London-based fact-checking organisation, but it may make it easier to initially spread the disinformation online.
However, as the technology used to build video games improves, the giveaways may be consigned to history as companies seek to create ever more realism in their next-generation games.
Stopping the spread
Meanwhile, social media companies have come under fire for what is seen as their failure to adequately tackle disinformation, an issue magnified by global conflicts.
Social media companies use AI to moderate content on their platforms, but the realism of video games has challenged their systems.
In 2019, when a mass shooting in Christchurch, New Zealand was broadcast via Facebook's livestreaming, Meta said its automatic detection system was not triggered because of a lack of user reports and its similarity to video game content.
"Another challenge is to automatically discern this content from visually similar, innocuous content – for example if thousands of videos from live-streamed video games are flagged by our systems, our reviewers could miss the important real-world videos where we could alert first responders to get help on the ground," Guy Rosen, Meta's then-vice president of product management, wrote in a blog post at the time.
"As video games develop and become more realistic it will become more difficult to distinguish them from real events," Allen-Kinross said.
The barrier to entry to generating disinformation will continue to lower through developments in the video game space and AI, but the greater underlying issue is social rather than technological, experts say.
"What makes mis- and disinformation effective is when it speaks to people's concerns, values and identities," said Sacha Altay, a research fellow at the Digital Democracy Lab at the University of Zurich.
"To do that you don't really need sophisticated technology, you just need to understand the current demand for specific narratives. That's what populist leaders do, they can just take the mic and scream."
(Reporting by Adam Smith; Editing by Jon Hemming.)
Context is powered by the Thomson Reuters Foundation Newsroom.
Our Standards: Thomson Reuters Trust Principles
Tags
- Disinformation and misinformation
- Online radicalisation
- Polarisation
- War and conflict