An online video game asking Internet users to create and distribute propaganda helps them better detect misinformation after only 15 minutes of play, a new study found.
Bad News, a browser game developed last year by a research team at Cambridge University, allows players to slip into the role of the “bad guy” by creating and sharing misleading information about line. Participants can, among other things, create a replica of a politician social network account or generate conspiracy theories.
In the three months following the launch of the game, the researchers offered the option to the players to also take part in a study asking them to grant a reliability score to various news headlines. In this study, some titles were derived from reliable sources of information while others were misinformation.
After playing the game for 15 minutes, players gave an average reliability score 21% lower than misinformation before playing, according to the results of the study published last week in Palgrave Communications (New window) , a peer-reviewed academic journal. The ranking of actual news articles was not influenced by the game session.
“We have shown moderate effects – I would not call them huge – but they are still moderate, very robust, persistent and statistically significant,” said Sander van der Linden, co-author of the study and director of the study. Cambridge Social Decision-Making Lab. “In fact, it surprised me rather and I found it encouraging.”
Real tactics
In Bad News, a game jointly designed by the Dutch media collective DROG and the design agency Gusmanson, players set up an online propaganda empire.
They use different tactics, such as fanning the fears of Internet users or manipulating the polarization of opinions, to establish their credibility and attract more subscribers, which earns them medals. The researchers say they chose these tactics based on examples of real strategies used by misinformation networks.
Creating content that harnesses the emotions of readers is an example of tactics to gain a medal in the game. This is a method that Jestin Coler, who once administered misinformation sites for the purpose of making money (New window) , claimed to have already used.
“Stories try to trigger an emotional response to get people to share it, ” Coler wrote (New Window) last year. This emotional response may be one of hope, inspiration, anger, fear, etc., but the goal is to share. Reaching a reader is nice, but reaching this reader and his hundreds of contacts is even better.”
By exposing players to these tactics as part of a game, researchers hoped to be able to protect them from misinformation in everyday life, and the results of the study suggest that the bet worked.
Like a magic show
The study still has some limitations. First, the participants were not chosen by scientists. They simply stumbled upon the game by browsing online.
Second, they were aware of the purpose of the game, which mentally prepared them for misinformation.
“If people know they’re going to have to find deception, they’re going to be paying much more attention [to content] in an environment like this game than they would in general,” said co-author Jon Roozenbeek. of the study and researcher at Cambridge.
Despite this, according to Roozenbeek, the difference measured in the study shows that the game still has an effect that can be translated outside the game and help people to more easily detect disinformation in the world real.
The researchers have previously made a connection between this game and a magic show: once the magician has revealed how the trick works, spectators can no longer be fooled.