![]() ![]() Importantly, recent scholarship suggests that false news spreads faster and deeper than true information (Vosoughi, Roy, and Aral, 2018). In particular, the continued influence effect of misinformation suggests that corrections are often ineffective as people continue to rely on debunked falsehoods (Nyhan and Reifler, 2010 Lewandowsky et al., 2012). However, decades of research on human cognition finds that misinformation is not easily corrected. Presidential elections, and rumours circulating in Sweden about the country’s cooperation with NATO (Kragh and Åsberg, 2017).Ī broad array of solutions have been proposed, ranging from making digital media literacy part of school curricula (Council of Europe, 2017 Select Committee on Communications, 2017), to the automated verification of rumours using machine learning algorithms (Vosoughi, Mohsenvand, and Roy, 2017) to conducting fact-checks in real-time (Bode and Vraga, 2015 Sethi, 2017). Recent examples of influential misinformation campaigns include the MacronLeaks during the French presidential elections in 2017 (Ferrara, 2017), the PizzaGate controversy during the 2016 U.S. Some of these bots are used to spread political misinformation, especially during election campaigns. For example, recent estimates suggest that about 47 million Twitter accounts (~15%) are bots (Varol et al., 2017). Social media platforms have proven to be a particularly fertile breeding ground for online misinformation. For example, false kidnapping rumours on WhatsApp have led to mob lynchings in India (BBC News, 2018a Phartiyal, Patnaik, and Ingram, 2018). ![]() In some countries, the rapid spread of online misinformation is posing an additional, physical danger, sometimes leading to injury and even death. The rapid spread of “fake news” and online misinformation is a growing threat to the democratic process (Lewandowsky, Ecker, and Cook, 2017 van der Linden, et al., 2017a Iyengar and Massey, 2018), and can have serious consequences for evidence-based decision making on a variety of societal issues, ranging from climate change and vaccinations to international relations (Poland and Spier, 2010 van der Linden, 2017 van der Linden et al., 2017b Lazer et al., 2018). We provide initial evidence that people’s ability to spot and resist misinformation improves after gameplay, irrespective of education, age, political ideology, and cognitive style. We conducted a large-scale evaluation of the game with N = 15,000 participants in a pre-post gameplay design. The game draws on an inoculation metaphor, where preemptively exposing, warning, and familiarising people with the strategies used in the production of fake news helps confer cognitive immunity when exposed to real misinformation. In the game, players take on the role of a fake news producer and learn to master six documented techniques commonly used in the production of misinformation: polarisation, invoking emotions, spreading conspiracy theories, trolling people online, deflecting blame, and impersonating fake accounts. In a novel attempt to address this issue, we designed a psychological intervention in the form of an online browser game. The spread of online misinformation poses serious challenges to societies worldwide. ![]()
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |