
Author: Serena Coppolino Perfumi The debate has become highly heated in recent times, but misinformation, fake news, and alternative facts have always existed. And we are still paying the price for the consequences of some of them.
The topic of fake news exploded when scholars pointed out the impact they had on recent political events like the 2016 presidential elections in the United States (Allcott & Gentzkow, 2017), and the Brexit referendum in the United Kingdom (DiFranzo & Gloria-Garcia, 2017). In both events fake news, often dealing with polarizing topics, are accounted responsible for leading voters to reinforce their political views, or in case of indecision, to make a radical choice.
In this very moment, fake news concerning immigrants are fueling xenophobia, and the support for far-right populist parties in Europe is growing. Even if the veracity of such news is later discredited, the effects result long lasting, as studies show that mere exposure is enough to increase their credibility (Pennycook & Rand, 2017).
However, looking back in history, we can see that misinformation is not a new phenomenon, especially in times of crisis.
Under the big construct of misinformation, we can find the newly coined term ”alternative facts”. Alt-facts analyze, describe or justify reality on the basis of fake and biased assumptions (Berghel, 2017).
An example could be scientific racism, which rooted its assumptions on the pseudo-scientific theories, which used to categorize human beings in ”inferior” and ”superior” races. This conception has also been reinforced throughout time by scientific studies built using culturally biased tools (Metress, 1975). Endorsing these theories nowadays, after they have been largely discredited by research, can be considered as ”relying on alt-facts”.
Fake news, instead, are invented stories fabricated and shared with a specific purpose (Berghel, 2017), either ideological or profit-related (Subramanian, 2017).
A historical example of fake news is a gruesome 1917 anti-German hoax created by British magazines, claiming that Germans had opened a ”factory” in which they were using human bodies to create products. The same piece of information resurfaced during the Second World War, but this time it was used as a shield by the Nazis to silence the accusations that had started circulating about the ongoing genocide: they claimed that they were being targeted once again by anti-German propaganda.
This hoax is considered by many scholars as partly responsible for an underestimation of the gravity of the situation in Germany, resulting also in a delay in the decision to take action against the Nazi regime (Neander & Marlin, 2010).
We can already see some recognizable patterns in these examples. Both alt-facts and fake news have been historically used either to justify some kind of structure, custom or attitude or to push and defend political ideologies, especially in critical times.
Nowadays, with the advent of the Internet, users are exposed to much more information, making the trustworthy sources selection more difficult. Evidence coming from cognitive psychology explains that in the cases in which human beings face too many alternatives, shortcuts named heuristics are likely to be employed, in order to save time and efforts. However, these strategies can be biased due to specific mechanisms affecting the information flow on social media platforms.
Despite what one might think, social media platforms can be rather closed environments, in which users interact mainly with friends and acquaintances, with whom they tend to share opinions, tastes, and values.
This ostracism turns these platforms into ”echo chambers”, in which only one opinion resonates over and over, leading to a reinforcement of beliefs (Quattrociocchi, Scala & Sunstein, 2016), and creating a filter bubble effect that limits the exposure to diverse standpoints (Abisheva, Garcia & Schweitzer, 2016).
Scholars, computer scientists, and institutions are now working on the possible solutions to face this problem.
For the general elections on March 4th, the Italian institutions worked with Facebook to actively fight the spread of political fake news with a fact-checking system. Sweden, that will host elections in September, is working on educational programs aiming to boost the users’ resilience and ability to analyze sources.
As it happened in the past, at this very historical moment, we can see all the ingredients that can lead fake news to cause significant damage to public opinion, attitudes, and political choices. But now they can travel faster.
Those who are mainly endangered by alt-facts and fake news and that, in the end, can make the actual difference are not those who already have rooted radical ideas. They still represent a minority. It is rather the big mass of undecided, unengaged users who are unsure about their political stance and values, and that therefore can shift easily from one side to the other.
The Internet can be a powerful democratic tool. One of its main accomplishments, I believe, has been this very feature, namely the ability to provide to anybody access to endless inputs and facts, tearing down borders and giving also to historically underrepresented voices the possibility to be heard. But this is of course also a double-edged sword. Fact-checking tools can be a useful in order to flag articles containing erroneous facts, but in my opinion, long-lasting results will be likely to be attained only by working on the users’ critical thinking and awareness on the peculiar dynamics that can occur in online environments. This work should be preventive, and implemented on a large scale in schools.
Another critical group is represented by the non-digital-natives population, and those who started using Internet later in life. They should be the target of an awareness program on the structure of online environments, how algorithms and search engines work, and be made conscious of the social influence-dynamics that can take place on social media platforms.
References
Abisheva, A., Garcia, D., & Schweitzer, F. (2016, May). When the filter bubble bursts: collective evaluation dynamics in online communities. In Proceedings of the 8th ACM Conference on Web Science (pp. 307-308). ACM.
Allcott, H., & Gentzkow, M. (2017). Social media and fake news in the 2016 election. Journal of Economic Perspectives, 31(2), 211-36.
Berghel, H. (2017). Alt-News and Post-Truths in the” Fake News” Era. Computer, 50(4), 110-114.
DiFranzo, D., & Gloria-Garcia, K. (2017). Filter bubbles and fake news. XRDS: Crossroads, The ACM Magazine for Students, 23(3), 32-35.
Metress, J. (1975). The Scientific Misuse of the Biological Concept of Race. The Social Studies, 66(3), 114-116.
Neander, J., & Marlin, R. (2010). Media and Propaganda: The Northcliffe Press and the Corpse Factory Story of World War I. Global Media Journal, 3(2), 67.
Pennycook, G., & Rand, D. G. (2017). Who falls for fake news? The roles of analytic thinking, motivated reasoning, political ideology, and bullshit receptivity.
Quattrociocchi, W., Scala, A., & Sunstein, C. R. (2016). Echo chambers on facebook.
Subramanian, S. (2017). Meet Macedonian Teens Who Mastered Fake News and Corrupted the US Election. Wired. com, 15.