Social media: how to pop the bubble?

Lees verder

How self-evident is the proper functioning of our democracy in 2020? We live in a digital world where social media is prominently present, the polarization of society is lurking, and we are subtly nudged by algorithms. This article will discuss questions such as: to what extent is your opinion distorted by social media algorithms? Does this result in the dysfunction of a democracy? To what extent can you resist this? Are there legal means to diminish these negative effects of social media?

“Democracy cannot succeed unless those who express their choice are prepared to choose wisely. The real safeguard of democracy, therefore, is education. To prepare each citizen to choose wisely and to enable him to choose freely are paramount functions of the schools in a democracy.” – Franklin D. Roosevelt.

Democracy is one of the core values of the European Union.1 But what is left of this religion we call democracy, which we have so sacred? And where is it headed? This article considers the extent to which the core values of democracy are affected by the increasing degree of information gathering, and thus the formation of an opinion, via social networks.

We will discuss questions such as: to what extent is your opinion distorted by social media algorithms? Does this result in the dysfunction of a democracy? To what extent can you resist this? Are there legal means to diminish these negative effects of social media?

What rights inherent to a democracy are at stake
First, I will establish which aspects of democracy are at stake concerning the use of social media. In a democracy, there is the right to self-determination and freedom of speech and information. These rights imply the freedom to hold opinions and to receive and impart information and ideas.2 As a millennial, I hold these rights to be self-evident. But how self-evident are they really? Is it still your own opinion when algorithms determine what you read?

Social media’s effect on decision-making: algorithms rule the waves!
Social media companies make a profit for every second you spend on their platform.3 At the start, this was just an innocent way to make money. Meanwhile, this has grown into a platform to which people become addicted. A study of the University of Chicago even concluded that social media addiction can be stronger than the addiction to cigarettes or alcohol.4 In this regard, it’s concerning – which may be a euphemism – that these platforms have designed advanced algorithms that use the users prior conduct and thus their consciously or unconsciously expressed preferences, interests and biases, and use that to offer a personalized newsfeed.5 This way, our decision-making does not occur independently. The judgments and decisions are predetermined by algorithms.6 This is what is called: the filter bubble.7

How does this affect a democracy?
Then why is having a filter bubble a bad thing? Well, that is because it creates so-called ‘echo chambers’: we assume that everyone thinks like us, and we forget that other perspectives exist. In an echo chamber, you only come across the kind of information that you have already looked up yourself and want to come across. This encourages a polarized society and detracts from social cohesion. The consequences of the filter bubble become more problematic if the number of people that gather information from social media increases. About 95% of people up to the age of 55 in the Netherlands use social media (figure 1). This increases the chance that they gather news or other information from social media as well. About two-thirds of American adults (68%) say they at least occasionally get news on social media.8

Figure 1: social media use in the Netherlands9

‘But what is left of this religion we call democracy, which we have so sacred? And where is it headed?’

 

“a centralized system of technocratic behavioral and social control using a super-intelligent information system would result in a new form of dictatorship.”

The way of gathering information through personalized information could be seen as a prison for our thinking.10 Delbing e.a. state that “a centralized system of technocratic behavioral and social control using a super-intelligent information system would result in a new form of dictatorship.”11 After all, the definition of a dictatorship is that one person or one group possesses absolute power without effective constitutional limitations.12 During elections, the group that controls this super-intelligent technology can win elections by nudging themselves to power.13 When we project this scenario to the US elections 2020, we notice that this causes social polarization resulting in groups that rather ‘cancel’ the other party than have a constructive conversation about the matters at hand.14 This happens so slowly and unobtrusively, that little attention is paid to it.

It’s easy to ignore this problem and think it’s not too bad, precisely because it happens so unobtrusively. However, we must remain alert to this. To quote Schimmelpenninck, “If you are still naive about the woes of social media, you should watch the Netflix documentary The Social Dilemma.”15

“If you are still naive about the woes of social media, you should watch the Netflix documentary The Social Dilemma.”

Can we slow down these problems with legal means?
Democracy is based on collective intelligence. Collective intelligence requires a high degree of diversity. This is, however, being reduced by today’s personalized information systems, which reinforce trends.16 These algorithms operate for the wrong purpose and are managed too centrally to have positive effects on a democracy. We are at the cradle of a digital revolution – at a crossroad so to speak. Which way we are headed is up to us.

We need the legislators’ help to implement certain guidelines on which this new digital democracy can be built. The technologies that potentially affect our opinion must have built-in guarantees that facilitate self-determination. Not only theoretical but also practical. Sophisticated systems could help by considering multiple criteria to ensure the quality of information on which we base our decisions. Therefore, the applicable algorithms should be selectable and configurable by the user. This opens the diversity and perspectives, and we would be less prone to manipulation by distorted information.17 This system can only work if, in the absence of this built-in guarantee, a complaint procedure is open, followed by possible sanctions in the event of a breach.

Conclusion: keep faith in journalism
Is there yet freedom to choose – as cited by Roosevelt and as we guarantee in de European Union – or is this becoming an empty shell due to social media? The theory of Enlightenment of Immanuel Kant is relevant here: “the right of individual self-development can only be exercised by those who have control over their lives, which presupposes informational self-determination.”18 Whether that is still the case, can be disputed.

From a legislative perspective, there must be more safeguards to ensure that information diversity remains high and that people can make effective choices in the information they consume.

From a more individual perspective, we must look for information outside the channels that are offered by social media algorithms. This said, this is not a plea that you should be suspicious of all the news you read. Just be aware that such algorithms exist and try to find facts from newspapers and news sites written from different perspectives. Be active in gathering information, be critical: but keep faith in journalism! Only in this way can we maintain the democracy on which our foundational values are based.

‘Be active in gathering information, be critical: but keep faith in journalism! Only in this way can we maintain the democracy on which our foundational values are based.’

Marijn Geurts

Voetnoten

1. ‘Europa en Democratie’, www.europa-nu.nl/europa_en_democratie.

2. Article 10 ECHR.

3. P. Eavis, ‘How You’re Making Facebook a Money Machine’, New York Times 29 april 2016; J.N. Sheth, Social Media Marketing: Emerging Concepts and Applications, Sheth p. 3-5 & 13-14.

4. Hofmann e.a. 2015, ‘Desire and Desire Regulation’, p. 64-65; Compare: Wilhelm, H. Vohs & K.R. Baumeister, ‘What People Desire, Feel Conflicted About, and Try to Resist in Everyday Life’, Psychological Science 2012 23(6), 582–588, https://doi.org/10.1177/0956797612437426; Compare: T. Ayeni 2019, ‘Social Media Addiction: Symptoms And Way Forward’, The American Journal of Interdisciplinary Innovations and Research, volume 1, issue 4, p. 19-22.

5. Filter Bubble and human rights, p. p-6; Filter Bubbles and Targeted Advertising, edited by York Times Editorial Staff New, Rosen Publishing Group: 2019, p. 107; These algorithms also get more advanced, because they are self-learning.

6. D. Helbing e.a., ‘Will Democracy Survive Big Data and Artificial Intelligence?’ Scientific American, 25 February 2017.

7. E. Pariser, The filter bubble: what the internet is hiding from you, Viking, London: 2011.

8. News Use Across Social Media Platforms 2018, Pew Research Centre (10 september 2018).

9. CBS, ‘Steeds meer ouderen maken gebruik van sociale media’, 20 January 2020.

10. D. Helbing e.a., ‘Will Democracy Survive Big Data and Artificial Intelligence?’ Scientific American, 25 February 2017.

11. D. Helbing e.a., ‘Will Democracy Survive Big Data and Artificial Intelligence?’ Scientific American, 25 February 2017.

12. The Editors of Encyclopaedia Britannica, ‘Dictatorship’, Encyclopædia Britannica, 21 May 2020.

13. D. Helbing e.a., ‘Will Democracy Survive Big Data and Artificial Intelligence?’ Scientific American, 25 februari 2017; J. Lepore, ‘The Hacking of America’, New York Times 20 September 2018.

14. Filter Bubbles and Targeted Advertising, edited by York Times Editorial Staff New, Rosen Publishing Group: 2019, p. 98 & 108.

15. S. Schimmelpenninck, ‘Wie nog steeds naïef is over de ellende van sociale media: kijk de Netflix-documentaire The social dilemma’, Volkskrant 13 september 2020.

16. D. Helbing e.a., ‘Will Democracy Survive Big Data and Artificial Intelligence?’ Scientific American, 25 February 2017; Filter Bubbles and Targeted Advertising, edited by York Times Editorial Staff New, Rosen Publishing Group: 2019, p. 110.

17. D. Helbing e.a., ‘Will Democracy Survive Big Data and Artificial Intelligence?’ Scientific American, 25 February 2017.

18. Immanuel Kant, ‘Beantwortung der Frage: Was ist Aufklärung?’ (1784) 12 Berlinische Monatsschrift 481, Translated by Mary C. Smith.

Video
Delen

Uw naam

E-mail

Naam ontvanger

E-mail adres ontvanger

Uw bericht

Verstuur

Share

E-mail

Facebook

Twitter

LinkedIn

Contact

Verstuur

Aanmelden

Meld aan