Search
Close this search box.

Social media and online behaviour

Interview with Walter Quattrociocchi, Professor of Computer Science at the Sapienza University, Rome.

I first came across Walter Quattrociocchi’s name in 2016 due to an article published by Le Scienze, which described some recent results of his research on social media, in particular the process of disseminating misinformation on the web, showing that users are structured around ‘echo chambers’, in other words, groups that tend to select and share content regarding a specific kind of news, confirming their own world view, according to so-called confirmation bias. Quattrociocchi, 40, is now Professor of Computer Science at the Sapienza University in Rome where he runs the Center for Data Science and Complexity for Society. His research activity focuses on the data-driven analysis of complex systems. His research group has a Facebook page where news of scientific studies and meetings appear amidst cooking tips, with recipes for melanzane alla parmigiana – aubergines with Parmesan – and pasta alla carbonara – an apparently bizarre combination, but one that says a lot about our reactions to the content we expect to find in a specific container. His work on the dissemination of information has been used to provide information for the World Economic Forum’s Global Risk Report and also by high-ranking regulatory and institutional structures. He has recently served on several international and national technical panels, including the Hate Speech group of experts, and has played the role of coordinator of the Social and Economic Impact Group of the COVID Data Task Force on request by the Ministry of Innovation. He is regularly invited as a keynote speaker at international conferences and institutions. We contacted him to talk about social media, its interactions with the world of politics and the results of his latest studies on online behaviour.

In a recent interview, President Macron described social media as an element that “removes hierarchical structure from any subject” and therefore is opposed to any type of authority forming the basis for a social organisation such as democracy. Does studying social media, therefore, mean studying our society’s state of wellness?

The topic of the relationship between social media and democracy was already very hot in 2018, when the Facebook newsroom mentioned one of our studies in order to talk specifically about the effect of ‘echo chambers’ on democracy. I am not at all convinced that any society is based on authority. I think that authority is something that an individual accepts, and that it is not a prerogative. But I can certainly say that within our society, social media has made the information ecosystem uncontrollable, and this has caused many problems. There have been many interpretations of the phenomenon, but often without benefitting from the appropriate tools and skills. Studying social media means examining the greatest change that has taken place in our society over the last millennium or more. What our studies have highlighted is that people, in other words all of us, seek out, and interact with, information that best fits our world view; we ignore conflicting information and we tend to create tribes around shared narratives.

In the world of information and communications, the term ‘disintermediation’ is used, in other words the lack of traditional media that verifies information according to specific professional rules. Today, we also refer to infodemics, namely the spread of disinformation, comparing it to a pandemic. In one of your recent papers published in Scientific Reports, you looked at what happened on some social platforms regarding COVID19 in the initial period of the emergency, from 1 January to 14 February. Can you tell us what you observed?

The information ecosystem is out of control, because it is highly entropic. Access to information is disintermediated, and moreover, social media channels act as gatekeepers, precisely because they sift through the ocean of information and search for that which is most congenial to us, the users of social media. The infodemic, according to the WHO, is precisely the overabundance of information that occurs during a pandemic. We have all seen how difficult it was to understand what was going on, especially in the early stages of COVID-19. Uncertain information, rebounding from one publication to another, was accompanied by problems in understanding to which experts it was important and advisable to pay attention. For our research team, it was a perfect storm, ideal for studying the spread of information. A 2018 study claimed that fake news circulated faster than real information, but we were not convinced by this idea. In fact, what we found was that reliable and unreliable information circulate in the same way, even on different platforms. Another question we tried to answer was whether pandemic models were a good approximation of infodemic models, and the answer was no. The explanation for this is simple: you can choose information, while a virus does not offer this choice. It is a small difference that completely changes the dynamics of the two processes (and the possible solutions).

In your work you also suggest that the algorithms chosen by social media for interaction with users are the prime cause for the increasing spread of certain types of unverified content. In this case, is there an ‘original sin’ that we should always take into consideration when using social media?

The central focus of our work on social media is the business model that they have introduced. The platforms were created to sell advertising directed to users who spend time on the channels. So the aim of these tools is to maximise the time spent on the channels and to sell advertising that is as fine-grained as possible – even though techniques for targeting users are still often rather coarse-grained. No-one could have foreseen that these platforms would have become the major channel for utilising and sharing information, but we are all aware how that came about. It is no coincidence that the image with the most likes on Instagram is an egg and not a quote by an author.

You have published a paper on PNAS (Proceedings of the National Academy of Sciences) showing that Facebook and Twitter are the channels on which polarisations, so-called echo chambers, tend to form the most, so that people remain trapped within specific groups sharing the same view of the world, and are therefore shielded from information that conflicts with their own positions. Why is this something that we should consider as important?

I have been part of a number of government commissions in which I have had the opportunity to discuss the subject of echo chambers with other people, and I have seen that it caused a lot of debate; some people considered them as ‘outdated’, while others, especially in the world of sociology, were enthusiastic about the concept. In order to find a more convincing answer, my group and I, together with other colleagues, constructed an array of data to make in-depth analyses and compare the ‘echo chamber’ effect on various platforms. The results confirmed our hypotheses. In fact, the interesting aspect, something that we didn’t expect, is that this polarisation seems to be greater on platforms that make extensive use of feed algorithms, in other words algorithms that tend to privilege content in line with our own habits and those of our friends. In other words, platforms influence our behaviour and our diet of information.

Polarisation is an element that we also find in politics, leading to the emergence of new extremist groups and transforming the mechanism that makes the powerful opposition between the new and the establishment into the founding factor for populist movements. Could social media have contributed to the emergence of this polarisation in political and cultural life, or do they simply reflect what is going on in our society?

I think that it mirrors what is happening in society. Things are moving faster, and inertia means that new things struggle to take hold. Social media make this discrepancy even more obvious. Facebook and Twitter were good things in the days of the Arab Spring, but bad when there was the vote for Brexit and the one for Trump. They became good again when Trump was banned. In short, to return to a theme close to my heart, we interpret information in the way that we prefer. The important thing is that it must support our own narrative. This process makes everything far more volatile, and politics often struggles to keep up with the sudden changes in such a vibrant and chaotic social system.

 

Share

Sito in manutenzione

Website under maintenance