Split Screen is an interesting experiment carried out by The Markup that visualizes the radical difference between the way the news to which US citizens who voted for Joe Biden and those who voted for Donald Trump are exposed on Facebook, and how the political environment on social networks essentially polarizes people and reinforces the echo chamber effect.
The phenomenon is sometimes particularly evident: anyone who belongs to different groups on the social networks such as family, school friends, special interests or hobbies, will probably have experienced a sense that each was a parallel reality where information and references come from diametrically opposed spheres, or in which the levels of usage differ greatly.
At this point, my impression is that trying to discuss the role of social networks in polarizing people around issues makes little sense because every social network, by definition, tries to maximize its stickiness, the permanence and level of engagement of its users, because its revenues depend on it. This addiction-like quest for engagement at all costs is achieved firstly through algorithms that evaluate our behavior to maximize the type of content that, on previous occasions, has led us to react at different levels, from viewing to clicking or commenting, and secondly, by taking into account the behavior of our network of contacts, trying to optimize, based on the principle of birds of a feather flock together, the same content that has caused your contacts to react.
This all generates social dynamics whereby the most radical member of the community tends to gain prestige and visibility, to become a reference. Sharing is no longer about what you believe: we share because it reinforces our beliefs, because it justifies our own radicalization, becoming just another element in a supposed battle, an argument in a hypothetical discussion.
The result is what we all know: we are increasingly attracted to content based on certain interpretations and we keep commenting on that content and confirming our biases with our network, until we have become radicalized, cheered on by our friends in peer pressure mode, and buying plane tickets to storm the Capitol, with the idea of taking pictures and videos and further increasing our prestige in front of the group. Sure, those people were not exactly innocents forced to commit a crime, but to doubt that their social environment has influenced their behavior is to deny reality. Facebook is to all intents the best-selling newspaper in some countries, and its editorial line is an amorphous set of beliefs that each of us can choose from and that eventually radicalizes people. The Markup experiment is clear proof of this.
MORE FOR YOU
The result of these recommendation systems is, on the one hand, to reinforce our biases and, on the other hand, the belief that, in addition, these biases are representative of wider society, or of a very significant part of it, which is perceived as being in opposition to another part that is “the enemy”. If that isn’t spreading division in society, I don’t know what is. The proof is clear and evident: the massive adoption of social networks runs in parallel to a growing polarization in society, to the point that it is now being widely leveraged by any number of organizations or interests.
As long as we pretend that this increased polarization of society and these echo chambers are just some byproduct that social networks have no responsibility for, rather than something that they themselves have completely consciously provoked and from which they obviously benefit, we won’t be able to do anything about correcting it.
The thing is, it doesn’t have to be this way: polarization and echo chambers have always existed, but we have never, as a society, dedicated ourselves to fostering and encouraging them in such a way. And we have known for quite some time now that this level of division can bring absolutely no good.