Guest Post: What's Not to Like? Social Media and the Capitol Hill Riots

Guest Post: What's Not to Like? Social Media and the Capitol Hill Riots

What's Not to Like? Social Media and the Capitol Hill Riot

The recent riots on Capitol Hill are the culmination of years of our obsession with social media and the free ride that it offers. How have we got to such a place where a country is so divided against itself and each side is convinced that they hold the truth? It’s not just America that has the problem, communities around the world are increasingly divided and views polarised by multiple versions of the ‘truth’.

 

Social media promises to be our friend, to connect us to the world – what’s not to like? Yet it all too easily ends up sucking us into a virtual world that separates us from reality. How is that possible, how can a civilised society be so gullible? It’s all down to the algorithms - the software programs, some like to call AI, that learn from your every word and click online. They are designed to draw you in by nudging, suggesting and filtering your news feed, all to the end of increasing what the marketeers call ‘engagement’. If Capitol Hill teaches us one thing, it is that these platforms are not really our friends! Neither, probably, are the large circle of virtual ‘friends’ that we have ended up connecting with.

 

On the surface the free connectivity that social media provides seems benign, but there is no such thing as a free lunch. The underlying business model that pays for your connecting to others, is advertising. This might seem irritating rather than corrupting, but the truth is that profits stem from increased user engagement and the algorithms used to achieve it don’t care if you are viewing fake news or are being radicalised in the process. This is the dark side of AI.

 

Based upon an understanding of human psychology, these algorithms learn from our online activity how to nudge us in directions that are likely to achieve greater engagement - leading us down rabbit holes, sucking us into someone else’s view of what is truth. The more we engage, the more we are sucked into a group with similar views, isolating us from other views - ultimately, cutting us off from objective reality. This social bubble and the news feeds that it generates, becomes our reality.

 

On the 6th January 2021, the result was violence and death on Capitol Hill and a fundamental challenge to the democratic process. All because one group has come to believe, through social media, that the election was stolen from them – yet with no real evidence to substantiate that belief. The actions of the CEOs, bowing to public pressure and turning off accounts, does nothing to stem this tide, rather it victimises the group who believe that free speech is being denied them, fuelling the polarisation and the conspiracy theories.

 

Many are calling for the social media giants to take responsibility for what they publish but this is to miss the point. These platforms are not traditional online publishing houses, they are advertising platforms, offering an alluring and free connection to the world, simply to sell us other peoples’ products. They are agnostic about what people say or believe. All that matters is that users spend more time online and get to see enough targeted adverts to justify the funding provided by advertisers to the platform vendor. It’s not the CEOs who control these platforms, even though the grandstanding gestures of shutting down Donald Trump’s accounts in the aftermath of Capitol Hill, might lead us to believe that they do.


It’s the algorithms that are in control, determining what we will see in our news feeds, nudging, suggesting and manipulating us to spend more time connecting with others. The very nature of this virtual world has other negative impacts on our humanity, where we compete for the attention of others, leading to a constant desire to shape our posts in order to be ‘liked’ and to garner more friends. This pressure in younger people can lead to isolation, depression and despair. 

 

Social media platforms are too easily exploited by people who have a malign intent or those that want to gather a crowd around their own ideology. Whether it’s helping people to commit suicide, fomenting terrorist acts or influencing elections, these individuals or groups are simply using the platforms in the knowledge that they will do much of the work for them. Nothing has been hacked, the platforms themselves provide the tools, designed to manipulate and exploit our human nature by appealing to our vulnerabilities and desires. It’s a short step to the creation of, ever growing clusters of users who participate in the propagation of ideologies, conspiracy theories and fake news.

 

Amongst all these users, there are of course many innocent and genuine exchanges, whether between family members and friends or even much larger groups of ‘friends’ who clearly can’t all know each other. Many aware users will seek to avoid being manipulated and sucked down rabbit holes, post unfair or even damning comments about others in or outside their group.

 

Anyone’s voice can be heard and be amplified through the clustering of adherents on social media, so how do we discern where the truth lies? The recent US election illustrates the problem all to clearly, with claims and counter claims about election fraud and irregularities, all supported and amplified through social media. The danger is that we become polarised, not by the facts, but by the perception that one particular group has the truth. Often this is because their viewpoint plays into a discontent or desire for another outcome. Social media propagates and amplifies crowd speak – I saw it on social media, so it must be true. This trend has not been helped by the shift that has occurred in mainstream news media, from just reporting verified facts, to a greater emphasis on analysis and the opinions of pundits and celebrities.

 

The riots on Capitol Hill are likely to increase the desire of governments around the world to determine what views are legitimate, what can be said publicly – even within our homes. We will hear of increased pressure on companies, even legislation from governments, to force social media companies to censure the content of their platforms. It’s a dangerous precedent to allow CEOs or government to control what we post, what we read, or who can have an account, because it will be the death of free speech. These platforms cannot operate in the same way that an online news agency might, selecting its journalists and editing its news feeds. There has to be another way.

 

As a society we are losing the ability to hear and debate different views and to discern fact from fiction – or fake news from genuine. Part of this is down to the virtual nature of social media, where people feel insulated from the real person and entitled to express their views about them, often rudely and aggressively, views that they would be reticent to express were they standing in front of them. There is no face-to-face challenge, opportunity for rebuttal and examination of the facts. Witness the countless twitter storms that ensue, often from innocent comments taken out of context, that de-platform speakers, cause people to be fired or pressurised to resign.

 

We have lost the art of debate and negotiation that can only genuinely take place when we are physically in the same place in ones and twos or small groups. The virtual world has taken this away whilst deluding us into thinking that we are more connected to each other than ever before – it’s a lie!

 

Social media insulates us from the patience and commitment that is required of real and authentic relationships. It insulates us from the messiness that is a natural part of genuine relationships, forged over time, with the inevitable disagreements and misunderstandings, that are a result of our fallen nature. We are called to love our neighbour as ourselves, to prefer others, rather than putting ourselves and our version of truth on a pedestal at the expense of others. Social media feeds our egos, exploits the darker side of us, thus amplifying our fallen nature, rather than encouraging the putting off of self and our transformation by the renewal of our minds. These virtual worlds that we inhabit are gradually destroying our soul – the real me that lurks inside.

 

So, what are we to do? I believe that the fundamental problem lies in the business model that intentionally uses manipulative algorithms to drive profits from advertising, that underwrite our free use of the platform. Now that half the world is on social media and uptake is plateauing, engagement is the thing. This requires the modelling of user data by algorithms that are largely agnostic to the content but, as we have already pointed out, are fine tuned to exploit your vulnerabilities. Of course, vendors claim that algorithms can be trained to spot fake news or dangerous content and to a degree this is possible, but its not the solution. Not least for the reasons outlined earlier, who will decide what can be spoken of.

 

Fundamentally, the platform and its algorithms are the problem. Social media provides a free and simple platform for anyone to connect and grow a community, using the well-established techniques of nudging and engagement – whether they realise it or not. Perhaps little harm results where these groups are truly benign but the Capitol Hill riots demonstrate the radicalisation, destabilising and dangerous impact on society at large, that can occur because of the very existence of social media platforms.

 

To remove the algorithms themselves might seem to offer a solution, but it is unlikely to be accepted by advertisers wanting ever more precise targeting, nor the platforms owners wanting to increase profits. Breaking up the large companies won’t solve the problem either, if the business model and the algorithms remain intact. Making them responsible for the content is impractical and simply creates another problem for free speech.

 

Perhaps the only safe way forward is to move to a paid-for model of social media where data is kept private to the user and without any algorithms, nudging and feeding us with what drives our engagement and purchases online. Of course, we could all just turn off our accounts and engage with people face to face! [safely, depending on COVID regulations in your area!]



Jeremy Peckham's new book, Masters or Slaves? AI and the Future of Humanity is one of IVP's January 2021 Releases. This guest blog post is Jeremy's own, and is shared here with permission.