Five Stars? Saving Our Souls on Social Media

The following is a transcript of a talk given by Gerry Lynch, Director of Communications for the Diocese of Salisbury, and strategic communications consultant to Sarum College, as part of the ‘Aspects of Spirituality’ series of talks at Wimborne Minster, 15th November 2017.

It’s hard to avoid the reality that social media is changing the way we relate to one another, changing how we experience and process the news, and changing our culture.

Just this week, the news has been dominated by allegations of Russian interference in the elections that produced the two greatest political shocks in the Western world for decades: the referendum on EU membership here in the UK and the election of Donald Trump as President of the United States.

I will look at how social media might be affecting our political culture. But I’ll spend more time examining whether or not social media is changing who we are. Is it making us more narcissistic, less patient, and less tolerant of other opinions, as is often claimed? Is social media corroding our souls?

As well as looking at present issues, I’ll put them in a longer historical context, looking at previous revolutions in communications and the changes they’ve unleashed. But first let’s turn to fiction – often the deliverer of deeper truths than cold hard facts, and that is probably even more the case in this era of paid-for fake news.

For I Have Seen the Future: On Netflix

Black Mirror’s Nosedive: (C) Netflix. Image used for criticism or review.

Imagine a near future world where everyone, you included, can rate one another instantly out of 5 stars using a mobile phone. Imagine, more than that, that everyone’s aggregate scores are instantly visible using special contact lenses. Everyone you meet can give you a rating, and you can return the favour – or the insult.

Imagine further that these scores are legally recognised, essential to daily life, and that real world rights and privileges are governed by those ratings: the best neighbourhoods and fanciest golf clubs are open only to the highly rated. Those with low ratings are widely regarded as anti-social, and can even fall foul of the law.

This was the premise brilliantly satirised in Nosedive, a 2016 episode of Charlie Brooker’s dystopian Black Mirror drama series. Set in a near-future world that is imprisoned in a social media panopticon, Lacie is a young woman on the make, a terrible social climber desperate to boost her rating from its current 4.2 to the 4.5 necessary for upper-middle class status.

In what the Daily Telegraph’s reviewer described a “pastel-coloured nightmare”, people fixate on their social media presence to the point of insanity. Overtly aggressive behaviour is almost unknown; so are racism, sexism and homophobia. All come with instant and debilitating social penalties.

Instead, passive aggression rules the day. Afraid of showing the slightest sign of unsocial behaviour, ‘respectable’ people in public are obsequiously nice and utterly false. The well-heeled spend their lives bitchily manœuvring to one-up each another, and most other people desperately struggle to join them.

Only those who refuse to join the rat race, whatever the social disadvantages they might incur in the process, are truly happy, and for many, liberation only comes after a brutal crash to zero rating: like an alcoholic hitting rock bottom.

Like all science-fiction dystopias, the power of Nosedive is precisely because it works on two levels. It presents a frightening picture of what our future might look like, but it also satirises the present and, indeed, the enduringly flawed nature of the human condition.

Social climbing is nothing new, nor is the tendency of elite groups to keep outsiders at bay through arcane codes of social behaviour. We’ve long been used to measuring our status numerically, through money, or with other over-simplified concepts, like class.

Yet, some fundamental aspects of human relationships have changed in the decade that we’ve been living with social media. There are genuine grounds for concern about how this technology is eroding the boundaries between public and private life, and between work and leisure. In this regard, it isn’t social media on its own that is worrying, but the explosion of mobile phones and small, light, tablets that make it perhaps too easy for us to carry our social media lives with us everywhere – even to bed and to the loo.

There are also legitimate concerns about how it might make it far too easy for an authoritarian régime to spy on us. Other worries include that it is shortening our attention spans and making us less capable of complex thought or, as I have already mentioned, that it is eroding democracy.

Is social media a threat to our souls? It certainly could be. But is it something that is necessarily negative? We’ll come to that. But first, let’s set the current revolution in a longer historical context.

Communications Revolutions Drive Other Revolutions

Johannes Gutenberg in a 16th-century copper engraving

It’s appropriate that this talk is taking place within a few weeks of the 500th anniversary of the start of the Reformation. For it was a revolution in communications technology – Gutenberg’s introduction of movable type into Europe in the middle of the 15th Century – that was a major driver of that great disruption in what had been a seemingly imperturbable Western Catholic order.

There are significant parallels with the emergence of the internet, which might be worth keeping in mind this evening.

Firstly, the Printing Revolution, as it is often called, broke up what had been concentrations of power over the dissemination of information. Until that point, manuscripts were the only way of disseminating large amounts of information over distance. Manuscripts required intense, lengthy, labour by highly skilled scribes, and therefore were extraordinarily expensive.

The Church was one of the few institutions capable of producing manuscripts on a large scale; monasteries were perhaps the biggest source of scribes.

The end of that monopoly was a driver – and only one driver – of the Reformation. Especially on these islands, we have tended to absorb a rather whiggish view of this process: that once materially able to do so, people read the Bible for themselves, discovered the Church had been lying about the contents, and ushered in a new era of free inquiry, real faith, progressive enlightenment and, ultimately, secularism.

In reality, things were a lot more complex than that: scientific enquiry had flourished at deeply ecclesiastical scholastic universities from Oxford to Kraków to Bologna long before the Reformation. Much was lost as well as gained at the Reformation, among Catholics and Protestants alike.

Rather than get into that tonight, let us note this breaking apart of a narrow oligarchy that had enormous control over communications ultimately destabilised the most powerful institutions of the day, and led to a new political order and new paradigms of seeing the world. Not just among Protestants: the Roman Catholicism that emerged after the Council of Trent was profoundly different to that which had existed before Luther.

A second major change is less well celebrated. In the days of manuscripts, every book was unique. Even when it was of the same text and produced by the same scribe, there were always variations from one work to the next. The printing process ensured, in contrast, that the same information fell on the same pages. So, for the first time, page numbering, tables of contents, and indices became common, though they previously had not been unknown.

Indexing led to more powerful ways of scientific and philosophical enquiry, allowing different authors writing on the same topic to be cross-referenced easily. True reference works and new syntheses of ideas became possible.

The third key change I want to highlight from the Printing Revolution, and this is one that is very counterintuitive, was the decline of Latin as the language of most published works.

Manuscripts were so expensive that they were only ever going to be read by a tiny intellectual élite: and knowledge of Latin was a prerequisite for membership of this élite. Although on one level, this was exclusive, it also meant a common intellectual and ecclesiastical culture permeated Western Christendom. A literate monk up the road in Cerne Abbas could easily communicate with counterparts from Portugal to Finland.

Printed books were cheaper – not exactly cheap, but more within the reach of the burgeoning urban educated classes of medieval Europe. So books came to be produced in the vernacular language of each area. New Bible translations meant some languages were written down for the first time, like Finnish and Slovene. At the same time, dialects within a particular country began to be standardised, especially in Protestant countries, through the impact of standardised Bible versions. Think of the influence of the Authorised Version of 1611 in this country or Luther’s Bible of 1545 in what would become Germany.

Indeed, some languages were differentiated from one another clearly for the first time only as a result of the impact of printing, and especially of Bible translations: German and Dutch are the most obvious examples.

So, this universal technology broke up an old universal order, differentiated countries from one another, but at the same time began to homogenise cultures within territory where each language was spoken.

Ultimately, printing drove wars too. As we all know, as the old order died, impassioned debate gave way to power struggles and then to some of the most ghastly conflicts in human history, which are still used to argue against Christianity to this day.

Let’s fast forward four centuries, and look at what happened when the cinema took the world by storm in the early part of the 20th Century. For the first time, moving images could be recorded and transported to every town and village in the world. And all of these moving images were, at that point, carefully curated and edited for a reason: film was too expensive and lighting too difficult for any other possibility.

Eisenstein’s masterpiece, Battleship Potëmkin (USSR, 1925).

As a propaganda tool it was unmatched, and it seemed to be the most authoritarian régimes that saw its potential most quickly. It’s not credible to argue that cinema played a role in destroying the monarchical ancient régimes that were swept away by the First World War; the role that it played in sustaining the fascist and communist régimes that mostly replaced them is, however, very clear.

Genius directors like Leni Riefenstahl and Sergei Eisenstein hailed their governments as bringers of order, liberators of the poor, and overthrowers of tyrants.

Radio too, played an important part in spreading ideas in a new form that people found compelling. Here, perhaps the liberal democracies were quickest to the mark, with the BBC establishing a global reputation and the fireside chats of Franklin D Roosevelt marking him out as the great communicator of his time.

Hollywood would eventually turn cinema into a vehicle for spreading American values worldwide, but only after World War Two, another conflagration that would set new standards of human brutality.

Cinema and radio would, after that war, finally be a clear driver of the collapse of an old empire – in this case the old European empires in the developing world and rulers of independent states who were too close to them. Movies like the Battle of Algiers celebrated and spread from country to country a vision of the implacable force, courage, and unwillingness to compromise of Third World liberation movements. Little clandestine radio stations from Cuba to Vietnam argued for liberation from foreign oppressors and the domestic ruling class alike. The world of 1964 was practically a different universe from that of 1914.

What is interesting about both these examples is that the initial disruption quite rapidly died down into a new and more stable order. While many institutions were destroyed and those that survived had to change radically in order to do so, out of the initial chaos new and often enduring institutions emerged, whether the Anabaptists or a united Communist Vietnam.

I’ll posit a tentative theory here: when a new communications technology emerges, the first generations that encounter it find it difficult to process. In particular, the new way of presenting information is so compelling and authoritative that people find it difficult to use their critical faculties well. They find it hard to tell propaganda and exaggeration apart from fact.

As people gain experience of the new technology, especially once a new generation comes along who have never known anything else, they become much cannier about understanding who is attempting to manipulate them and how. Of course, propagandists and advertisers develop too, but a technology never quite has the same power to move people as it does in its first generation or two. And then, inevitably, a new order beds down into stability.

The Internet Revolution

And so we come the communications revolution of our own time, which began a few months after the first man landed on the moon, in October 1969, when two university computer labs in California were connected by a live telephone link.

Connections developed slowly at first, at first only in the United States and for a long time only with much penetration among students and academics. There were some remarkable experiments in stand-alone networks for private customers, notably in 1980s France, but the internet as we know it started to emerge in the mid-1990s.

In 1994, in the summer holiday between my Sixth Form years, a friend who was a PhD student sneaked me into a computer lab in the library of the Queen’s University of Belfast, and I had my first encounter with the internet. At that point, it was mostly text only, a connections were so slow that the few low-resolution photos that were used would load line-by-line.

Yet, by the end of that decade, the internet had become ubiquitous.

The change we are concerned with today, social media, really emerged in the mid-2000s. Before that, even the internet had been a relatively top-down place. Sure, a few nerdy, mostly very young, people had their own websites on which they wrote about their favourite hobbies or their political views – I know, because I was one. But people needed to know at least a little computer code to do that effectively. Most traffic on the internet was still concentrated on a relatively small number of professionally produced sites, often owned by big corporations like the BBC or Time Warner Entertainment.

When platforms like Facebook emerged in the mid-2000s, they did two things that made it possible for people to create and share their own content easily. Firstly, they provided everyone with a common, easy to use, and relatively attractive platform with which to share their words, photos and movies.

Secondly, they made it easy for people who knew one another from other spheres to connect their social media pages on the internet – that’s why the word ‘social’ is used.

In not much more than a decade, a majority of adults in the world connects to some social media network or another on any given day.

Facebook is by far the biggest network, with just over a billion individual users active daily, including 30 million in the UK. Twitter comes next; 20 million people in the UK tweet at least once per month, and 328 million worldwide. Instagram is catching up fast, with a heavily female user base and 14 million accounts in the UK active daily. And then there’s Pinterest, Snapchat, and the rest of them. I even know someone who got a job through LinkedIn, once.

And in China, behind the Great Firewall, they have QQ and Weibo; in Russia, they can and do use the same networks we do, but V-Kontakte takes pride of place.

Despite what you read in the papers some time, there is no sign yet of a great abandonment of social media by young people. Facebook is used almost universally, although young people are cautious about what they say on a network where they are expected to use their real names, and on which they’re probably connected with their Mum and, these days, their granny. Semi-anonymous networks with time-expiring posts are what young people use to share those photos they’d rather not let any adult see!

And as I hinted at earlier, we can detach the explosive growth of the internet with the emergence of smart phones, and to some extent tablets and light laptops, which allowed people to take the internet with them everywhere.

What do young couples in the 2010s do in bed at the end of a long day? Well, he watches science fiction movies on his tablet, while she’s on her phone, looking at food photos on Pinterest.

A Threat to Our Souls?

Let me go back to my original bold question – is social media, turbo-charged by smartphones, a threat to our souls?

For individuals, it certainly could be. For humanity collectively, there are real risks. It isn’t all doom and gloom, however, and I want to make that abundantly clear. In recent years, Facebook is where people have wept with me at times of great pain and loss, and rejoiced with me when I had something to celebrate. I have made good friends, around the world, whom I would never have known otherwise. It connects grandparents with their families in Australia, and opens up the world to millions of bedbound and physically isolated people.

It cannot ever be a replacement for face-to-face contact. But it can enhance it.

Yet, let’s look at the most serious of those dangers that threaten our souls in the internet era. These are anger, mobs, addiction, status obsession, control and the panopticon.

I wonder if the Churches should be teaching the faithful how to spot the warning signs of these tendencies in themselves and how to pass that information on to others?

(C) xkcd.com This work is licensed under a Creative Commons Attribution-NonCommercial 2.5 License.

Anger seems to be set off by the tiniest things online. If a story I put on my Facebook page is widely shared, especially a political one, I’m constantly fascinated by the willingness some people I don’t even know to swear a lot on my page using their real name. The worst offenders are usually people who are agreeing with me. It’s the cyber equivalent of a stranger coming into my front garden, and shouting through my front door that they agree with me. With lots of added ‘f’ bombs.

Social media seems to have crystalised an odd aspect of human psychology – feeling self-righteous anger seems to give us a chemical hit that is intoxicating. As online marketing professionals say, “Angry people click”. With newspapers and other traditional media outlets in deep trouble, it is too tempting for sub-editors to turn their headlines into misleading but effective drivers of traffic, and therefore of advertising revenue.

Connected with anger is the tendency of social media to see people act in mobs. Cyberbullying is, as we all know, a sad reality of children’s lives that emerged along with the internet. With technology following us everywhere, it can be hard for children to escape what would once have been confined to the playground or classroom. Nor is cyberbullying something we necessarily grow out of.

Even before the emergence of social media, the great computer pioneer Jaron Lanier, effectively the inventor of virtual reality, had already noticed the tendency of people in online spaces to bully holders of minority opinions and fight other strong mobs in verbal pitched battles.

“I wonder if some aspect of human nature evolved in the context of competing packs”, he wrote in 2006, before social media really took off, “We might be genetically wired to be vulnerable to the lure of the mob. What’s to stop an online mass of anonymous but connected people from suddenly turning into a mean mob, just like masses of people have time and time again in the history of every human culture?”

Thinking people of all shades of opinion are disturbed by this cybermob phenomenon precisely because most of us have been part of it from time-to-time.

Addiction is another problem. The average smartphone user checks their phone a whopping 80 times per day. I found that hard to believe, until I started trying to tally how many times I checked it.

Some significant social media features have been designed specifically to be addictive. For example, the use of the colour red with a number to show how many new alerts one has to check; the ‘Like’ button on Facebook; and the swipe to reload feature that Twitter uses on smartphones. All seem to deliver a quick hit of dopamine to our brains.

Recently, a spate of programmers heavily involved in the development of Facebook and Twitter have begun voicing concerns about the way these and many other social media apps are intentionally being made more addictive. Most famously, Justin Rosenstein, who created the Facebook ‘Like’ button in 2007, recently deleted the app from his mobile phone.

There’s also evidence that even the potential distraction of working with a smartphone present makes it more difficult for people to concentrate and to assimilate complex thoughts.

Social media can be a wonderful boon, but surely it is time for consumer pressure on companies to rein in the way their software is increasingly made to be deliberately distracting and addictive.

(C) Kjyrstenolson under CC BY-SA 4.0.

Status obsession is hardly new. Social media turbo charges it by giving everyone a numerical rating of how many followers or likes they have that is easy to instantly compare. And it can affect real world outcomes – I’m convinced that having a relatively high number of followers on Twitter helped land me this job!

We should have a good idea of the negative ways this can affect people: money has, of course, long been for many people a numerical value of their worth and comparative status. And as Christ taught us, the love of money is the root of many kinds of evil.

But it’s more than just status envy. Some people’s online ratings now have significant control over their lives. Those who make their money from two of the most pioneering companies of the social media era, AirBnB and Uber, can have their livelihoods ruined by a few bad ratings. And both can be rated out of 5 stars at the end of every encounter.

An AirBnB host might invest thousands of pounds in furnishing a guest room, installing a second bathroom, etc., yet can see it all wasted if an aggressive guest takes a dislike to me. Similarly, if Uber drivers have their ratings fall below a certain threshold, they are subject to retraining or, if Uber has too many drivers in that market, they are simply dismissed.

The ratings game works both ways, which people don’t always realise. I managed to take two of the best holidays I’ve ever taken this year: three weeks in Southern Africa and three weeks in Eastern Europe. I was able to do this by cutting my costs to the minimum, taking the cheapest AirBnB option with a private sleeping space available. Uber was a life safer in South Africa, much better and cheaper than official taxis in a country where wandering around strange cities isn’t a smart idea.

I didn’t realise for some time that Uber drivers also rate their passengers. And my rating after a few weeks in South Africa wasn’t bad, but it wasn’t wonderful either. So every taxi trip became a trial of politeness and judging the driver’s personality, aiming for 5 stars, which rather undid the relaxed holiday feel.

A final big worry is the panopticon effect. The panopticon was a type of prison devised by philosopher Jeremy Bentham in the early 1800s, where every prisoner in every cell was visible from a central control pod with minimum effort. Are we making an online panopticon for ourselves where we are our own gaolers?

Few people realise just how much revealing information they put online, even if they don’t use social media. Unless you’ve tinkered with your settings, your phone is probably constantly relaying your current location to Google, allegedly so it can provide you with the maps and other services you need. This data is stored for a few weeks at least, and looking at your own movement maps can be very revealing.

Similarly, a lot of people’s social media activity is a great way of working out their sleep patterns and whether they skive at work!

The panopticon principle for prisons was abandoned when it turned out that it drove a lot of prisoners mad.

A Threat to Democracy?

The question of the moment, however, is whether smartphones are a threat to democracy. I reiterate that I think the immediate problems will pass as people become accustomed to information arriving this way. At present, too many people are inclined to see their phone or their computer as some sort of oracle: an attitude brilliantly incarnated in David Walliams’ superb Little Britain character, ‘Computer Says No’.

I think the more subtle threat is some of the ideology that has become popular with the emergence of computer networks.

The idea that leaderless networks inevitably come to right decisions and discover new and better ideas has become ubiquitous. It dominates majority thinking in economics and science alike. It is an idea that has been recycled and concentrated ever more powerfully by each generation in recent decades: starting with the 1960s counterculture, passing from them to the free market capitalists of the 1980s, and on to today’s techno-utopians.

It promises us progress without sacrifice, and in saying that the network ultimately makes the right decision if it has complete information, removes the virtue of moral courage. It is also rather narcissistic. Nothing encapsulated the ideology of leaderless networks better than the 2006 Time magazine Person of the Year – ‘You’.

The mid-2000s was the apogee of a sort of cult of the leaderless network, and as an interesting side point, also the high point of new atheism. From 2008 onwards, a series of economic and political crises shook our faith in any inevitable era of progress through a harmonious network of technologically connected humanity. In particular, in the early 2010s, the limitations of leaderless networks were exposed in a series of political crises.

Human Microphone at Occupy Wall Street. (C) David Shankbone under CC BY 3.0.

The Occupy Movement and the 2013 anti-corruption demonstrations in Brazil were organised by social media and eschewed leaders. Occupy Wall Street even used a system of ‘human microphones’ to relay messages democratically among the crowd, so hostile was it to the very idea of platform parties with speakers, and the hierarchy they implied.

The Arab Spring was the most significant and telling of these early 2010s social media-driven movements. Facebook and Twitter were the tools that young people used to overthrow authoritarian régimes in Egypt and Tunisia, desiring the liberty they could see, not least because of the internet, was a given norm just a few hundred miles across the Mediterranean. At first it looked like the dawn of a new, genuinely global, democratic order.

In Bahrain, a similar series of protests was brutally crushed with the government making effective use of sectarian tensions between Sunnis and Shi’ites to sap support for change. In Syria, the protests set off one of the worst wars of the 21st Century, which still continues more than six years later. In Egypt, the Muslim Brotherhood later won a Presidential election by a wafer-thin margin and rapidly turned to implementing an authoritarian Islamism. In many cases, the same educated young people who overthrow Mubarak, came back out on to the streets to ask the military to take over. Only in Tunisia did the social media revolution deliver real change.

In Brazil, the anti-corruption protests did empower the police to launch the biggest graft probe in the country’s history, but this is currently mired in stale political squabbling. The old political parties are still in place precisely where they were.

In Western countries the Occupy movement failed, going out with a whisper rather than a bang. Without leaders, it had no-one to give it direction, and simply disintegrated without fanfare. Arguably, it bore fruit years later as a generation seemed to turn to the left, symbolised by the emergence of Sanders and Corbyn as political leaders. But across the Western world, in the social media age, it has been politicians of the nationalist, authoritarian, right, who have made the running.

Across the world, far from a common global humanity, the politics of division – white versus black, Sunni versus Shi’a, native versus immigrant, have versus have not – have become dominant, from the United States to the Arab world to the Philippines.

None of these outcomes would have surprised a keen student of previous revolutions. As we noted, previous communications revolutions have tended to crystalise some divisions of people even as they transcended others. People seem to have a need to be part of in-groups and out-groups. The danger is that when people feel their group is threatened by others, they form mobs, as we have discussed. This is not new. The internet certainly did not create this part of human group psychology, but it may allow it to manifest itself in new and newly dangerous ways.

The first leaderless networks to emerge in the modern era also decayed rapidly in the face of human dynamics. These were the hippie communes that emerged in the United States in the late 1960s. Despite an aversion any formal leadership structures, the strongest personalities rapidly came to dominate supposedly non-hierarchical decision making structures. If there were no strong leaders, however, the commune soon died anyway.

Without agreed norms, these communes often developed deep problems with predatory sexual behaviour towards women. Creating a new society wasn’t possible by going into the wilderness and cutting oneself adrift. It is only possible through painstaking, patient, work with what already exists.

The Black Mirror

(C) Qurren under CC BY-SA 3.0.

Let us conclude with the Charlie Brooker series with which we started tonight’s talk, Black Mirror, is named for the ubiquitous screens we live with and increasingly carry everywhere: on our TV and computer screens, phones and laptops. Dark and reflective, they have reached into our souls with extraordinary power over the past decade and allowed us to reveal our nature, as individuals and a species, in ways rarely seen in the past.

There is plenty of good in that picture – sympathy for the lonely, families and friends kept together across long distances, old friends reunited after decades, countless beautiful photographs that uplift someone’s day, or wise sayings, or practical advice.

There is, however, a dark side, however, and perhaps that should return us to traditional Christian doctrines that became unfashionable. The doctrine of The Fall became terribly passé for several generations, seen as the last refuge of bitter fundamentalists who wanted to see God as their personal hitman.

But, as the old saying goes, original sin is the only Christian doctrine provable by concrete evidence. Online, our dark side is all too apparent – creatures of beauty and love, we are disfigured by self-righteousness, fear of the other, anger, avarice and obsession with status. The future will not, ultimately, be very different from the past: it never is. Human beings are still human beings; the technology at our disposal merely increases our power, for good and for ill.

So, perhaps we should return to the Good Book for the final thought of this evening, for it is as central to our calling as Christians as when St Paul wrote it to a group of bickering, flawed, Christians in the great sin city of the Eastern Empire during troubled and changing times: “Keep alert, stand firm in your faith, be courageous, be strong. Let all that you do be done in love.” (1 Cor 16:13-14) Good advice online and offline.

Read more in Sarum College News. Bookmark the permalink.

Please leave a comment

Your email address will not be published. Required fields are marked *