Back to top

There is an instinctive, basic trust within all of us, something that is essential to the ease with which we move through our daily lives. Think for a moment about what you trust. I trust that the alarm will wake me on time each day. Making my morning coffee, I trust that the utility companies continue to provide electricity, that what is labeled as coffee is genuinely coffee, and that the cup will not break. Think of the sense of utter betrayal when one of those conditions is not fulfilled! I still remember rushing to get up when a power failure reset the alarm, breaking my favorite mug on my 18th birthday, and the time the good coffee ran out on the morning of a big meeting.

As a U.K. citizen, I also have an instinctive trust in free speech. Freedom of speech is enshrined in law worldwide under the United Nations Declaration of Human Rights, and upheld in varying degrees nation by nation, from the First Amendment to individual enactments of human rights legislation. Because free speech is such a basic right in the U.K and United States in particular, we assume that people exercise this right in order to express genuine fact and opinion. It’s rare to question free speech. Perhaps it’s time we started.

Leaving the house each morning, I place my trust in the manufacturer of my car to fulfill their promise of comfortable, safe, and legally compliant vehicles. Until very recently, there was a general consensus among car buyers that Volkswagen and their partners fulfilled and even exceeded their duty of care towards our environment – certainly the messages put out there by the manufacturers, and by apparently independent free speech sources across all channels, reinforced this impression. However, as reported across the globe on Sept. 18, 2015, including the New York Times article, “VW Is Said to Cheat on Diesel Emissions,” we were misled. This came as a huge shock to consumers: trust had been misplaced. The trust in the brands involved was based on a perception fed by clever marketing, promotion, and perpetuation of an image across social channels. Perhaps discovering that all was not what it seems should have raised questions within all of us about the veracity and independence of that font of all knowledge, the internet. However, our need to function day to day within a trust framework means that questioning free speech is not a reflexive response.

A balanced view of the world?

How do we come to trust? From babyhood, we look at the evidence around us, and learn from our own experiences. We observe the reaction of our peers to situations, and follow their lead. We have an instinct for self-preservation, which helps us to place more emphasis on evidence that seems to be balanced and fact-based, naturally fearing overt coercion. Ultimately, we trust free speech, and the birth of the World Wide Web gave us access to reams of freely given information for decision making in our daily lives. However, when you step back, can you really say that all the memes, clickbait, selective reporting and freely-given opinion is truly balanced, factual, and evidence-based? We can’t always trust free speech on the World Wide Web.

As creator, Tim Berners-Lee said, when writing about the Web at 25: “When we link information in the Web, we enable ourselves to discover facts, create ideas, buy and sell things, and forge new relationships at a speed and scale that was unimaginable in the analogue era. These connections transform presidential elections, overturn authoritarian regimes, power huge businesses and enrich our social networks.” This explosion of information, coupled with the ability of every internet user to become an armchair philosopher, scientist, politician, or sports coach, starts to ring alarm bells if you step back and succeed in suspending your instinctive trust in free speech. Our natural leanings towards trusting our peers can backfire in all kinds of ways, in all areas of life.

The Escher Group in North East England conducted a detailed study last year into the habits of small businesses when they seek advice and support. As these micro enterprises are the backbone of the U.K.’s economic revival, and there are hundreds of public sector-led initiatives in place to help them, it’s important that they access those resources to thrive and survive. However, Escher’s results showed that 98 percent of respondents do not trust the public sector to help them with their business; the first point of reference is usually their peers. Think about your own first port of call: it’s a natural human reaction to ask the people you think will empathize with your problems. Unfortunately, this means that solid, verifiable business advice isn’t always filtering down to the people who need it. The noise added by personal opinion, anecdotal evidence, and online publication of unverified documents full of inaccuracies, is a problem that needs to be addressed. Ultimately, free speech is trusted over ‘official’ information because of a perceived lack of empathy, to the detriment of all.

Sharer beware!

Although Tim Berners-Lee goes on to say that: “social networks are interesting …they give us a custom view, a manageable and trusted slice,” this is the trust at the root of the perpetuation of internet hoaxes. I regularly find myself pointing friends and family to references on Snopes, ThatsNonsense and other sites when they unthinkingly share a dramatic, but uncorroborated meme, which seems to align with their own views. (For other useful ways to clean up your friends’ social feeds, check out Pete Brown’s comprehensive guide published in Australia’s The Conversation – Six easy ways to tell if a viral story is a hoax.)

They are exercising their freedom of speech rights by sharing hoax memes: that is to say, they are expressing what they believe, and they have the right to do so whether the reader finds it distasteful or not. However, it’s the detail behind what they share that is of concern. Our freedom of expression may be compromised by the veracity of the content we share.

The regulation dilemma: who polices free speech?

The Arab Spring demonstrated the powerful, positive use of the web to spread messages and “overturn authoritarian regimes” as Berners-Lee describes. However, we are now seeing the powerful, negative use of the Web as the Daesh movement (ISIS) overturns democracy in favor of its own brutal, authoritarian regime. The Brookings Institute published a study earlier this year of Daesh social media activity, identifying at least 46,000 Twitter accounts firing out around 100,000 tweets a day. There is fighting both on the ground and in the digital space, as the propaganda war is waged alongside real bloodshed. Even the argument over the movement’s name is telling – there is a push to remove the pseudo-authoritative title of ‘Islamic State’ used in the West, in favor of Daesh, as it is referred to in the rest of the world.

The internet is the home of free speech, but there are conflicting views and reports: who do you trust? Sharing propaganda is a valid expression of free speech (subject to laws against inciting hatred, of course) and trust in it rests with our individual judgment of the source’s alignment with our values and beliefs. In the U.K. we have seen the decision making that comes from misplaced trust, with families crossing the Turkish border to Syria, while refugees pour out across the same border towards Europe.

Much of the Daesh publicity is sent out as heavy bursts of tweets to build trends, with interaction between the supporting accounts, but very little outside. However, evidence is growing of a far more complex manipulation of free speech online, from an experienced propaganda machine. The recent infiltration of a Russian ‘Troll Factory’ by investigative journalist Lyudmila Savchuk has exposed a more intricate and far-reaching Web of subtle coercion. This activity is in addition to the now-familiar ‘Twitter-bot’ strategies: internet researcher Lawrence Alexander’s study identified 17,650 Russian accounts operating in a similar way to the Daesh machine.

Savchuk’s article in the Telegraph talks of not only phony social media accounts but also blogs, forum participation, and responses to online journalism. She was part of a special unit of “people pretending to be individual bloggers– a fortune teller, a soldier, a Ukrainian man – [and] had to, between posts about daily life or interesting facts, insert political reflections.” Developing a fake source to this level of detail and constructing believable back-stories, reinforces the impression that propaganda is in fact the free expression of independent peer opinion, and strengthens misplaced trust.

The troll factory activity was not restricted to Russia. The Guardian had long held suspicions that its online comments section was being trolled. Their moderators, who deal with 40,000 comments a day, believed there was an orchestrated pro-Kremlin campaign. Once again, this campaign played on our trust by apparently expressing independent reaction to media reports.

Restricting freedom of expression is in the realm of dictatorships and censorship – but does the corruption of freedom of expression merit a system of regulation? The techniques and the intensity of online propaganda are such a concern that in 2013 the European Union set aside $3 million to tackle Eurosceptic trolling in the run up to the European elections. It’s a never-ending battle to present a balanced view; free speech is compromised at every turn. So who decides what is ‘positive’ free speech, and what is ‘negative’? The Brookings study neatly summarizes the problem: “Regulating ISIS per se presents very few ethical dilemmas, given its extreme violence and deliberate manipulation of social media techniques,” the study reads. “However, the decision to limit the reach of one organization in this manner creates a precedent, and in future cases, the lines will almost certainly be less clear and bright.”

The selection dilemma: where do we place our trust?

In the face of such manipulation, where do we place our trust? Research Scientist Peter Gloor’s Collective Intelligence Networks (COINS) theory talks about the ‘Crowd,’ the ‘Experts’ and the ‘Swarm’. Following the experts or the crowd may not give you the right result, but narrowing your sample to reflect your situation, identifying the right ‘Swarm,’ can do so. Similarly, Tim Berners-Lee’s comment that we are comfortable with “a manageable and trusted slice” of information underlines the necessity of finding the right ‘swarm’ to reach an appropriate consensus.

Do we therefore retreat into our chosen communities to reduce the noise? This presents its own dilemma. It’s possible for a population of non-mathematicians to achieve the consensus that 2+2=5. The consensus diverges from mathematical principles; it has been corrupted by false assumptions. The consensus reached by families travelling towards Syria diverges from the reality that is causing millions to flee; it has been corrupted by online propaganda. Philosopher Jürgen Habermas, a proponent of consensus reality as a model of truth, refers to the ‘ideal speech’ situation where there are no external and coercive influences. Can you restrict your community to eliminate those influences, reaching a reliable consensus among knowledgeable peers – or does this selection corrupt, implicitly? Free speech may be unreliable in a self-selecting community.

Regulation, selection – or skepticism?

So where does the answer lie? Regulation comes with extraordinary volumes of ethical baggage: who watches the watchmen? Selection has its place for expert consensus, but who decides the makeup of the community? Ultimately, I believe we all have a responsibility to champion our rights to free speech, while suspending our instinctive trust and exercising a healthy level of skepticism.

Kate Baucherel

Kate Baucherel BA(Hons) FCMA is a digital strategist specialising in emerging tech, particularly blockchain and distributed ledger technology. She is COO of City Web Consultants, working on the application of blockchain, AR/VR and machine learning, for blue chip clients in the UK and overseas. Kate’s first job was with an IBM business partner in Denver, back when the AS/400 was a really cool piece of hardware, and the World Wide Web didn’t exist. She has held senior technical and financial roles in businesses across multiple sectors and is a published author of non-fiction and sci-fi. Find out more at www.katebaucherel.com

Add new comment

Restricted HTML

  • Allowed HTML tags: <a href hreflang> <em> <strong> <cite> <blockquote cite> <code> <ul type> <ol start type> <li> <dl> <dt> <dd> <h2 id> <h3 id> <h4 id> <h5 id> <h6 id>
  • Lines and paragraphs break automatically.
  • Web page addresses and email addresses turn into links automatically.