Combatting misinformation

By Erin Kernohan-Berning

Throughout the pandemic we have been made to think about something that we normally don’t really think too much about – how things like viruses spread. It might feel strange to some having to think about how something invisible to the human eye might be on something like a door handle, and then gets on your hand, and then gets into your eye when you unconsciously rub at it. You probably didn’t think about how much stuff we share – tiny droplets and viral particles – just talking to one another. I was told by a university professor once that two people speaking in a car for 20 minutes deposits enough of these droplets onto a dashboard to yield a DNA profile. Like finding out about the mites that live in your eyebrows, or how much of your mattress is dead skin cells by the end of its life, learning about the unseen world around us can be both amazing and unsettling.

Another thing we share is information. Humans are naturally social creatures. Relying on one another has helped us survive as a species. We’re social in a number of ways, but almost all of those ways involve sharing information. We share information with words (written and spoken), objects, art of all kinds, how we dress, our body language, and even how we smell. Sometimes this information is very direct in its delivery, and sometimes it is more abstract. Information is everywhere and as humans we are constantly taking it in and processing it.

With many of our social interactions being facilitated by technology of some kind, we need to acknowledge how this absorption of information may be affected by that technology. People of a certain age may remember the game “telephone”. Everyone sits in a circle and someone starts off by whispering a phrase in someone’s ear. The phrase is then whispered, in turn, by each person in the circle until it reaches the original sender. The game ends with the sender saying what they originally said and comparing it to the message that they heard at the end. More often than not, the message has changed significantly. Sometimes this is because someone unintentionally changed the message through mishearing what was whispered to them or misunderstanding it – this is analogous to misinformation which is the unintentional spread of inaccurate information. Sometimes the message changes because someone in the circle intentionally changes it because they want to be funny or mean – this is analogous to disinformation which is the intentional spread of inaccurate information to manipulate others.

Social media platforms are not neutral grounds for information sharing. Companies like Facebook (which owns Instagram), Twitter, YouTube, and others offer their services for free, but make money through advertising revenue. The major selling feature for advertising is how many eyes are going to be on a particular ad, whether that ad is clicked on, and whether that click ultimately converts to a sale. This roughly translates into more views or clicks = more money. To generate these coveted views or clicks, social media platforms elevate popular topics to keep people engaged. But as we all know popular does not mean good, and this elevation of popular rather than accurate can have damaging effects.

Casey Newton, Silicon Valley editor for The Verge, recently argued that Twitter’s trending topics feature is primed for the spread of conspiracy theories to the extent that they have been used for the spread of disinformation. Twitter has done some work around this issue, including using human curators to contextualize some of these trending topics, however increased chatter about long-debunked conspiracy theories still gets pushed to the surface on the regular where without this algorithmic manipulation such disinformation could die out on its own.



According to the Wall Street Journal, Facebook executives were made aware by their own team as far back as 2018 that the platform is engineered to “exploit the human brain’s attraction to divisiveness” and that the platform would automatically serve up more and more divisive content to keep people’s attention. However, this warning went largely unheeded. While efforts were made by developers to create tools to curb the spread of hyper-partisan content, Facebook executives actively resisted these efforts. In 2016 Facebook’s internal research found that “64 per cent of all extremist group joins” were due to their own recommendation tools. While Facebook did work to ban certain disinformation groups from the platform over this past year, many critics argue their efforts are falling short of what is needed.
An important part of combatting misinformation and disinformation is thinking critically about information you receive – whether it is on social media, traditional news media, in books, on the radio, or told to you by your neighbour. Thinking critically is not the same thing as just always taking the opposing viewpoint – that’s contrarianism. Rather, thinking critically involves evaluating information for accuracy, pausing to consider your own biases, reading beyond the headline or article, and looking for other sources that may or may not support the information you are interrogating.

For example, on Twitter I saw someone tweet a pre-print paper as a reply to a news article about COVID-19 which was purported to support a long-debunked piece of misinformation. The tweet also quoted the author of the paper using some fairly sensational language that I wouldn’t expect from a dispassionate academic. I read the paper and the cited quote was nowhere to be found. Then I searched for the quote itself. This is called reading laterally – when you search for information about what you are reading elsewhere. Eventually I found that there were factual errors in the paper, that there was an entire context to the issue that had gone unexplored by the author, that the author was not a seasoned expert on the topic about which he was writing, and that the author had been quoted by online media outlets known for spreading conspiracy theories. Throughout this process, I had to critically acknowledge any biases I might have held while searching for supporting information, which is an energy consuming process. Critical thinking is important, but it is also difficult. In this year of information and emotional overload, it’s no wonder we find it difficult to do what we need to do to be good digital citizens.

As the COVID-19 pandemic drags on, our patience is wearing thin. This makes us more susceptible to the promises that misinformation and disinformation make to us – that there are easy answers, that we should just all get on with it and get back to “normal,” that the problems in our world aren’t the boring, complex, hard to understand ones that we’ve been burdened with. All the more important why we need to guard ourselves against misinformation and disinformation. According to the UN’s Share Verify project, a study by social scientists from the University of South Wales identified “2,311 reports of rumours, stigma, and conspiracy theories in 25 languages and 87 countries,” some with deadly consequences. The only way to combat this global problem is for us to practice good information sharing and consuming habits, and to talk to our loved ones about doing the same.

Erin Kernohan-Berning is branch services librarian at the Haliburton County Public Library.