We’ve all been exposed to quacks, hacks, and fonts of bullshit before. But not like this. We’re at peak misinformation now, and it’s hurting our health.
In a recent Economist/YouGov poll, 20 percent of U. S. citizens surveyed said they believe that Covid vaccines contain a microchip. Think about that. A fifth of respondents subscribe to a theory that has its roots in the idea that Bill Gates wants to track your activities. This survey also found that only 46 percent of Americans were willing to say that the microchip thing is definitely false. Even though there’s no plausible way it could be happening.
These stats are troubling. But given our frenetic information environment, they are also, well, understandable. It is becoming harder and harder to tease out the real from the unreal. Sense from nonsense. Magical thinking from microchips.
Not long ago, I was shocked by a headline about “Covid parties”—people allegedly gathering to intentionally infect themselves and others. Infuriated and without pausing to reflect (or to do sufficient fact-checking), I immediately took to social media to rage about how irresponsible this was. Reality: Covid parties are mostly an urban legend. I was just adding to the noise and our collective angst.
I study misinformation. This is my job at the University of Alberta, where I am a professor of law and public health and specialise in health policy and the public representations of science. I should have known better. But the story played to my values, emotions, interests, and professional passions. Cringe.
This is truly the golden era of misinformation. We are, as the World Health Organisation declared in early 2020, in the middle of an “infodemic”—a time when harmful misinformation is spreading like an unstoppable infectious disease.
Part of the problem is that we have normalised nonsense in some very subtle and some very obvious ways. Heck, there are a host of (very) successful wellness gurus who have embraced pseudoscience as a core brand strategy. And thanks to people like Andrew Wakefield—the disgraced former physician who started the vile “vaccines cause autism” fallacy in a paper published in and later retracted by The Lancet—misinformation about vaccine safety has continued to spread and find new audiences.
A sad truth: Misinformation and men are an especially bad combo, and it’s hurting our health. Research from the University of Delaware tells us that men are more likely to believe Covid conspiracy theories and other research suggests they may be less concerned about the harmful effects of misinformation.
Men are also less likely to get the Covid vaccine. While there are myriad reasons for this hesitancy, the male inclination to accept and be influenced by Covid conspiracy theories is a key part of the story. So it’s especially important right now for men to use the strategies here to ingest a healthy diet of information and wash it down with some skepticism.
Our information environment has become a chaotic, confusing, exploitative shitstorm that is destroying our health and well-being. There are a variety of forces making it increasingly difficult for us to avoid—or even recognise—the harmful hogwash and polarizsng pandering. And all this is happening at the exact moment in history when we crave and so desperately need facts and a bit of clarity.
The infodemic has helped foster an erosion of confidence in scientific institutions, as those who spread misinformation frequently seek to promote doubt and distrust. The scientific community deserves some blame, too, with occasional bad research and poorly communicated results creating confusion. (Masks don’t work./Yes, they do.) But that’s how science works; evidence evolves and recommendations change, and being transparent about those changes is essential. Just be aware that alternate and often science-free voices will try to be definitive when actual scientists don’t have the data or facts to get there quite yet. You’re better off waiting until they do.
But there is a way forward! By using a few critical-thinking tools and being aware of the tactics used to push misinformation, we can cut through the noise.
So How Did We Get Here?
There’s no single reason that half-truths, deliberate untruths, and simple misunderstandings are undermining the acceptance and sharing of science-backed information. It’s a complex tangle of many factors. But if I’m forced to pick the one that has done the most to supercharge this era of bad info, the choice is obvious: social media. In July, President Joe Biden went so far as to say that misinformation on social media is “killing people,” a lament that is both alarming and supported by a growing body of evidence.
If you get your news from social media, you are more likely to believe and spread misinformation, according to a 2020 study from McGill University. An analysis by Pew Research Center came to a similar conclusion. Other research has traced the origins of Covid misinformation circulating in popular culture to specific platforms. For example, a 2020 Press Gazette analysis of more than 7,000 misleading claims about Covid on the global Poynter Coronavirus Facts Database found that more than half had originated on Facebook.
We know that misinformation can spread fast and far. In August, Facebook released a report on its most widely viewed content from January through March 2021. The winner? The post seen more times than any other was a misleading article implying the Covid vaccine had killed someone. This nugget of misinformation was viewed nearly 54 million times by Facebook users in the U. S. in that three-month period and has been leveraged by countless anti-vaccine advocates, compounding its impact.
This kind of noise has, as noted by Biden, done great harm, leading to deaths and hospitalisations, increasing stigma and discrimination, and skewing health and science policy. One study from early in the pandemic linked more than 800 deaths and thousands of hospitalisations to a rumour, spread primarily through social media, about the use of methanol as a cure for Covid. A study published this year by Heidi Larson and her colleagues at the Vaccine Confidence Project at the London School of Hygiene and Tropical Medicine found that the spread of online misinformation about Covid vaccines has had a significant impact on hesitancy—jeopardising our ability to reach herd immunity.
There are many reasons why this happens. Our current information ecosystem is a frantic space that doesn’t really invite a careful consideration of the facts, especially if the headline plays to our emotions. We react quickly to the impressions that content creates. We know, for example, that humans are evolutionarily predisposed to remember and respond to negative and scary information. This negativity bias is universal. Media experiments have found, for instance, that negative headlines outperform positive ones.
There is also a growing body of evidence that exposure to social media may be stressing us out. And when we’re stressed out, we may be more likely to believe and spread misinformation—thus creating an accelerating cycle of angst, dread, and bunk. There is some ironic truth in the term “deathscrolling.”
Adding to the gravitational pull of this vortex is the reality that the algorithms used by social-media platforms to decide what we see continue to ensure that harmful—and often fearmongering—misinformation floods our feeds. This can pull people into microchip-infused, anti-vax, 5G-caused-Covid rabbit holes that are specifically designed to play to our interests and values. The impact of this personalised media curation can be staggering. A 2020 analysis by the activist group Avaaz estimated that the algorithm Facebook uses generated 3.8 billion views of health misinformation in just one year.
Another problem: Lies, fake news, and pseudoscience can be made more compelling (microchips in the vaccines!) than the boring old truth (safe, clinical-trial-tested, actual vaccine ingredients). Indeed, research has found that, yep, as the saying goes, “a lie can travel halfway around the world while the truth is putting on its shoes.”
Misinformation and conspiracy theories can also draw us in because they may provide a complete narrative as to why things are happening. They can offer answers to questions that, from a scientific perspective, remain unresolved. During the pandemic, for example, much was—and still is—unknown. This can feel discombobulating. A story that gives answers, even a seemingly zany conspiracy theory, can be comforting—especially if that story reflects our preexisting values and beliefs. (Ah, it was that evil Bill Gates and his microchips!)
And this brings us to ideology.
The spread of misinformation has always had an element of ideological spin. Crafting a message that fits with a particular worldview is a surefire way to make misinformation more appealing, at least for those who subscribe to that worldview. In addition to leveraging our confirmation bias—that is, the strong psychological tendency to see, process, and remember information that confirms our preexisting beliefs—using ideology as the hook allows those who are pushing misinformation to sidestep the actual science. The message becomes about an ideological position, not what the science says or doesn’t say. I want to be clear that I’m not judging anyone’s political leanings. The ideological spin of science happens across the belief spectrum. The point is that when it comes to accepting and sharing misinformation, ideology matters. We all—right, left, centre—need to be aware of that.
David Rand, Ph.D., a professor at MIT’s Sloan School of Management, has conducted a boatload of studies on the connection between ideology and misinformation. “We have found that when deciding what to share on social media, people are much more likely to share content that aligns with their political partisanship—even if it’s false,” he says.
Experts have long recognised the double-edged nature of social media. It can bring us together, connecting us to communities, friends, and families. But it also can drive us apart—especially around ideology. “Social-media [platforms] are amplifiers of political polarisation,” says Kate Starbird, Ph.D. She is an associate professor at the University of Washington and an expert on the spread of misinformation. “They make polarisation worse and allow for that polarisation to be leveraged in new ways by those seeking to exploit our differences for their gain.” Public discussions about Covid became politically polarised almost as soon as the pandemic was declared. And an analysis from the University of Cincinnati that examined social-media interactions from the beginning of the pandemic saw that some of the most influential voices were politically motivated.
Increasingly, our information environment is dominated by social media and fuelled by a toxic stew of fear, distrust, uncertainty, and political polarisation. Recognising the forces that drive misinformation is an important step in stopping its spread. Indeed, Starbird told me that her top recommendation for spotting misinformation is to tune in to your emotions. Find 9 more ways to identify it here.
“Whenever some piece of content makes me feel politically self-righteous—like I’m about to spike a political football—that’s when I know I need to be extra careful about sharing,” she says. “Because there’s likely a misinformation flag somewhere on the field.” You can do even more than that to stop the spread of misinformation. The rest of this series helps you figure out whether or not you should believe the information you’re hearing.
You Might Also Like