This is a first-of-its-kind report because, unlike survey-based studies, it goes into fine grain qualitative and interpretive detail about the communication dynamics of personal messaging. Our method allowed ordinary people to really open up about their experiences of seeing vaccine myths and conspiracy theories posted into their WhatsApp and Facebook Messenger chats.
This report is based on the Everyday Misinformation Project’s first phase—nine months of intensive fieldwork. We used a detailed qualitative and interpretive method based on in-depth semi-structured interviews (n=102) with the public in three regions: London, the East Midlands, and the North East of England. We recruited participants using Opinium Research’s national panel of over 40,000 people. Those taking part roughly reflect the diversity of the UK population on age, gender, ethnicity, educational attainment, and a basic indicator of digital literacy.
Online personal messaging platforms have grown rapidly in recent years. In the UK, WhatsApp has 31.4 million users aged 18 and over—about 60% of the entire adult population—and is more widely and frequently used than any of the public social media platforms.
The report is free and open access. Download your free copy here.
The Report’s Key Findings
-
Online personal messaging platforms encourage what we call hybrid public- interpersonal communication. We explain how this has distinctive implications for how misinformation spreads.
-
Discussion of vaccines mostly happens in small messaging groups among family, friends, and work colleagues, where people know each other well and tend to trust each other.
-
Paradoxically, this can increase the likelihood that misinformation goes unchallenged. This is because, on personal messaging, people have a norm of conflict avoidance. Importantly, for some people conflict avoidance is seen as easier to perform on personal messaging than it is during in-person communication.
-
When people encounter vaccine misinformation in larger personal messaging groups, for example among school parents or work colleagues, they fear that if they try to correct it they will be seen as undermining group cohesion by provoking conflict and they worry about their command of facts about the safety of Covid vaccines. People perceive these risks to be greater when there is a more “public” or “semi-public” context of a larger messaging group to consider, even though the groups are still technically private.
-
Some people draw boundaries between what they see as the world of public and political communication, where they think there is a norm is that it is legitimate to challenge misinformation, and the interpersonal world of personal messaging, where the norm is that misinformation should go unchallenged because it is not appropriate to call it out.
-
Seeing misinformation leads some people to disengage from vaccine talk on personal messaging. This presents a further paradox: they know the content of the misinformation posts but do not speak up, even if they disagree with it. These signals of tacit acceptance in a family, friend or school group can enhance the legitimacy of misinformation and contribute to its further spread.
-
Some people try to find routes around the norm of conflict avoidance, for example by sharing criticisms of vaccine misinformation in encounters they perceive to be less risky. Some people engage in what we term scaling—up and down between different groups, both large and small—or they gauge others’ experiences and opinions in smaller one-to-one chats.
-
But conflict avoidance casts a long shadow. Scaling and gauging may help build solidarity among those positive about vaccination, but also reduce opportunities to address misinformation in the contexts where it appears.
-
Challenging vaccine misinformation overtly can backfire and lead people to exit dialogue. Vaccination talk is then deemed off limits, leaving personal messaging to continue but only on the basis of “safer,” less conflictual topics.
-
Based on these findings, we outline some broad principles for public health communication to slow the spread of Covid vaccine misinformation on personal messaging:
-
Person-focused, not content-focused, anti-misinformation interventions are more likely to work.
-
Interventions should balance people’s desire to maintain healthy relationships with friends, family members, and the other communities to which they belong with the need to foster healthy relationships with public health information.
-
Interventions should encourage people to scale up from the high-trust, one-to-one, and small group interactions to the larger groups, where people could work together to support each other in dialogue-based challenges to misinformation, avoiding the risk of standing out as lone individuals.
-
Interventions should also encourage people to scale down by discussing how to deal with misinformation in groups and then take the lessons learned down to one-to-one exchanges.
-
Interventions should not encourage antagonism, but an empathetic, dialogical orientation toward others.
-
About the Everyday Misinformation Project
Based in the Online Civic Culture Centre (O3C) and the Centre for Research in Communication and Culture at Loughborough University, the Everyday Misinformation Project is a three-year study funded by the Leverhulme Trust.
The project’s aim is to develop better-contextualised understanding of why people share and correct misinformation online. It has a unique focus on personal messaging, or what are sometimes called private social media or encrypted messaging apps. These services, particularly WhatsApp and Facebook Messenger, are hugely popular in the UK, but their role in the spread of misinformation is not well understood. In part, this is because, due to their nature, these services are difficult to research. Unlike public social media, they do not have public online archives and they feature end-to-end encryption.
Visit the project website.