Misinformation and disinformation increasingly shape behaviors on a large scale. Accordingly, policymakers have started working to address the proliferation of inaccurate information. But despite a growing body of research indicating identity is central to addressing the spread and influence of misinformation, actions and policy discussions continue to focus on content removal and better educating the public. In a week-long workshop held by the Yale International Leadership Center, or ILC, we worked together on trying to find new paths to tackle misinformation. We want to share some insights from those discussions. 

A belief may become part of our identity over a relatively short time: We may have argued for our belief, facing stern opposition  — or just forwarded it on a messaging app. We may have carried our belief on signs in protests facing armed police — or shared it on social media. We may have angered people close to us by making these beliefs part of who we are. 

The more integral to our identity the belief is, the harder it is for us to let it go.  

Thus, we doubt counterclaims telling us those beliefs are wrong. We are downright incredulous when such claims are made by people and institutions that criticize our own group. Sometimes, we will even see that as proof that we are in the right. When claims similar to ours are censored, we will often think it is an attempt to silence us, as individuals or as a group. That censorship thus backfires. Instead of changing our minds, it leads us to find other ways to get our message out.

Deciding what is true — or even deciding who gets to decide what is true — is a thorny normative problem. Should governments decide? Technology companies? The courts?

We don’t have a good enough answer to these questions right now, and it will likely be a long time until we do. But we cannot wait — tackling misinformation is urgent. Instead of censoring, how about we turn on the volume, increasing the reach and power of speakers.

We can make sure the right messenger carries the message. We can crunch the numbers. We can better understand the makeup of groups and belief clusters, and understand who is more likely to have an impact — an impact that crosses divides. Effective messengers are insiders, and some of them are in-betweeners: people who share a number of identities and can talk to different groups. They may have opposing views, but they share our identity. They don’t label or mock us because they are one of us. Instead of trying to decide whose speech should be demoted, we can give inbetweeners more power. The decision about the identity of these inbetweeners would be made by an algorithm, based on the positive engagement with their content coming from both sides on divisive issues. 

This is not a perfect solution. It does not address, for instance, misinformation presented on cable news. But it can be implemented quicker than other suggested policies, like breaking up social platforms. It does not entail the legal complexities of taking content down and poses less of a normative problem. And in some models, it can also be a sound business decision: some advertisers — including government clients — might be more inclined to spend dollars on less divisive platforms. 

Policymakers and thought leaders can help with this effort. 

Business and management researchers can look for business models that would make amplifying in-betweener voices a profitable endeavor. Policy teams within technology companies could be part of the discussion and research, including normative academic research. 

While this solution is promising, it is not the first step. 

The first step is reframing the challenge of misinformation and disinformation. It has to do with identity, not with truth; with empowering, not labeling; with messengers, not messages; with amplifying voices, not silencing them.

ORR HIRSCHAUGE is a 2021 Yale World Fellow. Contact him at or.hirschauge@yale.edu. SRIKUMAR MISRA is a 2021 Yale World Fellow. Contact him at srikumar.misra@yale.edu. ELINDA LABROPOULOU is a 2021 Yale World Fellow. Contact her at eleni.lampropoulou@yale.edu