Disinformation is a global challenge. It’s inflicting damage in many spheres of our lives. Perhaps most alarming is that it’s threatening democracy worldwide. People need access to reliable, authentic information to exercise their democratic rights and responsibilities. Good information is the oxygen of a thriving democracy. It is key to transparent government, and vital for citizens’ ability to hold governments to account.
We use the term “disinformation,” but that’s really shorthand for the broader problem of information manipulation, which includes disinformation: when false information is knowingly shared to cause harm; misinformation, which is information that’s wrong, but there’s no intent to deceive; or information that’s accurate, but that’s presented in a misleading context or under false pretenses, or is artificially amplified. It also includes hate speech. It’s important to take into account the whole package of manipulation when looking at the problem.
Given current events, it may be helpful to think of disinformation as a pandemic. What we’ve learned about the coronavirus is that it’s particularly dangerous for people and communities with underlying vulnerabilities. We know that people with preexisting health conditions are more at risk. But there are other kinds of risk, as well, such as not having health insurance, paid medical leave, or access to decent health care. Having a job that’s considered “essential.” Being required to work indoors in cramped conditions. Being confined in a prison or an eldercare facility. In order to fight and prevent future pandemics, it’s not enough just to come up with a single medical or technical solution. We also have to address all of these underlying conditions. A society that’s healthy -- in all meanings of that term -- will be far more resilient to any future viruses. The same is true for inoculating against disinformation.
Just like the coronavirus, disinformation is opportunistic. It’s often deliberately targeted at groups of people who are vulnerable, perhaps because they’re predisposed to believe certain things, are already marginalized from society, or have deep-seated grievances. The aim is to amplify those stress points. In other cases it’s targeted specifically at social divides -- such as racial, ethnic or political conflicts -- with the aim of amplifying those tensions. Disinformation thrives on insecurity and division. Like a virus, it can spread exponentially under the right conditions. It’s no coincidence that disinformation escalates around elections, because elections are when democratic institutions come under tremendous pressure. The political stakes are high, there’s a premium on public trust, and everyone involved needs good information to produce a legitimate outcome. This fragile ecosystem is irresistible to disinformation.
And unfortunately, unlike a virus, where we can at least hope for a vaccine, with disinformation there’s no “silver injection” that will inoculate all of us. Which is why we need to take a holistic approach. It’s not enough just to fight byte for byte with disinformation attacks as they emerge. If that’s our strategy, we’ll never keep up. What we also need to do is build up the integrity of the underlying information space so it’s resilient to the disinformation that will inevitably break through.
We need to focus on what I call the “Who, What, How and Whom” of information manipulation.
Who is producing and distributing the disinformation? What are the sources?
What is the content? What are the narratives and themes?
How is it being disseminated? Through what channels and behaviors?
To Whom is it being targeted and, more importantly, who is consuming the disinformation and who is most vulnerable to believing or acting on it?
That’s why NDI’s INFO/tegrity program takes a multi-faceted approach. The idea is to help partners fight the “disease” from as many angles as possible: disrupting the sources, neutralizing the false or inflammatory content, regulating the channels it flows through, building up the population’s resistance to malign influences, and ultimately, safeguarding the integrity of the information environment.
Among other INFO/tegrity initiatives, NDI conducts research on disinformation vulnerability and resilience, monitors disinformation and computational propaganda in elections, strengthens political party commitments to information integrity, helps social media platforms and tech firms “design for democracy,” shares tools to detect and disrupt disinformation, and rebuilds trust in institutions and processes through democratic innovation. NDI is collaborating with governments, legislatures, civil society, the media, technology platforms, academia, business, and the creative sector in leveraging their own diverse strengths and resources to tackle the many aspects of disinformation. The near-term objective of these efforts is to “flatten the curve” so disinformation doesn’t overwhelm democratic institutions. Over time and done right, though, they’ll push the “infection rate” closer to zero.
It’s only through these comprehensive strategies that we’ll restore integrity to our information and health to our democracies.
Laura Jewett is Senior Director for Eurasia programming at NDI.