Advice for Scientists

Using science to combat misinformation and disinformation

Scientists: Register with SciLine

What is Advice for Scientists?

A common desire of scientists involved in public engagement, including speaking with the media, is to debunk misinformation. Refuting false claims effectively is an uphill battle, but it can be incredibly valuable. While research is mixed on the best practices, here are tips to get you started:

  • Understand why false information is tough to combat.
    • Misinformation and disinformation both refer to false or misleading information, but the difference is in intention. Disinformation is designed to mislead, while misinformation, while harmful, is accidental and typically not malicious.

      View printable PDF

    • Simply stating facts isn’t enough to change people’s minds if they believe a piece of misinformation or disinformation.
    • Psychological drivers such as confirmation bias, familiarity bias, and motivated reasoning may make false information hard to dislodge. Disinformation is particularly difficult to challenge when it’s backed by large, well-funded efforts.
    • There’s some evidence that debunking may backfire and reinforce falsehoods in some circumstances, though whether that effect has been exaggerated is the subject of ongoing debate. Either way, if a false claim isn’t widespread, simply ignoring it may be a better approach.
    • No single strategy will reach every skeptic, and even the more effective approaches have small impacts. But those small impacts accumulate, so they are still worth the effort.
    • Only a small percentage of the population is fully invested in false information, so focus your efforts on the vast majority who aren’t convinced one way or the other.
  • It’s easier to stop misinformation or disinformation before it starts than once it’s circulating. “Prebunking” or “inoculation” strategies can help audiences recognize and refute misinformation on their own.
    • Set the stage by forewarning your audience about bad-faith efforts to manipulate them in your field of study.
    • Highlight telltale patterns and language that tend to be used in false claims.
    • Focus on common tactics, such as emotional manipulation, misattribution, misleading or selective framing, and false balance; or on topics ripe for misconceptions, such as vaccines or electric vehicles.
      • For tactic-based prebunking, you can explain a strategy used to spread false information, why it works, and what might motivate those originating or sharing the information.
      • For topic-based prebunking, research the range of possible falsehoods that might spread about those issues, and choose one to use as an example. As you address that claim, start by saying what’s true, then warn your audience that what’s coming next is false before stating the false claim. Explain why that information is false or how we know, and then follow it up with another true claim.
    • Make it easy for your prebunking to be shared. Practice concise soundbites that reporters might quote directly, make social media posts shareable, and use relevant examples that will stick in people’s minds.
    • The effects of prebunking fade over time, so you may need to engage the same audience repeatedly over several months.
  • For claims that are already circulating, consider these debunking strategies:
    • Have conversations with people, approach them with curiosity to figure out what about a piece of false information resonates with them, and use that to inform your approach.
    • Tailor your talking points to what your audiences care most about. In combating vaccine misinformation, for example, different audiences might respond to different strategies:
      • For people who are particularly community-oriented, focusing on the collective benefits of vaccination may be more effective than focusing on the risks of contracting a disease.
      • On the other hand, focusing on the risks of illness may be more effective for people who value bodily purity and sanctity, like some religious denominations or those who adhere to wellness culture.
      • In some marginalized communities, belief in misinformation about vaccines can stem from historical medical mistreatment, so addressing false claims may require a focus on building trust in the medical system.
    • Avoid framing things with a myth vs. fact structure, which some studies suggest isn’t effective.
    • Public opinion is powerful, so emphasize consensus when you can. If a large percentage of scientists agree on something, share that percentage. If only a small percentage of the general population believes a claim, say that.
    • Highlight how we know what we know. Explain how science works, what it does, what it doesn’t do, and how it’s self-correcting.
    • Audiences tend to pay more attention to sources’ trustworthiness than to their expertise when weighing evidence. Focus on communicating with communities you’re a part of, don’t be afraid to lean on your personal identities to connect, or think about whether you can partner with or recommend speaking with trusted community members, such as clergy and nurses.
    • As you debunk (or prebunk) claims, being transparent about what you do and don’t know builds trust.
    • Don’t overwhelm your audience with a deluge of debunking. Focus on just one or two of evidence, flawed logic, or strategies being used to perpetuate the false information instead of a comprehensive takedown.

Further reading: Prebunking misinformation and a guide for countering misinformation