This fall, I had the opportunity to tune into the Climate Center’s first Research to Reality session: “How to Refute Climate Change Misinformation” where USC professor Gale Sinatra spoke about her research around climate science education, theories of learning and the role of emotion when teaching controversial topics. 

Sinatra started by breaking down some climate change terms that we use in conversation, such as denial, doubt and resistance. 

According to her, denial is a belief-based stance that rejects evidence –– examples could be when people say things like “the earth is flat” or “global warming is a hoax” or “vaccines cause autism.” A more common version of this is called “cafeteria denial,” where people cherry pick what to accept and what to reject within a certain concept. 

She said people usually show doubt and resistance when facts don’t fit in with their personal beliefs or political identities, or when they require a deeper analysis than people are willing to do.

But it is also important to remember that doubt can be systematically manufactured. Sinatra said the longer we stay in our own bubbles and echo chambers, the more susceptible we become to systemic doubt. In fact, research shows that trust in expert sources has eroded to the point that we trust articles posted by our connections on Facebook more than scientific research published in reputable places.

During this talk, Sinatra provided us with a list of five main challenges that science communicators face. 

  1. Complexity: Because science is complex, communicators need to learn how to simplify it without “dumbing it down.” Metaphors can often be effective bridges towards effective communication especially around tougher or more complex topics.
  2. Controversy: While reporters generally strive for “balanced” reporting on controversial issues, this can sometimes give disproportionate visibility to those who deny the science and can exploit those who are vulnerable to systemic doubt. For example, the #DebateMeBro culture can sometimes hurt widespread science literacy. Sinatra said it is better when scientists explain how they know what they know, as well as what is still unknown, just to give the audience a deeper understanding of the knowledge systems.
  3. Motivated Reasoning: Motivated reasoning is a phenomenon where individuals reach conclusions based on their preferences and not accuracy. These preferences can include political or social identities. Understanding these motivations, Sinatra said, allows for tailored communication with a lower chance of creating misconceptions.
  4. Misconceptions: Misconceptions are personal thoughts and ideas that do not align with the widely accepted or scientific understanding of the issue. This often happens due to a phenomenon known as the “deficit hypothesis”: when people are not given sufficient, accurate information about the science surrounding an issue, they tend to develop misconceptions about it. But providing people with more information and letting them fight the misconceptions alone doesn’t work. Simply telling them they are wrong also doesn’t produce effective results. Sinatra said that these two ideas have to work together in harmony, wherein you first challenge the misconception by directly refuting it and then provide the person with more information about science or facts.
  5. Emotions and Attitudes: Several misconceptions are often associated with negative attitudes and emotions around the topic. But by eradicating these misconceptions, we can encourage more positive attitudes around the issue.

Sinatra also provided three tips that science communicators should keep in mind when fighting misconceptions:

  1. The way to successfully debunk a myth is by stating the myth, explaining why it is misleading and then filling in the gap by providing the correct scientifically valid position on the issue.
  2. Sometimes, this strategy can backfire, but do not be afraid of that. Sinatra said that when it comes to scientific topics, it is very rare that a debunking attempt would lead to strengthening the original misconception. You should definitely attempt to refute misconceptions when you have lots of strong, clear evidence to back your argument up and when having a misconception about the topic can be actually harmful.
  3. However, to avoid a backfire, avoid arguing with people in cases where the science is unclear or controversial.

For more information on this topic, check out Sinatra’s book Science Denial: Why It Happens and What to Do About It.

Sierra Decker works as a Research Assistant at the Center for Climate Journalism and Communication.