Debunking misinformation could cause people to remember it better and to remember it as being true even after they’re shown corrective information that’s supposed to debunk it due to the increased exposure to this misinformation.
his phenomenon is called the Familiarity Backfire Effect and plays a key role in various domains, most particularly when it comes to the refutation of pseudoscientific theories and political conspiracies.
When someone is shown evidence that disproves a certain pseudoscientific or political myth, after a week passes, he/she might forget the corrective evidence but remember the myth itself and believe that it’s true.
One study found that warning people that certain health-related claims are false does help people understand that those claims are false in the short term, but sometimes also increases the likelihood that they’ll remember those claims as being true after a few days have passed.
Another study found that giving people health-related warnings about what not to do sometimes ends with them remembering those warnings as being instructions regarding what they should do, after some time has passed.
This phenomenon can be attributed to the Increased familiarityeffect: each time people hear or read the statement that’s being debunked they become more familiar with it, which makes it easier for them to process it (by increasing its processing fluency). Since people prefer to accept explanations that are easy for them to process and access from a cognitive perspective, this increased familiarity increases the likelihood that they will believe that the statement is true.
It can also be attributed to the reiteration effect, whereby repeating a statement increases the degree to which people believe it.
On the other hand, another study finds no evidence of the familiarity backfire effect when correcting misinformation, however, the study suggests avoiding the repetition of this misinformation:
While we have demonstrated that corrections do not backfire when it comes to specific beliefs about a proposition, one needs to differentiate this from the over-arching framing that is achieved by stating something that is false… For example, a government official stating that there are ‘no plans for a carbon tax’ may achieve a reduction in the specific belief that a carbon tax rollout is being prepared, but at the same time using the word ‘tax’ may make people who oppose new taxes for ideological or pragmatic reasons think about climate change as a threat rather than an opportunity… Therefore, communicators should perhaps focus their considerations more on the framing of their corrections, as repeating the misinformation frame might do more damage than repetition of the misinformation itself.”
HOW TO AVOID THE FAMILIARITY BACKFIRE EFFECT?
There are several things that you can do to avoid causing the familiarity backfire effect when you debunk misinformation.
First, you should keep in mind two main guidelines whenever you engage in debunking:
- Focus on the facts that you’re presenting, rather than on the misinformation that you’re trying to debunk.
- Avoid repeating the misinformation unnecessarily.
Second, there are several additional guidelines that you can also use to guide your debunking:
- Start with the facts: That is, when you begin the debunking attempt, you should open with the facts, and only then introduce the misinformation.
- Before presenting misinformation, identify it as such: When you’re about to present misinformation, explicitly warn people that the information that they’re about to see is false, and potentially also explain why this information can be misleading and why people promote it in the first place.
- Follow misinformation with corrective information: After presenting misinformation, follow it up immediately with corrective information, to put the focus on the facts and help them stick in people’s minds.
In brief, to further reduce the likelihood of the familiarity backfire effect, you should make the facts clear, easy to understand, and easy to remember, and you can also use general debiasing techniques, such as increasing people’s involvement in the reasoning process.
We are working hard to bring you the latest fact-checked information and tools. Donate every time you read disinformation and the money will be used to pay a fact-checking ad!
Eine einmalige Spende tätigen
Your contribution is appreciated.
Spenden
Make a monthly donation
Your contribution is appreciated.
Spenden
Make a yearly donation
Your contribution is appreciated.
Spenden