There is a long tradition of attempting to test whether the truth changes people’s perceptions, both in academic and campaigning work, but the results remain mixed and inconclusive. Some studies show no impact at all on perceptions when we are told the correct figures, while others show some impact on certain beliefs, but not others. And some show more marked changes. In one more hopeful, recent example from a study in thirteen countries, the researchers split the group of respondents in two. They told one half some facts about actual immigration levels, and said nothing to the other half. Those armed with the correct information were less likely to say there were too many immigrants. However, on the other hand, they did not change their policy preferences: they were not more likely to support facilitating legal immigration. When the researchers went back to the same group four weeks later, the information had stuck for most – although so had the policy preferences. This fits with long-identified theories that facts struggle to cut through our partisan beliefs or our ‘perceptual screen’ as Angus Campbell and colleagues outlined in their classic book, The American Voter, back in 1960.
Excerpt from: The Perils of Perception Why We’re Wrong About Nearly Everything by Bobby Duffy