You only see what you want to see; you only hear what you want to hear. People are stubborn, especially when it comes to preexisting beliefs. Try convincing a Madonna fan that Lady Gaga is original, and you’ll see what I mean.
It gets even worse when it comes to politics. The information that researchers may gather from survey polls could be inaccurate because people tend to respond according to their predispositions, regardless of whether their beliefs are based on misperceptions. A 2011 study by Brendan Nyhan and Jason Reifler focuses on this issue – on why people have hold on to misinformed opinions and the most effective way of providing the correct information.
Nyhan and Reifler conducted three experiments to understand how people react to the different ways that information was given to them. They found that people tend to reject information that “call into question their beliefs and attitudes.” In other words, people who have preexisting ideas that have served to develop their identity, will likely refuse to acknowledge correct information.
I think it goes hand-in-hand with Stephen Colbert’s concept of truthiness. “I doubt that many people in American politics are acting on the facts,” Colbert says in the New York Times article. “Everybody on both sides is acting on the things that move them emotionally the most.”
So, as a data journalist, how do you avoid inaccuracy when trying to illicit responses from a politically passionate crowd?
Nyhan and Reifler seem to suggest that there are two ways of handling skewed answers. First, they look to previous studies to figure out how “self-affirmation” or “self-worth” can help people to accept information that may contradict their beliefs. I wish that Nyhan and Reifler were less jargon-y when they talk about self-affirmation because I’m not sure that I understood exactly what they meant. How do you affirm someone’s self-worth? Nyhan and Reifler indicated, through their literature review, that affirmation helps to make people less defensive about their preexisting beliefs, and therefore more willing to accept information or concepts that go against what they normally believe. But they don’t explain what these studies specifically did to avoid receiving skewed information.
And how do we know that the data collected from surveys and polls are necessarily skewed? Should researchers and journalists assume that when it comes to a controversial topic like climate change, people will generally have skewed beliefs? If so, should they then phrase the question or format it so that information is perceived in a way that will prevent people from relying on their preexisting beliefs?
Nyhan and Reifler indicate that the second way of handling misperceptions is to format the information that corrects misperceptions by using graphs and table charts. Sometimes, they point out, fact checking can make non-believers more resistant to the corrected information. But if the corrected information is presented in graph or table format, people may be less self-defensive if they see the information visualized.
But can everything be formatted as graphs and table charts? I think such a method may be too limited, only applicable to certain types of data.
Overall, Nyhan and Reifler’s study is useful in showing us one of the trickiest elements of data visualization: the psychological, political factor. It’s difficult to detect and control without making assumptions about the people being interviewed or about oneself. In that sense, I think that Nyhan and Reifler failed to consider the possible biases that data journalists may have as well, which could influence the content and visual designs of their projects.
But it’s helpful to know what kind of design is best if a data journalist is working on a project that involves a controversial subject, like climate change or, for this year, the 2012 elections .