Upton Sinclair famously used to say that it was "difficult to get a man to understand something if his salary depends on not understanding it." It's also the case that it is difficult to get someone to change a mistaken belief when their deeply held preconceptions and world view depends on it. Facts, as various studies have shown, have a tendency to bounce off ineffectively when they don't fit the preconceptions (or mental "frame") people bring to an issue.
In a recent piece in The Forum, political scientist Brendan Nyhan compares how this process worked in both the Obama and Clinton health care reform proposals. Various misconceptions about the proposals, such as the "death panel" myth, were held most strongly by ideological opponents who believed they were particularly well informed:
... beliefs about the Clinton and Obama reform plans represented misperceptions rather than simple ignorance—a distinction that is emphasized by Kuklinski et al. (2000: 792). The difference between the two concepts is that members of the public who are uninformed typically know that they lack information about a given issue, while those who hold misperceptions are paradoxically more likely to believe that they are well-informed. For instance, Kuklinski et al. (2000) found that Illinois residents who held misperceptions about welfare benefit levels and the beneficiary population were the most confident in the accuracy of their beliefs. Using survey data from 1993 and 2009, we observe a similar dynamic in misperceptions about the Clinton and Obama health care plans among opposing partisans (i.e., Republicans). As noted above, the confidence with which these beliefs are held is one reason they are so difficult to correct.Nyhan's review of research on why it is so difficult to correct misinformation—and how fact-checks or rebuttals may even backfire by reinforcing the misinformation—is particularly useful and applicable to policy debates generally:
... the same factors that lead to acceptance of myths and misperceptions also make them very difficult to correct. The increasing array of media choices means that individuals are less likely to encounter information that would correct misperceptions (Sunstein 2001). In addition, people's tendency to process information with a bias toward their pre-existing views means that those who are most susceptible to misinformation may reject the corrections that they receive. Nyhan and Reifler (N.d.) find that more realistic corrections embedded in mock news articles often fail to reduce misperceptions among the targeted ideological group and sometimes even increase them—a phenomenon called a "backfire effect." These results suggest media fact-checks are often ineffective and may sometimes make misperceptions worse. Other psychological factors also increase the likelihood that corrections will fail to undo the effects of misperceptions. Research by Mayo, Schul, and Burnstein (2004) shows that negations (i.e., "I am not a crook") often reinforce the perception they are intended to counter. (See Nyhan et al. 2009 for an application of this finding to the myth that Barack Obama is a Muslim.) In addition, even if people initially accept corrections debunking a false statement, they may eventually fall victim to an "illusion of truth" effect in which people misremember false statements as true over time (Skurnik et al. 2005; Schwarz et al. 2007). Finally, Bullock (2007) used the belief perseverance paradigm (c.f. Ross and Lepper 1980) to show that misleading statements about politics continue to influence subjects' beliefs even after they have been discredited.So, what should we do when faced with false information in a policy debate? Nyhan argues we need to "raise the reputational costs of promoting misinformation" and that the best approach may be "for concerned scholars, citizens, and journalists to (a) create negative publicity for the elites who are promoting misinformation, increasing the costs of making false claims in the public sphere, and (b) pressure the media to stop providing coverage to serial dissemblers." This isn't, of course, contrary to providing information that debunks a false claim, but it does suggest that the primary audience for debunking information is less the people in the public at large who hold the mistaken belief, than it is the media and other elites who are transmitting the false information.
Where does this leave us then when we're responding/defending against misinformation in the public sphere or even in non-elite private ones, like discussions with friends and family? Instead of just providing the fact or data that rebuts misinformation in a narrow or technical sense, it's important to identify the frame or mental preconceptions that likely underlie or reinforce the misinformation, and then come up with an argument or response that seeks to undermine or replace those broader preconceptions as much as it does the specific piece of misinformation.
“Passion and prejudice govern the world; only under the name of reason” --John Wesley
Sunday, May 23, 2010
How Do We Correct Misinformation in Public Policy Debates?
CEPR:
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment