Scientific issues can be vulnerable to misinformation campaigns. Plenty of people still believe that vaccines cause autism and that human-caused climate change is a hoax. Science has thoroughly debunked these myths, but the misinformation persists in the face of overwhelming evidence.
Straightforward efforts to combat the lies may backfire as well. A paper published on September 18 in Psychological Science in the Public Interest (PSPI) says that efforts to fight the problem frequently have the opposite effect.
"You have to be careful when you correct misinformation that you don't inadvertently strengthen it. If the issues go to the heart of people's deeply held world views, they become more entrenched in their opinions if you try to update their thinking." says Stephan Lewandowsky, a psychologist at the University of Western Australia in Perth and one of the paper's authors.
Psychologists call this reaction belief perseverance: maintaining your original opinions in the face of overwhelming data that contradicts your beliefs. Everyone does it, but we are especially vulnerable when invalidated beliefs form a key part of how we narrate our lives. Researchers have found that stereotypes, religious faiths and even our self-concept are especially vulnerable to belief perseverance.
A 2008 study in the Journal of Experimental Social Psychology found that people are more likely to continue believing incorrect information if it makes them look good (enhances self-image). For example, if an individual has become known in her community for purporting that vaccines cause autism, she might build her self-identity as someone who helps prevent autism by helping other parents avoid vaccination. Admitting that the original study linking autism to the MMR (measles–mumps–rubella) vaccine was ultimately deemed fraudulent would make her look bad (diminish her self-concept). In this circumstance, it is easier to continue believing that autism and vaccines are linked, according to Dartmouth College political science researcher Brendan Nyhan. "It's threatening to admit that you're wrong," he says. "It's threatening to your self-concept and your worldview."
It's why, Nyhan says, so many examples of misinformation are from issues that dramatically affect our lives and how we live. Ironically, these issues are also the hardest to counteract. Part of the problem, researchers have found, is how people determine whether a particular statement is true. We are more likely to believe a statement if it confirms our preexisting beliefs, a phenomenon known as confirmation bias. Accepting a statement also requires less cognitive effort than rejecting it. Even simple traits such as language can affect acceptance: Studies have found that the way a statement is printed or voiced (or even the accent) can make those statements more believable.
Correcting misinformation, however, isn't as simple as presenting people with true facts. When someone reads views from the other side, they will create counterarguments that support their initial viewpoint, bolstering their belief of the misinformation. Retracting information does not appear to be very effective either. Lewandowsky and colleagues published two papers in 2011 that showed a retraction, at best, halved the number of individuals who believed misinformation.
Despite countless findings to the contrary, a large portion of the population doesn't believe that scientists agree on the existence of human-caused climate change, which affects their willingness to seek a solution to the problem, according to a 2011 study in Nature Climate Change. Although virtually all climate scientists agree that human actions are changing the climate and that immediate action must be taken, roughly 60 percent of Americans believe that no scientific consensus on climate change exists. "This is not a random event," Edward Maibach, director of the Center for Climate Change Communication at George Mason University says. Rather, it is the result of a concerted effort by a small number of politicians and industry leaders to instill doubt in the public. They repeat the message that climate scientists don't agree that global warming is real, is caused by people or is harmful. Thus, the message concludes, it would be premature for the government to take action and increase regulations.
To counter this effort, Maibach and others are using the same strategies employed by climate change deniers. They are gathering a group of trusted experts on climate and encouraging them to repeat simple, basic messages. It's difficult for many scientists, who feel that such simple explanations are dumbing down the science or portraying it inaccurately. And researchers have been trained to focus on the newest research, Maibach notes, which can make it difficult to get them to restate older information. Another way to combat misinformation is to create a compelling narrative that incorporates the correct information, and focuses on the facts rather than dispelling myths—a technique called "de-biasing."
Although campaigns to counteract misinformation can be difficult to execute, they can be remarkably effective if done correctly. A 2009 study found that an anti-prejudice campaign in Rwanda aired on the country's radio stations successfully altered people's perceptions of social norms and behaviors in the aftermath of the 1994 tribally based genocide of an estimated 800,000 minority Tutsi. Perhaps the most successful de-biasing campaign, Maibach notes, is the current near-universal agreement that tobacco smoking is addictive and can cause cancer. In the 1950s smoking was considered a largely safe lifestyle choice—so safe that it was allowed almost everywhere and physicians appeared in ads to promote it. The tobacco industry carried out a misinformation campaign for decades, reassuring smokers that it was okay to light up. Over time opinions began to shift as overwhelming evidence of ill effects was made public by more and more scientists and health administrators.
The most effective way to fight misinformation, ultimately, is to focus on people's behaviors, Lewandowsky says. Changing behaviors will foster new attitudes and beliefs.
http://www.businessinsider.com/how-confirmation-bias-turns-conspiracy-theories-into-facts-2012-10
Straightforward efforts to combat the lies may backfire as well. A paper published on September 18 in Psychological Science in the Public Interest (PSPI) says that efforts to fight the problem frequently have the opposite effect.
"You have to be careful when you correct misinformation that you don't inadvertently strengthen it. If the issues go to the heart of people's deeply held world views, they become more entrenched in their opinions if you try to update their thinking." says Stephan Lewandowsky, a psychologist at the University of Western Australia in Perth and one of the paper's authors.
Psychologists call this reaction belief perseverance: maintaining your original opinions in the face of overwhelming data that contradicts your beliefs. Everyone does it, but we are especially vulnerable when invalidated beliefs form a key part of how we narrate our lives. Researchers have found that stereotypes, religious faiths and even our self-concept are especially vulnerable to belief perseverance.
A 2008 study in the Journal of Experimental Social Psychology found that people are more likely to continue believing incorrect information if it makes them look good (enhances self-image). For example, if an individual has become known in her community for purporting that vaccines cause autism, she might build her self-identity as someone who helps prevent autism by helping other parents avoid vaccination. Admitting that the original study linking autism to the MMR (measles–mumps–rubella) vaccine was ultimately deemed fraudulent would make her look bad (diminish her self-concept). In this circumstance, it is easier to continue believing that autism and vaccines are linked, according to Dartmouth College political science researcher Brendan Nyhan. "It's threatening to admit that you're wrong," he says. "It's threatening to your self-concept and your worldview."
It's why, Nyhan says, so many examples of misinformation are from issues that dramatically affect our lives and how we live. Ironically, these issues are also the hardest to counteract. Part of the problem, researchers have found, is how people determine whether a particular statement is true. We are more likely to believe a statement if it confirms our preexisting beliefs, a phenomenon known as confirmation bias. Accepting a statement also requires less cognitive effort than rejecting it. Even simple traits such as language can affect acceptance: Studies have found that the way a statement is printed or voiced (or even the accent) can make those statements more believable.
Correcting misinformation, however, isn't as simple as presenting people with true facts. When someone reads views from the other side, they will create counterarguments that support their initial viewpoint, bolstering their belief of the misinformation. Retracting information does not appear to be very effective either. Lewandowsky and colleagues published two papers in 2011 that showed a retraction, at best, halved the number of individuals who believed misinformation.
Despite countless findings to the contrary, a large portion of the population doesn't believe that scientists agree on the existence of human-caused climate change, which affects their willingness to seek a solution to the problem, according to a 2011 study in Nature Climate Change. Although virtually all climate scientists agree that human actions are changing the climate and that immediate action must be taken, roughly 60 percent of Americans believe that no scientific consensus on climate change exists. "This is not a random event," Edward Maibach, director of the Center for Climate Change Communication at George Mason University says. Rather, it is the result of a concerted effort by a small number of politicians and industry leaders to instill doubt in the public. They repeat the message that climate scientists don't agree that global warming is real, is caused by people or is harmful. Thus, the message concludes, it would be premature for the government to take action and increase regulations.
To counter this effort, Maibach and others are using the same strategies employed by climate change deniers. They are gathering a group of trusted experts on climate and encouraging them to repeat simple, basic messages. It's difficult for many scientists, who feel that such simple explanations are dumbing down the science or portraying it inaccurately. And researchers have been trained to focus on the newest research, Maibach notes, which can make it difficult to get them to restate older information. Another way to combat misinformation is to create a compelling narrative that incorporates the correct information, and focuses on the facts rather than dispelling myths—a technique called "de-biasing."
Although campaigns to counteract misinformation can be difficult to execute, they can be remarkably effective if done correctly. A 2009 study found that an anti-prejudice campaign in Rwanda aired on the country's radio stations successfully altered people's perceptions of social norms and behaviors in the aftermath of the 1994 tribally based genocide of an estimated 800,000 minority Tutsi. Perhaps the most successful de-biasing campaign, Maibach notes, is the current near-universal agreement that tobacco smoking is addictive and can cause cancer. In the 1950s smoking was considered a largely safe lifestyle choice—so safe that it was allowed almost everywhere and physicians appeared in ads to promote it. The tobacco industry carried out a misinformation campaign for decades, reassuring smokers that it was okay to light up. Over time opinions began to shift as overwhelming evidence of ill effects was made public by more and more scientists and health administrators.
The most effective way to fight misinformation, ultimately, is to focus on people's behaviors, Lewandowsky says. Changing behaviors will foster new attitudes and beliefs.
http://www.businessinsider.com/how-confirmation-bias-turns-conspiracy-theories-into-facts-2012-10
No comments:
Post a Comment