What makes people so gullible
Credulity is closely related, a willingness to believe unlikely propositions with no evidence behind them. When a colleague tells you the boss wants to see you immediately, the first, automatic reaction is to believe them.
Once we realise this is April 1, a more critical mindset will increase our threshold of acceptance and triggers more thorough processing. Rejection is then likely unless there is strong corroborating evidence. So, it seems that gullibility and credulity have to do with how we think, and the level of proof we need before accepting information as valid. Of course, this is not always so; others often want to manipulate us for their own purposes.
When the information is personally rewarding, we actually want to be gullible. This is when we tend to prefer dubious information that supports our pre-existing attitudes, and are more inclined to reject valid information that challenges our beliefs. A similar bias exists when passing on doubtful information to others. We tend to reshape rumour and gossip in ways that support our pre-existing stereotypes and expectations. Inconsistent details — even if true — are often changed or even omitted.
Gullibility and credulity have become important issues as a deluge of raw, unverified information is readily available online.
Consider of how fake news during the US presidential election influenced voters. Stories that generate fear and promote a narrative of corrupt politicians and media can be particularly effective. Again, the result is that we tend to accept it as the truth.
This is particularly true if a myth easily fits with our expectations. A slick presentation will instantly boost the cognitive fluency of a claim, while raising its believability. In one recent study , Newman presented participants with an article falsely saying that a well-known rock singer was dead. The subjects were more likely to believe the claim if the article was presented next to a picture of him, simply because it became easier to bring the singer to mind — boosting the cognitive fluency of the statement.
In light of these discoveries, you can begin to understand why the fear of the flesh-eating bananas was so infectious. For one thing, the chain emails were coming from people you inherently trust — your friends — increasing the credibility of the claim, and making it appear more popular.
The concept itself was vivid and easy to picture — it had high cognitive fluency. If you happened to distrust the FDA and the government, the thought of a cover-up would have fitted neatly into your worldview.
It's true: we would rather hide our heads in the sand than listen to evidence questioning our beliefs, even if the facts are solid Credit: Getty Images. That cognitive miserliness can also help explain why those attempts to correct a myth have backfired so spectacularly, as the CDC found to their cost.
The problem, she says, emerges from our deeply flawed memories. Rather than uprooting the myth, the well-intentioned correction has only pushed it deeper. A debunked myth may also leave an uncomfortable gap in the mind.
Fortunately, there are more effective ways to set people straight and make the truth stick. For a start, you should avoid repeating the original story where possible and try to come up with a whole alternative to patch up the tear in their mental model. Andrew Wakefield pictured falsified elements of research that wrongly linked autism to MMR vaccines, leading him to be struck off the medical register Credit: Getty Images.
Whatever story you choose, you need to increase the cognitive fluency with clear language, pictures, and good presentation. If true, this would be quite terrifying, as people do believe in a number of absurdities. For example, a few years ago, a sizable minority of Americans seemed to believe that the basement of the Comet Ping Pong pizzeria, in the suburbs of Washington DC, was used by Democratic operatives to abuse children the conspiracy theory known as pizzagate.
One guy did commit, if not an atrocity, at least something extremely stupid: storming the place, gun ablaze, asking for the children to be freed. Voltaire was right about him. But not about the Most popular false beliefs—rumors, urban legends, conspiracy theories—are like pizzagate: people say they believe, and they do, but the beliefs do not influence the rest of their thoughts, or their behavior.
This helps explain why people accept such beliefs: as the beliefs are largely inconsequential at least for those who hold them , the stakes are low, and their vigilance relaxed. People will loudly share views in spite of not behaving in line with the views at all. That makes no sense. When you live with an actually powerful and nefarious intelligence agency in your country, you shut up about it, or get dead.
If people share such beliefs, it is likely because their behavior serves some goal. Indeed, sharing beliefs, even false beliefs, can serve a variety of social ends. We can justify a common course of action—as when people share rumors of atrocities before an ethnic riot. We can entertain our audience with thrilling urban legends or salacious rumors. We can even commit to a fringe group by saying things so apparently absurd or evil that almost everyone is sure to reject us, thus proving to this fringe group that we really are in with them think flat earthers.
Mass persuasion—from authoritarian propaganda to advertising—fails massively. Does this mean, then, that people are merely pigheaded? People are not pigheaded, they are rationally skeptical. But when the reasons are there, people do change their minds.
0コメント