It can happen, but often you can predict when someone will be utterly unwilling to change their mind, despite mountains of evidence.
If it’s something that someone doesn’t really have a stake in, they’re likely to follow the evidence.
But, it’s different when something is a big part of someone’s identity. Take an American gun nut: Someone who spends a lot of free time on gun-related forums. Someone who goes shooting sometimes with buddies. Someone who listens to podcasts about guns, and has a gun safe filled with favourites. That’s the kind of person who is never going to be swayed by rational arguments about guns.
Too much of their self-identity and too many of their social connections are gun-related. Changing their mind wouldn’t just mean adopting a new set of facts, it would mean potential conflicts with all their friends. It would mean leaving a social group where they spend a lot of their free time. They’d not only have to accept that they’re wrong, but that all their friends are wrong too.
Of course, there are ways to change the minds of people who are in a situation like that. Unfortunately, it mostly happens due to tragedy. Like, a gun nut will change their mind, but only when a family member kills themselves with a gun, either on purpose or accidentally. That new, and incredibly personal data point is enough to compensate for all the social difficulties related to changing your mind.
Edit: in seriousness, I agree with you. But I just can’t help feeling that if somebody is able to change their mind with evidence then it’s my duty to try.
I try too, but it’s frustrating. I just wish I knew of a good technique that didn’t involve out-and-out lying. Because it’s hard to compete when someone’s being spoon-fed misinformation and disinformation that’s carefully crafted to bypass all their filters, and you have to try to fight for the truth by being honest and using facts.
It can happen, but often you can predict when someone will be utterly unwilling to change their mind, despite mountains of evidence.
If it’s something that someone doesn’t really have a stake in, they’re likely to follow the evidence.
But, it’s different when something is a big part of someone’s identity. Take an American gun nut: Someone who spends a lot of free time on gun-related forums. Someone who goes shooting sometimes with buddies. Someone who listens to podcasts about guns, and has a gun safe filled with favourites. That’s the kind of person who is never going to be swayed by rational arguments about guns.
Too much of their self-identity and too many of their social connections are gun-related. Changing their mind wouldn’t just mean adopting a new set of facts, it would mean potential conflicts with all their friends. It would mean leaving a social group where they spend a lot of their free time. They’d not only have to accept that they’re wrong, but that all their friends are wrong too.
Of course, there are ways to change the minds of people who are in a situation like that. Unfortunately, it mostly happens due to tragedy. Like, a gun nut will change their mind, but only when a family member kills themselves with a gun, either on purpose or accidentally. That new, and incredibly personal data point is enough to compensate for all the social difficulties related to changing your mind.
The backfire effect, as presented by The Oatmeal:
https://theoatmeal.com/comics/believe
Except that may have been a fluke:
https://link.springer.com/article/10.1007/s11109-019-09528-x
yeah well I still think it works
Edit: in seriousness, I agree with you. But I just can’t help feeling that if somebody is able to change their mind with evidence then it’s my duty to try.
I try too, but it’s frustrating. I just wish I knew of a good technique that didn’t involve out-and-out lying. Because it’s hard to compete when someone’s being spoon-fed misinformation and disinformation that’s carefully crafted to bypass all their filters, and you have to try to fight for the truth by being honest and using facts.