Settings

Theme

Why Facts Don’t Change Our Minds (2017)

newyorker.com

116 points by skm 6 years ago · 125 comments

Reader

im3w1l 6 years ago

I really dislike this narrative that facts don't change our minds and we are all irrational.

The thing is, truth does exert a pull on our beliefs. It's a slow force. It may take years for people to come around to it. Sometimes it even happens on a generational scale. But we are approaching the truth. Everything in history, and everything in our daily experience tells us this. A couple of experiments where researchers manage to fool the people in their studies does not disprove this overall trend.

What scares me about this narrative, is that people are using it to discredit democracy. "Look how stupid people are! We have to spoonfed them the cherrypicked facts that lead them to the right beliefs. We have to decide everything for them."

  • simonh 6 years ago

    Whether you like it or not has no bearing on whether it's true. In fact that's a confirmation bias right there. Presented with evidence of something you find unpalatable you simply reject it outright purely on the basis you don't like it. Personally I'd rather know.

    The point isn't that evidence has no power, it's that it has dramatically less power than most people think. However there are strategies for getting us out of the personal bias quagmire, such as the scientific method and the approach described in the article of providing an account or explanation of your position and the reasons for it. The debating rule of first explaining your opponent's position in your own words, but in a form they accept as being accurate, before trying to rebut it is also hugely powerful. These do seem to work and help lead us to better outcomes, so this is valuable and actually useful work.

    • im3w1l 6 years ago

      > Presented with evidence of something you find unpalatable you simply reject it outright purely on the basis you don't like it.

      I presented a case for why I think it is wrong. And I presented a case for why people have reason to push it despite it being wrong: it gives them more power. Considering peoples motivations is important, and when someone stands to gain we should be suspicious and have to go over everything extra carefully.

      • simonh 6 years ago

        I suppose I was a bit antagonistic in my post sorry, but if anything your observation supports the article's position. Why should facts exert only a slow, painstaking generational force on belief if people are actually rational? Surely it should have immediate effect?

        I really don't see what these researchers or journalists have to gain, beyond what they would gain from doing any research or journalism. I'm just not seeing any credible counter-arguments.

        • im3w1l 6 years ago

          > Why should facts exert only a slow, painstaking generational force on belief if people are actually rational? Surely it should have immediate effect?

          If you look at rational as a binary, then people aren't rational. But people are a little bit rational. Sufficiently rational, to eventually find the truth.

          I like the parallel with machine learning. Many, many bright minds have tried to formalize our intuitions into automatic systems. Gradually they make progress. But it's plain to see that it's not as easy as "just incorporate the new fact". We have systems that can deal with facts, and systems that can learn from experience. But systems that do both, that can learn from experience and express that in terms of facts, or use facts to guide it's exploration, that's an open problem.

          As for who has to gain, I do think journalists, and editors, and newspaper owners have something to gain. Their role transforms from giving people "just the facts", to manipulating people into the right beliefs. What the right beliefs are? That's for the journalists and their benefactors to decide.

          • michaelmrose 6 years ago

            A small minority of people are rational enough to forge new truths in the face of conflicting and complicated information usually in a small narrowly focused way. A much larger minority is capable of digesting and making use of the work product of the former group in a productive way again within the scope of a broader but still narrow scope.

            The majority is too stupid to make up their own minds and needs to be educated at a young age to accept the work product of prior generations experts because they are just too unintelligent to evaluate it for themselves. This is literally most people.

            The fact that this is unpleasant doesn't make it untrue.

          • simonh 6 years ago

            >If you look at rational as a binary, then people aren't rational. But people are a little bit rational. Sufficiently rational, to eventually find the truth.

            Right, which is pretty much exactly what the article says. It shows what forms the flaws in our rationality take, and suggests procedural methods we can use to help the process of rational analysis along.

        • Nasrudith 6 years ago

          Why should facts exert a slow pull? In addition to any unknown constraints with neural mechanisms an overly high learning rate has been noted with neural networks as resembling schizophrenia in some ways. Plus whatvever physiological constraints limit how fast minds could change.

          I suspect it effectively functions as minor anti-mindhacking measure as bizzare and dumb as it sounds. It would protect against adversarial input.

          If there was no mental inertia then "false facts" which check out to all verification measures could prove quite dangerous as input which outweighs all past known could be easily exploited by bad actors to change what they "know" and exploit it from there.

          • wizzwizz4 6 years ago

            An Untrollable Mathematician, Illustrated: https://www.lesswrong.com/posts/CvKnhXTu9BPcdKE4W/an-untroll...

            To summarise: for reasons, it's possible for somebody to endlessly drive up and down a computationally-limited agent's credence estimate for mathematical conjecture X (so long as the agent hasn't found a proof yet) just by stating implications of the conjecture, if the agent is trying to use ideal Bayesian reasoning. It's possible to make this "trolling" impossible, by basically turning your "mental inertia" up to 11 – but that proves it's possible to be untrollable.

        • RcouF1uZ4gsC 6 years ago

          > Why should facts exert only a slow, painstaking generational force on belief if people are actually rational? Surely it should have immediate effect?

          Kind of like Bayesian inference. You have a prior and the different evidence moves the probability.

          And it works a lot. For example, if you see a magician levitating, the evidence of your eyes seems to suggest that they are actually levititating. However, your priors tell you that the chance of that is still pretty low even with the new evidence. So this way, you don’t go run across a street magician and suddenly believe humans can levitate.

      • bluntfang 6 years ago

        Sounds like it's time to submit some peer reviewed scientific papers to refute the current popular science opinion!

        Oh you don't have the credentials to run experiments and talk about your findings? Oh well that's too bad, I guess we'll all live in the dark and you'll have your truth.

    • ARandomerDude 6 years ago

      You have the OP's causality chain backwards.

      OP: I dislike the claim because I reject the claim.

      Your version of OP: I reject the claim because I dislike the claim.

      • simonh 6 years ago

        That's not a rejection of the claim at all though, even though OP dislikes it. Why should objective proof exert a 'slow force' on our reason? Surely it should be decisive. Saying it can take a generation is tantamount to admitting that objective proof can have to wait for the people who reject it anyway to literally die off before it gets accepted.

        So what OP seems to be actually rejecting is an extreme version of the claim, along the lines that humans aren't rational at all, which isn't there in the article.

        • waterhouse 6 years ago

          There are cognitive metastrategies that can look like irrationality in the small, but are pretty sensible for people with limited information and research time.

          Consider: "a stranger makes a sophisticated argument, which seems convincing, whose conclusion is that you're wrong about something important". If you always respond to this by doing whatever the stranger advocates, you're likely to end up getting scammed or getting eaten up by a political movement or otherwise doing something you later regret.

          Even if the argument is true as far as it goes, sometimes true facts are presented in a misleading context. One favorite tactic to discredit a group seems to be to find some of its worst members and do truthful reporting on them; one can also try to suggest "policy X is working/not working" by choosing the statistical measures that paint it in the best/worst light, and failing to mention the other measures that might portray it more accurately.

          A general strategy of "remember new information, but don't let it affect your actions until you've had time to reflect / consult with those wiser than yourself / do further research" is useful in a wide range of situations. (And if you don't bother to do further research for years, it follows that either the new information sits in abeyance for years, or you take the risk of acting on it without having validated it.)

          • simonh 6 years ago

            >If you always respond to this by doing whatever the stranger advocates, you're likely to end up getting scammed or getting eaten up by a political movement or otherwise doing something you later regret.

            That's amazing, it's an excellent summary of the theory the article proposes for why we instinctively discount evidence against our preconceived opinions. Basically it's to stop us getting scammed, but in the hunter-gatherer context in which we evolved, not a modern society with robust systems for validating evidence.

            The pre-conceived opinions the studies test aren't always even things the test subjects actually care about though. They can be opinions about things they were only just exposed to and wouldn't be expected to have any personal investment in, such as opinions about fictional characters that only exist in the test. We're not talking about proving to conservatives that liberalism is right, or vice versa, some of the tests are literally concerning beliefs about issues that only exist in the test. It doesn't matter, as soon as an opinion is formed it's incredibly hard to change it no matter how strong the counter evidence, and even if it's shown conclusively the initial opinion was based on false data.

        • michaelmrose 6 years ago

          It takes generations because it takes generations to indoctrinate new generations in the truths gleaned by a small minority of experts in various fields have learned. This isn't strong a indication that people are rational. It's more like indication that societies can magnify the gains of a minority of rational individuals to improve the position of all including the majority not smart enough to have made those gains.

      • santoshalper 6 years ago

        He literally starts out describing it as a "narrative" and the first statement he makes is that he dislikes it. It is clear that he is not interested in finding out whether it is true, he simply doesn't like what it implies and is suspicious of the messenger. The actual argument against it is weak and really just reinforces it (it takes entire generations to come around on an issue - we know dude, that's what we're talking about!)

        We are all doing this shit all the time. Denying it just gives it more power. Admitting you have biases that color every interaction at least gives you a fighting chance to examine them.

        • ARandomerDude 6 years ago

          I think you've misunderstood. I didn't say I agree (or disagree) with the OP. My point is that simonh got the cause-effect relationship of im3w1l's argument backwards. The cause is not necessarily stated before the effect.

          For example: "I don't like eating brisket. It always gives me heartburn." In that case, the heartburn causes the dislike, even though it was stated after the effect (non-enjoyment).

          In this case, im3w1l described an emotional state (effect), followed by his reasons (cause). You're free to disagree with his reasons, but it's important to understand the argument or you're responding to a straw-man.

    • btmoney06 6 years ago

      Academics and the New Yorker staff: "Everybody is biased, except for me of course."

  • jbotz 6 years ago

    > I really dislike this narrative that facts don't change our minds and we are all irrational.

    I don't think that is the real "narrative" here, although reading this article may make it seem like that. The "narrative", or rather the modern scientific understanding which this article tries to present to a lay audience, is that real rational thinking is not our default, even though it seems to us that way.

    But we can think more rationally. It just takes a lot more work. We can do all of the following...

    * Subject our thinking to a rigorous framework such as the scientific method, in which we have to declare what evidence would falsify our argument (hypothesis) as we make it.

    * Study cognitive biases to become more aware of their effects on our thinking and hopefully "immunize" our mind against some of their effects.

    * Train our capacity for meta-cognition with mindfulness practices to become more aware of why we think what we think as we think it.

    As for using the limitations of human rationality as an argument against democracy... I don't think that's a logical conclusion at all since leaders and lawmakers are subject to these limitations no matter how they come into power. But it is an argument that we still need to improve the processes by which policy is decided and that we need to watch out for and guard against those who would abuse the specific ways that humans can be tricked because of these factors (such as the Cambridge Analytica crowd).

    • bumby 6 years ago

      > Study cognitive biases to become more aware of their effects on our thinking and hopefully "immunize" our mind against some of their effects.

      I’m not sure this works. In fact, Daniel Khaneman has said in the into to “Thinking: Fast and Slow” that all his study into cognitive biases has led him to believe he’s powerless to stop them in himself, and still only able to recognize them in others

      • jbotz 6 years ago

        That's where my third bullet comes in... you need a high degree of metacognitive awareness (mindfulness) to be able recognize your own biases. Together they do make a difference. If you're well versed in cognitive biases, then as you increase your mindfulness you will begin to recognize them in yourself, at first with a delay, upon reflection, and later --- when you're practically a Zen master ;-) --- you may recognize them in real-time and be able to correct them immediately.

        • bumby 6 years ago

          I think what Khaneman was getting at is that it's hopeless futile to try and recognize your own biases in a meaningful. So why I understand your statement, I'm skeptical that's it's possible if someone at the forefront of cognitive science admits he can't do it. I'm still enough of an optimist to try and use mindfulness myself to accomplish that though.

          Of course, maybe he's just really bad at being mindful ;-)

  • MaxBarraclough 6 years ago

    > I really dislike this narrative that facts don't change our minds and we are all irrational.

    To mirror simonh's comment, it's rather ironic that you're responding to a claim of fact, with an opinion. Whether it's true or not is a question of science, not wishful thinking, and you've not given a solid reason to reject the findings of this research.

    It's like the way the theory of evolution remains true whether or not some nasty elements of the far right try to use it to justify an atrocious ideology like 'social Darwinism'.

    > people are using it to discredit democracy

    Who does this? I don't see researchers like Dan Ariely [0] lurching to the far right when they make discoveries about our psychology. (It's odd that neither Ariely nor the field of behavioural economics [1] are mentioned in the article.)

    Nothing about this research indicates that non-democratic systems of government are the best way to run things after all.

    > truth does exert a pull on our beliefs

    Broadly speaking mankind seems to get less ignorant over time, but sometimes the pull on our beliefs can act in the opposite direction. [2]

    [0] https://en.wikipedia.org/wiki/Dan_Ariely

    [1] https://en.wikipedia.org/wiki/Behavioral_economics

    [2] https://en.wikipedia.org/wiki/Confirmation_bias#backfire_eff...

    • rbecker 6 years ago

      > To mirror simonh's comment, it's rather ironic that you're responding to a claim of fact, with an opinion.

      Are they? "But we are approaching the truth. Everything in history, and everything in our daily experience tells us this." sounds like a claim of fact. Do you disagree with it? Have we not, collectively, changed our minds on a great many things?

      Evolution, heliocentrism, the importance of doctors washing their hands, the non-determinism of quantum physics - all of these are a result of people changing their minds when presented with new facts. Even the importance of car safety belts and harmfulness of smoking. You are literally surrounded by evidence of people changing their minds when presented with new facts, but choose to instead focus on a few experiments where some people didn't change their minds when presented with some evidence on certain topics.

      How did the article call it.. confirmation bias?

      • zzzcpan 6 years ago

        This is just a logical fallacy. Some people changed their minds when presented with new facts, sure, but only some, some didn't and most people likely changed their minds when presented and bombarded with opinions, propaganda, etc., but not facts.

      • simonh 6 years ago

        Nobody has suggested humans are not capable of rationality, that's an absurd exaggeration. Clearly we are. The article even points out the scientific method as a valuable procedural tool we can use to help overcome the effects of biases, and suggests a method of self-reflection that has proved in studies to help obtain useful results. The better we understand those biases the better we are able to develop new approaches and procedures to mitigate them. But the first step is to understand ourselves and our limitations. Without that, we're a flailing bunch of tribal apes screaming at each other.

        • rbecker 6 years ago

          > Nobody has suggested humans are not capable of rationality, that's an absurd exaggeration.

          It's the title of the article.

          • simonh 6 years ago

            If articles could be distilled entirely down to titles, we wouldn't have articles. The authors are clearly not saying humans are incapable of rational thought, we're just not perfectly rational and in fact rational thinking can be surprisingly difficult for us, that's all. You know that, everybody who has read the article knows that whether they agree with the article or not, so why say this?

            It's exactly this sort of hit and run straw man argument the article describes as being behind a lot of fallacious thinking. Scoring a 'hit' on an opponent, no matter how absurd or irrelevant, or how much it distorts the opponent's actual position, grants an immediate dopamine shot. It feels fantastic.

            That's a really crucial part of the puzzle. It's why asking participants in a debate to first state the position of their opponent in their own words, but in terms their opponent accepts is accurate, before arguing against them is such a useful tool. It eliminates retorts based on knowing misrepresentations like this, which are a serious impediment to productive discourse.

            • rbecker 6 years ago

              The 'often' is missing not just from the title, but the entire article. It fails to draw any attention to the weaknesses or limited applicability of the experiments, and implies they're more universal, overstating their results.

              I believe this is deliberate - "humans sometimes don't change their minds on some topics when given some types of new evidence" is uncontroversial, it generates few clicks, little argument. But "facts don't change our minds" (and nothing in the article text about the limitations of that statement), gets you a flame war with one side eager to embrace science, while the other struggles with the contradiction between the article and their own experience.

              • simonh 6 years ago

                The article is about studies, not individuals. At the study level, they are pretty much universal. The effect is so strong that at this point demonstrating it is a routine entry level task for first year psychology students (my mother did a degree in child psychology when I was a teenager so none of this is new to me).

                I'm surprised at your last statement, my experience is very much in line with the article. Presumably you think you have rational, logical evidence based reasons for many of your opinions right? So how come so many people with opposing views are completely intractable to your arguments? You must have noticed this. So either the subset of humanity that agrees with you on any given topic is all purely rational and objectively correct and all the rest are either up to no good or crazy, or theres something else going on. That's all this article is actually pointing out.

                • im3w1l 6 years ago

                  We all agree that newtons laws are approximately correct. But this isn't interesting, it's not newsworthy. We focus on the conflicts.

                • rbecker 6 years ago

                  > Presumably you think you have rational, logical evidence based reasons for many of your opinions right? So how come so many people with opposing views are completely intractable to your arguments? You must have noticed this.

                  I have noticed this. I've also noticed many areas where I agree with the vast majority (e.g. "smoking is unhealthy"). If you'll look carefully, you've committed a subtle form of selection bias. The "many people with opposing views" limits the opinions to controversial ones, that people tend to hold due to various "irrational" reasons (such as group belonging or ethics).

                  I may not have been clear, and I think we mostly agree - I don't think people choose their opinions through pure data, logic, and reason. But they do play a bigger part than the article implies, and the article greatly overstates the strength of those experiments. If you have time, lets look at them closely, starting with the one about the death penalty:

                  > Half the students were in favor of it and thought that it deterred crime; the other half were against it and thought that it had no effect on crime.

                  There are so many confounding factors it's hard to choose where to begin:

                  1: The topic should immediately raise suspicion. Highly emotionally charged, instead of something boring like the optimal tire pressure for road safety. Meaning we probably can't generalize it to more boring topics. But it's exactly what I'd pick if I wanted a surprising result.

                  2: Despite how the sentence carefully implies but not states, deterrence is probably not even close to the main reason why someone would favor or oppose capital punishment, so someone is unlikely to change their opinion based on that. But intellectual laziness can result with someone not examining why they truly hold their opinion, and pick an easy reason instead, as long as they think the data supports it. This perhaps supports "we hold some opinions for irrational reasons" or "we're not honest with ourselves why we hold some opinions", but has little bearing on if we change our minds in face of new evidence.

                  3: The students didn't enter the study as blank slates, perfectly naive and willing to believe whatever some study told them. They could easily have been exposed to many prior studies and word of mouth, claiming capital punishment does/doesn't work as deterrent. And like the perfectly rational agents versed in Bayesian statistics that they are, they examined this new study in light of their prior data, and accepted it or discarded it as an outlier. If the study had claimed regular baths in bleach improve skin health, would we expect them to start bathing in bleach? Then why are we surprised when they are equally skeptical regarding studies on capital punishment. Yes, it's confirmation bias, but so is dismissing bleach-bath studies. And after all, confirmation bias requires there to be some facts that confirm our beliefs, and anything after that is the difficult task of judging who is credible.

                  The study shows motivated reasoning in an instance where we hold a belief for different or irrational reasons, but not the implied immunity to facts.

                  The firefighter study is also interesting. This time, instead of choosing a controversial topic, they instead gave the participants barely any data to work with. It would have been so much simpler if they were first shown study A, that says risk-taking firefighters save 50% more lives, and then told them now, that was made up, study B is the real one, that says risk-taking firefighters save 50% fewer lives. But that's not what they did - instead, they gave a single data point, Frank, that was or was not put "on report" for unspecified reasons, and it's also unclear if being "on report" even means he's less successful, or if he's like the stereotypical detective that has to turn in his gun and badge because the commissioner is upset he's digging into powerful people. Then, after they've had time to think up some plausible reason why risk taking is/isn't good, even that single data point is taken away. The participants, left with no data, simply kept their old beliefs. Facts didn't change their minds because they have no facts. Yes, the correct thing to do would be to revert to "I don't know", but 1) we don't know if that was even an option in the study, and 2) aversion to agnosticism falls very short of "facts don't change our minds". Let me also mention this sneaky wording:

                  > Even after the evidence “for their beliefs has been totally refuted, people fail to make appropriate revisions in those beliefs,”

                  But the evidence hasn't been "refuted". No new, more credible evidence, from which we would draw the opposite conclusion, was presented. Instead, the evidence was simply removed. Just one of the many ways in which the article tries to overstate its case.

                  The suicide note study is very similar. There's an elaborate song and dance, but in the end, the students are again left with no data, and asked to make some guess about that data.

                  The last study, with the reasoning problem, is also overstated. Lets start with "fewer than fifteen per cent changed their minds in step two." - lets go with 14%, despite the author trying to imply it was lower. "About half the participants realized what was going on. Among the other half, suddenly people became a lot more critical. Nearly sixty per cent now rejected the responses that they’d earlier been satisfied with." - so, "about half" of "nearly sixty per cent", lets go with 58/2 = 29%. In other words, 71% either reasoned so consistently, they could identify the deception, or came to the same conclusion again, despite being told their past self came to the opposite conclusion. This study does show people will scrutinize someone else's argument more closely than their own, but doesn't show how big this effect is, other than the coarse limit of 29%-58% of people switching sides in some apparently somewhat ambiguous 'reasoning' task.

                  Lets re-cap. The capital punishment study shows people rationalizing opinions probably based on ethics. The firefighter and suicide studies show people keeping old opinions despite not having any concurring or opposing evidence. The last study is hard to draw any straight-forward conclusions from, but yes, it shows people will do motivated reasoning, but only about at most half the time, and at least half the time they will be consistent. And all the studies except the last had to throw lots of emotions and ambiguity into it to be able to squeeze out an irrational result. I honestly think not pointing this out borders on deception.

                  So I don't "disagree" with any of the studies, or think that people are based on pure logic. But the studies are far more limited than the article wants to admit, and even "why facts often don't change our minds" is way overstating it.

                  • simonh 6 years ago

                    Thanks for the considered response, it was a good read. As you say we're not that far apart really.

                    I agree the studies are very narrow, but that's really a requirement, in a study like this you have to narrow things down very tightly to eliminate extraneous factors. I know it's hard to believe now, but these findings in the 70s really were shocking, many psychologists and philosophers thought the results were impossible. I think it's hard for people nowadays to put themselves back into that context.

                    On the reasoning problem, I don't think it's fair to include the people that spotted the trick in the final analysis. The fact that 60% of people that didn't know they were contradicting themselves did so perfectly happily just in order to feel consistent despite evidence to the contrary is pretty shocking.

                    Anyway, thanks for the discussion.

                    • rbecker 6 years ago

                      > I know it's hard to believe now, but these findings in the 70s really were shocking, many psychologists and philosophers thought the results were impossible.

                      I completely forgot to consider that/am only vaguely familiar with the beliefs at the time. It casts the experimenters in a completely different light, and makes me think I should not have been so negative of them. They must have tried many different increasingly sophisticated setups before they were able to tease out these effects, that they only suspected existed. It's always fascinating to see someone intuit some truth long before they have experimental data to confirm it. I imagine reading about their early failures and how they slowly arrived at their findings would be very interesting.

  • Lendal 6 years ago

    The article doesn't say facts never change minds. Clearly, facts do change minds. Just not always. The article simply focuses on the latter case, because that's a more interesting thing to read about.

    It's like boring headlines don't get voted up in Hacker News. It has to be something interesting. Facts change peoples' minds. Facts don't change people's minds. Both are true. But only one is interesting.

    • aeternum 6 years ago

      It also makes sense for humans to have some amount of hysteresis. Evidence or 'facts' can be misleading, data can be cherry-picked. It takes some time to sort it out.

      I do think too many people get blinded by that hysteresis and the want to be 'right'.

    • gmadsen 6 years ago

      its not that its just boring, but it doesn't provide any insight. We all have experienced changing our minds due to facts , there is nothing to write about that statement. What is interesting is that despite thinking that we are very logical with what we believe, study after study show that to be not the case on average. Confirmation bias is pervasive

  • WhompingWindows 6 years ago

    Only certain truth exerts a pull on our beliefs: the truth that we use to justify our beliefs post-hoc. It's well established in psychology, from books I've read like "thinking fast and slow" or "The Righteous Mind", psychological study points to people building their beliefs THEN finding facts to justify them.

    I do agree that, over generations, the correct and truthful views tend to gain the upper-hand. This arises from each generation downloading a new set of facts and learning in school, when they are young and their minds haven't formed their belief system yet. However, if we allowed all children to enter school at their place of worship from 5 to 18, we'd find college students remarkably unwilling to learn many more facts.

    So, more broadly, why does it bother you that facts don't change our minds and we're all irrational? We are Homo Sapiens, a mammalian primate who made the jump from the jungle to the Savannah and learned to work together to gather food and hunt game. We haven't left behind our animal software, it is still active in and exploited by our modern society.

  • majormajor 6 years ago

    > But we are approaching the truth. Everything in history, and everything in our daily experience tells us this.

    I think this needs substantiation. You present this like a fact, but it looks entirely like an opinion: your interpretation of history.

    It presents a sense of inevitability that I find extremely dangerous.

    The only thing that keeps us from losing what we have today - as many civilizations have done in the past - is our actions. Presenting it as historical inevitability cheapens both the meaning of our actions but also discourages people from seeing just how important active effort is.

  • loudtieblahblah 6 years ago

    Truth on some things are simple.

    Truth on others, like medicine, psychiatry, nutrition, etc.. Are really really conflicting. And people build cultures and tribes around their truths, each backed by science. Get some keto people, vegetarians and run of the mill nutrtion experts to sit around debating and your head will spin

    There's such a complexity there, conflicting studies, poorly done studies.. Finding a "truth" in how we should eat, how often, etc.. Is near impossible.

    And that's just that subject.

  • mc32 6 years ago

    Maybe part of it is expectation of instant gratification. But given how things change over time, slow adoption may not be worse than quick disruptive adoption that needs recalibration.

    • BariumBlue 6 years ago

      Agreed. One mental example I use is that imagine someone brought you proof of a ghost. Maybe it's a testimony from someone who is rational and wouldn't ever lie, maybe it's some unexplainable photograph or phenomenon.

      If you didn't believe in ghosts before and then suddenly switched to believing in them, then your entire worldview changes - the soul exists and can exert itself into reality, the afterlife is real, maybe emotions exert some real phenomenon too, and so maybe wishing makes things happen too!

      That, in and of itself, would be irrational, to change your entire worldview based on single pieces of evidence, even if they do appear to be true

  • brightball 6 years ago

    It also assumes a few other critical things.

    1. The person learning the fact trusts the source

    2. The fact can be easily proven if the person doesn’t trust the source

    3. There are no other facts which provide context that are missing

    These are all critical in how the general public receives “facts”.

    • quacked 6 years ago

      Yeah, those points are crucial. I always cringe when I see anyone say anything along the lines of "we've got the facts". How do we know?

      You can't trust any major media publication, because they'll play both sides of a story. For instance, a major newspaper reports that someone is predicting a recession. If there's a recession, the newspaper will crow about how smart they are, but if there's not a recession, the newspaper will run reports about why the recession predictions were wrong, and then crow about how smart they are. What were the facts?

      You can't trust any major political figure, because they have an agenda by default of existing in the political system. If there exists a fact that damages their agenda, they would lose their career if they admitted it.

      A lot of the time the "facts" that people are often the most angry about are actually either predictions of the future (ex: it's a 'fact' that global warming will lead to +2 deg C by 2100) or they're summaries of statistical models (ex: it's a 'fact' that X subgroup is n% more or less likely to earn less/more).

      Even things we know are "facts" don't necessarily lend real understanding to the person believing them. Every high school physics student knows the fact that light is a particle and a wave. Does that mean they actually understand light?

      How many "facts" were known to citizens in the past that we now laugh at?

  • eagsalazar2 6 years ago

    I think there is a more holistic POV here that resolves the tension you are worried about which is simply to accept that the beliefs people openly espouse are not what they actually believe, there is outward cognitive dissonance but inwardly people are making sense of the world in a fairly rational way that is consistent with their own goals and needs, not with objective/mathematically consistent reality. It only seems like irrational denial of facts because you incorrectly assume (a) you understand their needs and goals and (b) what they say they believe is about a position being objectively true or false and not a model of reality that works for them. If you want to "change people's minds" what you need to do is understand their needs and goals, how the most brutal version of "the truth" does or doesn't support those needs, and then "convince people" by creating coherent world views that are both "true" and allow them psychological coherence and safety (aka, +empathy and diplomacy).

  • btmoney06 6 years ago

    I tend to agree. The problem is that the elites weaponize stuff like this, because they think it applies only to the masses--not the highly educated. Instead of asking why somebody might have different views (and instead of actually learning what the different viewpoints are), they just assume that that other, less educated people fail to actually analyze the data.

  • santoshalper 6 years ago

    And even as the facts indicate that it is true, you will continue not to believe them. As you said, you dislike the narrative - whether it is true or not never really mattered to you, you just didn't like the implications.

    You are doing a remarkably effective job at demonstrating this phenomenon.

  • michaelmrose 6 years ago

    Generational shifts are literally driven by indoctrinating children in increasingly less stupid sets of beliefs in childhood as understood by minorities of experts in each field.

  • desipis 6 years ago

    > The thing is, truth does exert a pull on our beliefs. It's a slow force.

    What is the nature of this force? Where does it come from and how does it influence our minds?

  • commandlinefan 6 years ago

    > truth does exert a pull on our beliefs

    I'm more cynical than that. I believe that facts do change people's minds, but most people harbor hidden agendas that they try to adjust convenient facts to while ignoring inconvenient ones.

  • fullshark 6 years ago

    I don't know if it's to discredit democracy so much as humanism. A lot of people right now seem to be fantasizing about a collectivist democracy, which can be as oppressive and destructive as an autocracy.

SuoDuanDao 6 years ago

I'm reminded of Tetlock and Gardner's excellent book 'Superforecasting', which was essentially a study of people who consistently score at the top of prediction markets. One key thing that these 'superforecasters' had in common was that any new information caused them to update their model of the world, but none caused them to update it very much - typical people making predictions either didn't update their model or updated it too much in response to new facts.

I think it makes a lot of sense, when one is trying to identify patterns in information, that it's easy to over- or undervalue novel information. We don't necessarily know what a new fact means, so ignoring it is one common error while paying too much attention to it is another.

  • erichocean 6 years ago

    > We don't necessarily know what a new fact means

    We also rarely even know if a "new fact" is actually true. So many studies don't replicate that it makes sense to hold off on updating core beliefs whenever "new facts" seem unlikely or in contradiction with previously known (and reliable) facts.

    SSC had a nice article (now gone) that discussed this for a scientific theory that had literally hundreds of confirming studies done for it. All wrong. The "new facts" were bullshit. So even with tons of studies, it's reasonable to be skeptical in some situations.

    It's also great that, eventually, science was able to figure out the "new facts" were bullshit. Yay, science. But it also means that people aren't being irrational when they don't immediately alter their fundamental beliefs while the ink is still dry, especially "new facts" that seem in contradiction with everything else we know…

  • roter 6 years ago

    Thanks for the recommendation.

    I guess we just need to tune our relaxation factors [0] or, perhaps better, recalibrate our Kalman filters.

    [0] https://en.wikipedia.org/wiki/Successive_over-relaxation

dlkf 6 years ago

IMO the conclusion "facts don't change our minds" is a stronger conclusion than the first two experiments show. On my reading, the first two experiments show that:

1. if I have a uniform/undefined prior (how the fuck should I know how risky/conservative firefighters are?)

2. and then I'm given an anchor

3. and then told the anchor is bunk

4. the anchor still affects me

But I suspect this hinges very heavily on the fact that our initial prior is basically non-existent. By contrast, if you:

1. picked a topic where I actually have some prior belief (What country is colder: Sweden or Germany?)

2. gave me some information "Germany is actually colder on average than Sweden because of a weird atmospheric thing that affects the nordics"

3. told me that 2 was BS

I highly doubt you'd be able to replicate 4.

  • eagsalazar2 6 years ago

    Specific facts are orthogonal to the actual underlying positions held, which are presented outwardly as other positions for the sake of political cover, hence the illusion of facts not changing minds. What's needed is an understanding of the actual underlying, usually hidden, positions, then present facts to disrupt those positions.

    • dlkf 6 years ago

      This is unclear to me. Can you explain what you mean in the context of my alternative example?

      • NoSorryCannot 6 years ago

        Not OP, but

        You might not know anything about risk-taking behavior of firefighters but you may already have some vague belief like "firefighters are heroes" that obliquely colors your impression of their behavior.

        Or alternatively you might hold onto the info that Germany is colder because your underlying belief is more like, you don't like cold and you don't like Germany, so you'd like to also believe that Germany is colder than other places.

        This entails two things. One, your apparent position on firefighter behavior or the weather in Germany can change depending on what in the context of the conversation is being construed as good or bad. Second, trying to inform you with specific facts on these issues is unlikely to change your mind because the drivers of your positions are your more general beliefs about firefighters and Germany.

        In politics, I think partisanship often degenerates in this way. Arguing the issues is often just a facade for arguing for your party's position or arguing against an opposing party's position, regardless of merit. Facts won't work here to change minds.

        • dlkf 6 years ago

          You're just making same claim as the authors in the study, but adding a proposed mechanism. But just as they don't have evidence, neither do you. You have to actually do the study to prove it. If you believe that the experiment would work this just means you and I have different priors on the matter.

  • syrrim 6 years ago

    Why not? There are many strange facts in the world. Some of them are even true. It is very difficult, when recalling a strange facts, to remember whether it was one of the true ones or not. So naturally, things we've heard will tend to exert a pull, even if we later found out they were wrong.

    • dlkf 6 years ago

      In virtue of the experimental design, you were just now told that the "strange fact" is bullshit. If you had a clear opinion prior to receiving the strange fact, you'd revert to it

olah_1 6 years ago

I saw that Yuri Bezmenov interview[1] ages ago and didn't really think of it until now, when crime statistics are openly denied almost as if crime doesn't really exist at all.

Then I thought back to that Bezmenov interview with what he said about "demoralization". When a population is demoralized, they cannot discern true information when it is staring them in the face.

I think ignoring facts has less to do with some kind of esoteric psychological process and more to do with raising multiple generations to believe that they've been lied to and the whole "system" is evil.

[1]: https://www.youtube.com/watch?v=wYaR7mWxuf8

  • mistermann 6 years ago

    The general public has been lied to, to a significant degree. People who have full trust in entities that have and continue to publish untruths seem more irrational to me than supposedly irrational skeptics, etc (it is unknowable what the aggregate rationality of a given group is, but good luck finding anyone rational enough to realize that).

    • olah_1 6 years ago

      This comment is too vague and general for me to interact with. I have a notion that I disagree with what you’re getting at here, but I can’t be sure.

      I agree that the public is lied to. But that is usually through editorialization of headline news that omits or emphasizes convenient information for the sake of a narrative. What I’m talking about is being presented with raw information and considering it.

      • mistermann 6 years ago

        > I agree that the public is lied to. But that is usually through editorialization of headline news that omits or emphasizes convenient information for the sake of a narrative.

        How would you have any way of knowing this? And I mean that as a serious question, not as snark.

trabant00 6 years ago

I see quite a big problem with those studies: the facts where made up and the truth was contingent, not necessary.

So why was it expected of the participants to change their minds? Nothing they could verify disproved their initial position.

For me all this proves is what I already knew: "garbage in, garbage out".

edit: as below comment pointed out this might not be the problem of the studies but of how the article tries to use them to prove its point.

  • simonh 6 years ago

    For example in the one where the participant's own answer was disguised as that of another person we can't discount the result so easily. That's also true of the studies where participants downgraded their confidence when asked to give an account of it.

    On the invented studies, bear in mind that the point wasn't to measure changing the participant's mind, only for them to rate the value of a study that either supported or contradicted their initial position. Their only basis for evaluating the value of either study was their own pre-existing bias, so objectively they had no reason to evaluate them differently.

    That's quite different from expecting them to change their minds, as the reasons for them holding their position might not even have been addressed by the study. For example someone who disagrees with capital punishment on moral grounds may not care whether it is an effective deterrent or not so may no have any reason to doubt a study that it is an effective deterrent.

TopHand 6 years ago

What politicians know that the authors of this study don't seem to realize, is that if we are told the same story repeatedly for long enough, no matter how absurd, we'll start believing it. If you throw in some scary outcome if we don't believe the story, we'll come around sooner. It seems that fear will cause us to re-examine our beliefs and values.

RoutinePlayer 6 years ago

According to 19th century German philosopher Arthur Schopenhauer, “All truth passes through three stages: First, it is ridiculed. Second, it is violently opposed. Third, it is accepted as self-evident.”

  • danaris 6 years ago

    The problem with that quote is that it is all too often adopted by people who wish to use it to prove that their pet "ridiculed" or "violently opposed" idea is one of those that will become self-evident, when in fact, it states nothing of the kind.

    Many theories that are ridiculed deserve it.

    Many ideas that are violently opposed should never see the light of day again.

    Very, very few of those that reach either the first or the second stage ever make it to the third, and it is a classic logical fallacy to argue that being ridiculed implies that an idea will be proven true in the end.

  • rafaelvasco 6 years ago

    Oh that's a fact right there. Take any piece of evidence or information, if it is ridiculed too much or violently opposed, and has been for ages, but no one forgot it, then it's probably true or partially true.

    • danaris 6 years ago

      Yep, this "round earth" nonsense that has had people ridiculing good honest flatworlders for generations will die out any minute!

      Aaaaany minute....

      • rafaelvasco 6 years ago

        Ok you found a negative to my law. That was too easy. Now look with more attention and tell me a positive. You can do it!

jstanley 6 years ago

Related: Epistemic Learned Helplessness: https://web.archive.org/web/20180406150429/https://squid314....

RcouF1uZ4gsC 6 years ago

First of all have all these psychological studies been replicated?

Part of the reason, “facts” don’t change our mind is that a lot of “facts” aren’t really facts like physics, but are rather the result of statistical games.

Finally, and I think the biggest issue is that a lot of facts rely on trust, since they are practically impossible for the average person to fully verify. And I think, for a variety of reasons, trust has been lost. Think about vaccines. Say back in the 1950’s, you probably knew or heard of someone who died from polio. You mom, might have had a sibling that died from one of the other vaccine related illnesses. The doctor recommending the vaccines, was seen as a trusted friend. He(it was usually a he back then) probably spent his whole life in your town. He knew your grandparents. Maybe he delivered your parents. He would spend hours at the bedside of a sick child or a dying grandparent. Maybe he was the one who delivered your children as well. Now when he says that he recommends you give your child this vaccine, you are going to listen.

Now forward to modern times. You book your appointment. You go to the office where you wait for hours. The pediatrician comes in and rushes through a 15 minute visit. Says your kid should get vaccinated. On the way home you listen to an investigative report of how doctors are paid by big pharma to prescribe drugs. By the way, you have never heard of anyone you know getting one of these vaccine preventable illnesses.

Now the gap between the educated elites and regular people in this country is widening. They do t interact much socially. They do t even live together. In the United States, the non-college educated have seen a steady decline in their real wages and well-being. Of course they are going to distrust “facts” put out by the elite who are seen as out of touch.

I say this as someone who totally believes in vaccines and have persuaded many of my friends that they should have their children vaccinated. The growing gap between the rich and poor in this country is at the root of many issues.

  • treeman79 6 years ago

    Trust is a big part of it.

    Facts are closely related to statistics. It’s possible to be both true and a complete lie at the same time.

    Abusive people will often use “facts” to control victims. You learn to be very mistrustful after awhile.

    • zzzcpan 6 years ago

      Facts is a loaded term here. Facts cannot rely on trust, those are called authoritative opinions, not facts. And opinions about vaccination are still just opinions, not facts. If you say, for example, that it's hazardous not to vaccinate, like the article does, it's not a fact, it's a judgement and advertising judgements is the essence of propaganda, it's basically the opposite of a fact, dystopian use of the word fact. But actual decent factual picture about vaccines is complicated, it's about balancing many big and small risks: catching the virus while you live your life, catching something at the clinic while getting vaccinated, having complications from vaccination, being subjected to unnecessary treatments and drugs because doctors want to profit from you that may also cause complications, or just being able to afford it vaccination, and so on. Not to mention all the unknown unknowns and not knowing how to evaluate the risks involved. And poor but still factual picture would at least not advertise any judgement and would present the reasoning for everyone to make their own conclusions.

      • ghthor 6 years ago

        Thank you, this is a well balanced comment and it's highly valuable to have this type of viewpoint in our world.

  • simonh 6 years ago

    >First of all have all these psychological studies been replicated?

    According to the article they have many times, yes, it describes many examples of similar experiments along these lines.

    This evolutionary function of reason, and it's resulting flaws in our implementation of it supports my belief that in the grand scheme of things we are actually only just barely sentient. That is, we're at the very lowermost bound of the set of possible intelligences that are capable of technological civilisation. I think this because, well, we only just recently evolved enough intelligence to actually do it. If we'd become intelligent enough earlier, we'd have done it earlier.

    If that's true then sure, it would be natural to expect that our reasoning powers are still impaired by flaws and fallacious tendencies. The scientific method then is a procedural set of rules we've invented to prevent our naturally somewhat irrational tendencies to mess up our ability to determine accurate actionable information. Yay us!

    • andi999 6 years ago

      I am wondering why it is so hard for the article to properly reference the study. Then one could easily check through the citiations on google scholar for example. Probably it is Ross, L, Lepper, M. R., & Hubbard, M. (1975). Perseverance in self-perception and social perception: Biased attributional processes in the debriefing paradigm. Journal of Personality and Social Psychology, 32(5), 880–892.

  • Majromax 6 years ago

    > Finally, and I think the biggest issue is that a lot of facts rely on trust, since they are practically impossible for the average person to fully verify. And I think, for a variety of reasons, trust has been lost.

    That can explain non-movement of opinion when presented with contrary fact, but not movement away from the fact. The article here notes the experiment when students were presented with dueling articles on capital punishment: the ambiguous data acted to bolster their original position no matter the original stance.

    A lack of trust in authority is one thing, but to use the authority's agreement with your pre-existing opinion to determine trust in that same evaluation is inherently circular -- even if it is human.

  • byte1918 6 years ago

    > First of all have all these psychological studies been replicated?

    > Thousands of subsequent experiments have confirmed (and elaborated on) this finding.

  • ghthor 6 years ago

    Do you believe that the vaccines contain mercury and aluminum and that those metals are causing problems in people who take that vaccines? Because that's a fact they contain those metals and they hurt people. I have a daughter who went from speaking to requiring 5 years of therapy to start speaking again. I not allowed to sue about that and no one in the medical establishment would acknowledge that the only cause of my daughters sickness could have been the vaccines. The medical establishment is a joke, I'm through with giving them any trust. There is propaganda about vaccines being harmless and it needs to end.

    • dragonwriter 6 years ago

      > Do you believe that the vaccines contain mercury and aluminum and that those metals are causing problems in people who take that vaccines?

      That’s a lot to unpack.

      Yes, vaccines have very low concentration of compounds of those metals.

      Yes, those metals at the concentrations in vaccines can cause minor side effects, though there is no evidence of serious side effects.

      But, more to the point, vaccines more broadly than those ingredients occasionally cause serious injury. This is a rare but known risk.

      > I not allowed to sue about that

      It is true that one cannot sue in the US over vaccine injuries but presented alone in this context that is misleading to the point of dishonesty, since there is an alternative compensation program (one where, unlike with regular court where if you win you are still out legal costs without extra proof of particular egregious conduct, you can be awarded costs and fees even if you aren't eligible for compensation for actual harms.)

      https://www.hrsa.gov/vaccine-compensation/index.html

      That's because the “medical establishment” doesn't view vaccines as perfectly safe, but instead that they are viewed as being safe in the same sense as other prescription medicines but, further, than the public health establishment—the relevant part of the government—feels there is sufficient public health benefit from vaccination that even harms which would not compensable for other approved drugs are compensable on a no-fault basis for vaccines, to encourage their use.

      > and no one in the medical establishment would acknowledge that the only cause of my daughters sickness could have been the vaccines.

      If no experts would agree with your claims of causation, then allowing you to sue would just be allowing you to incur a bunch of costs to no end. A more likely explanation for no experts agreeing that the one thing you've focussed on as the cause being the cause is that there is not evidence for the claim of causation.

      > There is propaganda about vaccines being harmless

      There may be somewhere, but it's not coming from the “medical establishment” or the government, both of which acknowledge that there are both the common minor and less common severe harms from vaccines.

      • dennis_jeeves 6 years ago

        >Yes, those metals at the concentrations in vaccines can cause minor side effects, though there is no evidence of serious side effects.

        How about this to ensure safety: demonstrate that they are safe at say 100 times the amount used in vaccines. Has this been done? No, is what I understand. ( if you do find a study to that effect please put it here or email me).If that can be demonstrated then we can be reasonably sure that 1/100 the massive dose will be relatively harmless.

        The way it is normally put : that is no evidence of serious side effects is disingenuous. It's the other way round, it has to be demonstrated that it has no serious side effects.

      • mistermann 6 years ago

        >> There is propaganda about vaccines being harmless

        > There may be somewhere, but it's not coming from the “medical establishment” or the government, both of which acknowledge that there are both the common minor and less common severe harms from vaccines.

        https://www.cdc.gov/vaccinesafety/concerns/autism.html

        "Vaccines Do Not Cause Autism"

        The CDC does not know if vaccines cause autism, they only know (assuming they are telling us everything they know) that a causative relationship has not found.

        This is just one example of vaccine related propaganda that is asserted by authoritative bodies.

        EDIT: Moving a conversation from the abstract to the concrete seems to be a reliable way to invoke this behaviour in many individuals, even in a thread devoted to the very topic. Surely there must be a name for this phenomenon, it would be interesting to read studies on it.

        • dragonwriter 6 years ago

          > The CDC does not know if vaccines cause autism, they only know (assuming they are telling us everything they know) that a causative relationship has not found.

          No, they also no that the controlled-for-other-factors correlation that would indicate the possibility of causation has not been found outside of research that has been established as deliberately fraudulent.

          And they know that there has been extensive research into the question because of the popularity of the fraudulent research cited for the opposite conclusion.

          If there is a mechanism by which vaccines cause autism in some specific cases, they must also prevent autism that would otherwise manifest in other cases enough to mask the effect in aggregate.

          In any case, the evidence-based rejection that vaccines cause a specific harm is not equivalent to propaganda that they are harmless.

          The existence of the compensation program is an explicit acknowledgement that they are not harmless, as well as an easier route to compensation for games than exists for most drugs.

          • mistermann 6 years ago

            Let's try and unmuddy the waters here a bit....

            "Vaccines Do Not Cause Autism"

            Are you saying that this assertion is unequivocally known to be true? No uncertainty or possibility of future conflicting discoveries whatsoever?

            > In any case, the evidence-based rejection that vaccines cause a specific harm is not equivalent to propaganda that they are harmless. The existence of the compensation program is an explicit acknowledgement that they are not harmless, as well as an easier route to compensation for games than exists for most drugs.

            Scope expansion is an effective form of rhetoric (which some people classify as a form of propaganda in itself). Not saying this was intentional on your part, I tend to believe it is simply an innate/instinctual ability (System 1, in Thinking Fast and Slow parlance) of the subconscious. I am surely guilty of the same thing at times.

            Also: how did you come to know what everyone working for the CDC knows?

abetusk 6 years ago

Interesting read. They're basically proposing that our anti-rational behavior came out as a type of 'hyper-socialization'. I can believe it and, if true, would point to why things like changing the Overton window [1] and other mass public perception shifts change individual perception.

I don't think it's the only way to change peoples minds and I hesitate to dive into "just employ emotional reasoning" as that seems dangerous.

From personal experience, another effective way is to change people's minds is by giving them "skin in the game".

I've tried, over the years, to convince friends of the solution to the Monty Hall [2] problem. After explaining the solution and them either not believing it or not understanding it, I then play the game with them with 100 doors and revealing 98 after the first pick. Once this game is played a couple times, they understand the solution much more readily.

My take on this is that they suddenly have a personal stake in the game, even if it's weak. There's a personal cost that takes the form as social shame or loss aversion, even for a game that's played between friends with no money involved, that gives them a stake. Once they start wanting to actively avoid losing, they're much more willing to listen to reason.

The article points out that our anti-rational behavior is at odds with survival but I would bet there's a level of abstraction below which our survival minded rationality kicks in and above which we don't have enough of a stake in the answer to use our rationality to good effect.

[1] https://en.wikipedia.org/wiki/Overton_window

[2] https://en.wikipedia.org/wiki/Monty_Hall_problem

082349872349872 6 years ago

An ancient (albeit trivial) argument for facts not changing minds is that rhetoric was a distinct discipline from logic.

pier25 6 years ago

> strong feelings about issues do not emerge from deep understanding

I've thought about this too on my own strong feelings. The more I know about something, the more I understand its nuances, pros and cons, etc, the less I feel strongly about it. Now when I spot myself with a strong feeling about something I try to remind myself that I'm most likely missing something.

We see this constantly in the dev world. Younger devs feel very strongly about languages, libraries, frameworks, etc, probably because they have a shallower understanding of the thing.

Isamu 6 years ago

It takes constant training and energy to follow where the facts lead you. Feynman used different approaches as a way to keep himself focused on the facts and not exclusively what he “knew” was true. He said the easiest person to fool is yourself.

Mostly people want to validate their intuition and gut feelings and don’t want to experience the discomfort of finding out that their intuition is not magically correct.

dang 6 years ago

Why they didn't in 2018: https://news.ycombinator.com/item?id=18099488

Why they didn't at the time: https://news.ycombinator.com/item?id=13810764

iconjack 6 years ago

The fundamental problem is that our beliefs become part of our identity, and thus most of the time we're not actually seeking the "truth". This is obviously true when it comes to religion, and almost as bad when it comes to politics. And these days, a lot of "science" has become hyper political: race, climate, gender, evolution. Forget changing anyone's mind on those topics, no matter what facts you have in your arsenal.

mD5pPxMcS6fVWKE 6 years ago

Truth is only important to us as long as it contributes positively to our well-being. This sort of mushrooms is edible and this one is poisonous - everyone would agree on that. As far as more abstract truths are concerned: people believed for centuries that the Earth is flat. Many still do. If you said otherwise, society would probably burn you for heresy, so the cost of truth was hugely negative.

btmoney06 6 years ago

Were the New Yorker honest, they'd entitle this: "Why the Uneducated Don't Understand That You're Right." Which is a shame. This type of information should be used to help better the reader by asking them to understand their own blind spots--not indulge the reader by telling them that their adversary is ignorant and irraitonal.

SmokeyHamster 6 years ago

Slightly misleading headline. The study tested how much a lie persists in someone's mind even after they're told the truth.

The study found that facts do indeed change people's minds, just not as much as we'd like, because the initial impression sets expectations. Caldini talks about this in some of his books on persuasion.

bigpumpkin 6 years ago

The Stanford experiment forgot to account for the fact that the students could've used the fake score they first received as a useful prior on how difficult the task was. It does not show that "Facts Don't Change Our Minds".

gadders 6 years ago

The New Yorker can't help itself, can it? Reasonably fair article, but then suddenly veers into:

"When I talk to Tom and he decides he agrees with me, his opinion is also baseless, but now that the three of us concur we feel that much more smug about our views. If we all now dismiss as unconvincing any information that contradicts our opinion, you get, well, the Trump Administration."

And:

"(They can now count on their side—sort of—Donald Trump, who has said that, although he and his wife had their son, Barron, vaccinated, they refused to do so on the timetable recommended by pediatricians.)"

The thing is with studies like this is it's used by people on the losing side of elections to start complaining about "low information voters" with the subtext being "If only everyone was as clever as me and all my friends that think the same then [thing I disagree with] would never win elections." Ironically this also lets them avoid any introspection as to whether they may lose because there are defects with their policy positions.

  • noetic_techy 6 years ago

    Exactly. Many times following the facts can paint BOTH sides as wrong, and those who espouse "follow the facts" often only mean "follow the facts I want you to follow and discard the rest".

  • Barrin92 6 years ago

    > it's used by people on the losing side of elections to start complaining about "low information voters" with the subtext being "If only everyone was as clever as me and all my friends that think the same then [thing I disagree with] would never win elections."

    it's pretty backed up by evidence (and honestly attending a Trump ralley), that the average voter of Trump is less educated, much more prone to misinformation, and simply holds a ton of trivially wrong beliefs about the state of the world.

    That's without making a value judgement about the voter or saying they shouldn't have their vote which they should of course because there's no requirement for voting in a democracy, but it seems silly to pretend that such a thing as an uninformed group of voters does not exist, or even cannot exist because it would be offensive in a way.

    Autocrats and corrupt leaders have banked on them throughout all of history, and measured, intelligent and truthful discourse is not always found in the majority.If we're concerned with truth then "they keep losing elections" or might makes right style arguments hold no value, in fact they're quite dangerous.

    • war1025 6 years ago

      > It's pretty backed up by evidence (and honestly attending a Trump ralley), that the average voter of Trump is less educated, much more prone to misinformation, and simply holds a ton of trivially wrong beliefs about the state of the world.

      This is just as true of your "average" Democrat. The "average" person is woefully misinformed about most things. It's probably safe to say that nearly everyone, myself and the majority of the HN crowd included, is misinformed about many things that aren't critical to our day to day life.

      • Barrin92 6 years ago

        It's not as true and there's actually been studies on the particular voter behaviour in 2016, and belief in 'fake news' (as in literally made up stuff) was a strong predictor of defection from the Democratic to the Republican ticket, and there's solid psychological evidence why this affects conservatives in particular[1]

        It's also very trivial to see if you eyeball the size of the market for misinformation. While there are some highly partisan left-wing media in the US, and there were some facebook pages targetting say, Bernie voters it paled in the market for the Trump base, literally by a magnitude or so in revenue. Which I think is very obvious too if one looks at the size of the audiences of youtube channels attracting those audiences or people like Alex Jones.

        NPR in 2016 actually did an interview with one such 'entrepreneur', who actually tried to sell fake news to virtually everyone, but had very little success with liberal audiences.[2]

        [1]https://www.theatlantic.com/science/archive/2017/02/why-fake...

        [2]https://www.npr.org/sections/alltechconsidered/2016/11/23/50...

        • war1025 6 years ago

          For one, the two articles you linked are from liberal media sources. Of course they are going to find fault with conservatives.

          More importantly, just because conservatives are more likely to believe a certain form of fake news doesn't mean liberals are immune to being misled. All it means is that conservatives are motivated by different things than liberals, and will therefore latch on to a particular flavor of things that confirm their beliefs. Liberals love confirmation bias just as much as anyone.

          Find any random person on the street and ask them to explain why they hold the views they do. You'll quickly find that opinions are based on emotion and backfilled later with plausible explanations.

thisrod 6 years ago

"Knowledge advances one funeral at a time" - old physics saying.

rbecker 6 years ago

"Why some facts, on some topics, don't change our minds as much as they maybe should" would better reflect the article content.

troughway 6 years ago

Jordan Peterson studied/covered this as well; here's a short clip - https://www.youtube.com/watch?v=sWbj-2DRLps

  • war1025 6 years ago

    It's downvoted and greyed out because of a general hate of Jordan Peterson (which I've found is really independent of political affiliation), but it's a good clip in my opinion.

dutch3000 6 years ago

i very much enjoyed the article, but i do prefer apolitical content when possible. unsure why it was necessary to reference trump in the vaccine portion. people (authors included) that can’t control themselves from injecting politics where it doesn’t naturally belong are becoming more and more irritating imo.

  • squarefoot 6 years ago

    > unsure why it was necessary to reference trump in the vaccine portion.

    It wasn't necessary, however it gave the authors the opportunity to test in just one line if the summary was true, and I guess it worked.

    I also don't want politics injected into scientific topics, but the role of politicians is to rule for people's good, and talk with extreme caution and responsibility because of the trust people give them. When a high profile politician says "this is good", a lot of people will follow the advice blindly, so when a politician put people lives at risk by telling for example that Hydroxychloroquine works as a cure for the Coronavirus (to date at least one dead and one intoxicated after following that advice), it's politics actually harming lives with dangerous information, which makes everyone's duty to inject back common sense into the debate. If only because scientists don't have the same exposure, and it becomes so hard or even impossible for them to undo the damages done by clueless politicians who talk about things they don't know squat.

    BTW. I would have the same exact opinion even in the case it was Obama or Clinton doing what Trump did.

    • dutch3000 6 years ago

      i’ve purposely attempted to fully disconnect from all politically related news and i’ve begun to notice oddities in conversation patterns mostly. the pattern is mainly people injecting political comments in completely nonrelated topics being discussed. the trump references were not educational, yet a sign of the author’s inability to control himself. it’s just a complete turn off for me. people can easily make the connections in the article to today’s reality, it doesn’t need to be explicitly referenced.

    • mistermann 6 years ago

      > however it gave the authors the opportunity to test in just one line if the summary was true, and I guess it worked.

      In what way did it work?

      > I also don't want politics injected into scientific topics, but the role of politicians is to rule for people's good, and talk with extreme caution and responsibility because of the trust people give them. When a high profile politician says "this is good", a lot of people will follow the advice blindly, so when a politician put people lives at risk by telling for example that Hydroxychloroquine works as a cure for the Coronavirus (to date at least one dead and one intoxicated after following that advice), it's politics actually harming lives with dangerous information, which makes everyone's duty to inject back common sense into the debate.

      I haven't encountered many officials who have consistently spoken with "extreme caution and responsibility" on Hydroxychloroquine, or anything related to this pandemic really. As far as I can tell, it is unknown whether Hydroxychloroquine is or is not effective in treating covid patients (there are severe limits on our ability to know many things), but the vast majority of reporting I've been exposed to the matter has a very strong propaganda odour to it.

      https://edition.cnn.com/2020/07/02/health/hydroxychloroquine...

      https://i.redd.it/cppndepg1s851.jpg

      Personally, I now start from the default epistemic position that anything said in the media is untrue, but the degree and manner in which that is the case is unknown, and that is the part of the claim that should receive significant mental attention (which parts are objectively untrue, misrepresented (cherry picked, deliberately framed), and what noteworthy "facts" are suspiciously absent). Rare is the news story these days where nothing sets off my suspicion.

  • tribeofone 6 years ago

    > why it was necessary to reference trump in the vaccine portion

    because it's the new yorker. Facts are optional, bias is required.

MaxBarraclough 6 years ago

(2017)

baxtr 6 years ago

Is that actually a fact?

  • wizzwizz4 6 years ago

    No, all this evidence is made up. So: how likely do you think it is that humans really behave like this?

  • willvarfar 6 years ago

    I don't mean to be paranoid, but do you think the article might be misinformation in some study of the collective gullibility of the HN crowd?

    • danaris 6 years ago

      Well, given that it was published in the New Yorker 3 years ago, it clearly wasn't written primarily for us.

      Given that it is discussing the results of actual psychological studies (that I have seen talked about in a number of other places), it is vanishingly unlikely that it is in some way intended to study anyone's gullibility.

Keyboard Shortcuts

j
Next item
k
Previous item
o / Enter
Open selected item
?
Show this help
Esc
Close modal / clear selection