Page 3 of 4 FirstFirst 1234 LastLast
Results 31 to 45 of 52

Thread: Why Facts Don’t Change Our Minds~or theirs

  1. #31 | Top
    Join Date
    Jun 2016
    Location
    State of Bliss
    Posts
    31,007
    Thanks
    7,095
    Thanked 5,196 Times in 3,829 Posts
    Groans
    433
    Groaned 261 Times in 257 Posts
    Blog Entries
    5

    Default

    Quote Originally Posted by JPF View Post
    Thank you for sharing the studies. Here are my thoughts.

    First about the studies.

    When the administrators of the experiment pull the switch-a-roo and tell them that the suicide notes or Frank’s bio was faked, they asked the students to make a value judgement based on what they had experienced previously. It doesn’t say so in the OP but it certainly seems like if you’re given an either/or choice (did we get this right or wrong…..does an effective firefighter take or avoid risks), choosing the previously held position seems like a coin flip. If it’s either/or…why not? So I’d be interested in knowing what choices the students had before I endorse the experiment. Put another way, if the students are asked a simple A or B choice, why not “stick to your guns” if it is just as likely to be right as it is wrong?

    The second thing is this. Put into the context of politics, the studies you’re trying to relate to Politics, in my opinion, fall short. Again, just my opinion—and your post is just as valid as any on the subject. Here is why I think that.


    Politics is more than just looking at someone’s positions and determining whether or not you think they are good or bad for your district, state, or country. Elizabeth Warren? I think she’s a great Senator based on the committee hearings I see. As a Presidential Candidate? I wouldn’t like her chances of beating a corpse. She doesn’t inspire me, she strikes almost no confident tone in her delivery, and she seems to put too much importance on pandering.

    Additionally, in our federal system, it makes zero sense (unless you are a fan of gridlock) today to vote for a President of one party and a legislator from another party. None whatsoever. So if you were to look at facts in the race for the 2nd District in Colorado between John Doe (R) and Jane Doe (D), you may agree with John’s positions but since you support a Democrat for President, you would be smarter in my view to vote for Jane than you would John.
    So the facts that you like about John do not matter. They are over-shadowed by the little consonant next to his name.

    Lastly, in terms of the politics, a very divisive national figure or position will hurt your party’s chances more than their presence helps. I am perplexed that the Democrats voted for Nancy Pelosi to be the SOTH. Her effectiveness as a practitioner of the profession is well documented. So what. Every republican who runs this year for Senate, House, City Council, School Board, Dog Catcher will have a picture of heir Democratic Party opponent next to Pelosi. Whatever she gains for the Dems in the trenches, she loses for them at the ballot box. The NEA…it costs you one tenth of a penny in tax monies or something like that. It likely costs the Democrats thousands of votes every cycle because of the art it helps promote. The ridiculous acceptance of multiple genders by the Democrats? Incredibly toxic to the national party. My personal positions largely support what Pelosi would want, largely support the rationale of the promotion of art being in the national interest and I think you should be able to call yourself any gender you want as long as you use the bathroom that biology assigned to you. I’m willing to sacrifice Pelosi, the NEA and the bathroom liberation act (or whatever the heck they call it) to get more votes in MI, OH, and WI on election day 2020.

    Anyway, I guess in summary, my point is that while the Stanford experiments are good as far as showing there is a distinguishable gap between what we want to believe and what the facts suggest we should believe, it may not be easily translatable a to politics.
    This is only one, there are other studies finding the same results.

    In my experience, & I have been on line over 20 years, I don't think it is just politics........

    I started message boards early one, sports boards & the behavior & my team vs yours is exactly the same......

    Hypocrisy was more blatant & obvious as players got moved around more than politicians in a dive bar..

    While a player on a rival team would be called everything in the book along w/ his momma, if later he was on your team??

    lol, all that was forgotten, now defending the bs they once criticized..

    Those that not long ago claimed to believe in & "fought on the internet" for free trade are now xenophobic

    Those that not long ago hated John McCain & "fought on the internet" against him are now fond of him

    "There is no question former President Trump bears moral responsibility. His supporters stormed the Capitol because of the unhinged falsehoods he shouted into the world’s largest megaphone," McConnell wrote. "His behavior during and after the chaos was also unconscionable, from attacking Vice President Mike Pence during the riot to praising the criminals after it ended."



  2. #32 | Top
    Join Date
    Jun 2016
    Location
    State of Bliss
    Posts
    31,007
    Thanks
    7,095
    Thanked 5,196 Times in 3,829 Posts
    Groans
    433
    Groaned 261 Times in 257 Posts
    Blog Entries
    5

    Default

    This one of the threads I mentioned earlier.
    "There is no question former President Trump bears moral responsibility. His supporters stormed the Capitol because of the unhinged falsehoods he shouted into the world’s largest megaphone," McConnell wrote. "His behavior during and after the chaos was also unconscionable, from attacking Vice President Mike Pence during the riot to praising the criminals after it ended."



  3. #33 | Top
    Join Date
    Nov 2017
    Posts
    53,520
    Thanks
    252
    Thanked 24,567 Times in 17,094 Posts
    Groans
    5,280
    Groaned 4,575 Times in 4,254 Posts

    Default

    Quote Originally Posted by Dark Soul View Post
    So that's why libs can't learn.
    You are the star of that study. This post shows how wedded you are to its conclusions.,

  4. #34 | Top
    Join Date
    Dec 2006
    Posts
    71,685
    Thanks
    6,597
    Thanked 12,131 Times in 9,660 Posts
    Groans
    14
    Groaned 504 Times in 477 Posts
    Blog Entries
    1

    Default

    Quote Originally Posted by Bill View Post
    I found this to be quite correct, w/ few exceptions.. Someone being proven totally wrong, regardless of facts, won't change their minds






    In 1975, researchers at Stanford invited a group of undergraduates to take part in a study about suicide. They were presented with pairs of suicide notes. In each pair, one note had been composed by a random individual, the other by a person who had subsequently taken his own life. The students were then asked to distinguish between the genuine notes and the fake ones.

    Some students discovered that they had a genius for the task. Out of twenty-five pairs of notes, they correctly identified the real one twenty-four times. Others discovered that they were hopeless. They identified the real note in only ten instances.

    As is often the case with psychological studies, the whole setup was a put-on. Though half the notes were indeed genuine—they’d been obtained from the Los Angeles County coroner’s office—the scores were fictitious. The students who’d been told they were almost always right were, on average, no more discerning than those who had been told they were mostly wrong.

    In the second phase of the study, the deception was revealed. The students were told that the real point of the experiment was to gauge their responses to thinking they were right or wrong. (This, it turned out, was also a deception.) Finally, the students were asked to estimate how many suicide notes they had actually categorized correctly, and how many they thought an average student would get right. At this point, something curious happened. The students in the high-score group said that they thought they had, in fact, done quite well—significantly better than the average student—even though, as they’d just been told, they had zero grounds for believing this. Conversely, those who’d been assigned to the low-score group said that they thought they had done significantly worse than the average student—a conclusion that was equally unfounded.

    “Once formed,” the researchers observed dryly, “impressions are remarkably perseverant.”

    A few years later, a new set of Stanford students was recruited for a related study. The students were handed packets of information about a pair of firefighters, Frank K. and George H. Frank’s bio noted that, among other things, he had a baby daughter and he liked to scuba dive. George had a small son and played golf. The packets also included the men’s responses on what the researchers called the Risky-Conservative Choice Test. According to one version of the packet, Frank was a successful firefighter who, on the test, almost always went with the safest option. In the other version, Frank also chose the safest option, but he was a lousy firefighter who’d been put “on report” by his supervisors several times. Once again, midway through the study, the students were informed that they’d been misled, and that the information they’d received was entirely fictitious. The students were then asked to describe their own beliefs. What sort of attitude toward risk did they think a successful firefighter would have? The students who’d received the first packet thought that he would avoid it. The students in the second group thought he’d embrace it.

    Even after the evidence “for their beliefs has been totally refuted, people fail to make appropriate revisions in those beliefs,” the researchers noted. In this case, the failure was “particularly impressive,” since two data points would never have been enough information to generalize from.


    The Stanford studies became famous. Coming from a group of academics in the nineteen-seventies, the contention that people can’t think straight was shocking. It isn’t any longer. Thousands of subsequent experiments have confirmed (and elaborated on) this finding. As everyone who’s followed the research—or even occasionally picked up a copy of Psychology Today—knows, any graduate student with a clipboard can demonstrate that reasonable-seeming people are often totally irrational. Rarely has this insight seemed more relevant than it does right now. Still, an essential puzzle remains: How did we come to be this way?

    In a new book, “The Enigma of Reason” (Harvard), the cognitive scientists Hugo Mercier and Dan Sperber take a stab at answering this question. Mercier, who works at a French research institute in Lyon, and Sperber, now based at the Central European University, in Budapest, point out that reason is an evolved trait, like bipedalism or three-color vision. It emerged on the savannas of Africa, and has to be understood in that context.

    Stripped of a lot of what might be called cognitive-science-ese, Mercier and Sperber’s argument runs, more or less, as follows: Humans’ biggest advantage over other species is our ability to coöperate. Coöperation is difficult to establish and almost as difficult to sustain. For any individual, freeloading is always the best course of action. Reason developed not to enable us to solve abstract, logical problems or even to help us draw conclusions from unfamiliar data; rather, it developed to resolve the problems posed by living in collaborative groups.

    “Reason is an adaptation to the hypersocial niche humans have evolved for themselves,” Mercier and Sperber write. Habits of mind that seem weird or goofy or just plain dumb from an “intellectualist” point of view prove shrewd when seen from a social “interactionist” perspective.


    Consider what’s become known as “confirmation bias,” the tendency people have to embrace information that supports their beliefs and reject information that contradicts them.

    Of the many forms of faulty thinking that have been identified, confirmation bias is among the best catalogued; it’s the subject of entire textbooks’ worth of experiments. One of the most famous of these was conducted, again, at Stanford. For this experiment, researchers rounded up a group of students who had opposing opinions about capital punishment. Half the students were in favor of it and thought that it deterred crime; the other half were against it and thought that it had no effect on crime.

    The students were asked to respond to two studies. One provided data in support of the deterrence argument, and the other provided data that called it into question. Both studies—you guessed it—were made up, and had been designed to present what were, objectively speaking, equally compelling statistics. The students who had originally supported capital punishment rated the pro-deterrence data highly credible and the anti-deterrence data unconvincing; the students who’d originally opposed capital punishment did the reverse. At the end of the experiment, the students were asked once again about their views. Those who’d started out pro-capital punishment were now even more in favor of it; those who’d opposed it were even more hostile.

    If reason is designed to generate sound judgments, then it’s hard to conceive of a more serious design flaw than confirmation bias. Imagine, Mercier and Sperber suggest, a mouse that thinks the way we do. Such a mouse, “bent on confirming its belief that there are no cats around,” would soon be dinner. To the extent that confirmation bias leads people to dismiss evidence of new or underappreciated threats—the human equivalent of the cat around the corner—it’s a trait that should have been selected against. The fact that both we and it survive, Mercier and Sperber argue, proves that it must have some adaptive function, and that function, they maintain, is related to our “hypersociability.”

    Mercier and Sperber prefer the term “myside bias.” Humans, they point out, aren’t randomly credulous. Presented with someone else’s argument, we’re quite adept at spotting the weaknesses. Almost invariably, the positions we’re blind about are our own.


    A recent experiment performed by Mercier and some European colleagues neatly demonstrates this asymmetry. Participants were asked to answer a series of simple reasoning problems. They were then asked to explain their responses, and were given a chance to modify them if they identified mistakes. The majority were satisfied with their original choices; fewer than fifteen per cent changed their minds in step two.
    mueller report failure

    impeachment failure

    have you learned?

  5. #35 | Top
    Join Date
    Apr 2020
    Location
    Olympia, Wa
    Posts
    70,468
    Thanks
    3,125
    Thanked 15,029 Times in 12,559 Posts
    Groans
    1
    Groaned 1,401 Times in 1,345 Posts
    Blog Entries
    1

    Default

    The Better People adjust our thinking to facts/truth/reality....as we do our best to figure out what the facts/truth/reality is.....it is in large part what makes us The Better People.

    I highly recommend the practice.
    This illegal illegitimate regime that runs America is at fault...not me.... they do not represent me and I have long objected to their crimes against humanity.

  6. #36 | Top
    Join Date
    Jun 2020
    Location
    Phoenix
    Posts
    38,038
    Thanks
    14
    Thanked 18,926 Times in 13,193 Posts
    Groans
    3
    Groaned 832 Times in 791 Posts

    Default

    Well, one reason is that academic fraud is rife today, particularly on the Left, like the paper submitted and accepted by Affila, a feminist academic journal. The authors wrote a bogus paper based on Mein Kampf changing the target hate group from Jews to men. They wrote the paper almost verbatim otherwise. The Left loved it...

    http://norskk.is/bytta/menn/our_stru...y_struggle.pdf

    https://areomagazine.com/2018/10/02/...f-scholarship/

    Shallow, unvetted, and uselessly "peer reviewed" papers like this one abound on the academic Left simply because they have become nearly monolithic in universities in the Liberal arts (in particular). Instead of real academic inquiry, like-minded thinking is becoming the norm. Propaganda replaces philosophical thought. Closed-mindedness replaces curiosity. The Left is destroying higher education with its stupidity, hatred, racism, and bigotry.

  7. #37 | Top
    Join Date
    Jan 2014
    Location
    The Blue Ridge
    Posts
    37,741
    Thanks
    21,918
    Thanked 12,581 Times in 9,703 Posts
    Groans
    4,312
    Groaned 1,312 Times in 1,210 Posts
    Blog Entries
    1

    Default

    Quote Originally Posted by Nordberg View Post
    You are the star of that study. This post shows how wedded you are to its conclusions.,
    OK Boomer.

  8. #38 | Top
    Join Date
    Jul 2009
    Posts
    134,852
    Thanks
    13,246
    Thanked 40,785 Times in 32,151 Posts
    Groans
    3,661
    Groaned 2,865 Times in 2,752 Posts
    Blog Entries
    3

    Default

    Why Facts Don’t Change Our Minds~or theirs
    I don't know about you, but "facts" don't change my mind when I can see you've made them up.......
    Isaiah 6:5
    “Woe to me!” I cried. “I am ruined! For I am a man of unclean lips, and I live among a people of unclean lips, and my eyes have seen the King, the Lord Almighty.”

  9. #39 | Top
    Join Date
    Jul 2009
    Posts
    134,852
    Thanks
    13,246
    Thanked 40,785 Times in 32,151 Posts
    Groans
    3,661
    Groaned 2,865 Times in 2,752 Posts
    Blog Entries
    3

    Default

    Quote Originally Posted by Bill View Post
    This one of the threads I mentioned earlier.
    why did you mention it earlier?.....
    Isaiah 6:5
    “Woe to me!” I cried. “I am ruined! For I am a man of unclean lips, and I live among a people of unclean lips, and my eyes have seen the King, the Lord Almighty.”

  10. #40 | Top
    Join Date
    Apr 2020
    Location
    Olympia, Wa
    Posts
    70,468
    Thanks
    3,125
    Thanked 15,029 Times in 12,559 Posts
    Groans
    1
    Groaned 1,401 Times in 1,345 Posts
    Blog Entries
    1

    Default

    My peers suck so much that very few of you fuckers give a tinkers damn about what the truth is.

    That is your fault, but understand this....some of us managed to do better during this life.
    This illegal illegitimate regime that runs America is at fault...not me.... they do not represent me and I have long objected to their crimes against humanity.

  11. #41 | Top
    Join Date
    Mar 2020
    Location
    Texas
    Posts
    136,604
    Thanks
    46,753
    Thanked 68,628 Times in 51,918 Posts
    Groans
    2
    Groaned 2,506 Times in 2,463 Posts
    Blog Entries
    2

    Default

    Quote Originally Posted by Bill View Post
    I found this to be quite correct, w/ few exceptions.. Someone being proven totally wrong, regardless of facts, won't change their minds






    In 1975, researchers at Stanford invited a group of undergraduates to take part in a study about suicide. They were presented with pairs of suicide notes. In each pair, one note had been composed by a random individual, the other by a person who had subsequently taken his own life. The students were then asked to distinguish between the genuine notes and the fake ones.

    Some students discovered that they had a genius for the task. Out of twenty-five pairs of notes, they correctly identified the real one twenty-four times. Others discovered that they were hopeless. They identified the real note in only ten instances.

    As is often the case with psychological studies, the whole setup was a put-on. Though half the notes were indeed genuine—they’d been obtained from the Los Angeles County coroner’s office—the scores were fictitious. The students who’d been told they were almost always right were, on average, no more discerning than those who had been told they were mostly wrong.

    In the second phase of the study, the deception was revealed. The students were told that the real point of the experiment was to gauge their responses to thinking they were right or wrong. (This, it turned out, was also a deception.) Finally, the students were asked to estimate how many suicide notes they had actually categorized correctly, and how many they thought an average student would get right. At this point, something curious happened. The students in the high-score group said that they thought they had, in fact, done quite well—significantly better than the average student—even though, as they’d just been told, they had zero grounds for believing this. Conversely, those who’d been assigned to the low-score group said that they thought they had done significantly worse than the average student—a conclusion that was equally unfounded.

    “Once formed,” the researchers observed dryly, “impressions are remarkably perseverant.”

    A few years later, a new set of Stanford students was recruited for a related study. The students were handed packets of information about a pair of firefighters, Frank K. and George H. Frank’s bio noted that, among other things, he had a baby daughter and he liked to scuba dive. George had a small son and played golf. The packets also included the men’s responses on what the researchers called the Risky-Conservative Choice Test. According to one version of the packet, Frank was a successful firefighter who, on the test, almost always went with the safest option. In the other version, Frank also chose the safest option, but he was a lousy firefighter who’d been put “on report” by his supervisors several times. Once again, midway through the study, the students were informed that they’d been misled, and that the information they’d received was entirely fictitious. The students were then asked to describe their own beliefs. What sort of attitude toward risk did they think a successful firefighter would have? The students who’d received the first packet thought that he would avoid it. The students in the second group thought he’d embrace it.

    Even after the evidence “for their beliefs has been totally refuted, people fail to make appropriate revisions in those beliefs,” the researchers noted. In this case, the failure was “particularly impressive,” since two data points would never have been enough information to generalize from.


    The Stanford studies became famous. Coming from a group of academics in the nineteen-seventies, the contention that people can’t think straight was shocking. It isn’t any longer. Thousands of subsequent experiments have confirmed (and elaborated on) this finding. As everyone who’s followed the research—or even occasionally picked up a copy of Psychology Today—knows, any graduate student with a clipboard can demonstrate that reasonable-seeming people are often totally irrational. Rarely has this insight seemed more relevant than it does right now. Still, an essential puzzle remains: How did we come to be this way?

    In a new book, “The Enigma of Reason” (Harvard), the cognitive scientists Hugo Mercier and Dan Sperber take a stab at answering this question. Mercier, who works at a French research institute in Lyon, and Sperber, now based at the Central European University, in Budapest, point out that reason is an evolved trait, like bipedalism or three-color vision. It emerged on the savannas of Africa, and has to be understood in that context.

    Stripped of a lot of what might be called cognitive-science-ese, Mercier and Sperber’s argument runs, more or less, as follows: Humans’ biggest advantage over other species is our ability to coöperate. Coöperation is difficult to establish and almost as difficult to sustain. For any individual, freeloading is always the best course of action. Reason developed not to enable us to solve abstract, logical problems or even to help us draw conclusions from unfamiliar data; rather, it developed to resolve the problems posed by living in collaborative groups.

    “Reason is an adaptation to the hypersocial niche humans have evolved for themselves,” Mercier and Sperber write. Habits of mind that seem weird or goofy or just plain dumb from an “intellectualist” point of view prove shrewd when seen from a social “interactionist” perspective.


    Consider what’s become known as “confirmation bias,” the tendency people have to embrace information that supports their beliefs and reject information that contradicts them.

    Of the many forms of faulty thinking that have been identified, confirmation bias is among the best catalogued; it’s the subject of entire textbooks’ worth of experiments. One of the most famous of these was conducted, again, at Stanford. For this experiment, researchers rounded up a group of students who had opposing opinions about capital punishment. Half the students were in favor of it and thought that it deterred crime; the other half were against it and thought that it had no effect on crime.

    The students were asked to respond to two studies. One provided data in support of the deterrence argument, and the other provided data that called it into question. Both studies—you guessed it—were made up, and had been designed to present what were, objectively speaking, equally compelling statistics. The students who had originally supported capital punishment rated the pro-deterrence data highly credible and the anti-deterrence data unconvincing; the students who’d originally opposed capital punishment did the reverse. At the end of the experiment, the students were asked once again about their views. Those who’d started out pro-capital punishment were now even more in favor of it; those who’d opposed it were even more hostile.

    If reason is designed to generate sound judgments, then it’s hard to conceive of a more serious design flaw than confirmation bias. Imagine, Mercier and Sperber suggest, a mouse that thinks the way we do. Such a mouse, “bent on confirming its belief that there are no cats around,” would soon be dinner. To the extent that confirmation bias leads people to dismiss evidence of new or underappreciated threats—the human equivalent of the cat around the corner—it’s a trait that should have been selected against. The fact that both we and it survive, Mercier and Sperber argue, proves that it must have some adaptive function, and that function, they maintain, is related to our “hypersociability.”

    Mercier and Sperber prefer the term “myside bias.” Humans, they point out, aren’t randomly credulous. Presented with someone else’s argument, we’re quite adept at spotting the weaknesses. Almost invariably, the positions we’re blind about are our own.


    A recent experiment performed by Mercier and some European colleagues neatly demonstrates this asymmetry. Participants were asked to answer a series of simple reasoning problems. They were then asked to explain their responses, and were given a chance to modify them if they identified mistakes. The majority were satisfied with their original choices; fewer than fifteen per cent changed their minds in step two.
    Interesting study....and a note to those who volunteer for psych studies: There's a double blind or more so whatever they say the test is about is not what it's about.
    God bless America and those who defend our Constitution.

    "Hatred is a failure of imagination" - Graham Greene, "The Power and the Glory"

  12. #42 | Top
    Join Date
    Apr 2020
    Location
    Olympia, Wa
    Posts
    70,468
    Thanks
    3,125
    Thanked 15,029 Times in 12,559 Posts
    Groans
    1
    Groaned 1,401 Times in 1,345 Posts
    Blog Entries
    1

    Default

    I did not read this but I have seen this sort of thing over and over again, this is how it goes......"Dont worry that you suck as much as you do, so does everyone else, trust us!".

    Succeed/Fail is what the Better People care about.....you people who feel the need to feel good about your excuses for failing suck.

    DO BETTER!
    This illegal illegitimate regime that runs America is at fault...not me.... they do not represent me and I have long objected to their crimes against humanity.

  13. #43 | Top
    Join Date
    Dec 2016
    Posts
    1,929
    Thanks
    62
    Thanked 266 Times in 217 Posts
    Groans
    7
    Groaned 19 Times in 19 Posts

    Default

    FILL IN THE BLANK:


    If Jon Wilkes Booth was wearing a "MAKE AMERICA GREAT AGAIN" hat...

    Abraham Lincoln was wearing a "_______________" hat.

  14. #44 | Top
    Join Date
    Dec 2006
    Posts
    71,685
    Thanks
    6,597
    Thanked 12,131 Times in 9,660 Posts
    Groans
    14
    Groaned 504 Times in 477 Posts
    Blog Entries
    1

    Default

    Quote Originally Posted by Bill View Post
    I found this to be quite correct, w/ few exceptions.. Someone being proven totally wrong, regardless of facts, won't change their minds






    In 1975, researchers at Stanford invited a group of undergraduates to take part in a study about suicide. They were presented with pairs of suicide notes. In each pair, one note had been composed by a random individual, the other by a person who had subsequently taken his own life. The students were then asked to distinguish between the genuine notes and the fake ones.

    Some students discovered that they had a genius for the task. Out of twenty-five pairs of notes, they correctly identified the real one twenty-four times. Others discovered that they were hopeless. They identified the real note in only ten instances.

    As is often the case with psychological studies, the whole setup was a put-on. Though half the notes were indeed genuine—they’d been obtained from the Los Angeles County coroner’s office—the scores were fictitious. The students who’d been told they were almost always right were, on average, no more discerning than those who had been told they were mostly wrong.

    In the second phase of the study, the deception was revealed. The students were told that the real point of the experiment was to gauge their responses to thinking they were right or wrong. (This, it turned out, was also a deception.) Finally, the students were asked to estimate how many suicide notes they had actually categorized correctly, and how many they thought an average student would get right. At this point, something curious happened. The students in the high-score group said that they thought they had, in fact, done quite well—significantly better than the average student—even though, as they’d just been told, they had zero grounds for believing this. Conversely, those who’d been assigned to the low-score group said that they thought they had done significantly worse than the average student—a conclusion that was equally unfounded.

    “Once formed,” the researchers observed dryly, “impressions are remarkably perseverant.”

    A few years later, a new set of Stanford students was recruited for a related study. The students were handed packets of information about a pair of firefighters, Frank K. and George H. Frank’s bio noted that, among other things, he had a baby daughter and he liked to scuba dive. George had a small son and played golf. The packets also included the men’s responses on what the researchers called the Risky-Conservative Choice Test. According to one version of the packet, Frank was a successful firefighter who, on the test, almost always went with the safest option. In the other version, Frank also chose the safest option, but he was a lousy firefighter who’d been put “on report” by his supervisors several times. Once again, midway through the study, the students were informed that they’d been misled, and that the information they’d received was entirely fictitious. The students were then asked to describe their own beliefs. What sort of attitude toward risk did they think a successful firefighter would have? The students who’d received the first packet thought that he would avoid it. The students in the second group thought he’d embrace it.

    Even after the evidence “for their beliefs has been totally refuted, people fail to make appropriate revisions in those beliefs,” the researchers noted. In this case, the failure was “particularly impressive,” since two data points would never have been enough information to generalize from.


    The Stanford studies became famous. Coming from a group of academics in the nineteen-seventies, the contention that people can’t think straight was shocking. It isn’t any longer. Thousands of subsequent experiments have confirmed (and elaborated on) this finding. As everyone who’s followed the research—or even occasionally picked up a copy of Psychology Today—knows, any graduate student with a clipboard can demonstrate that reasonable-seeming people are often totally irrational. Rarely has this insight seemed more relevant than it does right now. Still, an essential puzzle remains: How did we come to be this way?

    In a new book, “The Enigma of Reason” (Harvard), the cognitive scientists Hugo Mercier and Dan Sperber take a stab at answering this question. Mercier, who works at a French research institute in Lyon, and Sperber, now based at the Central European University, in Budapest, point out that reason is an evolved trait, like bipedalism or three-color vision. It emerged on the savannas of Africa, and has to be understood in that context.

    Stripped of a lot of what might be called cognitive-science-ese, Mercier and Sperber’s argument runs, more or less, as follows: Humans’ biggest advantage over other species is our ability to coöperate. Coöperation is difficult to establish and almost as difficult to sustain. For any individual, freeloading is always the best course of action. Reason developed not to enable us to solve abstract, logical problems or even to help us draw conclusions from unfamiliar data; rather, it developed to resolve the problems posed by living in collaborative groups.

    “Reason is an adaptation to the hypersocial niche humans have evolved for themselves,” Mercier and Sperber write. Habits of mind that seem weird or goofy or just plain dumb from an “intellectualist” point of view prove shrewd when seen from a social “interactionist” perspective.


    Consider what’s become known as “confirmation bias,” the tendency people have to embrace information that supports their beliefs and reject information that contradicts them.

    Of the many forms of faulty thinking that have been identified, confirmation bias is among the best catalogued; it’s the subject of entire textbooks’ worth of experiments. One of the most famous of these was conducted, again, at Stanford. For this experiment, researchers rounded up a group of students who had opposing opinions about capital punishment. Half the students were in favor of it and thought that it deterred crime; the other half were against it and thought that it had no effect on crime.

    The students were asked to respond to two studies. One provided data in support of the deterrence argument, and the other provided data that called it into question. Both studies—you guessed it—were made up, and had been designed to present what were, objectively speaking, equally compelling statistics. The students who had originally supported capital punishment rated the pro-deterrence data highly credible and the anti-deterrence data unconvincing; the students who’d originally opposed capital punishment did the reverse. At the end of the experiment, the students were asked once again about their views. Those who’d started out pro-capital punishment were now even more in favor of it; those who’d opposed it were even more hostile.

    If reason is designed to generate sound judgments, then it’s hard to conceive of a more serious design flaw than confirmation bias. Imagine, Mercier and Sperber suggest, a mouse that thinks the way we do. Such a mouse, “bent on confirming its belief that there are no cats around,” would soon be dinner. To the extent that confirmation bias leads people to dismiss evidence of new or underappreciated threats—the human equivalent of the cat around the corner—it’s a trait that should have been selected against. The fact that both we and it survive, Mercier and Sperber argue, proves that it must have some adaptive function, and that function, they maintain, is related to our “hypersociability.”

    Mercier and Sperber prefer the term “myside bias.” Humans, they point out, aren’t randomly credulous. Presented with someone else’s argument, we’re quite adept at spotting the weaknesses. Almost invariably, the positions we’re blind about are our own.


    A recent experiment performed by Mercier and some European colleagues neatly demonstrates this asymmetry. Participants were asked to answer a series of simple reasoning problems. They were then asked to explain their responses, and were given a chance to modify them if they identified mistakes. The majority were satisfied with their original choices; fewer than fifteen per cent changed their minds in step two.
    you might as well stfu then.

  15. #45 | Top
    Join Date
    Jun 2016
    Location
    State of Bliss
    Posts
    31,007
    Thanks
    7,095
    Thanked 5,196 Times in 3,829 Posts
    Groans
    433
    Groaned 261 Times in 257 Posts
    Blog Entries
    5

    Default

    Quote Originally Posted by PostmodernProphet View Post
    I don't know about you, but "facts" don't change my mind.
    tru dat!! you are a perfect example for sure, you got skillz
    "There is no question former President Trump bears moral responsibility. His supporters stormed the Capitol because of the unhinged falsehoods he shouted into the world’s largest megaphone," McConnell wrote. "His behavior during and after the chaos was also unconscionable, from attacking Vice President Mike Pence during the riot to praising the criminals after it ended."



Similar Threads

  1. Replies: 28
    Last Post: 11-24-2015, 11:07 AM
  2. Do political ads change minds ?
    By Celticguy in forum General Politics Forum
    Replies: 23
    Last Post: 04-07-2014, 10:15 AM
  3. APP - just some interesting facts about polar sea ice and climate change
    By Don Quixote in forum Above Plain Politics Forum
    Replies: 0
    Last Post: 03-11-2014, 12:39 PM
  4. Replies: 11
    Last Post: 12-11-2013, 07:23 AM
  5. Acorn, some facts for a change
    By midcan5 in forum General Politics Forum
    Replies: 32
    Last Post: 10-19-2008, 09:46 PM

Bookmarks

Posting Rules

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •