This post isn't really about philosophy, but philosophers I know keep linking to this article on facebook. That's my excuse for talking about a psychology paper.
Jonah Lehrer posted, on the New Yorker website, this short piece entitled "Why Smart People are Stupid," with the more descriptive alternate title "Research Shows that the Smarter People Are, the More Susceptible They Are to Cognitive Bias." It's (supposed to be) a report of a recent article called "Cognitive Sophistication Does not Attenuate the Bias Blind Spot" in the Journal of Personality and Social Psychology, by Richard West, Russell Meserve, and Keith Stanovich. The bias blind spot, the authors of the latter paper tell us, is "reporting that thinking biases [e.g. the base-rate fallacy, the Alice the bank-teller conjunctive fallacy] are more
prevalent in others than in themselves." What follows is about the original article, not the Lehrer post.
(The substantial gap between the Lehrer post and the actual article, which you could already figure out from the differing titles, is covered well here.)
The authors of the study used multiple instruments to judge cognitive sophistication, which includes not only cognitive ability (the authors use SAT scores to measure cognitive ability), but also 'thinking dispositions,' measured in part by a Need for Cognition test. (All the following quotations come from this site on the Need for Cognition Scale.) This test "quantitatively measures 'the tendency for an individual to engage in and
enjoy thinking" (Cacioppo & Petty, 1982, p. 116)." I had heard of the Need for Cognition Scale before, but had never looked at the actual 18-question survey instrument (again, available here). Some of the questions are the sorts of questions I expected, given the name 'Need for Cognition':
"5. I try to anticipate and avoid situations where there is likely a chance I will have to think in depth about something.
6. I find satisfaction in deliberating hard and for long hours."
However, some of the questions seemed instead to measure something slightly different: the respondent's belief or faith in her rational faculties:
"2. I like to have the responsibility of handling a situation that requires a lot of thinking.
10. The idea of relying on thought to make my way to the top appeals to me.
15. I would prefer a task that is intellectual, difficult, and important to
one that is somewhat important but does not require much thought."
Respondents who say that they 'agree very strongly' with 2, 10, and 15 seem like they would be much more likely to think of themselves as overcoming irrational biases. Speaking purely from my own case, I certainly agree very strongly with item #6 ('I find satisfaction in deliberating hard and for long hours'). However, I also believe humans (myself included) can exhibit irrational biases in (what what feels like) rational deliberations, so I would probably disagree or at least say I 'neither agree nor disagree' with 2 and 15. If there is an important task that I can get the right answer to that does not require much thought, then I would prefer that to a task where there a lot of rational speculation is required. To put the point more pithily: if a decision really is important, I'd rather be right than think -- since I think biases can creep into my thinking process.
My point is just that items 2, 10, and 15 seem to already be asking 'How much faith do you have that your rational faculties aren't clouded by bias?', which is pretty close to asking about the blind spot bias. Or at least, people who very strongly agree with 2, 10, and 15 would be people who think they can overcome bias when they are responsible for making important decisions. And that seems to be halfway to the bias blind spot (the belief that biases are more prevalent in others than ourselves) -- and it's the more substantive half, since just about everyone agrees that other people are suffer from biases.
And this fits the data: the correlation between 'Composite Bias Blind Spot' (the authors asked subjects about 7 different specific biases, not just 'Do you think you are less biased than other people?') and Need For Cognition score is .260, p<.001 (Table 2). (And interestingly, this correlation was by far the largest of the 4 four components of 'cognitive sophistication.')
So the interesting question for me would be: suppose we took each subject's Need For Cognition answers, dropped the answers to 2, 10, and 15 (or maybe just 2 and 15), and re-calculate each subject's NFC score. If I am right, the size of the correlation would drop significantly.
idiosyncratic perspectives on philosophy of science, its history, and related issues in logic
6/18/2012
6/07/2012
In the current AJP: "Overcoming Metaphysics through the Logical Analysis of Language" for the 21st C
"[T]he isolation of non-naturalistic metaphysics from other disciplines as well as from empirical refutation has made it moribund. We claim that a defining feature of non-naturalistic metaphysics is that it can have no observable consequences."
James Maclaurin and Heather Dyke, "What is Analytic Metaphysics For?", Australasian Journal of Philosophy, Vol. 90(2), 2012. (freely available reprint here)
Of course, the paper does not reprise Carnap's "Overcoming Metaphysics through the Logical Analysis of Language" exactly; Maclaurin and Dyke's central claim is that the criteria for theory-choice in non-naturalistic metaphysics are not truth-conducive. The Carnap "Overcoming Metaphysics..." concludes instead that (non-naturalistic) metaphysics is cognitively meaningless.
So the precise problem with metaphysics is different for Carnap and for Maclaurin &, but a central premise used to reach their contra-metaphysics conclusions -- namely, (non-naturalistic) metaphysical claims make no difference to observation and are (inferentially) isolated from claims of the sciences -- is basically the same in both cases.
James Maclaurin and Heather Dyke, "What is Analytic Metaphysics For?", Australasian Journal of Philosophy, Vol. 90(2), 2012. (freely available reprint here)
Of course, the paper does not reprise Carnap's "Overcoming Metaphysics through the Logical Analysis of Language" exactly; Maclaurin and Dyke's central claim is that the criteria for theory-choice in non-naturalistic metaphysics are not truth-conducive. The Carnap "Overcoming Metaphysics..." concludes instead that (non-naturalistic) metaphysics is cognitively meaningless.
So the precise problem with metaphysics is different for Carnap and for Maclaurin &, but a central premise used to reach their contra-metaphysics conclusions -- namely, (non-naturalistic) metaphysical claims make no difference to observation and are (inferentially) isolated from claims of the sciences -- is basically the same in both cases.
6/05/2012
Submit a paper for the Creighton Club
Subscribe to:
Posts (Atom)