Cognitive Bias . . . or why you really don't think objectively about the most important things4/22/2019 Fred is constantly fascinated by people who are so certain they are right, they will simply ignore reality rather than entertain the possibility that they are wrong. The desire, the willingness, nay, the insistence on standing one's ground in the face of all credible evidence and reason to the contrary is a part of the human condition. What we now know is that this phenomenon has a cause (or rather a group of causes) called cognitive bias. These are psychological rationalizations that our minds employ to keep us from going crazy by constantly having to reevaluate reality.
From an evolutionary perspective, this makes sense. If we didn't have cognitive biases when we lived on the savanna, we would have to stop and think every time we heard a rustle in the undergrowth that sounded vaguely lion-like, "Hmm, I wonder if that could be a lion. It was a lion the time Ogg was eaten, and when Mooga was eaten too, but I suppose . . . AAAAAAAAA! Lion." Nope, better to just run away first and never ponder whether it was a lion. In fact, better to believe that it ABSOLUTELY WAS A LION AND I JUST ESCAPED CERTAIN DEATH! Never mind that Pooga and Dooga are laughing their a**es off as they watch you run away from a rabbit. It was a lion, you know it, and they are big, stupid Neanderthals! Unfortunately, in the modern world some humans have figured out that if they can make other humans think that there is a lion in the underbrush, those humans can be made to seek protection without question whether there is a lion (or even why there would be a lion in suburbia). This is why propaganda works so well -- it plays on primal fears, most especially the fear that we will not be able to procreate. Wait, I hear you cry, how did sex get into this? Well, it's very simple. In evolutionary terms, not getting eaten by a lion equals being desirable as a mate. More generally, having good survival instincts means being "savanna smart," and we all want our potential mates to think this is something we have and that in turn we can pass on to our children. We don't want to look like lion bait in front of She-ooga, because, in modern terms (according to Ron White) "You can't fix stupid." Thus, because we don't want to look stupid, we have put our cognitive biases to work to convince our potential mates that we aren't lion bait by reinforcing our certainty that we know better than to fall for something as painfully obvious as the truth. Fortunately, because Shee-ooga has the same cognitive biases, it often never occurs to her that our certainty does not seem to conform to reality. Now Fred is willing to admit that he falls into the old cognitive bias trap now and again, and probably a lot more often than he realizes. Indeed, the belief that one is not susceptible to cognitive bias is recognized form of cognitive bias. So, as a public service, Fred presents this list of cognitive bias phenomena with brief explanations. How many can you recognize in yourself? Tribal Epistemology -- Information is evaluated based not on conformity to common standards of evidence or correspondence to a common understanding of the world, but on whether it supports the tribe’s values and goals and is vouchsafed by tribal leaders. Dunning Kruger Effect -- The tendency for unskilled individuals to overestimate their own ability and the tendency for experts to underestimate their own ability. Availability Cascade -- A self-reinforcing process in which a collective belief gains more and more plausibility through its increasing repetition in public discourse (or “repeat something long enough and it will become true”). Confirmation Bias -- The tendency to search for, interpret, focus on and remember information in a way that confirms one’s preconceptions. Backfire Effect -- The reaction to disconfirming evidence by strengthening one’s previous beliefs. Curse of Knowledge -- When better-informed people find it extremely difficult to think about problems from the perspective of lesser-informed people (but see Dunning Kruger Effect). Empathy Gap -- The tendency to underestimate the influence or strength of feelings, in either oneself or others. Illusory Truth Effect -- A tendency to believe that a statement is true if it is easier to process, or if it has been stated multiple times, regardless of its actual veracity. Irrational Escalation/Sunk Cost Fallacy --The phenomenon where people justify increased investment in a decision, based on the cumulative prior investment, despite new evidence suggesting that the decision was probably wrong. Negativity Bias -- Psychological phenomenon by which humans have a greater recall of unpleasant memories compared with positive memories, thus reinforcing the belief that bad things happen far more often than they really do. Normalcy Bias -- The refusal to plan for, or react to, a disaster which has never happened before. Planning Fallacy -- The tendency to underestimate task-completion times (also known as Anti-Murphy Bias, referring to "Murphy's Law"). Semmelweis Reflex -- The tendency to reject new evidence that contradicts a paradigm. This is especially problematic in the scientific community, where dogmatic insistence on a "proven" theory often results in ridicule of the new discovery that falls outside the expected result. Third-person Effect -- Belief that mass communicated media messages have a greater effect on others than on themselves (also know as False Immunity Bias, that is the belief that one is immune from being influenced by the media). Parkinson’s Law of Triviality -- The tendency to give disproportionate weight to trivial issues. Also known as bikeshedding, this bias explains why an organization may avoid specialized or complex subjects, such as the design of a nuclear reactor, and instead focus on something easy to grasp or rewarding to the average participant, such as the design of an adjacent bike shed. It's also the basis for most magic tricks in which the magician distracts the audience by having them focus on a trivial, but easily followed, pattern. False Consensus Effect -- The tendency for people to overestimate the degree to which others agree with them.
0 Comments
Leave a Reply. |
AuthorPublius Fred Archives
May 2020
Categories |
Proudly powered by Weebly