Two envelopes presented in front of you. Envelop A has a raffle ticket that guarantees 95% probability of winning $1 million. Envelop B has a cashier check of amount X. How much of the X will make you choose the check, instead of the raffle ticket?
One of those I surveyed said, “$100,000 or more.” “Really?!” I was surprised. “Because $100K will change my life beyond what I can possibly imagine, so more than that has no incremental utility.” In other words, $100,000 is the same as $1 million to her.
How much is your value?
Those who accept less than $950,000 are not rational. But that’s everyone of us. People fall for things like this every day. There are companies and individuals who prey on us, legally.
Everyone should read Daniel Kahneman’s book: Thinking, Fast and Slow. I devoured it in several days and, very rare for me to do, re-read it immediately. I was training to become a better decision makers by chewing through this book slowly.
General notes from the book.
- The mental capacity for “attention” is limited. One cannot physically pay attention to too many things. Yet practice will ease the mental effort to perform certain tasks and increase the overall “multitasking” capacity.
- Self-control itself requires effort. Therefore you cannot carry out mental activity where you are physically exhausted. And, if you are tired or concentrated on certain mental tasks, your other self-control disciplines will lapse. You become rude, lazy, impatient, over-eating, or politically less correct. This is called ego depletion. Funny that you can restore your capacity with sugar!
- Being rational is different from being intelligent. – Keith Stanovich
- There are three ways people associate things: by resemblance, contiguity in time and place, and causality. – David Hume, 1748
- Truth Illusion: if it is cognitively easier, it must be true. Use repetition, fonts, text layout, or whatever to introduce the illusion of familiarity or clarity. The recipients tend to believe it easier. Rhyming the message is greatly effective (Woe unites foes.) People believe more when they are in good mood, and less when sad.
- Humans seem genetically wired for causality. Particularly the perception of intention: we tend to believe that someone planned or intended for the outcome. When none of such intention existed, we think that God wanted it to happen. We tend to assign larger roles to talent, stupidity, and intentions than to luck. A few lucky gambles can therefore crown a reckless leader with a halo of prescience and boldness. Conversely, we tend to substitute accuracy or competence with self-confidence. (“He has no doubt, therefore it must be correct that this lottery ticket will win.”)
- Confirmation Bias: we are biased to say yes or react to positive messages.
- We believe when the story seems coherent and “make sense”, not when the data are of high quality and convincing. In fact, you can convince almost anyone without any data, or proof, as long as you can tell a compelling and coherent story. The rational decision makers must pay attention to the data and re-cast the story based on the so call “base rate.” 100% improvement on a 1% probability is only 2%. Don’t let “the emotional tail wags the rational dog.”
- When we have painstakingly planned out a project, we become confident on the success of that project. We tend to ignore the “unknowable unknowns.”
- Our reactions to risks are usually irrational. When faced with a sure loss, we “hell Mary.” At the same time, we buy insurance to protect ourselves against the “hell Mary.”
- We are really bad at probability. If there is a 0.001% chance for your kid to get sick, how much would you spend to prevent that? And how much should you save for her college education which will give her 25% higher pay through-out her life?