banner
lca

lca

真正的不自由,是在自己的心中设下牢笼。

"Thinking, Fast and Slow" Reading Notes

image

Author: Daniel Kahneman
Recommendation: ⭐⭐⭐⭐⭐

Excerpts#

Preface#

  • Our subjective judgments are biased: we are particularly prone to believe research findings that are drawn without sufficient evidence, and the collection of observational samples in studies is often inadequate.

  • People use similarity as a simple heuristic (essentially a rule of thumb) to make difficult judgments. This reliance on heuristic methods inevitably leads to biased predictions (systematic errors). Dependence on rules of thumb will inevitably lead to biases in people's judgments.

  • People estimate the importance of things based on how easily they can retrieve information from memory, which is often related to the extent of media coverage. Topics that are frequently mentioned become vivid in the mind, while others fade away.

  • The content chosen for media reporting aligns with the information present in people's minds, so the phenomenon of authoritarian regimes pressuring independent media is not coincidental.

  • When faced with difficult problems, we often answer relatively simple questions while ignoring the fact that we have replaced the original question.

Chapter 1 An Angry Face and a Multiplication Problem#

  • A step-by-step calculation process is slow thinking.

  • System 1 operates unconsciously and quickly, requiring little mental effort, with no feeling, and is completely under autonomous control. System 2 shifts attention to mentally demanding activities, such as complex calculations.

  • When people focus too much on something, they block out other things, even those they are usually very interested in.

Chapter 2 Movie Protagonists and Supporting Characters#

  • If a person's brain is in a sprinting state, it may effectively block out (secondary information). Generally, most people require self-control to maintain coherent thinking or to think actively from time to time.

Chapter 3 Inertia of Thought and the Contradiction of Delayed Gratification#

  • Those who have experienced flow describe it as "a state where the brain's attention is effortlessly focused, allowing one to forget the concept of time, forget oneself, and forget personal problems." Their descriptions of the pleasure derived from this state are very appealing, and Mihaly refers to it as "optimal experience."

  • Riding a motorcycle at 150 miles per hour and competing in a chess tournament both require effort; however, in a state of flow, focusing on engaging tasks does not require self-control. Therefore, we should allocate all resources to the task at hand.

  • A major function of System 2 is to supervise and control thought activities and various behaviors guided by System 1, allowing some thoughts to be directly translated into actions or to suppress or alter other thoughts.

Chapter 4 The Magical Power of Association#

  • The thoughts that things evoke in your brain trigger many other thoughts, and this associative behavior rapidly expands in your brain.

  • You think you know yourself well, but you are mistaken.

  • In an authoritarian country, the ubiquitous portraits of leaders not only convey the feeling of "Big Brother is watching you," but also gradually cause you to lose independent thought and the ability to act independently.

Chapter 5 Your Intuition May Just Be an Illusion#

  • System 1 creates a sense of familiarity, while System 2 relies on this familiarity generated by System 1 to make judgments of correctness.

  • Repetition can induce a relaxed state and a comforting sense of familiarity.

Chapter 7 The Letter "B" and the Number "13"#

  • Conscious skepticism requires simultaneously holding multiple incompatible explanations in mind, which takes effort, and this is not System 1's strength. Variability and skepticism fall within System 2's responsibilities.

  • Gilbert proposed that before understanding a statement, one will first attempt to believe it: if the statement is correct, you must first understand what its viewpoint means. Only then can you decide whether to "doubt" it.

  • When System 2 is involved, we are almost inclined to believe everything. Because System 1 is not only easily deceived but also prone to bias, and although System 2 holds the power of skepticism and distrust, it is sometimes busy and, when not busy, quite lazy, often neglecting its duties.

  • Evidence shows that when people are fatigued or depleted, they are more easily influenced by hollow yet persuasive information, such as advertisements.

  • Liking (or disliking) a person leads to liking (or disliking) everything about that person—including aspects you have not yet observed—this tendency is called the halo effect.

  • Our observation sequence of a person's character traits is random. However, the order is indeed important because the halo effect emphasizes first impressions, while subsequent information is largely diminished.

  • The method adopted to avoid the halo effect in grading follows a general principle: eliminate erroneous associations!

  • Jumping to conclusions in the absence of evidence is very helpful for our understanding of intuitive thinking.

  • You often find that knowing very little can actually encompass all known things into a coherent thought pattern.

  • Framing effect: Different expressions of the same information often evoke different emotions in people. The statement "the survival rate within a month after surgery is 90%" is more reassuring than "the death rate within a month after surgery is 10%." Similarly, saying a cold dish "is 90% fat-free" is more appealing than saying "contains 10% fat." Clearly, the deep meaning of each pair of sentences is the same; they just differ in expression, yet people usually read different meanings and believe what they see is the truth.

  • She knows nothing about this person's management skills. The reason she has a good impression of him is that she once heard him give a brilliant presentation.

Chapter 8 How Do We Make Judgments?#

  • People always evaluate a person's ability by combining factors of strength and credibility.

  • Divergent thinking allows us to make intuitive judgments.

  • A clear example of divergent thinking is when he was asked whether he thought the company was financially strong, he thought of the product he was passionate about from that company.

Chapter 9 Goal Problems and Heuristic Problems Are Inseparable#

  • When the brain is in a normal state, you almost have intuition and thoughts about everything that appears before you.

  • Heuristic problems are the simpler questions you answer by bypassing the original question.

  • When people are asked to judge probabilities, they are actually judging other things and believe they have completed the task of judging probabilities. When faced with a difficult "goal problem," if some answers to related and easily answerable "heuristic problems" immediately come to mind, System 1 will often adopt this "substitution" approach, using the answers to the substitute questions.

  • Three-dimensional heuristics: Distant objects appear larger.

  • The impression of the size of three-dimensional images influences our judgment of the size of flat images. The illusion arises from the heuristics of three-dimensional images.

  • The bias of heuristics is that objects that appear farther away seem larger.

  • Your associative memory will quickly and automatically use available information to craft the most appropriate story.

  • Emotional heuristic: Because of liking, there is agreement.

  • The supremacy of conclusions does not mean that your thinking has completely stopped, nor does it mean you can completely ignore information and reasonable explanations to reach your own conclusions.

  • System 2 also has the function of actively searching memory, complex calculation, comparison, planning, and decision-making. System 2 seems to always be in the highest decision-making position and has the ability to resist System 1's suggestions; it can slow things down and begin logical analysis.

  • Exaggerating emotional consistency (halo effect).

  • Focusing on current evidence while ignoring non-existent evidence (seeing is believing).

Chapter 10 The Law of Large Numbers and the Law of Small Numbers#

  • System 1 is very good at a mode of thinking—automatically and effortlessly recognizing causal relationships between things, even when such relationships do not exist.

  • For a rational person, unbiased and moderate predictions should not raise issues.

  • The error risk of small samples can be as high as 50%.

  • When information is insufficient, extreme predictions and the willingness to predict rare events stem from System 1.

  • Confidence is determined by the coherence of the most reasonable story you extract from available information.

  • Our intuitive predictions are indeed encouraging, but this prediction may be too far from reality; let's take another look at the information we have and bring the prediction back to an average state.

  • The general bias of trusting more than questioning.

  • System 1 is not good at questioning.

  • System 2 can raise questions because it can simultaneously hold multiple incompatible possibilities.

  • The law of small numbers is a manifestation of the general bias of trusting more than questioning.

  • We often feel very familiar with and knowledgeable about a person, but in fact, we know very little about them.

Chapter 19 The Illusion of "Knowing"#

  • We constantly try to understand the world, and in this process, we inevitably generate "narrative fallacies." Those statements that attract people's attention are often simple and easy to understand; they are concrete rather than abstract, and they believe that talent, stupidity, and intention outweigh luck; they focus on a few significant events that have occurred rather than countless events that did not happen.

  • We humans often concoct strained explanations for past regrets and believe them to be true, thus deceiving ourselves.

  • Causal explanations for random events are inevitably wrong.

  • The halo effect has a final stage, which is to place an invincible halo on the protagonist of the story.

  • The case of Google also includes a lot of skills, but the role of luck in the actual operation of the company far exceeds the level described in the story. The more luck involved, the less one can learn from it.

  • We firmly believe that the world is meaningful, and this confidence is built on a solid foundation: we maximize the neglect of our ignorance.

  • In this world, patterns (such as the order of six baby girls) do not just happen by chance; they are also the result of mechanical causal connections or human will.

  • The social cost of hindsight.

  • Simply put, if you follow your intuition, you will often make mistakes by viewing random events as patterned events.

  • I knew the "effect" long ago.

  • The phenomenon of "hindsight."

  • If an event indeed occurs, people will exaggerate the likelihood of their previous predictions; if a possible event does not occur, subjects will incorrectly recall that they always thought the likelihood of that event happening was low.

  • Changing personal thoughts based on what has happened creates profound cognitive illusions.

  • Hindsight bias has a detrimental effect on decision-makers' evaluative behavior; it leads observers to assess the quality of a judgment not based on the rationality of the judgment process but rather on the goodness or badness of the outcome.

  • We all need a reassurance, wanting to know that our actions will have appropriate results.

  • The anchoring effect is ubiquitous in life.

  • Philip Rosenzweig wrote a book called "The Halo Effect."

  • The anchoring effect.

  • Before evaluating the special value of an unknown quantity, people always consider this quantity in advance, at which point the anchoring effect occurs.

  • Success and failure stories often exaggerate the impact of leadership style and management measures on company performance, so these stories are basically useless.

  • Because luck plays a significant role, we cannot infer the level of leadership and the quality of management measures from predictions of success.

  • We should not carry outcome bias. Although outcome bias can sometimes be useful, this decision is foolish.

  • Suggestion is a form of anchoring effect.

  • System 1 understands sentences by trying to believe in the truth of their content; its selective activation of corresponding ideas leads to a series of systematic errors, making us more easily deceived and more firmly believe in our thoughts.

  • Numbers of different sizes can evoke different conceptual systems in memory, and these biased concepts become the basis for estimating the annual average temperature, thus making the estimates biased.

  • If you know little about the trees in California but are asked whether redwoods are taller than 1200 feet, you might think this number is not far from the real number. Since this question was posed by someone who knows the actual height of these trees, this anchor value may be a valuable hint. However, an important finding in anchoring effect research is that anchor values are clearly arbitrary and may be as effective as anchor values that might have informational value.

  • The anchoring effect is triggered by this associative activation. Whether the story is true or credible is not important at all. The powerful influence of random anchoring is an extreme example of the anchoring effect because random anchoring clearly provides no information.

  • Plans are designed for optimal situations. When we anticipate actual results, we should avoid the anchoring effect of plans. Considering the various ways plans can go wrong is also a way to execute plans.

Chapter 12 Scientifically Utilizing the Availability Heuristic#

  • Being aware of one's biases is beneficial for team relationships; constantly being vigilant about biases is tiring.

  • The same biases apply to common observations; many cooperative team members feel that what they do exceeds their responsibilities.

  • In any case, everyone should keep this in mind. What you do may occasionally exceed your responsibilities, but you should know that when you might feel this way, every member of your team may feel the same.

  • The availability bias can affect our views of ourselves or others.

  • Self-assessment is measured by the ease with which events are presented in the mind. The experience of easily recalling something is more important than the number of things recalled.

  • Students who list more improvement methods also rate the course higher.

  • If I encounter unimaginable difficulty in recalling examples that reflect my decisiveness, it indicates that I am not a decisive person at all.

  • This CEO has had consecutive successes, so failure does not easily appear in her mind. The availability bias makes her overly confident.

Chapter 13 Anxiety and the Design of Risk Policies#

  • Availability effect.

  • Over time, memories of disasters become blurred, and the degree of worry and precaution diminishes.

  • Availability bias.

  • Diabetes and asthma, strokes and accidents.

  • In each group of causes, subjects must identify the more common causes and estimate the ratio of the two possibilities, then compare their judgments with the health statistics at the time.

  • The world in our minds is not an accurate reflection of the real world; our estimates of the frequency of events are also influenced by our exposure to this information and frequency, as well as the intensity of personal emotions.

  • In many areas of life, the views formed and choices made by people directly express their emotional and preferential tendencies, and these behaviors are made entirely unconsciously.

  • Emotional heuristic.

  • When people favor a certain technology, they believe that technology has more advantages and lower risks; if they dislike a technology, they will only think of its drawbacks and a few advantages.

  • How to prevent low-probability risk events from evolving into public crises?

  • The public's perception of risk is deeper than that of experts.

  • When the views and hopes of experts contradict those of other citizens, people should not fully accept the experts' views. He said that when experts and the public disagree on their respective priorities, "both sides must respect each other's views and wisdom."

  • Every policy issue includes assumptions about human nature, especially the choices people might make and the consequences of their choices for themselves and society.

  • Utility stacking is a series of self-sustaining events that may begin with media coverage of relatively minor events, which then triggers public panic and large-scale government action.

  • Media coverage of a certain risk can capture part of the public's attention, which can then turn into outrage and anxiety. This emotional response itself is a form of promotion, prompting the media to follow up with reports, which in turn leads to greater anxiety and a wider impact.

  • The 1989 Eila incident.

  • The Eila incident was clearly an overreaction to a minor issue.

  • The Eila incident illustrates that our brains have a fundamental limit in solving small risks: we either completely ignore the risk or overemphasize it, with no middle ground.

  • Policymakers should not ignore the widespread fear, even if these emotions are baseless.

  • Policymakers must strive to protect the public from the influence of fear rather than just protecting them from real dangers.

Chapter 14 Guess What Tom's Profession Is?#

  • When you doubt the reliability of information, you can do one thing: when making probability judgments, think towards the base rates. Don't expect following this principle to be easy—it requires a lot of effort to achieve self-monitoring and self-control.

  • Bayes' theorem.

  • The paradox of less is more.

  • Focusing on weaknesses is also common in political debates.

  • Stereotyping: Stereotyping refers to the tendency of people to extend their views of a group to every member of that group (if the group has certain problems, all its members will invariably have those problems).

  • It is worth paying these costs to build a better society; however, if one only focuses on being happy and holding the correct political stance while denying the existence of costs, this attitude does not withstand scientific scrutiny. Relying on emotional heuristics in political disagreements is common; some positions we agree with incur no costs, while some positions we oppose provide no benefits. We should have the ability to do better.

  • We are not as helpful as we think.

  • Subjects know that one of them has a seizure and needs help, but they feel that several people may have already rushed out to help, so they can safely stay in their cubicle.

  • This experiment shows that when someone knows others have also heard the same call for help, they feel their responsibility diminishes.

  • Changing a person's view of human nature is difficult, and changing a person's view of their own dark side is even harder.

  • Statistical results explained by causal relationships have a greater impact on our thoughts. However, even persuasive causal relationship statistics do not change our long-held or deeply rooted beliefs formed through personal experiences.

Chapter 17 All Performances Will Regress to the Mean#

  • Life often gives us feedback that contradicts common sense.

  • Success = Talent + Luck; Great Success = More Talent + More Luck.

  • The illusion of effectiveness.

  • Confidence is a feeling that reflects the consistency of a piece of information and the cognitive relaxation exhibited when processing that information.

  • The illusion of skill in stock investing.

  • The most active traders often achieve the worst results, while the least active investors earn the highest returns.

  • Individual investors often sell "winning stocks" to maintain their profits; "winning stocks" are those that appreciate after being bought, and whether they rise or fall depends on "losing stocks."

  • Persistent achievements.

  • Many researchers share a common view that almost all stock traders, regardless of their knowledge of stocks (few understand stocks), are playing a game of chance. The subjective experiences of traders are merely their seemingly wise guesses made under highly uncertain conditions. However, in efficient markets, wise guesses are not much more accurate than random guesses.

  • If your success mainly relies on luck, how much of your achievements can you attribute to yourself?

  • Subjective confidence and professional culture provide fertile ground for cognitive illusions.

  • Experts make mistakes not because of the content of their thoughts, but because of the way they think.

  • Hedgehogs "know one big thing," having their own theories about the world, explaining certain special events within a clear framework, often lacking patience for those who do not view things their way, and being critical of themselves.

  • In contrast, foxes are more complex thinkers. They do not believe that a single big thing can drive the course of history (for example, they cannot accept the idea that Ronald Reagan ended the Cold War solely through personal strength). Instead, these foxes recognize that many different factors and forces interact to produce this outcome, including pure luck, and that this outcome often leads to larger, more unpredictable results.

  • She is like a hedgehog, with a theory that explains everything, which gives her the illusion that she understands the world.

Chapter 21 Intuitive Judgments vs. Formulaic Calculations: Which is Better?#

  • Why are expert predictions less accurate than simple calculations? Mill suggests one reason is that these experts try to be clever, always wanting to think outside the box, and when predicting, they consider complex combinations of different characteristics.

  • Because we do not have a clear understanding of what is in our thoughts, we will never know how we will make different judgments when there are slight changes in the surrounding environment.

  • In a memorable example, Dawes pointed out that the stability of marriage can be predicted by a formula: the frequency of sexual activity minus the frequency of arguments.

  • Do not simply trust intuitive judgments—whether your own or others'—but do not completely disregard them either.

Chapter 22 When Can You#

  • I only respond in rare cases when I believe criticism is seriously misleading.

  • Learning professional skills, such as high-level chess, professional basketball, and firefighting skills, is complex and slow because expertise in a field involves not just a single skill but many small techniques.

  • When someone tells you to believe their judgment, do not believe them, nor should you believe yourself.

  • Professional skills are not a single skill but consist of many skills. The same professional may be an expert in her field but a novice in others.

  • Even when judging the wrong questions, one can still have high confidence in making that judgment.

Chapter 23 Striving to Adopt#

  • We tend to favor internal opinions over external ones.

  • Executives are likely to propose overly optimistic plans to seize resources, thus organizations face the challenge of controlling this tendency among executives.

Chapter 24 Optimism is a Double-Edged Sword#

  • A person cannot adopt information they have not thought of, perhaps because they never knew that information.

  • Individuals and businesses reward those who provide risky and misleading information rather than those who tell the truth.

  • The successful scientists I have encountered tend to exaggerate the importance of their ongoing research. I also believe that those who do not like to exaggerate their importance often become demoralized when repeatedly faced with setbacks and failures, which is the fate of most researchers.

  • Pre-mortem: a partial method to overcome optimistic bias.

  • When an organization is about to make an important decision but has not yet formally issued a resolution, Klein suggests convening a brief meeting with those knowledgeable about the decision. Before the meeting, there should be a short speech: "Imagine that we have implemented the current plan a year from now, but the results were disastrous. Please take 5-10 minutes to briefly write down the reasons for this disaster."

  • When a team focuses on decision-making, especially when the leader announces his intentions, doubts about the feasibility of planned steps gradually diminish, and eventually, such doubts may be regarded as disloyalty to the team and the leader.

Chapter 25 Decisions Regarding Risk and Wealth#

  • Prospect theory.

  • It is the poor who buy insurance and the rich who sell insurance.

  • This risky gamble becomes the only choice for entrepreneurs and commanders when they are at a loss.

  • Theory-induced blindness, meaning once you accept a theory and use it as a thinking tool, it becomes difficult to notice its errors.

  • You also know that your attitude towards gains and losses does not stem from self-assessment of the wealth you possess. You want to gain $100 but do not want to lose $100, not because this money changes your wealth status. You simply like to gain and dislike to lose—almost certainly, your aversion to loss is much greater than your liking for gain.

  • The core content of prospect theory includes three cognitive characteristics.

  • The third: loss aversion.

  • Many choices we face in life are mixed blessings: there are risks of loss and possibilities of gain, and we must decide whether to accept or reject the risk.

  • For most people, the fear of losing $100 is stronger than the desire to gain $150.

  • In gambles where both gains and losses may occur, loss aversion leads to choices that strongly avoid risk.

  • In the description of prospect theory, poverty means that a person's living standard is below their reference point. Some goods are unaffordable for the poor, so they are always "in loss." They feel that the small amount of money they receive is a reduction of loss rather than a gain. This money can help a person get a little closer to the reference point, but the poor always linger at the steepest slope of the value function.

  • For the poor, spending money means loss.

  • Negative emotions, irresponsible parents, and poor feedback have a greater impact than good situations, and people process bad news more thoroughly than good news. We care more about avoiding negative self-definitions than pursuing positive self-definitions; bad impressions and bad patterns are easier to form and harder to disappear than good situations.

  • Long-term healthy marital relationships rely not only on seeking happiness but also on avoiding negative situations.

  • We all know that perhaps one thing can ruin friendships cultivated over years.

  • The boundary between good and bad is a reference point that changes over time and depends on the situation at the time.

  • We find that a basic principle of fairness is: one should not use market forces to impose losses on others.

  • Contrary to the principle of expectation, the importance people place on outcomes differs from the importance they place on the likelihood of outcomes. The possibility effect emphasizes unlikely outcomes, while almost certain outcomes are valued less than certain outcomes. The principle of expectation determines value through possibilities, which is a psychological approach that should be avoided.

  • When making decisions with long-term impacts, do not be overly meticulous, but do not act entirely randomly either. If you consider a little, you might later say, "I could have made a better choice," and this hindsight will make you feel more regretful.

Chapter 35 The Inconsistency of Experience Utility and Decision Utility#

  • The experiencing self answers the question "Does it hurt now?" while the remembering self answers the question "How was it overall?" We can only preserve life experiences through memory, so when we think about life, the only perspective we can adopt comes from the remembering self.

Chapter 36 Life is Like a Play#

  • How much can you remember about your last trip?

  • The person taking the photo does not think that the scenery at that moment is only for their brief enjoyment; they regard the scenery as a memory to be cherished in the future. Photos are very useful for the remembering self, even though we rarely look at these photos for long periods or multiple times; some photos we may not have looked at again, but taking photos is not necessarily the best way for travelers' experiencing selves to appreciate the scenery.

  • People in love may feel happy even in traffic jams, while those in mourning may continue to feel sad even when watching a comedy. However, under normal circumstances, we only feel joy or sadness based on what is happening at that moment, provided we pay attention to it. For example, to derive pleasure from eating, you must notice that you are eating.

  • The potential conflict between the remembering self and the experiencing self is more complex than I initially imagined. In early ice-hand experiments, the combination of process neglect and the peak-end rule led people to make clearly absurd choices. Why do people choose to endure unnecessary suffering? This choice is made by the remembering self, which prefers to recall experiences that leave better memories, even though these choices may cause them more pain.

  • The only standard for judging whether a person is rational is not whether their beliefs or preferences are reasonable, but whether they are consistent. A rational person can have likes and dislikes, but their preferences must be consistent over time. Rationality refers to logical consistency, that is, whether it is reasonable.

  • When we observe those whose behavior seems strange, we should consider a possibility—they may have reasonable reasons for doing so. Only when the reasons become unreasonable will psychological explanations be triggered.

  • Our views of ourselves are essentially views of System 2. System 2 makes judgments and choices, but it endorses and rationalizes the views and feelings formed by System 1. You may not realize that you hold an optimistic attitude towards a project simply because its leader reminds you of your beloved sister. Or, you may dislike someone who resembles your dentist. If you seek an explanation, you will search your memory for some decent reasons, and you will surely find some.

  • The acquisition of skills requires a fixed environment, opportunities for practice, and rapid and clear feedback on one's thoughts and actions.

  • Recognizing the cognitive domain you are in, slowing down, and asking System 2 to reinforce it. When you encounter the Müller-Lyer illusion again, what will you do? When you see line segments with arrows pointing in different directions, you will realize that you cannot trust your intuition about length.

Loading...
Ownership of this post data is guaranteed by blockchain and smart contracts to the creator alone.