The last time Hackerfall tried to access this page, it returned a not found error. A cached version of the page is below, or click here to continue anyway

52 Concepts To Add To Your Cognitive Toolkit | Peter McIntyres

A series of unfortunate events led me to become stronger. Both by studying and by training myself in rationality.

These are the list of rules I wrote to myself, as remainders against my own stupidity.

Here are the lists:

Against Akrasia

Deciding in Advance

It is tempting to delay the action of choosing as much as possible. Some plans might call for delayed action, but never for delayed choosing. That is mere hesitation. If you need extra-information to make a decision and you know which information you need, you can decide in advance what you would do once you had said extra information. Never delay a choice.

Hesitation is always easy, rarely useful

When confronted with the time to act, hesitation might manifest in one form or another. Sometimes it manifests as mere laziness, other times as an overwhelming fear and more. Notice whenever you are hesitating and act immediately. If you had already decided, within a calmer state of mind, that this was the right course of action, defer to your past self, ignore your mental doubts and just act.

Notice Procrastination

Procrastination is an un-winnable mental battle. You have to be able to notice whenever you are procrastinating. When you do notice you are procrastinating, just go do whatever it is you are delaying immediately and dont think anymore. No, seriously, GO!

Trivial Inconveniences

Humans let go of many decisions and even benefits when an inconvenience, however trivial, stands in our way. Our mind is averse to effort and it finds a way to convinces us to procrastinate it. This is how some European countries raised their organ donation rate to almost 100%. A law was simply passed that made everyone a donor by default unless they filled out a simple form at a government office. Beware of trivial inconveniences and stop delaying.

Trivial Distractions

When engaged in a task that requires effort we are prone to becoming easily distracted by our immediate environment as a way to procrastinate. Something as simple as answering an instant message or quickly cheking a website can trigger a large episode of procrastination. Notice these impluses and ignore them. Turn of mobile devices if necessary and, if you find yourself already within a procrastination episode disengage it as soon as you are aware of it.

Basics

Agency

Roles, emotions, impulses and biases can highjack your train of thought making you believe you are thinking and acting by your own volition when in fact you are not. You must always keep in mind to question whether you are acting with Agency or if something else is motivating your cognition.

Patience

Our own subjective experience makes us believe our problems and issues are bigger than they really are and we are programmed to seek immediate resolutions. Our brains are hardwired for instant gratification and long-term gains are not intuitively understood. You must abide to hard statistical reasoning and always think long-term. For this you must embrace patience and if you realize you have no agency at the moment, you must wait until you have a clear head, overcoming the need for an immediate resolution.

The Sunk-Cost Fallacy

This economics fallacy runs deeper into human nature than I previously suspected. Quirrel’s insight of “Learning to lose” and even the buddhist teaching of “Letting go” are both instances of overcoming the fallacy. Learn to accept reality as it is, for throwing a tantrum won’t change anything and keep in mind that cost already spent are sunk. Never let a sunk cost dictate your behaviour.

Make your beliefs pay rent

You must subtract the words from beliefs and notice which experiences they predict or, better yet, prohibit. Never ask what to believe but what to anticipate. You must ask what experiences constrain the belief, what event would definitely falsify it. If the answer is null, it is a floating belief. If you are equally good at explaining any outcome you might have powerful rhetoric powers but ZERO knowledge

Learning from History

The History of the world is filled with mistakes that get repeated over and over. Subtle flaws in human reasoning are responsible for this cyclical phenomenon. Every mistake contains a hidden lesson to be learned, as most of them get repeat albeit in a different fashion depending on the time period and cultural background. The Elan Vital seemed plausible to someone as smart as you. Short-term history, like the personal life experiences of people around you and biographies also contain lessons to be learned. It is very important to grasp the stupidity of ordinary people to see what follies of theirs you have imitated and what others you could avoid. Be Genre Savvy and avoid the clichs of history.

Shut up and do the Impossible!

Always play to win. If your mind believes something is impossible you might try only for the sake of trying. You can succeed in the endeavor of making and extraordinary effort without winning. You must try your best under the assumption that the problem is solvable. You must also keep in mind that whenever you feel like you have maxed out, in all probability you still have a lot more to give if only you focus and muster the required mental power. It is recommended to never give up until, at the very least, you have actually tried for 5 minutes, by the clock.

Heuristic Decisions

Whenever you are hesitant about a decision, force yourself to decide in advance. If you can guess your answer, then, in all probability, your System-1 has already decided. Once an idea enters your mind it stays there for we rarely change our minds. Dont accept intuitions as definitive answers and make an actual effort to think for yourself.

Fallacious Wisdom

Every human culture is permeated with wisdom and advice. The elderly and the experienced are seen as valid sources of lessons. However, since the human mind is biased to generate fallacious causal explanations to connect random events into a coherent story, a lot of this wisdom is false and biased. Lessons imparted by wise old men and successful icons might be product of their minds cleverly rationalizing a story that ignores the factor of luck, giving more weight to merit and hard work. There are hidden powerful lessons hidden amid the sea of fallacies but remember: A powerful wizard must learn to distinguish truth from among a thousand plausible lies.

The premortem

When planning, imagine a worst-case scenario as something that has already happened and then force yourself to come up with an explanation as to why that happened. This was originally envisioned as a technique for minimizing bias when making collective decisions, but is also quite useful for an individual. The idea is to put our uncanny capacity for rationalization and causal storytelling to rational use and realize all the perils that our own subjective experience ignores.

Uncertainty as a Virtue

System-1 constantly scans for signs of certainty in the body language of others and it is biased to believe wherever certainty appears while uncertain. This has translated in a culture that penalizes uncertainty and incentivizes an appearance of certainty. Even in their private mental life, people reject uncertainty and look for certainty even where there might be none. Inadequate appreciation of the uncertainty of the environment inevitably leads economic agents to take risks they should avoid. However, optimism is highly valued, socially and in the market; people and firms reward the providers of dangerously misleading information more than they reward truth tellers. This might be good for social relations but terrible for gaining accurate beliefs. Given that we live in an uncertain world, an unbiased appreciation of certainty is a cornerstone of rationality. Be wary of certainty and realize uncertainty is a virtue not a disvalue.

Emotional Bias

Notice emotionally biased thoughts and impulses

You must keep diligent watch over your own mental process to notice whenever agency is gone and emotion or intuition is biasing your perception. Whenever you realize you are emotionally biased you must say out loud I notice I am emotionally biased and restrain all courses of action until you have a clear head. Sneaky emotions like anger, jealousy and helplessness can make you think you are thinking and acting by agency, which is why you must notice it and restrain yourself.

Noticing Motivated Cognition

Beyond regular emotional bias, there are other instances in which you have no agency but you believe you do. A flaw in your self-concept or a desire to confirm a belief can too constrain your mind and motivate to think and act within a biased reasoning space. The only defense against motivated cognition is Meta-cognition. Always observe your mental processes and ponder if there are any hidden motivators behind them.

Removing Emotional Complexity

Our own subjective experience makes us see out life problems as bigger than they really are. Most of these issues are seen as nothing more than moments in hindsight some time later. This is because we add layers of emotional complexity and lose sight of what is really there. To be able to see the situation for what it is you must adopt a reductionist perspective, trying to see it with external eyes. See it as simple as you can as if you were assessing the situation of a stranger.

Not one unusual thing has ever happened

Feelings of surreality and bizarreness are products of having poor models of reality. Reality is not inherently weird. Weirdness is a state of mind, not a quality of reality. If at any point some truth should seem shocking or bizarre, it’s your brain that labels it as such. If the real world seems utterly strange or surreal to your intuitions then it’s your intuitions that need to change. The map is not the territory.

Reasoning

Base-rate neglect

Human minds cant intuitively grasp statistical reasoning and because of that we live in constant self-deception. System-1 ignores statistical facts and instead focuses on its own causal and emotional reasoning, which gives peoples System-2 the impression of accurate judgment. For a most accurate Bayesian reasoning, you must do your best to anchor your judgments and predictions to the base-rate (the mean) and from there adjust with other information, always being wary of the quality of the evidence.

Regression to the mean

This is a basic law of nature that the human mind is inherently designed to overlook and deny. Extraordinary events in any given scenario are mostly statistical anomalies. Statistics show that extraordinary events regress to the mean. System-2, designed for causal reasoning rather than statistical, creates fallacious narratives to explain the causes behind extraordinary events, creating biased beliefs that generate false predictions. When these predictions inevitably turn out to be false, a new causal explanation is generated to maintain the false belief. Always remember that regression to the mean is a basic law of nature when making judgments. Regression to the mean has an explanation but not a cause so avoid the temptation to accept causal narratives as facts.

Occam’s Razor

Occams Razor indicates that the simplest explanation is the most probable one. Adding complexity to a model can lead to biases like the conjunction fallacy, however, this is not intuitive. Since humans have trouble inferring the particular from the general, we are prone to believe more in a more detailed model, even though that makes it less probable.

Always remember that in science complexity is costly and must be justified by a sufficiently rich set of new and (preferably) interesting predictions of facts that a simpler model or existing theory cannot explain.

Remove layers of complexity and remember that the length of a verbal statement in itself is not a measure for complexity. Ponder what must be true in order for the theory to be true to uncover hidden complexity.

Overconfidence

People are prone to be inherently overconfident in their predictions and their model of the world. The origin of this biased perception is mostly that System-1 gets its subjective feeling of confidence from the cognitive ease associated with the coherence of an explanation. We are designed for causal explanations, not statistical reasoning, which makes System-1 suppress feelings of uncertainty. To avoid overconfidence, be wary of the coherence of a story and focus instead on the quality of the evidence. Remember to make beliefs pay rent

Mysterious Answers

When thinking about a mystery and trying to devise a solution, the probable solution must make the mystery less confusing. It is not enough that the proposed hypothesis is falsifiable; it must also destroy the mystery itself. Because if it doesn’t, then why should your hypothesis have priority over any other?

Notice Confusion

Notice whenever an explanation doesnt feel right. Pay close attention to this tiny note of confusion and bring it to conscious attention. Never try to force the explanation for that would be a rationalization. Say out loud I notice that I am confused and understand that either the explanation is false or your model is wrong. Explanations have to eliminate confusion.

Defer to more rational selves

A technique for making rational decisions or detaching from your own opinion is to model what someone else would do in that situation. Is it recommended that the model you use is of someone you believe to be more rational than you. This forces your mind to make a decision from a fresher perspective.

Rationality vs. Rationalization

Rationality gathers evidence before proposing a possible solution. Rationalization flows in the opposite direction. It gets fixated on a conclusion and then looks for arguments to justify it. This is why the power behind hypothesis lies in what they predict and prohibit. It is important to avoid rationalizations and make beliefs, models, explanations and hypothesis pay rent.

Positive Bias and Confirmation Bias

Our intelligence is designed to find clever arguments in favor of existing views, which makes us inherently biased towards confirmation rather than falsification. It takes conscious effort to look for evidence against out thoughts and beliefs. Even people who are looking to test whether a belief is true or not fall for this bias in the shape of the Positive Bias, where they design experiments to confirm rather than falsify their theory.

You must make a conscious effort to overcome the aversion against conflicting evidence and learn to look towards the darkness. You must make your beliefs pay rent.

Optimistic Bias

Humans are optimistic to the point of it being a cognitive bias. System-1 makes people feel at less risk of experiencing a negative event when compared to other people. This is born from people being too trusting to their own subjective experience and neglecting base rates. People rationalize away risks and defeats in order to keep a coherent narrative within their own minds. You have to be overly pessimistic when dealing with hard statistics and remember you are not exempt from risks. Statistics are always more accurate than your own subjective experience.

Update incrementally

Even when accepting conflicting evidence as true, our intuitions tend to remain the same as they were. This means people generally fail to update there beliefs based on the new incoming evidence. Conflicting evidence then is rationalized away in order to maintain the mental status quo and retain the belief.

Avoid this mistake by making an effort to update based on new evidence. Every piece of evidence must shift your belief upwards or downwards in regards to probability. Stop and reflect when receiving new evidence to avoid action gar thinking the same way as before.

Do not avoid a beliefs weakest points

People who start doubting a cherished belief tend to do so from a strong perspective, one where they already are proficient at counter-arguing in order to reaffirm the belief. When pressed to question from a weak point of the belief, people flinch-away in pain and stop thinking about it.

In order to successfully doubt, you must try to find the weakest points in a belief. Deliberately think about whatever hurts the most. Don’t rehearse standard objections whose standard counters would make you feel better. Ask yourself what smart people who disagree would say to your first reply, and your second reply. Whenever you catch yourself flinching away from an objection you fleetingly thought of, drag it out into the forefront of your mind and confront it.

The Outside View (Adjusting predictions from the baseline)

There is a prevalent tendency to underweight or outright ignore prior statistics when making predictions which leads to many mistakes in forecasting. This is especially true when people have specific information about an individual case, as they dont feel that statistical information can add any more value to what they already know. Truth is statistical information has more weight than specific details. A specific instance of this is The Planning Fallacy. You must adopt the outside view diligently. Check prior statistics and adjust any further evidence from the baseline.

The illusion of validity

A cognitive bias that makes people overestimates his or her ability to interpret and predict accurately the outcome when analyzing a set of data. System-1s pattern-seeking tendency looks for consistent patterns in data sets, looking for a causal explanation where there might be none. Then System-2 uses its impressions to form beliefs and make false predictions. Repeated exposure to this phenomenon in an uncertain environment can make a person fall into a self-deception of expertise, like political forecasters and stock traders. To avoid this bias, make beliefs pay rent, pay close attention to base-rates and assess the uncertainty present in the environment you are trying to make predictions on.

Assessing Risk

Loss Aversion and Risk-seeking behavior

For the human mind, losses are more prominent than gains. This has an evolutionary origin as organisms that treat threats as more urgent than opportunities have a better chance at survival and reproduction. The loss aversion ratio has been estimated in several experiments and is, on average, usually in the range of 1.5 to 2.5. Loss Aversion is the source of a wide array of cognitive biases that make people take irrational choices when assessing risk and so one must diligently use System-2 to fight it. Even though Loss Aversion is a prominent fact of human cognition, there are certain instances where humans are biased to be risk seeking. In general, humans are Loss-Averse when assessing risk in the realm of possible gains but jump into risk-seeking behavior when assessing loses. Economic logic dictates the opposite. We should be Risk-Seeking when facing possible gains and Loss-Averse when faced with loses. A failure to abide to this rule results in long-term costs.

Biased to the reference point

Intuitive evaluations and judgments are always relative to a reference point. This is why room-temperature water feels cold after being in warm water and vice versa. The reference point can be the status quo or expectations of things System-1 feels entitled to. Anything above the reference point is perceived as a gain and everything below as a loss. This can bias perception, creating risk-averse or risk-seeking behaviors depending on the situation. Be wary of this as other people set reference points cleverly anticipating this human bias. A very tangible example of this phenomenon is The Endowment Effect, where System-1 makes an object feel more valuable merely because you own it. The reference point here is ownership itself, when trying to get rid of the object the pain of losing it is evaluated instead of its market value. This bias must be fought when dealing with economic transactions.

Adjusting to the real probability

System-1 emotional intuitions get biased when presented with probabilities. We fail to anchor our emotions to the actual number. There are two specific effects one must be wary about:

The Possibility Effect: This is when highly improbable outcomes are overweighed by System-1. This is due loss aversion. When faced with a pair of losing choices, System-1 is biased to prefer to gamble for huge gains, even at the expense of terrible loses, than accept a loss. This causes a 5% chance to be disproportionately weighted against a 0% chance.

The Certainty Effect: This is when highly probable outcomes are underweighted by System-1. This is due to risk aversion. When people are faced with a certain but inferior choice opposed to a highly probable and superior one, their intuitions are biased to pick the certain option. For instance, a 5% difference between a 95% choice and a 100% feels like too much a risk. Whatever the probability presented, we must adjust our decision to said probability according to the expected utilities. This is completely counterintuitive and must be enforced by System-2.

Denominator Neglect

When dealing with probabilities, the human mind can neglect the denominator of the probabilities in question, leading to fallacious assessments. This is caused by your mind paying attention only to the numerators and ignoring the denominators when comparing probabilities. How a probability is framed can lead to vivid imagery biasing judgment. An example of this is an experiment where people had to choose to draw a red marble at random between two Urns (A: 10 marbles, 1 red|B: 100 marbles, 8 red). People chose B most of the time because they focused on the numerator, 8, ignoring that A had a higher chance of success. Denominator neglect leads to the overweighting of rare events. System-1 generates this bias because of the confirmatory bias of memory. Thinking about that event, you try to make it true in your mind. A rare event will be overweighed if it specifically attracts attention. Separate attention is effectively guaranteed when prospects are described explicitly (99% chance to win $1,000, and 1% chance to win nothing). Obsessive concerns (the bus in Jerusalem), vivid images (the roses), concrete representations (1 of 1,000), and explicit reminders (as in choice from description) all contribute to overweighting.

Broad Framing as opposed to Narrow framing

For risk assessment, humans tend to have a form of mental accounting labeled Narrow Framing, where they separate complex issues and decisions into simpler parts that are easier to deal with. The opposite frame is labeled Broad Framing, where issues and decisions are viewed as a single comprehensive case, with many options. As an example, imagine a longer list of 5 simple (binary) decisions to be considered simultaneously. The broad (comprehensive) frame consists of a single choice with 32 options. Narrow framing will yield a sequence of 5 simple choices. The sequence of 5 choices will be one of the 32 options of the broad frame, which means the probability of it being the most optimal is very low. Issues and problems have to be judged with a Broad Framing perspective to avoid biased judgments.

Narrow framing: Short term goals

People often adopt short term goals but, to assess their status in succeeding at said goal, they usually use an immediate goal, which leads to errors in judgment. This is an example of Narow Framing easier to grasp with an example. Cabdrivers may have a target income for the month or year but the goal that controls their effort is typically a daily target of earnings. This is easier to achieve in some days than in others, like rainy days where taxis arent empty for long as opposed to pleasant weather. The daily earnings goal makes them stop working early when there is a lot of work and makes them work endlessly when fares are low. This is a violation to economic logic. Leisure time, when measured in fares per hour, is much more expensive on rainy days than pleasant weather days. Economic logic would dictate that they should maximize their profit on rainy days and treat themselves to leisure on lazy days, where it is less expensive. Short-term goals must never be measured by immediate goals, the Broad Framing view must be adopted in order to maximize utility.

Narrow Framing: Moral Issues

When faced with possible moral issues that require a financial decision such as altruistic donations or budget allocation, System-1 substitutes the main economic question with a simpler, emotional one: How strongly do I feel about this issue? Once the mind is armed with the answer of this question, the monetary figure is generated by matching the intensity of the feeling with an actual figure. This is mostly caused by judging the moral issue in isolation. In order to avoid this Narrow Framing bias, a Broader Frame has to be adopted and the issue has to be compared with other possible decisions, even if it feels counter-intuitive.

Narrow Framing: Risk Policy in Financial Decisions

The most effective way to avoid Narrow Framing when making financial decisions is to adopt a Risk Policy and abide to it besides emotional incentives to do otherwise. Several financial biases come from Narrow Framing, one of the most relevant being The Disposition Effect, which is the tendency of investors to sell assets whose value is raising while keeping the ones that are dropping. This comes from the framing of the decision, as it Is perceived as a choice between the pain of experiencing the loss of selling a losing asset versus the joy of selling a winning one.

Fear of feeling Regret as a source of risk-aversion

A major bias in decision making comes out of people anticipating the possibility of feeling regret if a negative outcome were to occur and avoiding said decision in favor to a safer one where no regret would be evoked. The effect of this bias is stronger when the possible negative outcome is the product of action rather than inaction. This happens because when people model themselves, they expect to have a stronger emotional reaction (including regret) to an outcome that is produced by action than to the same outcome produced by inaction. This translates into a general aversion to trading increased risk for some other advantage, which can affect institutional risk-taking policies. To counter this bias, regret must not be factored when making a decision about risk.

Emotional Framing

Losses and gains are processed differently by System-1 depending on how they are framed, which evokes different emotional reactions, resulting in biased choices. Options that are economically equivalent are processed as being different due to emotional bias. Losses evoke stronger negative feelings than costs. This bias can be exemplified with credit surcharges labeled as possible cash discounts. Places with permanent 2×1 promotions do the same thing, framing it as a discount when in reality you are forced to order 2. This heavily affects forecasts. For instance, a surgery labeled with 90% chance of recovery gets much more acceptance than one framed as having a 10% fatality rate, even though they represent the same probability. Reframing is effortful and System 2 is normally lazy. Unless there is an obvious reason to do otherwise, most of us passively accept decision problems as they are framed and therefore rarely have an opportunity to discover the extent to which our preferences are framebound rather than reality-bound. To avoid this bias, you must always consider reframing the problem to eliminate emotional bias.

Mental Accounting

To better be able to grasp and evaluate economic outcomes, people categorize their assets on separate accounts even though they might be the same kind of resource. This generates behaviors and decisions that drift away from economic logic. Mental Accounting biases judgment to a reference point, in this case the particular mental account, and behavior is dictated by it. Psychological pain and loss-aversion are then evoked from the mental account depending on its size which biases the final decision. This is why people are more willing to pay larger sums of money out of their credit cards than their cash. In order to avoid Mental Accounting assess economic decisions using economic logic. This means paying attention to emotional incentives (like loss-aversion and pain) in order to ignore them, acquiring a Broad-Framing perspective to compare the decision to others and analyzing whether System-1 has different mental accounts for the economic resource in question.

Continue reading on mcntyr.com