Fooled By Randomness 6 – Probability Blindness & Gambers’ Ticks

Fooled By Randomness 6 – Probability Blindness & Gambers’ Ticks

Today’s post is our sixth visit to Nassim Nicolas Taleb’s modern classic Fooled By Randomness.

Probability blind

Taleb begins Chapter Eleven by looking at how difficult we find probability.

He imagines a holiday being planned by your other half.

  • You’ll definitely be going to either Paris or the Bahamas, but not both.

We can only imagine each of the holidays in turn.

  • We have no way to visualise a 50/50 split, let alone an 85/15 one.

That’s because we experience life as a single path, not a Monte Carlo.

  • The same applies to bets and investments, and to a cancer diagnosis with a specified survival rate – it’s not specific to future holidays.

This inability to simultaneously imagine two futures leads to the framing effect, where scenarios can be described in terms of loss (eg. death in the case of the cancer diagnoses) or gain (survival/cure).

  • Different framing will lead to different reactions.

Consumers consider a 75% fat-free hamburger to be different from a 25% fat one.


Rules have their value. We just follow them not because they are the best but because they are useful and they save time and effort

When you see a tiger, you should just run away from it.

Nobel prize-winner Herbert Simon figured out that trying to optimise everything we do would cost an infinite amount of time and energy.

  • He came up with the concept of “satisficing” – we stop when we get close to a satisfactory solution.

Taleb comments:

[According to Simon] We are rational, but in a limited way: “boundedly rational.”


Kahneman and Tversky [experimental psychologists] went in a completely different direction than Simon and started figuring out rules in humans that did not make them rational.

They called these rules “quick and dirty” heuristics.

  • The rules have side effects, known as biases.

The study of these biases has led to the development of behavioural finance, which was not readily accepted by mainstream economists.

Kahneman and Tversky showed that these biases do not disappear when there are incentives, which means that they are not necessarily cost saving [optimal/rational].

If your mind operates by series of disconnected rules, these may not be necessarily consistent with each other, and if they may still do the job locally, they will not necessarily do so globally.

You may prefer apples to oranges, oranges to pears, but pears to apples – it depends on how the choices are presented to you.

Taleb provides a table of heuristics mapped to trader rules that were common before behavioural finance existed.


First up is the “I’m as good as my last trade” heuristic (or the “loss of perspective” bias) each day/month/year we reset our counter to zero and start again from scratch.

This means that you have an arbitrary reference point and react to differences from that point, looking at the local context, not the absolutes.

“You can have a good month and a bad day. Which period should dominate? Your attitude toward the risks and rewards of the gamble will vary according to whether you look at your net worth or changes in it.

The fact that the losses hurt more than the gains, and differently, makes your accumulated performance, that is, your total wealth, less relevant than the last change in it.

Psychologists call this effect of comparing to a given reference anchoring. Wealth itself does not really make one happy (above, of course, some subsistence level); but positive changes in wealth may, especially if they come as “steady” increases.

This is a form of anchoring – total wealth matters less than changes away from the number to which we are anchored (which, ironically is total wealth).

  • Let’s look at a few more heuristics.

[Availability] corresponds to the practice of estimating the frequency of an event according to the ease with which instances of the event can be recalled.

This is the reason why earthquakes in California are thought to be more likely than earthquakes in the whole of the US.


This is the Linda problem – estimating the probability that (for example) a person belongs to a particular social group by assessing how similar the person’s characteristics are to the “typical” group member’s.

  • So a feminist-style philosophy student is more likely to be a feminist bank teller than to be just a bank teller (which is impossible).

Taleb notes:

I am glad to be a trader taking advantage of people’s biases but I am scared of living in such a society.


This is counterfactual thinking – what might have been (had you done the optimal thing, rather than what you actually did).


The probability of events in your mind is affected by the emotions that they produce in you.

Two systems

As most people now know, there are two systems for thinking:

  1. Fast, using heuristics
  2. Slow, using rationality

The underlying idea is that this arrangement suits the simpler environment into which we evolved.

Information is limited by the physical means of its transmission; one cannot travel fast. The number of people you would get to know in a lifetime will be small. Your life would be simple, hence your space of probabilities would be narrow.

We never needed to calculate the odds until very recently.


Damasio’s book Descartes’ Error looks at patients with collateral brain damage (from say a tumour removal) which results in an inability to register emotions.

  • This also results in an inability to make the simplest decisions (like whether or not to get out of bed in the morning).

We need a shortcut; emotions are there to prevent us from temporizing.

Joseph LeDoux’s Emotional Brain notes that connections from the emotional systems to the cognitive systems are stronger than those in the opposite direction.

The implication is that we feel emotions (limbic brain) then find an explanation (neocortex).

Conditional probabilities

Taleb examines conditional probabilities using a test given to medical doctors:

A test of a disease presents a rate of 5% false positives. The disease strikes 1/1,000 of the population. People are tested at random, regardless of whether they are suspected of having the disease. A patient’s test is positive. What is the probability of the patient being stricken with the disease?

Apparently many doctors answer 95% – the accuracy rate of the test.

  • But since only 1 in 1000 of the people who will be given the test has the disease, the remaining 999 do not.
  • For each real positive, there will be 50 false positives.

So the chance of someone with a positive test having the disease is 1 in 51 or just under 2%.


Taleb notes that when valuing options, people confuse the most likely scenario (say $0 with out of the money options, often described as Bloomberg

Taleb has a Bloomberg terminal, set up to display a lot of prices: currencies, stocks, interest rates, and commodities.

He looks for changes outside the normal daily range:

Unless something moves by more than its usual daily percentage change, the event is deemed to be noise. A 2% move is not twice as significant an event as 1 %, it is rather like four to ten times. A 7% move can be several billion times more relevant than a 1 % move!

Wax in my ears

Odysseus famously lashed himself to a mast to avoid being seduced by the sirens.

  • He also filled the ears of his crew with wax so they could not hear.

I take this as an instruction to formulate plans in advance of the heat of battle, and then to stick to them.

  • And also, to Cut Out The Noise.
Conditional information

Unless the source of [a] statement has extremely high qualifications, the statement will be more revealing of the author than the information intended by him.

Unless you have confidence in the ruler’s reliability, if you use a ruler to measure a table you may also be using the table to measure the ruler.


Taleb notes that he once made a lot of money on a day when his cab dropped him at the “wrong” entrance to his bank’s building.

  • The next day he asked the cab to drop him there, and he wore the same “lucky” tie from the previous day.

These are gambler’s ticks, and BF Skinner showed that even pigeons exhibit them.

  • When food deliveries are randomised, the birds develop elaborate “rain dances” to bring forth their treat.

We are not made to view things as independent from each other. When viewing two events A and B, it is hard not to assume that A causes B, B causes A, or both cause each other.


Most of us know pretty much how we should behave. It is the execution that is the problem, not the absence of knowledge.

Take a look at the huddling smoking crowd outside the service entrance of the Sloan-Kettering Cancer Center. You will see dozens of cancer nurses standing outside the entrance with a cigarette in hand as hopeless patients are wheeled in for their treatments.


That’s it for today.

  • We’ve processed another two chapters and will complete the book in the next article (after which we’ll move on to a summary).

Next up are Carneades and Bacchus.

  • Until next time.

Mike is the owner of 7 Circles, and a private investor living in London. He has been managing his own money for 35 years, with some success.

Article credit to:

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.