💎 It’s as if the less we know, the more we try and dress things up in complicated sounding terms

Another of the wise men whose voice appears in these pages, the physicist Richard Feynman, once remarked that many fields have a tendency for pomposity, to make things seem deep and pro­ found. It’s as if the less we know, the more we try to dress things up with complicated-sounding terms. We do this in countless fields, from sociology to philosophy to history to economics – and it’s def­initely the case in business. I suspect that the dreariness in so much business writing often stems from wanting to sound as though we have all the answers, and from a corresponding unwillingness to recognize the limits of what we know. Regarding a particularly self­ important philosopher, Feynman observed:

It isn’t the philosophy that gets me, it’s the pomposity. If they’d just laugh at themselves! If they’d just say, “I think it’s like this, but von Leipzig thought it was like that, and he has a good shot at it, too”.

Excerpt from: The Halo Effect: How Managers let Themselves be Deceived by Phil Rosenzweig

💎 Interesting test in Estonia to reduce speeding

Time is money. That, at least, is the principle behind an innovative scheme being tested in Estonia to deal with dangerous driving. During trials that began in 2019, anyone caught speeding along the road between Tallinn and the town of Rapla was stopped and given a choice. They could pay a fine, as usual, or take a ‘timeout’ instead – waiting going when stopped. In other words, they could pay the fine in time rather than money.

The aim of the experiment was to see how drivers perceive speeding, and whether loss of time might be a stronger deterrent than loss of money. The project is a collaboration between Estonia’s Home Office and the police force, and is part of a program designed to encourage innovation in public services. Government teams propose a problem they would like to solve – such as traffic accidents caused by irresponsible driving – and work under the guidance of an innovation unit. Teams are expected to do all fieldwork and interviews themselves.

Excerpt from: Unconventional Wisdom: Adventures in the Surprisingly True by Tom Standage

💎 How economics affects culture – in the 1950s average duration of US pop songs dropped to 2:30

We’ve been here before, where the business affected the show (or the tail wagged the dog). The first iteration of the phonograph could only hold about two to three minutes of music. It is said Puccini used to deliberately write arias that could be cut into three-minute segments that could fit on one side of a 78-rpm disc, arguably making him the first ever pop writer. Elderton notes that during the late 1950s and early 1960s, the average duration of American pop songs fell to 2 minutes and 30 seconds. As the mafia owned and controlled jukeboxes across America, they insisted that a record was limited to 2 minutes 30 seconds, allowing them to boost their take-per­ machine considerably.

Excerpt from: Tarzan Economics: Eight Principles for Pivoting through Disruption by Will Page

💎 How language can shape our attitudes and behaviours

Consider an example from the insurance world. Back in the 1930s, executives at the Hartford Fire Insurance Company in Connecticut realized that warehouses which contained oil drums kept blowing up. Nobody knew why. The company asked a fire-prevention engineer named Benjamin Whorf to investigate. Although Whorf was a trained chemical engineer, he had also done research in anthropology and linguistics at Yale, with a focus on the Hopi Native American communities. So, he approached the problem with an anthropologist’s mindset: he observed warehouse workers, noting what they did and said, crying to absorb everything without prior judgment. He was particularly interested in the cultural assumptions embedded in language, since he knew these could vary. Consider seasons. In English, “season” is a noun, defined by the astronomical calendar (“summer starts on June 20,” people say). In the Hopi language and worldview “summer” is an adverb defined by heat, not the calendar (it feels “summer(y)”). Neither is better or worse; but they are different. People cannot appreciate this distinction unless they compare. Or as Whorf observed: “We always assume that the linguistic analysis made by our group reflects reality better than it does.”

This perspective solved the oil drum mystery. Whorf noticed that the workers were careful when handling oil drum marked as “full.” However, workers happily smoked in rooms that stored drums “empty.” The reason? The word “empty” in English is associated with “nothing”; it seems boring, dull, and easy to ignore. However, “empty” oil drums are actually full of flammable fumes. So, Whorf told the warehouse managers to explain the dangers of “empty” to workers and explosions stopped. Science alone could not solve the mystery. But cultural analysis-with science-could. The same principle (namely using antho-vision to see what we ignore) is equally valuable when mysterious problems erupt in modern bank trading floors, corporate mergers, or pandemics, say.

That is because, “the least questioned assumptions are often the most questionable,” as the nineteenth-century French physician and anthropologist Paul Broca reputedly said. It is a dangerous mistake to ignore the ideas we take for granted, be that about language, space, people…

Excerpt from: Anthro-Vision: How Anthropology Can Explain Business and Life by Gillian Tett

💎 Our tendency to underestimate the variance – or noise – in business

In a well-run insurance company, if you randomly selected two qualified underwriters or claims adjusters, how different would you expect their estimates for the same case to be? Specifically, what would be the difference between the two estimates, as a percentage of their average?

We asked numerous executives in the company for their answers, and in subsequent years, we have obtained estimates from a wide variety of people in different professions. Surprisingly, one answer is clearly more popular than all others. Most executives of the insur­ance company guessed 10% or less.  When we asked 828 CEOs and senior executives from a variety of industries how much variation they expected to find in similar expert judgments, 10% was also the median answer and the most frequent one (the second most popular was 15%). A 10% difference would mean, for instance, that one of the two underwriters set a premium of $9,500 while the other quoted $10,500. Not a negligible difference, but one that an organization can be expected to tolerate.

Our noise audit found much greater differences. By our measure. the median difference in underwriting was 55%, about five times as large as was expected by most people, including the company’s executives.

Excerpt from: Noise: A flaw in human judgement by Daniel Kahneman, Olivier Sibony and Cass R. Sunstein

💎 The four steps that lead to the quantification fallacy

The first step is to measure whatever can be easily measured. This is okay as far as it goes. The second step is to disregard that which can’t be easily measured, or to give it an arbitrary quantitative value. This is artificial and misleading. The third step is to presume that what can’t be measured easily really isn’t important. This is blindness. The fourth step is to say that what can’t be easily measured really doesn’t exist. This is suicide.

Excerpt from: Tarzan Economics: Eight Principles for Pivoting through Disruption by Will Page

💎 Putting safety before profit

the largest impact-would come not from Detroit but from Sweden. In the mid-fifties, Volvo hired an aeronautical engineer named Nils Bohlin, who had been working on emergency ejection seats at Saab’s aerospace division. Bohlin began tinkering with a piece of equipment that had been largely an oversight in most auto­mobiles up until that point: the seat belt. Many cars were sold with­ out any seat belts at all; the models that did include them offered poorly designed lap belts that offered minimal protection in the event of a crash. They were rarely worn, even by children.

Borrowing from the approach to safety restraint used by military pilots, Bohlin quickly developed what he called a three-point de­sign. The belt had to absorb g-forces on both the chest and the pelvis, minimizing soft tissue stress under impact, but at the same time it had to be simple to snap on, easy enough that a child could master it. Bohlin’s design brought together a shoulder and lap belt that buckled together in a V formation at the passenger’s side, which meant the buckle itself wouldn’t cause injuries in a collision. It was an elegant design, the basis for the seat belts that now come standard on every car manufactured anywhere in the world. An early prototype of the shoulder strap had decapitated a few crash dummies, which led to a rumor that the seat belt itself could kill you in a crash. To combat those rumors, Volvo actually hired a race-car driver to perform death-defying stunts-deliberately rolling his car at high speeds-all the time wearing Nils Bohlin’s three-point seat belt to stay safe,

By 1959, Volvo was selling cars with the three-point seat belt as a standard feature. Early data suggested that this-one addition was single-handedly reducing auto fatalities by 75 percent. Three years later, Bohlin was granted patent number US3043625A by the US Patent and Trademark Office for a “Three-point seat belt systems comprising two side lower and one side upper anchoring devices.” Recognizing the wider humanitarian benefits of the technology, Volvo chose not to enforce the patent-making Bohlin’s design freely available to all car manufacturers worldwide. The ultimate effect of Bohlin’s design was staggering. More than one million lives-many of them young ones-have been saved by the three-point seat belt. A few decades after it was awarded, the Bohlin Patent was recognized as one of the eight patents to have had “the greatest significance for the humanity” over the preceding century.

Excerpt from: Extra Life: A Short History of Living Longer by Steven Johnson

💎 Even short breaks can disrupt habits

Recent research suggests that anything more than a short lapse in a behavior we hope to make habitual (say, multiple missed visits to the gym rather than just one) can be costly. Seinfeld’s mantra “Don’t break the streak” is astute. It also helps explain the logic behind twenty-eight-pill packages of birth control. Scientifi­cally speaking, the pills are necessary only on the first twenty-one days of a twenty-eight-day menstrual cycle. However, most birth control packages include seven sugar pills along with twenty-one hormone pills to ensure that people on birth control won’t fall out of the bit.

Excerpt from: How to Change: The Science of Getting from Where You Are to Where You Want to Be by Katy Milkman

💎 On the importance of updating our beliefs when presented with new information

My colleague Phil Tetlock finds that forecasting skill is less a matter of what we know than of how we think. When he and his collaborators studied a host of factors that predict excellence in forecasting, grit and ambition didn’t rise to the top. Neither did intelligence, which came in second. There was another factor that had roughly triple the predictive power of brainpower.

The single most important driver of forecasters’ success was how often they updated their beliefs. The best forecasters went through more rethinking cycles. They had the confident humility to doubt their judgments and the curiosity to discover new information that led them to revise their predictions.

Excerpt from: Think Again: The Power of Knowing What You Don’t Know by Adam Grant

💎 On the value of changing your mind

With all due respect to the lessons of experience, I prefer the rigor of evidence. When a trio of psychologists conducted a comprehensive review of thirty-three studies, they found that in every one, the majority of answer revisions were from wrong to right. This phenomenon is known as the first-instinct fallacy.

In one demonstration, psychologists counted eraser marks on the exams of more than 1,500 students in Illinois. Only a quarter of the changes were from right to wrong, while half were from wrong to right. I’ve seen it in my own classroom year after year: my students’ final exams have surprisingly few eraser marks, but those who do rethink their first answers rather than staying anchored to them end up improving their scores.

Excerpt from: Think Again: The Power of Knowing What You Don’t Know by Adam Grant

💎 On using our understanding of the natural world to your advantage

In another life-and-death situation, in 1989 Bengal tigers killed about 60 villagers from India’s Ganges delta. No weapons seemed to work against them, including lacing dummies with live wires to shock the tigers away from human populations.

Then a student at the Science Club of Calcutta noticed that tigers only attacked when they thought they were unseen, and recalled that the patterns decorating some species of butterflies, beetles, and caterpillars look like big eyes, ostensibly to trick predators into thinking their prey was also watching them. The result: a human face mask, worn on the back of head. Remarkably, no one wearing a mask was attacked by a tiger for the next three years; anyone killed by tigers during that time had either refused to wear the mask, or had taken it off while working. — sidebar: Occam’s Razor in the Medical field

Excerpt from: The Great Mental Models Volume 1: General Thinking Concepts by Shane Parrish and Rhiannon Beaubien

💎 On the danger of only measuring the first order effects of an intervention

In 1963, the UC Santa Barbara ecologist and economist Garrett Hardin’ Proposed his First Law of Ecology: “You can never merely do one thing.” We operate in a world of multiple, overlapping connections, like a web, with many significant, yet obscure and unpredictable, relationships. He developed Second-order thinking into a tool, showing that if you don’t consider “the effects of the effects,” you can’t really claim to be doing any thinking at all.

When it comes to the overuse of antibiotics in meat, the first-order consequence is that the animals gain more weight per pound of food consumed, and thus there is profit for the farmer. Animals are sold by weight, so the less food you have to use to bulk them up, the more money you will make when you go to sell them.

The second-order effects, however, have many serious, negative consequences. The bacteria that survive this continued antibiotic exposure are antibiotic resistant. That means that the agricultural industry, when using these antibiotics as bulking agents, is allowing mass numbers of drug-resistant

Excerpt from: The Great Mental Models Volume 1: General Thinking Concepts by Shane Parrish and Rhiannon Beaubien

💎 On the illusion of explanatory depth

isolation is powerful but misleading. For a start, while humans have accumulated a vast store of collective knowledge, each of us alone knows surprisingly little, certainly less than we imagine. In 2002, the psychologists Frank Keil and Leonid Rozenblit asked people to rate their own understanding of how zips work. The respondents answered confidently — after all, they used zips all the time. But when asked to explain how a zip works, they failed dismally. Similar results were found when people were asked to describe climate change and the economy. We know a lot less than we think we do about the world around us. Cognitive scientists call this ‘the illusion of explanatory depth’, or just ‘the knowledge illusion’.

Excerpt from: Conflicted: Why Arguments Are Tearing Us Apart and How They Can Bring Us Together by Ian Leslie

💎 On the advantage of being familiar with a number of accurate models of human behaviour, rather than just knowing a series of unrelated facts

In a famous speech in the 1990s, Charlie Munger summed up this approach to practical wisdom: “Well, the first rule is that you can’t really know anything if you just remember isolated facts and try and bang ‘em back. If the facts don’t hang together on a latticework of theory, you don’t have them in a usable form. You’ve got to have models in your head. And you’ve got to array your experience both vicarious and direct on this latticework of models. You may have noticed students who just try to remember and pound back what is remembered. Well, they fail in school and in life. You’ve got to hang experience on a latticework of models in your head.”

Excerpt from: The Great Mental Models Volume 1: General Thinking Concepts by Shane Parrish and Rhiannon Beaubien

💎 Bayesian thinking and the importance of applying a base rate when interpreting new data

The core of Bayesian thinking (or Bayesian updating, as it can be called) is this: given that we have limited but useful information about the world, and are constantly encountering new information, we should probably take into account what we already know when we learn something new. As much of it as possible. Bayesian thinking allows us to use all relevant prior information in making decisions. Statisticians might call it a base rate, taking in outside information about past situations like the one you’re in.

Consider the headline “Violent Stabbings on the Rise.” Without Bayesian thinking, you might become genuinely afraid because your chances of being a victim of assault or murder is higher than it was a few months ago. But a Bayesian approach will have you putting this information into the context of what you already know about violent crime. You know that violent crime has been declining to its lowest rates in decades. Your city is safer now than it has been since this measurement started. Let’s say your chance of being a victim of a stabbing last year was one in 10,000, or 0.01%. The article states, with accuracy, that violent crime has doubled. It is now two in 10,000, or 0.02%. Is that worth being terribly worried about? The prior information here is key. When we factor it in, we realize that our safety has not really been compromised.

Excerpt from: The Great Mental Models Volume 1: General Thinking Concepts by Shane Parrish and Rhiannon Beaubien

💎 On how we can be trapped by our own perspective

The first flaw is perspective. We have a hard time seeing any system that we are in. Galileo’ had a great analogy to describe the limits of our default perspective. Imagine you are on a ship that has reached constant velocity (meaning without a change in speed or direction). You are below decks and there are no portholes. You drop a ball from your raised hand to the floor. To you, it looks as if the ball is dropping straight down, thereby confirming gravity is at work.

Now imagine you are a fish (with special x-ray vision) and you are watching this ship go past. You see the scientist inside, dropping a ball. You register the vertical change in the position of the ball. But you are also able to see a horizontal change. As the ball was pulled down by gravity it also shifted its position east by about 20 feet. The ship moved through the water and therefore so did the ball. The scientist on board, with no external point of reference, was not able to perceive this horizontal shift.

This analogy shows us the limits of our perception. We must be open to other perspectives if we truly want to understand the results of our actions. Despite feeling that we’ve got all the information, if we’re on the ship, the fish in the ocean has more he can share.

Excerpt from: The Great Mental Models Volume 1: General Thinking Concepts by Shane Parrish and Rhiannon Beaubien

💎 Beware claimed data (people don’t like to admit they ‘don’t know’ when questioned)

However, serious academic consideration of public opinion about fictitious issues did not start until the ’80s, when George Bishop and colleagues at the University of Cincinnati found that a third of Americans either favoured or opposed the fictitious Public Affairs Act. Bishop found that this figure dropped substantially when respondents were offered an explicit don’t know’ option. However, 10 per cent of respondents still selected a substantive answer, even when given a clear opportunity to express their lack of familiarity. Similar findings were reported in the US at around the same time by Howard Schuman and Stanley Presser, who also found that a third of respondents to their survey expressed positions on issues which, though real, were so obscure that few ordinary citizens would ever have heard of them.

Excerpt from: Sex, Lies and Politics: The Secret Influences That Drive our Political Choices by Philip Cowley and Robert Ford

💎 Beware interpreting stats on anything you have a strongly held view about (from politics to Covid and beyond)

It’s much more challenging when emotional reactions are involved, as we’ve seen with smokers and cancer statistics. Psychologist Ziva Kunda found the same effect in the lab when she showed experimental subjects an article laying out the evidence that coffee or other sources of caffeine could increase the risk to women of developing breast cysts. Most people found the article pretty convincing. Women who drank a lot of coffee did not.

We often find ways to dismiss evidence that we don’t like. And the opposite is true, too: when evidence seems to support our preconceptions, we are less likely to look too closely for flaws.

The more extreme the emotional reaction, the harder it is to think straight.

Excerpt from: How to Make the World Add Up: Ten Rules for Thinking Differently About Numbers by Tim Harford

💎 Beware the Rosser Reeves effect when interpreting tracking data (communication effectiveness)

Research routinely shows that people who’re aware of communication from brand X are more likely to buy that brand. Sometimes used as evidence that communication drives sales, in fact causality usually runs the other way: buying brand X makes you more likely to notice its communications. This phenomenon (the so-called ‘Rosser Reeves effecť – named after the famous 1950s adman) has been known for decades, yet is still routinely used to ‘prove’ communication effectiveness (most recently to justify social media use).

Excerpt from: How not to Plan: 66 ways to screw it up by Les Binet and Sarah Carter

💎 On the tendency of marketers to exaggerate the amount consumers change (social trends)

Marketing and advertising people can talk a load of nonsense at the best of times. But if you want to hear them at their worst, ask them to talk about social trends. The average social trends presentation is a guaranteed mix of the obvious, irrelevant and false.

Recently, we were listening to a conference speech about changing lifestyles’. Life nowadays is faster than ever, said the speaker. We work longer hours. We have less free time. Families are fragmenting. Food is eaten on the run..

We’ve been listening to this bullshit for 30 years. And it’s no more true now that it was then. The inconvenient, less headline-worthy truth is that people have more free time than ever. Economic cycles wax and wane, but the long-term trend in all developed economies is toward shorter, more flexible working hours. And longer holidays. People start work later in life and spend much longer in retirement. Work takes up a smaller percentage of our life than it used to.

Related myths about pressures on. family time are equally false. Contrary to popular belief, in developed economies parents spend more time with their children these days. Not less. Research shows the amount of time families spend eating together has stayed remarkably constant over the years, As has the amount of time they spend together watching TV.

Excerpt from: How not to Plan: 66 ways to screw it up by Les Binet and Sarah Carter

💎 On the benefits of brevity (sell your idea or your dream in 10 to 15 minutes)

Let’s put this in perspective. Abraham Lincoln inspired generations in a speech that lasted two minutes. John F. Kennedy took 15 minutes to shoot for the moon. Martin Luther King Jr. articulated his dream of racial unity in 17 minutes. Steve Jobs gave one of the most famous college commencement speeches of our time at Stanford University in 15 minutes. If you can’t sell your idea or your dream in 10 to 15 minutes, keep editing until you can.

Ideas don’t sell themselves. Be selective about the words you use. If they don’t advance the story, remove them. Condense, simplify, and speak as briefly as possible. Have the courage to speak in grade-school language. Far from weakening your argument, these tips will elevate your ideas, making it more likely you’ll be heard.

Excerpt from: Five Stars: The Communication Secrets to Get From Good to Great by Carmine Gallo

💎 All speeches have three versions (before, during, ideal)

“There are always three speeches for every one you actually gave: the one you practiced, the one you gave, and the one you wish you gave.”

-Dale Carnegie

Excerpt from: 100 Things Every Designer Needs to Know About People (Voices That Matter) by Susan Weinschenk

💎 On our minds working on problems even when we’re not consciously thinking about them (John Cleese)

Graham and I thought it was rather a good sketch. It was therefore terribly embarrassing when I found I’d lost it. I knew Graham was going to be cross, so when I’d given up looking for it, I sat down and rewrote the whole thing from memory. It actually turned out to be easier than I’d expected.

Then I found the original sketch and, out of curiosity, checked to see how well I’d recalled it when rewriting. Weirdly, I discovered that the remembered version was actually an improvement on the one that Graham and I had written. This puzzled the hell out of me.

Again I was forced to the conclusion that my mind must have continued to think about the sketch after Graham and I had finished it. And that my mind had been improving what we’d written, without my making any conscious attempt to do so. So when I remembered it, it was already better.

Chewing this over, I realised it was like the tip-of-the-tongue phenomenon: when you can’t remember a name, and you chase after it in your mind

Excerpt from: Creativity: A Short and Cheerful Guide by John Cleese

💎 Kleiner Perkin’s tactic for avoiding their staff developing entrenched positions in meetings (flip-flop)

Another renowned venture capitalist, Kleiner Perkins’s Randy Komisar takes this idea one step further. He dissuades members of the investment committee from expressing firm opinions by stating right away that they are for or against an investment idea. Instead, Komisar asks participants for a “balance sheet” of points for and against the investment: “Tell me what is good about this opportunity; tell me what is bad about it. Do not tell me your judgment yet. I don’t want to know.” Conventional wisdom dictates that everyone should have an opinion and make it clear. Instead, Komisar asks his colleagues to flip-flop!

Excerpt from: You’re About to Make a Terrible Mistake!: How Biases Distort Decision-Making and What You Can Do to Fight Them by Olivier Sibony

💎 Analysing successful brands can be misleading (survivorship bias)

The models whose success we admire are, by definition, those who have succeeded. But out of all the people who were “crazy enough to think they can change the world,” the vast majority did not manage to do it. For this very reason, we’ve never heard of them. We forget this when we focus only on the winners. We look only at the survivors, not at all those who took the same risks, adopted the same behaviors, and failed. This logical error is survivorship bias. We shouldn’t draw any conclusions from a sample that is composed only of survivors. Yet we do, because they are the only ones we see.

Our quest for models may inspire us, but it can also lead us astray. We would benefit from restraining our aspirations and learning from people who are similar to us, from decision makers whose success is less flashy, instead of a few idols

Excerpt from: You’re About to Make a Terrible Mistake!: How Biases Distort Decision-Making and What You Can Do to Fight Them by Olivier Sibony

💎 On why partial knowledge is often victorious over full knowledge (it conceives things as simpler than they are)

Such misleading stories, however, may still be influential and durable. In Human, All Too Human, philosopher Friedrich Nietzsche argues that “partial knowledge is more often victorious than full knowledge: it conceives things as simpler than they are and therefore makes its opinion easier to grasp and more persuasive.”

Excerpt from: The Myth of Experience: Why We Learn the Wrong Lessons, and Ways to Correct Them by Emre Soyer and Robin M Hogarth

💎 On the danger of a theory-free analysis of mere correlations (winter detector)

The ‘winter detector’ problem is common in big data analysis. A literal example, via computer scientist Sameer Singh, is the pattern-recognising algorithm that was shown many photos of wolves in the wild, and many photos of pet husky dogs. The algorithm seemed to be really good at distinguishing the two rather similar canines; it turned out that it was simply labelling any picture with snow as containing a wolf. An example with more serious implications was described by Janelle Shane in her book You Look Like a Thing and I Love You: an algorithm that was shown pictures of healthy skin and of skin cancer. The algorithm figured out the pattern: if there was a ruler in the photograph, it was cancer. If we don’t know why the algorithm is doing what it’s doing, we’re trusting our lives to a ruler detector.

Excerpt from: How to Make the World Add Up: Ten Rules for Thinking Differently About Numbers by Tim Harford

💎 On the lack of data proving the effectiveness of ad campaigns (designed to boost loyalty)

The advertising industry – whose only important asset is ideas – has learned nothing from this. We keep heading in the wrong direction. We keep bulking up everything in our arsenal except our creative resources. Then we take the people who are supposed to be our idea people and give them till 3 o’clock to do a banner.

Sure, we need people who are tech-savvy and analytical. But more than anything, we need some brains-in-a-bottle who have no responsibility other than to sit in a corner and feed us crazy ideas. We keep looking to “transform” our industry but ignore the one transformation that would kill.

Excerpt from: How not to Plan: 66 ways to screw it up by Les Binet and Sarah Carter

💎 Five pronged model for encouraging behaviour change (reduce)

REACTANCE

When pushed, people push back. So rather than telling people what to do, or trying to persuade, catalysts allow for agency and encourage people to convince themselves.

ENDOWMENT

People are attached to the status quo. To ease endowment, catalysts surface the costs of inaction and help people realize that doing nothing isn’t as costless as it seems.

DISTANCE
Too far from their backyard, people tend to disregard. Perspectives that are too far away fall in the region of rejection and get discounted, so catalysts shrink distance, asking for less and switching the field.

UNCERTAINTY

Seeds of doubt slow the winds of change. To get people to un-pause, catalysts alleviate uncertainty. Easier to try means more likely to buy.

CORROBORATING EVIDENCE

Some things need more proof. Catalysts find corroborating evidence, using multiple sources to help overcome the translation problem.

Excerpt from: Catalyst by Jonah Berger

💎 Six psychological biases that help explain why we fail to prepare for disasters

1. Myopia: a tendency to focus on overly short future time horizons when appraising immediate costs and the potential benefits of protective investments;
2. Amnesia: a tendency to forget too quickly the lessons of past disasters;
3. Optimism: a tendency to underestimate the likelihood that losses will occur from future hazards;
4. Inertia: a tendency to maintain the status quo or adopt a default option when there is uncertainty about the potential benefits of investing in alternative protective measures:
5. Simplification: a tendency to selectively attend to on subset of the relevant factors to consider when making choices involving risk; and
6. Herding: a tendency to base choices on the observed actions of others.

Excerpt from: The Ostrich Paradox: Why We Underprepare for Disasters by Robert Meyer and Howard Kunreuther