💎 If you give people a sense of control (even an illusory one) they’re happier with their decisions

It is not only humans who like to choose: animals prefer to have a choice as well. In fact, they choose to choose even if having a choice does not change the outcome. If rats need to select between two paths that lead to food—one path is a straight line and the other subsequently requires them to select whether to go right or left—they choose the latter path. Pigeons do the same thing. Give a pigeon two options: the first is a button to peck that results in grain being dispensed, and the second is two buttons from which it needs to select one to peck in order to receive the same grain, and the bird will pick the option with two buttons. The pigeons quickly learn that the seeds are no different; yet they prefer the seeds that were obtained by making a choice.

Excerpt from: The Influential Mind: What the Brain Reveals About Our Power to Change Others by Tali Sharot

💎 We try and avoid negative information (a lesson in stock market trading)

Figure 5.2. People’s desire to know their own worth is related to market performance. The black line represents the S&P 500, and the gray line represents the number of times people logged on to their accounts to check on their stocks. When the market goes up, people are more likely to take a peek at the value of their holdings than when it goes down.

Excerpt from: The Influential Mind: What the Brain Reveals About Our Power to Change Others by Tali Sharot

💎 If you are writing a list of three terms put the shortest first and the longest last for maximum impact

If not quite a ‘rule’, it’s at least a strong guideline for successful rhythm that you should put the shortest term in any list first and the longest last. This is the principle of climax underscoring the rising tricolon. ‘I am Scottish by aspiration, birth and choice’ has nothing of the drum-roll about it. ‘I will be fishing for cod, blue-fin tuna, the inedible but mighty basking shark, and the many-tentacled deep-sea octopus’ just, somehow, tends to sound better than ‘I will be fishing for the many-tentacled deep-sea octopus, blue-fin tuna, the inedible but mighty basking shark, and cod.’

Excerpt from: Write to the Point: How to be Clear, Correct and Persuasive on the Page by Sam Leith

💎 On reading your copy out loud and ‘where you falter, alter’

Peggy Noonan, who wrote speeches for Ronald Reagan has said: ‘Once you’ve finished the first draft of your speech – stand up and speak it aloud. Where you falter, alter.’ That applies especially to speeches, of course: in that case you’re trying to produce something that’s hard to stumble over when spoken aloud. Tongue-twisters such as ‘red lorry, yellow lorry’ are easier on the page than in the mouth. But it is also good advice to the prose writer. There is a developmental connection between reading aloud and reading silently – and there is a neurological one too.

Excerpt from: Write to the Point: How to be Clear, Correct and Persuasive on the Page by Sam Leith

💎 On our tendency to explain behaviour too much in terms of personality and not enough in terms of circumstances

The bias runs deep. Few of us, surely, think of ourselves as having a fixed, monochrome personality: we’re happy or sad, stressed or relaxed, depending on circumstances. Yet we stubbornly resist the notion that others might be similarly circumstance-dependent. In a well-known 1960s study, people were shown two essays, one arguing in favour of Castro’s Cuba and one against. Even when it was explained that the authors had been ordered to adopt each position based on a coin-toss – that their situation, in other words, had forced their hand readers still considered that the pro Castro author must be deep down, pro Castro and vice versa.

Excerpt from Help!: How to Become Slightly Happier and Get a Bit More Done by Oliver Burkeman

💎 On how transient our beliefs can be

In a study entitled “After the Movies’, some crafty Australian researchers grilled people leaving the cinema about their views on politics and morality; they discovered that those leaving happy films were optimistic and lenient, while those leaving aggressive or sad ones were far more pessimistic and strict.

Excerpt from Help!: How to Become Slightly Happier and Get a Bit More Done by Oliver Burkeman

💎 The friend of creative work is alertness, and nothing focusses your attention like stepping on to unfamiliar ground

In 1958, a young psychologist by the name of Bernice Eiduson began a long-term study of the working methods of forty mid-career scientists. For twenty years, Professor Eiduson periodically interviewed the scientists and gave them a variety of psychological tests, as well as gathering data on their publications. Some of the scientists went on to great success: there were four Nobel Prize winners in the group and two others widely regarded as Nobel-worthy. Several other scientists joined the National Academy of Sciences. Others had disappointing careers.

In 1993, several years after Bernice Eiduson’s death, her colleagues published an analysis of this study, trying to spot patterns. A question of particular interest was: what determines whether a scientist keeps publishing important work throughout his or her life? A few highly productive scientists produced breakthrough paper after breakthrough paper. How?

A striking pattern emerged. The top scientists switched topics frequently. Over the course of their first hundred published papers, the long-lived high-impact researchers switched topics an average of 43 times. The leaps were less dramatic than the ones Erez Aiden likes to take, but the pattern is the same: the top scientists keep changing the subject if they wish to stay productive.

Excerpt from: Messy: How to Be Creative and Resilient in a Tidy-Minded World by Tim Harford

💎 Anchoring – even when taken to ridiculous extremes – has an affect on people’s judgements

Psychologists Gretchen Chapman and Brian Bornstein tested this idea in a 1996 experiment, when Liebeck v. McDonald’s was much in the news. They presented eighty students from the University of Illinois, U.S.A., students with the hypothetical case of a young woman who said she contracted ovarian cancer from birth control pills and was suing her health care organization. Four groups each heard a different demand for damages: $100; $20,000; $5 million; and $1 billion. The mock jurors were asked to give compensatory damages only. Anyone who wants to believe in the jury system must find the results astonishing.

The jurors were amazingly persuadable, up through the $5 million demand. The lowball $100 demand got a piddling $990 average award. This was for a cancer said to have the plaintiff ‘almost constantly in pain… Doctors do not expect her to survive beyond a few more months.’

Excerpt from: Priceless: The Myth of Fair Value (and How to Take Advantage of It) by William Poundstone

💎 People lie on surveys (and in their Netflix queues) both consciously and unconsciously

Netflix learned a similar lesson early on in its life cycle: don’t trust what people tell you; trust what they do. Originally, the company allowed users to create a queue of movies they wanted to watch in the future but didn’t have time for at the moment. This way, when they had more time, Netflix could remind them of those movies. However, Netflix noticed something odd in the data. Users were filling their queues with plenty of movies. But days later, when they were reminded of the movies on the queue, they rarely clicked. What was the problem? Ask users what movies they plan to watch in a few days, and they will fill the queue with aspirational, highbrow films, such as black-and-white World War II documentaries or serious foreign films. A few days later, however, they will want to watch the same movies they usually want to watch: lowbrow comedies or romance films. People were consistently lying to themselves.

Excerpt from: Everybody Lies: Big Data, New Data, and What the Internet Can Tell Us About Who We Really Are by Seth Stephens-Davidowitz

💎 On the underrated value of ‘interstitial time’

There’s a popular subgenre of books about writing known informally as ‘writer porn’, in which famous authors describe their daily routines, which pens they use and, especially, the secluded mountain-top cabins where they work each morning for six blissfully undisturbed hours. I don’t think I’ve ever actually met such an author, but for anyone whose job is even slightly ‘creative’, they stir envy: we’d all love such big chunks of time in which to focus. Instead, our lives are plagued with what the blogger Merlin Mann, at 43folders.com, calls ‘interstitial time’ – small chunks of minutes spent waiting at the doctor’s surgery, or for someone who’s late, or for a meeting postponed at short notice.

It feels like time wasted. But it needn’t be. The poet William Carlos Williams, for example, wrote much of his oeuvre on the backs of prescription pads during gaps in his workday as a paediatrician.

Excerpt from Help!: How to Become Slightly Happier and Get a Bit More Done by Oliver Burkeman

💎 On life being understood backwards but lived forward – what’s obvious in retrospect is often hard to see at the time

As the philosopher Søren Kierkegaard noted in his journal in 1843, ‘It is perfectly true, as the philosophers say, that life must be understood backwards. But they forget the other proposition, that it must be lived forwards.” The day-by-day business of living feels, at this particular moment, spectacularly distant from the ways in which I and others will come to comprehend these events.

Yet this is just an extreme version of something that is always true. Human understanding is always both provisional and belated. Many things that appear obvious in retrospect were anything but obvious at the time, because the clarity we experience when looking back in time is utterly unlike the cloud of uncertainty that surrounds day-today existence. The world is far more complex than any stories we can tell about it; far more mysterious, far harder to predict.

Excerpt from: How to Think: Your Essential Guide to Clear, Critical Thought by Tom Chatfield

💎 Why supermarkets change prices: it’s a way of ascertain who is price sensitive

One common situation is for two supermarkets to be competing for the same customers. As we’ve discussed, it’s hard for be systematically more expensive than the others without losing a lot of business, so they will charge similar prices on average, but both will also mix up their prices. That way, both can distinguish the bargain hunters from those in need of specific products, like people shopping to pick up ingredients for a cook-book recipe they are making for a dinner party. Bargain-hunters will pick up whatever is on sale and make something of it. The dinner-party shoppers come to the supermarket to buy specific products and will be less sensitive to prices. The price-targeting strategy only works because the supermarkets always vary the patterns of their special offers, and because it is too much trouble to go to both stores or to order two separate internet deliveries, carefully comparing the price of each good every time we go online. If shoppers could predict what was to be discounted, they could choose recipes ahead of time, and even choose the appropriate supermarket to pick up the ingredients wherever they’re least expensive.

Excerpt from: The Undercover Economist by Tim Harford

💎 Why failures to act tend to haunt us more than failed actions

But in his book If Only, the psychologist Neal Roese argues that when it comes to real-life choices, ‘if you decide to do something and it turns out badly, it probably won’t still be haunting you a decade down the road. You’ll reframe the failure, explain it away, move on, and forget it. Not so with failures to act’. You’ll regret them for longer, too, because they’re ‘imaginatively boundless?’ you can lose yourself for ever in the infinite possibilities of what might have been. In other words: you know that thing you’ve been wondering about doing? Do it.

Excerpt from Help!: How to Become Slightly Happier and Get a Bit More Done by Oliver Burkeman

💎 Give me the freedom of a tight brief

Joyce uses the analogy of a playground. 12 Researchers found that when you put up a fence around a playground, children will use the entire space—they’ll feel safe to play all the way to the edges. But if those walls are removed, creating a wide-open playground, the space the children choose to play in contracts: they stay toward the middle and they stick to each other, because that’s what feels safe. This, Joyce suggests, is what happens in the creative process. When there are no clear limits in the brief itself, we aren’t sure what boundaries to explore and push against. We end up without the necessary focus and passion of which Marissa Mayer speaks. In fact, one of Joyce’s surprise findings was that in the absence of explicit constraints, the unconstrained teams created more conflict, stemming from all the different unarticulated assumptions and implicit constraints that team members created in their own heads, as if to fill the void.

Excerpt from: A Beautiful Constraint: How To Transform Your Limitations Into Advantages, and Why It′s Everyone′s Business by Mark Barden and Adam Morgan

💎 If you can’t make it shorter, make it feel shorter

Indefinite wait seems longer then defined ones, writes David Maiste in his paper “The Psychology of Waiting Lines’, which is why Disney theme parks use complex formulae to calculate and display wait-times. ‘Pre-process’ waits seem longer than ‘in process’ waits, which is why restaurants will seat you before they’re ready to serve you. Customers are happier when queues are acknowledged: when a supermarket calls ‘all staff to the checkouts’, it’s as much about you hearing it as about staffing. And occupied time passes faster than unoccupied time: mirrored walls are especially effective, apparently because most people love looking at themselves.

Excerpt from Help!: How to Become Slightly Happier and Get a Bit More Done by Oliver Burkeman

💎 Are you communicating to feel good about yourself or to persuade?

Bottom line: it’s hard to change someone’s mind when you feel morally and intellectually superior to them. As Megan McArdle memorably put it: “It took me years of writing on the Internet to learn what is nearly an iron law of commentary: The better your message makes you feel about yourself, the less likely it is that you are convincing anyone else.”

Excerpt from: The Scout Mindset: The Perils of Defensive Thinking and How to Be Right More Often by Julia Galef

💎 On the creative benefits of thinking like a child

Einstein was a great fan of this technique. He said that: “To stimulate creativity, one muse develops the childlike inclination for play.” Researchers at the North Dakota State University agree. They conducted an experiment where they asked 76 undergraduates what they would do if college were cancelled for the day. The interesting bit was that half of them were encouraged to think as if they were seven years old. These students were found to give much more creative responses than the control group.

Excerpt from: Go Luck Yourself: 40 ways to stack the odds in your brand’s favour by Andy Nairn

💠 Optimism Bias

When looking to the future, we tend to overestimate the good stuff and underestimate the bad.

This is a draft chapter from my new book; Security Gems: Using Behavioural Economics to Improve Cybersecurity (working title).

Subscribe to read new chapters as I write them.

💠 In Sickness and In Health

Marriage. It’s a wonderful thing, isn’t it?

In the Western world, the numbers don’t agree. Divorce rates are about 40 percent.

That means that out of five married couples, two will end up in divorce. But when you ask newlyweds about their own likelihood of divorce, they estimate it at zero percent.

Good luck to them!

Optimism bias is sometimes used interchangeably with ‘overconfidence’, and refers to the phenomenon whereby individuals believe they are less likely than others to experience a negative event.

As humans we need some level of optimism, if we went in to marriage thinking it would end in divorce, marriage simply would not exists.

The optimism bias is an intriguing concept that comes with a host of benefits, such as shielding us from depression and ensuring we respond positively to failure.

Sadly, though, the optimism bias in cyber security leaves us overly-vulnerable to cyber attack.

💠 It’ll never happen to me

When I was growing up, there was a kid in my neighbourhood who loved climbing trees. I was always suspicious one of his parents was a monkey.

He’d shoot up them, without a second thought.

Once, thirty metres in the air, a branch broke beneath him. All of us standing below heard the crack. It sounded like lightning, followed by a heavy thud as it hit the ground

Luckily he managed to quickly reach out and grab a branch above, saving himself from a long fall.

Whilst the slip didn’t bring him back down to earth, it did bring him back to reality. It took him the rest of the day to climb back down. And weeks before we saw him up another tree.

The dangers of being overly optimistic or self-confident can often blind us to the very high likelihood of negative outcomes.

When there’s nothing to warn us of our impending doom we get even more reckless.

Drink and drug driving is a massive problem, and is in a large part a result of our unbounding optimism.

“I’ve only had a couple of beers”, offers no solace to the family whose love one has been killed as a result of impaired reaction times.

Nightclubs in Germany came up with a brilliant idea to reduce the problem of their patrons jumping into cars after a night on the tiles; piss screens.

Urinals allowed drivers to steer a car in a video games using their pee. Aim left to go left. Right to go right.

If you’re too slow or swerve too much, that is to pee on the blokes foot next to them, the car would crash. “Too pissed to drive”, the screen would read, along with the number of the local taxi firm.

Again, in life we need moments to peg us back to reality.

When people receive emails they don’t necessarily treat them with the suspicion they deserve.

Far too often, we’re optimistic about the outcome of clicking links, and end up clicking malicious links or opening malicious attachments.

Wether it’s drink driving, or clicking an email. Both can have catastrophic consequences.

Facebook do a great job of warning us about the result of our actions. Click an external link on your newsfeed and they’ll make you confirm the link shown is where you want to end up.

The aim here is to make the negative effects and losses of a certain action clear to the individual, and offer a clear, safer alternative.

Sadly Facebook don’t do this with uploading drunk photos yet.

💠 It’ll happen to them

Now, I’m not advocating we all become pessimists. World economies rely on optimism.

Entrepreneurs need optimism.

Do you ever find yourself in situations wondering “how hard could it be?”.

As an amateur home-chef, I have a particularly bad habit of asking this type of question when dining out. How hard could it be to create a menu? Cook the food? Leave the customers wanting more?

I make a great Pad Thai.

In my town one particular restaurant unit has changed hands five times in as many years. Italian. Indian. Thai. Greek. Italian, again.

It’s not unusual. In some cities, the chance of restaurant failure in the first year can be as high as 90%. That is, nine out of every ten restaurants opened will fail!

Nine in ten! Who would want to open a new restaurant?

Restaurateurs know the numbers, but despite the well-documented failure rates, they often don’t think they apply to them. They might argue their concept is different to the others, their restaurant is in a better part of town, or the cuisine is seeing new popularity.

But do they really have a better chance of success than others trying the same thing?

In the majority of cases, no.

The problem is we don’t know the reason behind the facts. We don’t know a lot about others, but know a lot about ourselves.

We’re optimistic about ourselves, we’re optimistic about our kids, we’re optimistic about our families, but we’re not so optimistic about the guy sitting next to us, and we’re pessimistic about the fate of our fellow citizens and the fate of our country.

This plagues those responsible for creating public health messaging.

One in two UK people will be diagnosed with cancer in their lifetime. But despite the odds most people don’t think they’ll get cancer [1].

38 percent of cancer cases are preventable in the UK. 15 percent of that can be attributed to stopping smoking.

Yet millions of people still smoke, pouring their hard earned money into the pursuit of lowering their health outcomes.

People explain it away. They go to the gym everyday. Other smokers don’t. They don’t drink, like other smokers.

Comparative optimism, where we can’t make a direct comparison, convinces us others are more likely to suffer negative experiences than we are ourselves.

Studies around peoples perceived privacy risks, like unauthorised access to accounts and sharing of personal information, is much more likely to happen to other people [2].

Almost half of all UK businesses suffered some form of cyber security breach in 2020 [3].

Yet companies don’t think it will happen to them.

It’s why we can ignore network security risks while at the same time reading about other companies that have been breached. It’s why we think we can get by where others failed.

Optimism induced invincibility needs to be accounted for, and removed. You are no better than your peers, mostly.

💠 Prevention is better than cure

Skiing. Windsurfing. Rock climbing. These are the kinds of things I love to do on holiday.

Health insurance companies don’t like me doing them. I know this because they charge me a hefty premium for coverage.

Previously I was guilty of questioning if travel insurance was worth the money.

Whilst speaking to the Swiss Mountain Rescue team one Winter, they told me just how much it cost to be evacuated via helicopter. About $100 per minute. And that’s from takeoff to landing.

Perceptions of actual risk can be clouded by optimism. I don’t go on holiday to break a leg, but the chance is pretty high.

It’s not just that we don’t think bad things can happen to us or are more likely to happen to someone else. We–all things being equal–believe that good outcomes are more probable than bad outcomes.

In one study, participants were given a list of 18 positive and 24 negative events, like getting a good job after graduation, developing a drinking problem, and so on [4].

Overall, they considered themselves 15% more likely than others to experience positive events, and 20% less likely than others to experience negative events.

People are more likely to accept risks if they feel they have some control over them.

Here we see the feeling of security diverging from the reality of security.

Controlling for this feeling is important.

We all know someone that has “seen it all”.

Experience often trumps decision making. It offers a sense of security.

But never let it cloud the actual risks, which should be assessed with an eye of experience, but also an eye of fatalism.

💠 Security Gems

You are not invincible.

  • Set a “base rate”:  Take an outside view, meaning we should look at base rates for our estimates as if we are looking at someone else’s chances.
  • Conduct a premortem: before making a decision predict how a project or strategy could fail and then work backward to prevent these issues.
  • Make impending negative events caused by over-optimism clear: Bringing negative events to our mind just before we’re likely to engage in an undesirable act can be a good behaviour change technique.
  • Use positive information motivate: Instead of telling people why they shouldn’t do something, convince them with the benefits of an alternative. Remember our optimism bias leads us to think we’re less likely to suffer negative outcomes compared to others.
  • Beware of feeling secure: Take a risk based approach to security. Best practises are good to follow, but make sure they address the critical issues.

[1] Cancer risk statistics
[2] Optimistic bias about online privacy risks
[3] Almost half of UK businesses suffered a cyber attack in past year
[4] Unrealistic Optimism about Future Life Events

Security Gems: Using Behavioural Economics to Improve Cybersecurity

This post is a draft chapter from my new book. Pardon the typos.

Subscribe to read new chapters as I write them.

💠 Confirmation Bias

We seek out or interpret information that confirms our preconceptions, avoiding things that challenge them.

This is a draft chapter from my new book; Security Gems: Using Behavioural Economics to Improve Cybersecurity (working title).

Subscribe to read new chapters as I write them.

💠 Paying for confirmation

According to the flat Earth model of the universe, the sun and the moon are the same size.

You’ll find credible looking mathematical models that argue the theory. Photographs taken from a plane showing a flat horizon. Queries about how the seas could ever exist if the earth was round.

You won’t find calculations from Eratosthenes who is credited for discovering the earth was round. Photographs taken from space of a round planet. Or mentions of gravity, which holds the water in the seas.

Or does it?

As humans we have a disposition to confirm our beliefs by exclusively searching for information that supports a hunch while excluding opposing data.

Confirmation bias isn’t limited to conspiracy theorists. It causes us to vote for politicians, investors to make poor decisions, businesses to focus on the wrong ideas, and almost certainly led you to buy this book.

During the 2008 US presidential election, Valdis Krebs analysed purchasing trends on Amazon. People who already supported Obama were the same people buying books which painted him in a positive light. People who already disliked Obama were the ones buying books painting him in a negative light. [1]

People weren’t buying books for the information. They were buying them for the confirmation.

I’m in no doubt the people buying this book have a predisposition for product psychology.

Sound like you?

💠 Biased Search for Information

I love the word “yes”.

Yes, have an extra slice of cake. Yes, you do look good today. Yes, you are the best.

Experiment after experiment has shown that people tend to ask questions that are designed to yield a “yes”.

This is also known as congruence heuristic [2].

Google search histories are a good demonstration of the affirmative questions we all love to ask.

“Are cats better than dogs?”

We prime Google that cats are indeed better than dogs. Google hears we have a preference for cats. Google plays ball, listing sites detailing reasons why cats are better than dogs.

“Are dogs better than cats?”

The same question phrased differently produces entirely different results. Now dogs are better.

“Which is better; cats or dogs?”

Or;

“What is the best pet for [my situation]?”.

Would have been better questions. Obviously the answer is always dogs.

Affirmative approaches to reasoning are common in security.

Analysts enter an investigation digging for an answer they really want. They are worried about their manager pulling them up because they’ve not found anything juicy. The CISO needs their shiny dashboard showing number of threats detected.

Teams lose sight of the bigger picture.

Such an approach creates blindspots because people are looking for what they know, instead of considering other possibilities, the negative test cases.

💠 Biased Interpretation

I hate the word “No”.

No, you can’t have an extra slice of cake. No, you don’t look good today. No, you are not the best.

It’s hard to accept something that conflicts with what we believe. So-much-so our brains have developed a coping mechanism of sorts.

Imagine you’ve spent years of research into a particular area of study.

Late nights in the lab trying to uncover evidence to support you hypothesis. Weekends spent fretting over calculations. Months lost scouring obscure libraries.

All to prove the world is flat.

So much knowledge makes it easy to explain away a “no”.

A picture of earth from space.

That’s Hollywood magic at work.

Tides.

Well, “Isaac Newton is said to have considered the tides to be the least satisfactory part of his theory of gravitation”. “Duh!”. [3]

People tend to not change their beliefs on complex issues even after being provided with research because of the way they interpret the evidence.

Capital punishment is another polarising issue, but one that also draws on our moral compass.

In one experiment, a mix of participants who were either in support of, or against capital punishment were shown the same two studies on the subject.

After reading the detailed descriptions of the studies, participants still held their initial beliefs and supported their reasoning by providing “confirming” evidence from the studies and rejecting any contradictory evidence, or considering it inferior to the “confirming” evidence. [4]

We can all be guilty of trying to explain aways why things that don’t conform to what we believe.

“Well, that could never happen. Our firewall will block that type of thing”.

💠 Backfire effective

And we’re a stubborn bunch.

I’ve had some silly arguments in my time. Backing down in the heat of an argument with a partner can be hard at the time, but laughable an hour later.

Politics is a similarly laughable pursuit.

Many people hold an allegiance to the same political party their whole life.

Democrats questioned why people still voted Republican when Trump was on the card, despite of all the evidence questioning the reality of his claims to “Make America Great Again”.

Evidence might hold a strong position in the court of law. In the court of public opinions it’s not so strong.

In fact, not only is it not so strong, it can work against our reasoning! People’s preexisting beliefs are not only explained away when challenged by contradictory evidence, they have been shown to actually get stronger! [5]

All is not lost though.

Whilst one piece of disconfirming evidence does not result in a change in people’s views, it has been shown a constant flow of credible refutations can correct misinformation and misconceptions.

Think about how you disseminate your research.

💠 Biased Memory

Before forensic science became an integral part of the criminal justice system, eyewitness accounts were the basis of a prosecutor’s case.

The problem is our memory just isn’t particularly good. We remember some things and forget others. It tries to link memories together for easier recall, often falling victim to confirmation bias, amongst others in the process.

“Was the car speeding or not speeding, ma’am?”.

“Yes, officer. I heard the engine revving loudly.”

Confirmation bias influences eyewitnesses to make non-factual assumptions.

A revving engine might be linked to speeding in one mind. A mechanic might recognise this as a badly tuned engine, completely unrelated to speed.

Hundreds of wrongful convictions have been overturned in recent years as a result of cases bought solely on eyewitness accounts for this very reason.

The future is strongly influenced by memories of experiences in our past. It’s a fundamental to becoming the best.

Which is great if you’re trying to perfect a free kick into the top corner, but often falls short in many other areas. Like reading the resumes of job applicants.

Oxford University; advance to interview. Likes cats; nope.

In one scenario, individuals were asked to read a woman’s profile detailing her extroverted and introverted skills. Half were asked to assess her for either a job of a librarian or salesperson.

Those assessing her as a salesperson better recalled extroverted traits while the other group recalled more examples of introversion [6]. Their memories told them the best sales were extroverted and vice-versa.

Before long your team talks the same, thinks the same, and dresses the same. They thrive of validating their same outlook on the world.

To quote Eminem; “Would the Real Slim Shady please stand up?”.

Management consultants love to hark on about the benefits of seeing things from a different perspective. And they’re right.

Sometimes a breath of fresh air can give you a new take on security strategy.

💠 Security Gems

Try to prove yourself wrong.

  • Be careful with your research: Read entire articles, rather than forming conclusions based on the headlines and pictures.  Search for credible evidence presented in articles from a diverse range of sources.
  • Prove assumptions wrong: Warren Buffett, one of the most successful investors of our time, is well aware of confirmation bias and one of his first actions before making an investment decision is to seek opinions that contradict his own.
  • Plan for failure: When we understand that our first assumptions will not be correct and plan for failure, we allow teams to find the correct answer instead of going with the simple and easy hypothesis.
  • Data helps, but be careful: Qualitative measures are much better to use in arguments due to their inherent factual nature. However, you should make it clear how data points should be interpreted.
  • Surround yourself with a diverse group of people: Try to build a diverse team of individuals. Seek out people that challenge your opinions, perhaps someone in a different team, or assign someone on your team to play “devil’s advocate” for major decisions.

[1] New Political Patterns
[2] Heuristics and Biases in Diagnostic Reasoning (Baron, 2000)
[3] Earth Not a Globe
[4] Biased assimilation and attitude polarization: The effects of prior theories on subsequently considered evidence (Lord, Ross, & Lepper, 1979)
[5] The Backfire Effect
[6] Testing hypotheses about other people: The use of historical knowledge (Snyder, M., & Cantor, N.,1979)

Security Gems: Using Behavioural Economics to Improve Cybersecurity

This post is a draft chapter from my new book. Pardon the typos.

Subscribe to read new chapters as I write them.

💠 The Isolation Effect

We remember things that stand out in the crowd. But different doesn’t necessarily mean it’s important.

This is a draft chapter from my new book; Security Gems: Using Behavioural Economics to Improve Cybersecurity (working title).

Subscribe to read new chapters as I write them.

💠 Standing out is not such a bad thing

To “stand out like a sore thumb” implies that something is noticed because it is very different from the things around it.

I’m often guilty of being the sore thumb. Dressed in shorts mid-winter, whilst those around are being warmed by five layers of clothing.

One of the factors behinds EasyJet’s success, arguably the pioneer of the low-cost flight, was to stick out like a sore thumb. The companies early advertising consisted of little more than the airline’s telephone booking number painted in bright orange on the side of its aircraft.

“Have you heard of that orange airline?”, people would ask.

Have you ever highlighted information in a book? Then you too have used this effect to your advantage.

Psychologists have studied why our attention is usually captured by salient, novel, surprising, or distinctive stimuli. Probably using highlighter during their research.

Product designers understand our fascination with things that stand out and will spend hours perfecting the size, colour and shape of something to grab your attention, directing you on the path they want you to take.

Good products guide users to the important features and functions by making them stand out.

The big red flashing bell indicating a security alert should be distinctive, drawing attention and making it very clear that it needs to be looked at.

💠 Information overload can make standing out difficult

Being able to draw attention to something in the age of information overload is vital.

An email received from a friend or family member sticks out amid a sea of unfamiliar names.

A letter where the address is handwritten stands out, allowing me to easily filter boring correspondence from correspondence I will enjoy reading.

“YOU’VE WON A PRIZE”

“YOUR ACCOUNT HAS BEEN COMPROMISED”

These email subject lines have a similar effect.

Not only is someone shouting at you, they’re also warning you of a potentially serious event that arouses a sense of urgency.

It’s not your everyday (or hourly); “Sally has liked your photos taken in 2003 on Facebook” email. It’s serious.

In phishing school [2], you’ll find classes titled: How to grab a victims attention.

Successfully grabbing the attention of someone browsing their inbox is the first part of a successful campaign. You should expect the attackers to have aced that class.

💠 Not standing out can be disastrous

Digging deeper into the email inbox, or not as the case may be, it’s clear our brains weren’t designed to deal with mountains of spam.

So called alert fatigue highlights this weakness. People stop noticing alerts, emails, texts, and [INSERT LATEST COOL MESSAGING SERVICE HERE] because there are simply too many.

People become desensitised to similar things being shown to them every day.

I once sat with a client who somewhat proudly proclaimed the “Alerts” folder in his inbox stood at 10,000 unread emails. That was nothing he assured me, his colleagues folder clocked closer to six digits!

You don’t want to foster this culture.

When my fire alarm sounds, my heart rate accelerates as adrenaline is pumped into my blood stream. The noise that stands out. It’s important. It immediately draws all my attention. Yes, even from an oh so cute cat video.

Security alerting needs to have the same effect. To point you to real fires. To prioritise what is most important. Missing critical alerts, emails, texts, or warnings of actual fires does not typically end well.

💠 The art of deception

The ability to recognise and remember things that stand out has long proved advantageous to our species.

As hunter gatherers being able to determine something that stood out was vital in finding food and avoiding becoming food.

Evolution has long realised standing out is a disadvantage.

Chameleons.

The Artic Hare is another great example of the evolutionary importance of blending in.

In the winter their bright white coats hide them from predators amongst a backdrop of snow. In spring, the hare’s colours change to blue-gray in approximation of local rocks and vegetation.

Humans are no different.

Go to a club on a Saturday night and watch the herds of men and women dressed head to toe in clubbing uniforms.

During my college years flannel shirts were the “in-thing”. One night I bumped into 3 other guys, who all had a great taste in fashion I will add, all wearing the same shirt.

Militaries around the world understand the importance of camouflage. Soldiers don’t want to stand out. It’s a matter of life and death on the battlefield.

Neither do criminals.

Actors know downloading terabytes of data in a short period of time will stand out. Instead they slowly exfiltrate data over months patterns don’t stand out.

Malware is designed to act like a user, disguising itself as a normal process on an endpoint.

Yet so much of cyber security is focused on identifying the anomalies.

Sure, anomalies are important. It’s why so many vendors consistently demo that there product proudly detected “3 failed logons, from 3 different locations, in 3 seconds, for 1 account”.

However, the things that stick out, in a world where the bad guys are doing everything they can to stay anonymous, are only part of the story.

💠 Breaking camouflage

In the early days of map making it took a lot of time to produce a map.

Companies had to hire someone to go out and walk every street.

Needless to say, plagiarism plagued the pre-computerised map making industry.

In the 1930’s, General Drafting, a map making company, came up with an ingenious idea. In their map of New York State they included a copyright trap; a fictitious place, Agloe [3].

Fast forward a few years and the company spotted Agloe detailed on a map produced by one of their fiercest competitors, Rand McNally.

Such was the problem, Agloe continued to appear on a number of maps up until the 1990s. I can imagine the disappointed faces of day-trippers, and the ensuing arguments about wrong turns.

These traps have come to be affectionately known as Mountweazels [4]: a bogus entry deliberately inserted in a reference work. Prizes for anyone who spots the one in this book.

Like Mountweazels, honeypots are similar traps used in computer networks.

A honeypot mimics a system that may be attractive to an attacker, but would only ever be accessed by someone snooping around.

Like a motion activated light illuminates intruders attracted by the shiny objects in your house, honeypots illuminates attackers attracted by the shiny potential they offer.

💠 Security Gems

If you want people to remember something, make it stand out.

  • Make the right path clear: If you want a user to take action in a certain way, guide them by making the route stand out.
  • Beware of normal: it’s easy to remember things that stand out, but distinctiveness is not the only attribute you should be worried about.
  • Don’t focus on anomalies: entice those operating covertly into the open. Break their camouflage.
  • Don’t make yourself obvious: Remember, attackers are drawn to things that stand out.
  • Communicate effectively: Make important communications and events distinctive in a way that makes sense. Remove the bullshit.
  • Think about methods of communication: Sending important alert to mobile phone might make them stand out over email alone.

[1] Salience, Attention, and Attribution: Top of the Head Phenomena (Taylor & Fiske, 1978)
[2] Completely fictitious.
[3] Agloe, New York (Wikipedia)
[4] Fictitious entry (Wikipedia)

Security Gems: Using Behavioural Economics to Improve Cybersecurity

This post is a draft chapter from my new book. Pardon the typos.

Subscribe to read new chapters as I write them.