💠 Optimism Bias

When looking to the future, we tend to overestimate the good stuff and underestimate the bad.

This is a draft chapter from my new book; Security Gems: Using Behavioural Economics to Improve Cybersecurity (working title).

Subscribe to read new chapters as I write them.

💠 In Sickness and In Health

Marriage. It’s a wonderful thing, isn’t it?

In the Western world, the numbers don’t agree. Divorce rates are about 40 percent.

That means that out of five married couples, two will end up in divorce. But when you ask newlyweds about their own likelihood of divorce, they estimate it at zero percent.

Good luck to them!

Optimism bias is sometimes used interchangeably with ‘overconfidence’, and refers to the phenomenon whereby individuals believe they are less likely than others to experience a negative event.

As humans we need some level of optimism, if we went in to marriage thinking it would end in divorce, marriage simply would not exists.

The optimism bias is an intriguing concept that comes with a host of benefits, such as shielding us from depression and ensuring we respond positively to failure.

Sadly, though, the optimism bias in cyber security leaves us overly-vulnerable to cyber attack.

💠 It’ll never happen to me

When I was growing up, there was a kid in my neighbourhood who loved climbing trees. I was always suspicious one of his parents was a monkey.

He’d shoot up them, without a second thought.

Once, thirty metres in the air, a branch broke beneath him. All of us standing below heard the crack. It sounded like lightning, followed by a heavy thud as it hit the ground

Luckily he managed to quickly reach out and grab a branch above, saving himself from a long fall.

Whilst the slip didn’t bring him back down to earth, it did bring him back to reality. It took him the rest of the day to climb back down. And weeks before we saw him up another tree.

The dangers of being overly optimistic or self-confident can often blind us to the very high likelihood of negative outcomes.

When there’s nothing to warn us of our impending doom we get even more reckless.

Drink and drug driving is a massive problem, and is in a large part a result of our unbounding optimism.

“I’ve only had a couple of beers”, offers no solace to the family whose love one has been killed as a result of impaired reaction times.

Nightclubs in Germany came up with a brilliant idea to reduce the problem of their patrons jumping into cars after a night on the tiles; piss screens.

Urinals allowed drivers to steer a car in a video games using their pee. Aim left to go left. Right to go right.

If you’re too slow or swerve too much, that is to pee on the blokes foot next to them, the car would crash. “Too pissed to drive”, the screen would read, along with the number of the local taxi firm.

Again, in life we need moments to peg us back to reality.

When people receive emails they don’t necessarily treat them with the suspicion they deserve.

Far too often, we’re optimistic about the outcome of clicking links, and end up clicking malicious links or opening malicious attachments.

Wether it’s drink driving, or clicking an email. Both can have catastrophic consequences.

Facebook do a great job of warning us about the result of our actions. Click an external link on your newsfeed and they’ll make you confirm the link shown is where you want to end up.

The aim here is to make the negative effects and losses of a certain action clear to the individual, and offer a clear, safer alternative.

Sadly Facebook don’t do this with uploading drunk photos yet.

💠 It’ll happen to them

Now, I’m not advocating we all become pessimists. World economies rely on optimism.

Entrepreneurs need optimism.

Do you ever find yourself in situations wondering “how hard could it be?”.

As an amateur home-chef, I have a particularly bad habit of asking this type of question when dining out. How hard could it be to create a menu? Cook the food? Leave the customers wanting more?

I make a great Pad Thai.

In my town one particular restaurant unit has changed hands five times in as many years. Italian. Indian. Thai. Greek. Italian, again.

It’s not unusual. In some cities, the chance of restaurant failure in the first year can be as high as 90%. That is, nine out of every ten restaurants opened will fail!

Nine in ten! Who would want to open a new restaurant?

Restaurateurs know the numbers, but despite the well-documented failure rates, they often don’t think they apply to them. They might argue their concept is different to the others, their restaurant is in a better part of town, or the cuisine is seeing new popularity.

But do they really have a better chance of success than others trying the same thing?

In the majority of cases, no.

The problem is we don’t know the reason behind the facts. We don’t know a lot about others, but know a lot about ourselves.

We’re optimistic about ourselves, we’re optimistic about our kids, we’re optimistic about our families, but we’re not so optimistic about the guy sitting next to us, and we’re pessimistic about the fate of our fellow citizens and the fate of our country.

This plagues those responsible for creating public health messaging.

One in two UK people will be diagnosed with cancer in their lifetime. But despite the odds most people don’t think they’ll get cancer [1].

38 percent of cancer cases are preventable in the UK. 15 percent of that can be attributed to stopping smoking.

Yet millions of people still smoke, pouring their hard earned money into the pursuit of lowering their health outcomes.

People explain it away. They go to the gym everyday. Other smokers don’t. They don’t drink, like other smokers.

Comparative optimism, where we can’t make a direct comparison, convinces us others are more likely to suffer negative experiences than we are ourselves.

Studies around peoples perceived privacy risks, like unauthorised access to accounts and sharing of personal information, is much more likely to happen to other people [2].

Almost half of all UK businesses suffered some form of cyber security breach in 2020 [3].

Yet companies don’t think it will happen to them.

It’s why we can ignore network security risks while at the same time reading about other companies that have been breached. It’s why we think we can get by where others failed.

Optimism induced invincibility needs to be accounted for, and removed. You are no better than your peers, mostly.

💠 Prevention is better than cure

Skiing. Windsurfing. Rock climbing. These are the kinds of things I love to do on holiday.

Health insurance companies don’t like me doing them. I know this because they charge me a hefty premium for coverage.

Previously I was guilty of questioning if travel insurance was worth the money.

Whilst speaking to the Swiss Mountain Rescue team one Winter, they told me just how much it cost to be evacuated via helicopter. About $100 per minute. And that’s from takeoff to landing.

Perceptions of actual risk can be clouded by optimism. I don’t go on holiday to break a leg, but the chance is pretty high.

It’s not just that we don’t think bad things can happen to us or are more likely to happen to someone else. We–all things being equal–believe that good outcomes are more probable than bad outcomes.

In one study, participants were given a list of 18 positive and 24 negative events, like getting a good job after graduation, developing a drinking problem, and so on [4].

Overall, they considered themselves 15% more likely than others to experience positive events, and 20% less likely than others to experience negative events.

People are more likely to accept risks if they feel they have some control over them.

Here we see the feeling of security diverging from the reality of security.

Controlling for this feeling is important.

We all know someone that has “seen it all”.

Experience often trumps decision making. It offers a sense of security.

But never let it cloud the actual risks, which should be assessed with an eye of experience, but also an eye of fatalism.

💠 Security Gems

You are not invincible.

  • Set a “base rate”:  Take an outside view, meaning we should look at base rates for our estimates as if we are looking at someone else’s chances.
  • Conduct a premortem: before making a decision predict how a project or strategy could fail and then work backward to prevent these issues.
  • Make impending negative events caused by over-optimism clear: Bringing negative events to our mind just before we’re likely to engage in an undesirable act can be a good behaviour change technique.
  • Use positive information motivate: Instead of telling people why they shouldn’t do something, convince them with the benefits of an alternative. Remember our optimism bias leads us to think we’re less likely to suffer negative outcomes compared to others.
  • Beware of feeling secure: Take a risk based approach to security. Best practises are good to follow, but make sure they address the critical issues.

[1] Cancer risk statistics
[2] Optimistic bias about online privacy risks
[3] Almost half of UK businesses suffered a cyber attack in past year
[4] Unrealistic Optimism about Future Life Events

Security Gems: Using Behavioural Economics to Improve Cybersecurity

This post is a draft chapter from my new book. Pardon the typos.

Subscribe to read new chapters as I write them.

💠 Confirmation Bias

We seek out or interpret information that confirms our preconceptions, avoiding things that challenge them.

This is a draft chapter from my new book; Security Gems: Using Behavioural Economics to Improve Cybersecurity (working title).

Subscribe to read new chapters as I write them.

💠 Paying for confirmation

According to the flat Earth model of the universe, the sun and the moon are the same size.

You’ll find credible looking mathematical models that argue the theory. Photographs taken from a plane showing a flat horizon. Queries about how the seas could ever exist if the earth was round.

You won’t find calculations from Eratosthenes who is credited for discovering the earth was round. Photographs taken from space of a round planet. Or mentions of gravity, which holds the water in the seas.

Or does it?

As humans we have a disposition to confirm our beliefs by exclusively searching for information that supports a hunch while excluding opposing data.

Confirmation bias isn’t limited to conspiracy theorists. It causes us to vote for politicians, investors to make poor decisions, businesses to focus on the wrong ideas, and almost certainly led you to buy this book.

During the 2008 US presidential election, Valdis Krebs analysed purchasing trends on Amazon. People who already supported Obama were the same people buying books which painted him in a positive light. People who already disliked Obama were the ones buying books painting him in a negative light. [1]

People weren’t buying books for the information. They were buying them for the confirmation.

I’m in no doubt the people buying this book have a predisposition for product psychology.

Sound like you?

💠 Biased Search for Information

I love the word “yes”.

Yes, have an extra slice of cake. Yes, you do look good today. Yes, you are the best.

Experiment after experiment has shown that people tend to ask questions that are designed to yield a “yes”.

This is also known as congruence heuristic [2].

Google search histories are a good demonstration of the affirmative questions we all love to ask.

“Are cats better than dogs?”

We prime Google that cats are indeed better than dogs. Google hears we have a preference for cats. Google plays ball, listing sites detailing reasons why cats are better than dogs.

“Are dogs better than cats?”

The same question phrased differently produces entirely different results. Now dogs are better.

“Which is better; cats or dogs?”

Or;

“What is the best pet for [my situation]?”.

Would have been better questions. Obviously the answer is always dogs.

Affirmative approaches to reasoning are common in security.

Analysts enter an investigation digging for an answer they really want. They are worried about their manager pulling them up because they’ve not found anything juicy. The CISO needs their shiny dashboard showing number of threats detected.

Teams lose sight of the bigger picture.

Such an approach creates blindspots because people are looking for what they know, instead of considering other possibilities, the negative test cases.

💠 Biased Interpretation

I hate the word “No”.

No, you can’t have an extra slice of cake. No, you don’t look good today. No, you are not the best.

It’s hard to accept something that conflicts with what we believe. So-much-so our brains have developed a coping mechanism of sorts.

Imagine you’ve spent years of research into a particular area of study.

Late nights in the lab trying to uncover evidence to support you hypothesis. Weekends spent fretting over calculations. Months lost scouring obscure libraries.

All to prove the world is flat.

So much knowledge makes it easy to explain away a “no”.

A picture of earth from space.

That’s Hollywood magic at work.

Tides.

Well, “Isaac Newton is said to have considered the tides to be the least satisfactory part of his theory of gravitation”. “Duh!”. [3]

People tend to not change their beliefs on complex issues even after being provided with research because of the way they interpret the evidence.

Capital punishment is another polarising issue, but one that also draws on our moral compass.

In one experiment, a mix of participants who were either in support of, or against capital punishment were shown the same two studies on the subject.

After reading the detailed descriptions of the studies, participants still held their initial beliefs and supported their reasoning by providing “confirming” evidence from the studies and rejecting any contradictory evidence, or considering it inferior to the “confirming” evidence. [4]

We can all be guilty of trying to explain aways why things that don’t conform to what we believe.

“Well, that could never happen. Our firewall will block that type of thing”.

💠 Backfire effective

And we’re a stubborn bunch.

I’ve had some silly arguments in my time. Backing down in the heat of an argument with a partner can be hard at the time, but laughable an hour later.

Politics is a similarly laughable pursuit.

Many people hold an allegiance to the same political party their whole life.

Democrats questioned why people still voted Republican when Trump was on the card, despite of all the evidence questioning the reality of his claims to “Make America Great Again”.

Evidence might hold a strong position in the court of law. In the court of public opinions it’s not so strong.

In fact, not only is it not so strong, it can work against our reasoning! People’s preexisting beliefs are not only explained away when challenged by contradictory evidence, they have been shown to actually get stronger! [5]

All is not lost though.

Whilst one piece of disconfirming evidence does not result in a change in people’s views, it has been shown a constant flow of credible refutations can correct misinformation and misconceptions.

Think about how you disseminate your research.

💠 Biased Memory

Before forensic science became an integral part of the criminal justice system, eyewitness accounts were the basis of a prosecutor’s case.

The problem is our memory just isn’t particularly good. We remember some things and forget others. It tries to link memories together for easier recall, often falling victim to confirmation bias, amongst others in the process.

“Was the car speeding or not speeding, ma’am?”.

“Yes, officer. I heard the engine revving loudly.”

Confirmation bias influences eyewitnesses to make non-factual assumptions.

A revving engine might be linked to speeding in one mind. A mechanic might recognise this as a badly tuned engine, completely unrelated to speed.

Hundreds of wrongful convictions have been overturned in recent years as a result of cases bought solely on eyewitness accounts for this very reason.

The future is strongly influenced by memories of experiences in our past. It’s a fundamental to becoming the best.

Which is great if you’re trying to perfect a free kick into the top corner, but often falls short in many other areas. Like reading the resumes of job applicants.

Oxford University; advance to interview. Likes cats; nope.

In one scenario, individuals were asked to read a woman’s profile detailing her extroverted and introverted skills. Half were asked to assess her for either a job of a librarian or salesperson.

Those assessing her as a salesperson better recalled extroverted traits while the other group recalled more examples of introversion [6]. Their memories told them the best sales were extroverted and vice-versa.

Before long your team talks the same, thinks the same, and dresses the same. They thrive of validating their same outlook on the world.

To quote Eminem; “Would the Real Slim Shady please stand up?”.

Management consultants love to hark on about the benefits of seeing things from a different perspective. And they’re right.

Sometimes a breath of fresh air can give you a new take on security strategy.

💠 Security Gems

Try to prove yourself wrong.

  • Be careful with your research: Read entire articles, rather than forming conclusions based on the headlines and pictures.  Search for credible evidence presented in articles from a diverse range of sources.
  • Prove assumptions wrong: Warren Buffett, one of the most successful investors of our time, is well aware of confirmation bias and one of his first actions before making an investment decision is to seek opinions that contradict his own.
  • Plan for failure: When we understand that our first assumptions will not be correct and plan for failure, we allow teams to find the correct answer instead of going with the simple and easy hypothesis.
  • Data helps, but be careful: Qualitative measures are much better to use in arguments due to their inherent factual nature. However, you should make it clear how data points should be interpreted.
  • Surround yourself with a diverse group of people: Try to build a diverse team of individuals. Seek out people that challenge your opinions, perhaps someone in a different team, or assign someone on your team to play “devil’s advocate” for major decisions.

[1] New Political Patterns
[2] Heuristics and Biases in Diagnostic Reasoning (Baron, 2000)
[3] Earth Not a Globe
[4] Biased assimilation and attitude polarization: The effects of prior theories on subsequently considered evidence (Lord, Ross, & Lepper, 1979)
[5] The Backfire Effect
[6] Testing hypotheses about other people: The use of historical knowledge (Snyder, M., & Cantor, N.,1979)

Security Gems: Using Behavioural Economics to Improve Cybersecurity

This post is a draft chapter from my new book. Pardon the typos.

Subscribe to read new chapters as I write them.

💠 The Isolation Effect

We remember things that stand out in the crowd. But different doesn’t necessarily mean it’s important.

This is a draft chapter from my new book; Security Gems: Using Behavioural Economics to Improve Cybersecurity (working title).

Subscribe to read new chapters as I write them.

💠 Standing out is not such a bad thing

To “stand out like a sore thumb” implies that something is noticed because it is very different from the things around it.

I’m often guilty of being the sore thumb. Dressed in shorts mid-winter, whilst those around are being warmed by five layers of clothing.

One of the factors behinds EasyJet’s success, arguably the pioneer of the low-cost flight, was to stick out like a sore thumb. The companies early advertising consisted of little more than the airline’s telephone booking number painted in bright orange on the side of its aircraft.

“Have you heard of that orange airline?”, people would ask.

Have you ever highlighted information in a book? Then you too have used this effect to your advantage.

Psychologists have studied why our attention is usually captured by salient, novel, surprising, or distinctive stimuli. Probably using highlighter during their research.

Product designers understand our fascination with things that stand out and will spend hours perfecting the size, colour and shape of something to grab your attention, directing you on the path they want you to take.

Good products guide users to the important features and functions by making them stand out.

The big red flashing bell indicating a security alert should be distinctive, drawing attention and making it very clear that it needs to be looked at.

💠 Information overload can make standing out difficult

Being able to draw attention to something in the age of information overload is vital.

An email received from a friend or family member sticks out amid a sea of unfamiliar names.

A letter where the address is handwritten stands out, allowing me to easily filter boring correspondence from correspondence I will enjoy reading.

“YOU’VE WON A PRIZE”

“YOUR ACCOUNT HAS BEEN COMPROMISED”

These email subject lines have a similar effect.

Not only is someone shouting at you, they’re also warning you of a potentially serious event that arouses a sense of urgency.

It’s not your everyday (or hourly); “Sally has liked your photos taken in 2003 on Facebook” email. It’s serious.

In phishing school [2], you’ll find classes titled: How to grab a victims attention.

Successfully grabbing the attention of someone browsing their inbox is the first part of a successful campaign. You should expect the attackers to have aced that class.

💠 Not standing out can be disastrous

Digging deeper into the email inbox, or not as the case may be, it’s clear our brains weren’t designed to deal with mountains of spam.

So called alert fatigue highlights this weakness. People stop noticing alerts, emails, texts, and [INSERT LATEST COOL MESSAGING SERVICE HERE] because there are simply too many.

People become desensitised to similar things being shown to them every day.

I once sat with a client who somewhat proudly proclaimed the “Alerts” folder in his inbox stood at 10,000 unread emails. That was nothing he assured me, his colleagues folder clocked closer to six digits!

You don’t want to foster this culture.

When my fire alarm sounds, my heart rate accelerates as adrenaline is pumped into my blood stream. The noise that stands out. It’s important. It immediately draws all my attention. Yes, even from an oh so cute cat video.

Security alerting needs to have the same effect. To point you to real fires. To prioritise what is most important. Missing critical alerts, emails, texts, or warnings of actual fires does not typically end well.

💠 The art of deception

The ability to recognise and remember things that stand out has long proved advantageous to our species.

As hunter gatherers being able to determine something that stood out was vital in finding food and avoiding becoming food.

Evolution has long realised standing out is a disadvantage.

Chameleons.

The Artic Hare is another great example of the evolutionary importance of blending in.

In the winter their bright white coats hide them from predators amongst a backdrop of snow. In spring, the hare’s colours change to blue-gray in approximation of local rocks and vegetation.

Humans are no different.

Go to a club on a Saturday night and watch the herds of men and women dressed head to toe in clubbing uniforms.

During my college years flannel shirts were the “in-thing”. One night I bumped into 3 other guys, who all had a great taste in fashion I will add, all wearing the same shirt.

Militaries around the world understand the importance of camouflage. Soldiers don’t want to stand out. It’s a matter of life and death on the battlefield.

Neither do criminals.

Actors know downloading terabytes of data in a short period of time will stand out. Instead they slowly exfiltrate data over months patterns don’t stand out.

Malware is designed to act like a user, disguising itself as a normal process on an endpoint.

Yet so much of cyber security is focused on identifying the anomalies.

Sure, anomalies are important. It’s why so many vendors consistently demo that there product proudly detected “3 failed logons, from 3 different locations, in 3 seconds, for 1 account”.

However, the things that stick out, in a world where the bad guys are doing everything they can to stay anonymous, are only part of the story.

💠 Breaking camouflage

In the early days of map making it took a lot of time to produce a map.

Companies had to hire someone to go out and walk every street.

Needless to say, plagiarism plagued the pre-computerised map making industry.

In the 1930’s, General Drafting, a map making company, came up with an ingenious idea. In their map of New York State they included a copyright trap; a fictitious place, Agloe [3].

Fast forward a few years and the company spotted Agloe detailed on a map produced by one of their fiercest competitors, Rand McNally.

Such was the problem, Agloe continued to appear on a number of maps up until the 1990s. I can imagine the disappointed faces of day-trippers, and the ensuing arguments about wrong turns.

These traps have come to be affectionately known as Mountweazels [4]: a bogus entry deliberately inserted in a reference work. Prizes for anyone who spots the one in this book.

Like Mountweazels, honeypots are similar traps used in computer networks.

A honeypot mimics a system that may be attractive to an attacker, but would only ever be accessed by someone snooping around.

Like a motion activated light illuminates intruders attracted by the shiny objects in your house, honeypots illuminates attackers attracted by the shiny potential they offer.

💠 Security Gems

If you want people to remember something, make it stand out.

  • Make the right path clear: If you want a user to take action in a certain way, guide them by making the route stand out.
  • Beware of normal: it’s easy to remember things that stand out, but distinctiveness is not the only attribute you should be worried about.
  • Don’t focus on anomalies: entice those operating covertly into the open. Break their camouflage.
  • Don’t make yourself obvious: Remember, attackers are drawn to things that stand out.
  • Communicate effectively: Make important communications and events distinctive in a way that makes sense. Remove the bullshit.
  • Think about methods of communication: Sending important alert to mobile phone might make them stand out over email alone.

[1] Salience, Attention, and Attribution: Top of the Head Phenomena (Taylor & Fiske, 1978)
[2] Completely fictitious.
[3] Agloe, New York (Wikipedia)
[4] Fictitious entry (Wikipedia)

Security Gems: Using Behavioural Economics to Improve Cybersecurity

This post is a draft chapter from my new book. Pardon the typos.

Subscribe to read new chapters as I write them.