Mitigating the Impact of Anchor Bias in Cyber Security Planning

Barbara Vibbert
Manager, Solutions Engineering
Automatic Summary

Understanding and Overcoming Anchor Bias in Cybersecurity

Thank you for joining us today to gain more insights into the scope and impact of anchor bias in the cybersecurity sector. This post draws on a diverse range of experiences, studies, and anecdotal evidence. My name is Barbara Vibbert, a seasoned cybersecurity professional with over 20 years of experience, and I'll be your guide through this intriguing topic. If you have any queries, please feel free to drop them in the comment section below.

Firstly, What is Anchor Bias?

Anchor bias is a type of cognitive bias where the first piece of information received is given greater weight than subsequent information – even when the following information brings greater detail and authority. It impacts a wide range of sectors, from car shopping to sports, appliance shopping, and more importantly, cybersecurity.

Anchor Bias in the World of Cybersecurity

In my extensive career in information security, one thing has become evident – the impact of anchor bias is noteworthy. The first major incident, like an outage or breach, heavily influences all cybersecurity decisions moving forward far beyond its initial usefulness. Overreliance on specific tools like firewalls, incidents in the news, and the increasing hype around artificial intelligence (AI) can all manifest as anchor bias, potentially hindering effective cybersecurity.

Addressing Anchor Bias in Cybersecurity

If left unchecked, anchor bias can lead to severe impacts like overspending, undervaluation of risks, and reputational damage. Fortunately, there are effective strategies that can minimize its effect on security planning and response. These include:

  • Diversity: Engaging with diverse teams with differing backgrounds and experiences can dilute the impact of anchor bias in cybersecurity.
  • Data: Data and analytics enable firms to make informed decisions that are less influenced by cognitive biases.
  • Process: Implementing a structured, documented cybersecurity planning process is another way to minimize the impact of anchor bias.
  • Practice: Regularly updating cybersecurity plans ensures they never become stale, reducing the chance of falling prey to anchor bias.

Closing Thoughts

In conclusion, anchor bias is an elemental part of decision-making processes, even in the sophisticated world of cybersecurity. However, with proper understanding and strategies like diversity, data, process, and practice, its impact can be significantly minimized, leading to optimized results.

This post only scratches the surface of the topic. If you have any relevant experiences, opinions, or queries, feel free to share them in the comment section below. Your input is highly appreciated.


Video Transcription

The hour and we do have a hard stop. So I'm gonna go ahead and get started. Thank you so much for attending uh my presentation today.If you have questions during the presentation, please post them in the Q and A or hold them to the end when we'll have time for Q and A. My name is Barbara Vibert and I'm a manager of solutions engineering at Sonicwall. I've been a cybersecurity professional for more than 20 years. I set up my first wireless network in the desert in Saudi Arabia during Gulf war one where I was an Arabic speaking voice intercept analyst. My team searched for enemy radio signals and performed initial transcription, translation and analysis.

The purpose of the wireless network was to collect lines of bearing which were then triangulated to produce coordinates which we passed on to division artillery to continue to continue the mission by other means. Since then, I've been an information security architect for a large academic medical center on the west coast and an information security manager for a large uh hospital group on the east Coast. In that time, I've seen the impact of anchor bias repeatedly. The first big incident, whether it's an outage or a breach colors, all cybersecurity, uh decisions moving forward and disproportionately affects decisions about mitigation strategies. Well, past its utility, all humans have biases and it's not necessarily a bad thing. Cognitive biases are mental shortcuts that allow us to make decisions rapidly. Sometimes cognitive biases produced inaccurate conclusions. But most of the time they're reasonably reliable. Let's just say the humans who didnt see tigers in the tall grass. Well, theyre probably not our ancestors.

Anchor bias is a type of cognitive bias where the first piece of information received is given greater weight than subsequent information. Even if the subsequent information is more detailed and has greater authority. Shopping for cars is often fraught with anchor bias. Let's say you have $18,000 for a car. You go to the showroom and the first vehicle you're shown is $50,000. Wow, that's a lot of money. The second vehicle you view is priced at $30,000. That's still a lot of money. He has to see another car. The last car you la look at is only $20,000. What a great price. But is it your budget for a car is $18,000? The price of the first car you were shown is anchoring your perception of what is a reasonable price for a car. An interesting analysis published for the proceedings of the Human Factors and Ergonomic Society for the 2019 annual meeting. Observe the behavior of red team members who were told in advance of their engagement that the target network might include deception. That is to say some of the host of the network might not be real. Red team is a security term for a group that performs approved penetration testing. Even though the red team's objective was to find and exploit real hosts, compromising and ex filtrating their data. The team became uh fixated on determining host validity. They spent more and more time on validation and less time on exploitation. Practically, this meant the red times spent more time just doing things which allowed the blue team more opportunities to detect and deter them.

Other research has shown people to believe high model numbers on appliances indicate better quality or more features. Even sports aren't exempt. All of the things being equal. Researchers found that people assumed players with higher numbers on their jerseys would perform better.

I'm gonna ask you to do an estimation exercise that Amos Tversky and Daniel Kahneman, two of the most influential figures in behavioral economics used to illustrate anchor bias. I'm about to show two multiplication problems, use the next five seconds to estimate the correct answer to each problem if you want to uh please share your answers in the chat. OK, time's up. The correct answer to both equations is 40,320. However, in this study, the median estimate for the first question was 2250. And the median answer for the second equation was 512. The order the numbers are presented can create an anchor bias that affects our capability to estimate accurately or even realize that two equations are the same.

Now, let's look at some examples of anchor bias in cybersecurity. I'll start with an example from my own experience. I'm a network security specialist and my first real network security tool was a firewall. Firewalls are great and they can do a lot to improve an organization security posture.

But they are far from the be all end, all of information security and over reliance on firewalls can leave huge gaps in protection. I have to remember, I have a whole kit of tools, not just my trusty firewall. Another common anchor in cybersecurity planning is a breach or incident when a risk is realized it is traumatic to an organization psyche. There's a wound and everyone can become desperate to make sure it never happens again. Even to the point of ignoring other risks whose probability or impact impact is greater cybersecurity decisions could also be improperly influenced by something that makes the news like a splashy breach at a major corporation or something executive reads in a magazine or hears at a conference.

The probability and impact of realized risks differs among organizations and splashy news headlines can pull decision making emphasis inappropriate. Speaking for the news right now, there's a lot of hype around artificial intelligence and machine learning. There are valid concerns about making sure A I provides true information and does not reflect the biases of the humans who programmed and trained it. There's also a lot of hyperbolic articles about the miracles A I can allegedly perform. So in one company, a decision maker bans the use of A I which could adversely impact their ability to remain competitive in another company. The CTO becomes obsessed with the possibilities A I presents and demands security decisions be heavily biased towards tools and solutions that incorporate IA I.

When anchor bias pervades an organization's security flooding, the impacts can be severe. The emphasized risks are left unmitigated spending decisions are biased towards the anchor and may result in overspending or underspending relative to threat. Risk and un unmitigated risks are exploited resulting in theft and reputation damage or key strategies can help organizations minimize the impact of Anchor bias on security planning and response. They are diversity data process and practice. Let's take a look at each of these strategies and see how they can help an organization minimize the impact of anchor bias and make better cyber security decisions. Diversity is the most impactful strategy an organization can implement to minimize the impact of anchor bias on cyber security planning.

Different people with different backgrounds and experiences have different anchors diluting the pool of anchors diminishes each anchor's pull. There's plenty of research showing diverse teams are more effective at problem solving and decision making than homogeneous teams. Cyber security decisions are typically made by representatives from it, staff risk management. And if you're lucky, legal in compliance, these voices are important but they're not sufficient for effective cyber security planning. They can crowd out the other voices to avoid anchors and plan for contingencies. Broaden the team.

Executive sponsorship is important to security. The participation by executives and leaders is more impactful. Leaders have insight into organizations goals and strategies that illuminates the bigger picture and can help put anchors in perspective. You must include people who understand an organization's processes.

These folks know how and why things are done the way they are failing to include people who understand the processes, results in processes and assets being undervalued or even unknown, the business process. People let you know how the business breaks when it breaks, but real change starts to happen when end users are included in the cybersecurity planning process. Now for it, professionals and users can seem like the bane of our existence. My father, a retired military intelligence officer makes a joke about enlisted soldiers that kind of applies to uh end users as well as well. Put one in a dark room with an anchor and a box of meals ready to eat. Come back three days later, the meals will be gone and the anchor will be broken. If there's a way to break something, end users will find it. Think about that statement for a minute. If there's a way to break something, the end users will find it. End users aren't the enemy, they're our secret weapon. End user will provide the easiest and most efficient way to get their jobs done. If mechanisms exist to circumvent security precautions, they know how to exploit them more important. You know why the precautions are being avoided.

Engaging end users give cybersecurity planners the information they need to ensure the easy way is also the right way. Data and analytics are also effective tools for combating anchor bias, anchors form in all parts of our brain. Recall, the exercise asked you to do where you estimated the answer to two equations. It wasn't really a fair request. If you'd had more than five seconds, you would have seen they were the same equation with the numbers ordered differently. If I'd given you a little more time, you could have actually done the math and known for certain what the answer was taking the time to gather hard data and perform analysis provides better outcomes and diminishes the impact of anchors. Sometimes more information is needed than is found on an infographic. Take the time to dig in and discover the truth. The process you need to treat cybersecurity planning like any other process that is essential to your organization's operation documented. So why do businesses define their values?

They do it? So when the hard questions arise, they can look to those values to help them make consistent decisions. Cyber security planning works the same way knowing what matters, guides, hard decisions that must be made quickly when breaches or incidents occur. Integrate cybersecurity into normal operations.

I've worked at organizations where the cybersecurity folks were known as the CEO S of no. As a result, they weren't included in business process discussion. There was a joke that if he wanted to kill a project, get it security involved. The first encounter with it security was negative.

It was negative and it adversely impacted cyber security which adversely impacted business operations. When cybersecurity is part of business planning, it's integrated rather than a bolted on set of solutions. It it becomes normal practice is the final recommendation I am gonna make for reducing anchor bias in cyber security planning. If cybersecurity planning only happens once a year, it becomes stale just like any skill you have to practice. Since anchor bias can be formed by events, the news or an executive's influence, use those low low impact events like media reporting on breaches or new technologies to practice cyber security planning. Finally, I'm gonna use an experience from cybersecurity operations to illustrate how effective practice is at reducing anchor bias at a hospital where it schedules were largely dictated by a change advisory board. I needed to make changes to their firewall and those changes were gonna require a reboot.

The last time anyone had made changes requiring a reboot on those firewalls, there was a failure and it took nearly an hour to get everything working again. The mitigation implemented after the failure was to put an H A pair in place. An H A pair is just two firewalls where one's up while the other is rebooting. When it was time to make my change, everyone was on edge the entire it staff was on standby, the change went fine but the next time I needed to make a change on that firewall, I got the same amount of pushback. One successful change wasn't enough to combat the anchor bias from the outage. So I submitted changes for every Tuesday night to test fail over on the firewalls and every test went fine. After two months, the change advisory board told me I did not need to bother them with normal firewall stuff anymore. The anchor moved in conclusion. Anchor bias is real. It adverse, adversely impacts cybersecurity planning in a negative way. And the most effective strategies for minimizing the impact of anchor bias in cybersecurity planning in response are diversity data process and practice. Thank you for attending. I'm interested in hearing uh your questions and thoughts.