Conversational Security Awareness: Putting Humanity into Your Human Risk Management Program
During the recent SANS Security Awareness: Managing Human Risk Summit 2023, I was privileged to co-present with Perry Carpenter, Author and Chief Evangelist at KnowBe4. Our presentation was focused on putting the humanity into a human risk management program, aka a cyber security awareness, behavior and culture program.
What do we do?
Perry Carpenter kickstarted the presentation by posing an intriguing question: "What is our ultimate purpose? Why did "security awareness" become a thing in the first place?" When we do so much more than “security awareness”, do we need to rebrand our work? If we examine what security awareness professionals do at the enterprise level, our work as security professionals involves much more than just delivering awareness content.
Our domain also involves managing human risk, understanding and influencing human behaviour, and transforming security culture. We now use all of these terms – and more – to describe what we do, and yet for most CISOs and security leaders, it is the term “security awareness” which comes first to mind.
Our goals are to win the hearts and minds of our audience and influence positive behaviours toward cybersecurity and security risks to achieve our ultimate mandate, which is to reduce human (and business) risk. To meet these objectives, it is imperative that we build a well-rounded program that is based on building trusted relationships between the security department and the employees, based on the experiences people share in cybersecurity, and leveraging motivating and engaging content to meet the learners where they are.
When our content is the face and voice of our CISO and team, we must always consider the emotional impact of that content, and our training, events, meetings, and interactions.
In this regard, Perry noted that it is essential to avoid two common mistakes that can hinder our efforts:
Using “us versus them” language about the people at the other side of our content and training. If we fail to use empathy and active listening, then we run the risk of creating enemies and alienating ourselves from our colleagues and partners.
Highlighting humans as “the first line of defence” and “the last line of defence” places excess burden and responsibility on the shoulders of our colleagues, which can backfire and ultimately result in a reduced report rate. Perry encouraged us to think of people instead as “a critical line of defence”, with businesses using a variety of tools and processes to create a holistic and layered defence.
Perry talked through the lifecycle of extrinsic and intrinsic motivations over time, highlighting that the effectiveness of extrinsic motivators diminishes over time while intrinsic motivation shows the best long-term effectiveness. Building an approach to cyber security which takes this into account can help tackle a crucial challenge: how to change perceptions of cyber risk.
Perception is truth
That was the perfect pass for my part of the presentation.
Before tackling the challenge of perceptions, it's important to note that our brains operate differently than computers. We're not equipped with impeccable memories and our interpretation of information shapes our decisions.
The curse of knowledge is a cognitive bias that occurs when an individual who communicates with others fails to disregard information that is only available to themselves, assuming everyone shares the same background and understanding. Security professionals are frequently victims of this bias, resulting in poor communication with employees who are not technology focused. To overcome this barrier, it is crucial that we use language that speaks to the hearts and minds of our audience.
However, if we want to build a compelling narrative, we must understand the learners’ perceptions of cyber risk and security. Perceptions are highly influential in cyber security culture. How people perceive themselves, how they see the security team, and whether they perceive security as an important value and priority for the organization are all paramount factors to consider if we want to understand the security culture of an organization.
One of the most important perceptions is whether people think cyber security is important to the leadership in their organization. When executives lead by example, and their actions match their words, they become the influencers for fostering a positive security culture across the enterprise.
The latest ClubCISO report findings reflect this sentiment:
Most CISOs feel their security culture is an ongoing priority and has improved. Yet only 21% seem to measure security culture. This raises the question as to what is informing the perception that security culture has improved.
Leadership endorsement of security culture is seen as the most important element of fostering a better security culture, followed by a proactive “report it” culture. Perceptions of leadership endorsement of security are more influential than simulated phishing, tailored training, security champions, gamification, or giveaways.
When it comes to behavior change, the perception of ourselves is crucial. Research suggests that the most important factor in security culture is self-efficacy - a person’s belief that they can successfully execute the recommended behaviour to reach the desired outcome.
Self-efficacy is a reliable predictor of cyber-security intention and behaviour, and interventions that seek to improve self-efficacy among people are more likely to yield positive results than those that merely stress the cybersecurity threat. Raising awareness of threats is fundamental, but it will only take your cyber security culture so far. As we cover in our guide to cyber security culture, the true game-changer is empowering people, raising their confidence, and giving them the tools and guidance to practice more secure behaviors.
Of course, a lot of this comes down to professional skills – communication, empathy, and how we help people develop their self-efficacy. The importance of these skills is recognised more than ever in security, but does the industry truly value them as much as so-called ‘hard skills’? Why do we continue to call them "soft skills" when most people acknowledge they are among the hardest of all to practice?
These human/professional skills are also crucial in overcoming the curse of knowledge, which is where I opened and something we are all susceptible to. The illusory superiority cognitive bias, also known as the Lake Wobegon Effect, is closely related to the curse of knowledge. According to social psychology, illusory superiority is a condition where a person overestimates their own qualities and abilities in relation to the same qualities and abilities of other people. A classic example relates to driving: most people rate their driving skills as better than average (which is impossible: the majority of people can’t be better than average at anything...).
In security, this translates into a false sense that we would never get hacked or be susceptible to social engineering. Of course, we are just as susceptible – if not more so – because of that over-confidence.
Speaking at the SANS Security Awareness Summit was a blast. All speakers took the discussion about managing human risk and empowering our colleagues and communities a step further. Individuals and organizations will learn a great deal by watching the on-demand recordings when they are available.
In the meantime, check out our Ultimate Guide to Cyber Security Culture, where you can learn how to understand, measure, and transform your security culture. You can either read it online or register to receive it in your inbox as a resource to keep coming back to.
I have delivered over 80 keynotes about the human side of cyber security in more than 25 countries, to organizations including NATO, World Government Summit and RSA. Find out more and get in touch to book me for your next event.