What’s missing from your Cyber-security Training?

Posted on Oct 03, 2025 by Laurie Ibbs

Fun Fact: Studies in workplace learning show that people forget about 70% of what they learn in training within 24 hours if they don’t actively use it.

This is known as the Ebbinghaus forgetting curve. It’s why “tick-box” cybersecurity training often fails.

Our first workshop for this project was a great event. Everyone enjoyed it, and we definitely all got cake!

The review of the results is done and we are loving what we learned from this. Our delegates revealed numerous areas for concern. When an adverse event occurs the conversation invariably circles back to human error. E.g. Someone responded to the wrong email, reset a password, or misconfigured something.

Blame gets placed at the feet of individuals. Organisations who are capable (and lets not forget, that some do not have the resouces) respond with structural solutions. Annual training or compliance checklists.

This observation came out of conversations between people from various sectors (tech start-ups, charities, consultants, educators) in which we saw a consistent pattern: Cybersecurity behaviours are primarily shaped by individual and social factors, not by the structural or organisational training that dominates current strategies to respond.

We aren't saying that training and compliance are not great strategies! They really can be! But, we all know what eats strategy for breakfast, don’t we?

Culture, or to quote an appropriate definition: "The set of predominating attitudes and behaviours that characterize a group or organization". This is one of our headlines on what the workshop revealed, but lets dig a little deeper.

The social side of security

A recurring observation from participants was that people don’t feel at risk. Whether it’s “we’re too small to be attacked” or “home Wi-Fi is safe,” complacency and misplaced confidence were everywhere. This isn’t about technical literacy. It’s about culture, perception, and influence.

We also heard how social behaviours directly undermine good practices:

  • Sharing passwords to “help out colleagues,” or because its normalised.

  • Oversharing on social media, sometimes without realising how it exposes us.

  • People-pleasing behaviours, like customer service staff giving away more information than they should.

(Side note; avoid these things!! πŸ‘†πŸ»πŸ‘†πŸ»)

These are not simple “errors” but the product of social influence and cultural norms. Re-training these learned behaviours is an up hill struggle precisely because culture is so difficult to direct positively.

Training a "Tick-Box" exercise

Participants described most organisational training as passive, irrelevant, or purely compliance-driven.

  • Training is often a “tick-box” activity - something to rush through rather than absorb.

  • “Free” training can be seen as having limited value (even though the reverse may well be true)

  • Certification and compliance are often pursued to win contracts, rather than building a culture of security awareness.

In other words: the why of training doesn’t resonate with individuals. Without cultural reinforcement, awareness campaigns or annual training don’t change behaviours. And so, we never escape the “Ebbinghaus forgetting curve”.

Fear, Blame, and Avoidance

We also heard that fear drives disengagement, and in the worst cases causes inaction. We become frozen in the headlights, unable to respond.

People are afraid of accountability and of making mistakes, so they would rather not engage. We frequently noted claims of diminished responsibility are a big part of this discussion. Meanwhile, organisations often percieve digital security as “an IT problem”. 

The result: a culture where people don’t feel supported, security feels like an obstacle, and individuals avoid responsibility.

What We’re Really Seeing: A Cultural Gap

When you step back, the pattern is clear:

  • Culture is absent. Digital security isn’t embedded in daily routines or values.

  • Social drivers matter most. Norms, trust, and perceptions of risk are shaping behaviours in ways which can be unhealthy.

  • Training is failing. Current approaches are not equipping people to apply skills meaningfully.

If we continue to frame breaches as human error, we will continue to miss the real issue: the cultural and social environment in which those errors occur, becuse of a lack social approaches to support the individual.

Towards a Healthier Culture

So where does this leave us? The workshop discussions point towards some shifts we need to make:

  • Move from policy-driven to culture-driven approaches. Without reinforcement, we risk reverting to type.

  • Build training that is engaging, role-specific, and socially aware. Awareness alone does not create action.

  • Treat cybersecurity less as compliance and more as collective wellbeing - like wearing seatbelts or eating healthily.

  • Recognise and research social influence: how trust, peer behaviour, and cultural norms actually shape security practices, and determine how we can tap into that.

Until we build a security culture that acknowledges the human and social dimensions, training will remain an expensive tick-box exercise - and people will remain the so-called “weakest link.”

How do we get there?

This is far from clear at the moment, but we hope to start getting some clues in the next part of the project. We are already hard at work trying to come up with a novel approach that brings more clarity and focus to cultural adoption of good practice in digital security and online safety.

However, I see massive parallels with other "safety" culture work. We can maybe take inspiration from the Dupont Bradley curve, and develop a similar model for the security world. Developed in 1995 by Berlin Bradley, a DuPont employee, the model illustrates the relationship between an organization’s safety culture and the occurrence of workplace accidents.

A similar online safety and security model could map adoption of security culture to occurence of information breaches. In terms of getting global acceptance for such a model we will need change management programmes including individual coaching and mentoring, as well as re-inforcement, to help the best practice stick.

Lets not underestimate. Technological advances may not help us here. This is a human challenge!

Next Episode! Dupont battles the CyberCAKE and a new adversary joins the fray. I am going to be King of the Pirates. 

(I am sure no one reads this far, lets see!)
 
Well, its that time where we probably should share more from the output in terms of details. So look out for our next post. It will be more about the full breakdown of the workshop, and maybe talk about how we generated the various observations. I got some lovely visualisations to show you.
 
And because this project is ultimately about people, we’d love to hear from you too:
πŸ‘‰ What do you hate about training?
πŸ‘‰ When you think back on great workplace culture, what was it that enabled it? How did it get that way?
 
Drop your thoughts via LinkedIn, use our contact form or send us an email. Your feedback might even help shape our future workshops.
 
 | Tags: training, culture, workshop, onepiece quotes

Partners

...
...
...
...
...
...