Fun fact; In 1990, John Romkey and Simon Hackett created one of the world’s first “Internet of Things” devices: an Internet-connected toaster. Using TCP/IP and SNMP, they could turn it on remotely and later even added a robotic arm to drop in the bread. Branded as the “Sunbeam Deluxe Automatic Radiant Control Toaster” it certainly sounds like something straight out of The Jetsons’ promise of “the world of tomorrow, today.”
However, the real-world manifestation of this device was far more cyberpunk than classic retro-futuristic: exposed wires, low-level protocols, hacked-together automation, and a toaster doing things no toaster was ever intended to do.
It was gloriously over-engineered, the direct opposite of a “keep it simple” philosophy.
Fast forward to today, and internet-connected devices are everywhere; fridges, washing machines, light bulbs, even toothbrushes. However, digital literacy and basic security training have not kept pace. As a result, the social knowledge of how to handle this technology, especially the security implications, is mostly missing in action.
So the future really did arrive, and it really isn't evenly distributed. Many of us are still trying to catch up and come to terms with it. And some internet-connected devices are frankly baffling in their very existence. Case in point: the internet connected wine bottle.
Let's step back from the gadgets and gimmicks for a second. This post is about our 4th place theme. Training, Knowledge and Literacy.
If you’ve only just joined us, you can catch up on the previous instalment, where we explored how "Risk Perception" emerged as Theme 5, and took a closer look at how the workshop itself ran.
Quickly though, let's recap what the dots mean.
• Green shows that our experts felt that they understood the area and had good evidence to back it up.
• Yellow shows an anecdotal understanding that this topic is involved but less evidence to support the concept.
• Red denotes that this topic is important needs more exploration and discussion.
The votes were tallied from items contributed by our delegates which our lovely analysts grouped and collated these after the event. Let's take a look!

This subject was given a lot of airtime in the discussion. One notable contribution was a ticket that stated:
"52% of working adults can’t do 20 essential digital skills for work."
In telling fashion, it earned a green vote. A quick search today pulls up similar findings, including reports highlighting that many people still struggle with the basics. As one contributor put it: "People don't know how to do the basics, and staying safe online is one of the least understood categories of the EDSF."
Another expert captured the issue even more succinctly with a single word: "Ignorance."
This perspective was echoed across all the groups working on the activity. We talked about the skills gap being driven by digital exclusion. Our delegates were also critical of training techniques used across the industry, approaches that seek to enforce compliance rather than spark curiosity and inclusion.
Training is a two-way street. The content needs to be consumable and meet delegates at their level. However it should also encourage them to explore something new. Not everyone will have an innate interest in technology, and certainly not the niche motivations that lead someone more deeply into digital security. But clearly there is a problem with equipping people to cope in a world where understanding digital security matters more than ever.
Training crops up as a prominent topic in some of the later analysis around "engagement" or simply getting people to care in the first place. I’ll leave the deeper dive for a future article, but you might have already seen a previous post where we explored why so much training fails to land or improve real-world resilience. There is loads more to delve into on this topic.
When it comes to "knowledge and literacy" in online security it's fair to say that both are generally lacking for most people. And this seems closely tied to the "someone else’s problem" attitude we discussed in the last theme. At first glance, the two feel inseparable.
So how does this look through the kaleidoscopic post-modern lens of Jetsons vs Cyberpunk. Ah, did you miss that meeting?
To recap, there's the shiny, effortless future (the world of The Jetsons) where everything is modern, sleek, and seemingly foolproof. Then there's the other timeline, the one that perhaps looks a little more like our own reality: a cyberpunk world of overstretched, time-starved people, constantly hounded by their digital exhaust. It's the backdrop of countless sci-fi novels and films.
Both visions are forms of retro-futurism: the future imagined from the vantage point of the past. They're slightly off-target, of course, but they still capture something important; a nostalgic sense of a utopia we never reached, or a near-dystopia we're uncomfortably close to living in.
In The Jetsons the "future" arrives user-friendly. In the mid-20th century, futuristic technologies were imagined as completely effortless. Push button control, flying cars chauffeur you to work or the golf course, and dinner appears from a slot in the wall in an instant.
That optimism implied we wouldn't need to be trained. Human adaptation would be automatic. It would be intuitive, ready made, almost rehydrated, just in time for use. Learning, understanding and applied grubby manual maintenance work would be secondary concerns, or non-existent.
In reality, we have found the opposite. Every new wave of technology (from automation to AI) has increased the demand for training, knowledge and digital literacy. The ready made sleek self healing technology of fiction rarely survives contact with the real world.
For the cyberpunk network slicer (or in our world, the stereotypical "hoodie wearing hacker") knowledge, and the way they wield it, carries a deep mystique. In a strangely anachronistic fashion, it's usually passed down by word of mouth. It's traded through whispered secrets, shared rituals, and tight-knit cliques.
Mentor figures arise in the fictional depictions. They take the “rookie” off the street and give them a chance to learn their secrets. These mentors are almost portrayed as shamanistic, with deep personal rituals they undertake before the run. Some are artificial constructs of friends who are now dead; a personality backup taken just before the job that went wrong.
These characters know the toll their world is taking on them emotionally and physically. They willingly suffer for it. They hone their skills constantly to react fast enough to avoid getting caught. Once they get the score, they can use the money to forget the moral and legal lines they crossed in the process, but they generally go back for more. Its precisely because their world has become so complex that their skills are valuable, and the temptation to game the system, for reward and thrill seeking validation, is irresistible.
This fact exposes a historical blind spot: we keep inventing complex systems, but we rarely invest in preparing people to live within them. We don’t lower the cognitive burden for the every day average person. New developments and advances seem only to increase complexity, wave after wave.
The technologies shaping our world today are wildly complex and incredibly powerful: AI, biotech, data economies, algorithmic decision-making, always-on surveillance. Seen through the Jetsons–Cyberpunk lens, the lesson becomes clear: training isn't just technical anymore. It's ethical. It's emotional. It's all about facilitating how reason to about such arcane subject matter. Fundamentally it should be focusing on helping people coexist with their tools, not merely operate them.
At first glance, the IT industry isn't doing a great job of that. If the statistics we found earlier are even close to accurate, then we're facing an enormous challenge. In the developed world alone, a huge proportion of people lack the digital skills needed to navigate everyday life, and we can safely assume the gap is even wider globally.
Preparing a nation (or the world) for an "always connected, always collecting" digital environment is a monumental task, and one we've barely started. We keep gathering more data and building more systems, but to what end? And who gets to decide? More importantly: how can the average person make informed decisions when they haven't been taught how?
Next up: Fear, uncertainty, doubt!!
Third up, we have the next prominent theme. Centrally around fear, and the unknown. Stay tuned, this is fun for me to write and explain in my own idiosyncratic way, but I am also deeply aware that it may not be landing with people more generally. Let me know.
The project is ultimately about everyday folk coping with a future we didn't really expect, so we'd love to hear from you too:
👉 Is it making sense yet?
👉 How long does it take you to really deeply understand a new shiny toy?
Drop your thoughts via LinkedIn, use our contact form or send us an email. Get in touch Be my guide!