Support 100 years of independent journalism.

  1. Spotlight
  2. Cyber
25 November 2021

How the Bank of England educates employees about cyber threats

At the UK's central bank, good cyber practice starts with people, says John Scott.

By Zoë Grünewald

As head of security education at the Bank of England, you would think John Scott and his family would be the most security-conscious people in the room. But even the most security-conscious people can make mistakes, Scott told the New Statesman‘s Cyber Security in Financial Services Conference yesterday morning. As he explained, just a couple of months ago, he and his wife realised that she had left her house keys in the front door all night long.

Rather than revoke her key privilege or chastise her for what could have been a monumental security breach, Scott acknowledged that mistakes happen. In fact, Scott admitted, he had done the same thing just a few weeks before.

This anecdote was his opening analogy for his talk on cyber security culture and his role at the Bank of England. His theory: that it is time that companies put the reality of human nature at the heart of their security policies.

Scott argued that cyber security policy should be based around “championing” people to make good decisions, and in order to do this, we need to understand people – what they need and why they make mistakes. As Scott explained, most cyber security approaches across institutions are based on presenting boundaries – implementing “three clicks and you’re out” policies to phishing emails – leading to too many IT departments becoming known as the “Department of No”. But, as Scott pointed out, effective cyber security policy doesn’t work like this.

Instead, it should be a human-based approach. Scott pointed to what he called “cognitive biases” – that is, the way humans process information – to understand why people make mistakes: “We do the things that are like the things that we’ve done before as much as we can, because working out from the core principles every time is tiring.”

Sign up for The New Statesman’s newsletters Tick the boxes of the newsletters you would like to receive. Quick and essential guide to domestic and global politics from the New Statesman's politics team. The New Statesman’s global affairs newsletter, every Monday and Friday. The best of the New Statesman, delivered to your inbox every weekday morning. The New Statesman’s weekly environment email on the politics, business and culture of the climate and nature crises - in your inbox every Thursday. A handy, three-minute glance at the week ahead in companies, markets, regulation and investment, landing in your inbox every Monday morning. Our weekly culture newsletter – from books and art to pop culture and memes – sent every Friday. A weekly round-up of some of the best articles featured in the most recent issue of the New Statesman, sent each Saturday. A newsletter showcasing the finest writing from the ideas section and the NS archive, covering political ideas, philosophy, criticism and intellectual history - sent every Wednesday. Sign up to receive information regarding NS events, subscription offers & product updates.

These biases cause us to make mistakes because, of course, not all tasks are the same. They may be sub-conscious thoughts or reactions like “I have to think fast”, “there’s too much/not enough information” or “what should I remember?”, and this in turn leads to mistakes, slips or lapses in judgement.

Scott also explored the human approach to risk assessment as a contributing factor. Just as people are generally more afraid of sharks than mosquitoes even though, globally, the risk of death is much higher when encountering a mosquito, people often underestimate the risk of cyber threats. Sharks look and feel scarier. In the world of cyber, the bad thing that might happen when clicking a phishing link may not look or feel scary. In fact, you may never see the impact of it or realise it’s happened, as cyber attacks are often sophisticated enough to make it impossible for companies to trace the point of infection.

Content from our partners
How do we secure the hybrid office?
How materials innovation can help achieve net zero and level-up the UK
Fantastic mental well-being strategies and where to find them

Scott also pointed out that we forget that there are people out there who are deliberately trying to exploit employees and rely on our propensity for error. It isn’t a level playing field – there are seasoned, expert hackers, going up against people for whom “cyber security” is just a corporate buzz phrase. Despite this, companies still focus far more of their time and resources on securing the technology and servers, rather than educating and empowering their employees.

So, what does a human-based approach look like? Scott spoke of the need for safeguarding processes, such as requiring sign-off, to limit the likelihood of human error. Secondly, companies should look at their current procedures. Employees should have checklists that they adhere to, but they should also ensure that those checklists are iterative and appropriate for different scenarios.

In this instance, Scott pointed to the hacking of the Bank of Bangladesh’s account with the US Federal reserve – when the first transaction was withdrawn by hackers, there were errors in the code. The Federal Reserve followed the procedure as set out by its checklists – giving the bank an hour to confirm the withdrawal request was correct after initially bouncing it, and then releasing the funds after confirmation. Unfortunately, it hadn’t considered that it was evening time in Bangladesh, an unlikely time for such a withdrawal. If this consideration had been built into their checklist, the transaction may have been prevented.

Finally, and maybe most importantly, Scott advocated for “treating people like adults”. Just as he and his wife had made an honest mistake, there is no need to chastise employees when they do the same. As Scott quoted from security author Lance Spitzner, “humans are not the weakest link, they’re the primary attack vector”. Errors are part of human nature and threat actors know that, so by educating and empowering employees, we limit the propensity for serious breaches. Ultimately, after all this, do you really think they’ll leave the key in the door again?