Nigel Cassidy: Watch out, cybercriminals about!
But it's not so much your technology as your people who are your greatest risk.
I'm Nigel Cassidy and this is the CIPD podcast.
Now, you might be thinking this podcast about cybersecurity is probably one for the IT Department or your Security Officer if you have one. Well, think again, because in 9 out of 10 cases where cybercriminals come knocking, it's the organisation's own workers who unwittingly open the door. So, given that HR isn't by tradition a cybersecurity guardian or a risk manager, how can it ensure that people take more care and maybe even spot or head off a financially devastating attack?
With us, a trio of cybersecurity experts with a deep knowledge of this "people risk" and how to combat it.
Tim Ward is co-founder of ThinkCyber. A security and IT specialist, he applies behavioural theory to transform people's security awareness. Hello Tim!
Tim Ward: Good morning.
NC: Tarquin Folliss was a long-serving senior diplomat, focussing on national security policy. He's now Vice-Chairman of the "Security Awareness Special Interest Group" which is in "Industry for Leadership", a networking forum. Hello Tarquin!
Tarquin Folliss: Good morning.
NC: And Shelby Flora is MD of Accenture Security, with a clutch of Fortune 100 clients. She's driven by a keen interest in the intersection of humans and technology. Hopefully mainly humans today. Hi Shelby!
Shelby Flora: Hello.
NC: I mean threats to steal data or money or to bring an organisation down are just so many and varied, aren't they? I mean, it could be hacking your social media, getting passwords, phishing, sending bogus emails asking for information, ransomware demands, hijacking files or this thing I struggle to get my head round sometimes, distributed denial of service attacks which flood your website and then try to extort money. Now, from what I've read, most incidents, I mean the vast majority. are human enabled. I mean, people are supposed to be your strongest asset but it looks to me like they're your weakest link.
TF: That's part of the problem that we talk about, and I think Tim will probably go into that in more detail. But, I mean, that is an issue about the culture of organisations and how they look at their people, so I don't think talking about people as the weakest link is probably the most positive way of looking at it. You have to look at how you can make people more resilient. You know, inevitably, the crimes that are committed against organisations are committed by human beings and they feel most comfortable targeting human beings. They are the easiest access point, rather than trying to hack through technology which is actually quite time-consuming and expensive.
SF: Yes, I think it's also interesting to, kind of, understand the history of the human attack surface. Cybersecurity wasn't really on the top of anyone's tongue 10 or 15 years ago but it very much is so now, you know, the headline breaches are all too frequent now and oftentimes they have been attributed to some sort of human fallibility, whether it was malicious or negligent in intent. And we saw that number rise drastically, almost like an exponential curve, through the last many years. However, it has been interesting to see that that number has slightly gone down in the last two or three. It peaked at about 90, when you look at the Horizon breach report or the World Economic Forum analysis. It did top out at near 90%, which is really worrisome, but it has slowly gone down, it was 87%, 85. But still very disproportional to where investments are made in the cyber-resilience of an organisation, less than 5% of budgets are actually spent on shoring up the human firewall. So, it is something to be very much aware of and paying attention to in figuring out ways to shore it up for an organisation.
NC: So, Tim, from your experience as a security practitioner, how would you assess the state of security awareness if you like and the importance of the culture of the organisation in whether it gets sorted?
TW: Yeah, I mean, culture is fundamental to this, and I suppose that's where the HR side of this comes in. I think, our traditional approaches to tackle this have not been that effective because we've thought of this as an education and training issue and I suppose initially it was because people really didn't know anything about this.
So, we've turned to e-learning as the traditional solution. I think now, generally, people do understand some of the threats and e-learning once a year is obviously not quite keeping up with the fact that those threats are evolving constantly. So, we need to think of new things, and, unfortunately, the industry landed upon quite a clever thing, which was to run these simulations and phishing simulations and to test people, and essentially to trick people which created a really useful measure, and cybersecurity's quite hard to measure, and so it created this fantastic measure where you could tell your bosses that the click rate was this number and look, all the controls we're now putting in and we're reducing that click rate. Unfortunately, that's got us into a little bit of a consequences culture where I think we're almost using this to blame people and say, "look, people are stupid, they're clicking on these links". Now, 9 times out of 10, most of us come to work and our job involves clicking on links, so if we, every now and then, get caught out by sometimes very sophisticated emails, then that's naturally going to happen.
So, I think, it's worth stepping back and thinking about culture and where the whole organisation has a part to play.
And the important thing now, I think, is that everyone feels safe (psychological safety), they feel safe to report, to ask questions and say "look, I don't understand this or I think I've made a mistake - help". And actually, cybersecurity teams need that because the sooner you hear about a problem happening, the more time you've got to try and do something about it.
NC: This is a very powerful point, Shelby, that Tim is making because the reality is modern cybercriminals have mastered these psychological techniques to manipulate us to perform a specific action. I mean, it is a bit much to put it down to somebody's gullibility, isn't it? I've heard there are even corporations that have a zero-tolerance policy, you know, one lapse breaking the security rules and you're out. So, I mean, you're not going to admit to anything, are you? You're just going to cover it up and hope it's all right?
SF: Yeah, I really do think this attack factor, the human firewall or the human attack surface is due for a bit of a renaissance, as both Tarquin and Tim were alluding to.
NC: Sorry, a "human firewall", this is a thing?
SF: Yes, it's part of that flip in lexicon, but also in sentiment as well, because when you think of organisational culture, you, you , it always needs to match how the organisation is wanting to position itself in the market and as a lot of organisations have gone through this paper-digitisation over the last decade, it is part of the day-to-day. And so, when we're working with organisations, we also like to frame this skill-building, just as you would expect your leaders, you know, management and above, to have this as acumen. To make the sound decision for the business.
Cyber acumen is, quite frankly, the new business acumen and everyone has a role to play. Now this will vary, executives need to be making sure that all the right regulatory obligations are met in this regard and all the assurances are in place, but managers need to be savvy enough to ask the right questions of their teams when they're looking to bring new technology into the environment, as well as just the day-to-day awareness that the bad guys and gals are looking for a way to break in to do the mischievous activities. And oftentimes, humans, when we're knocked out of homeostasis, tend to be susceptible to that. So, I think the reframing of the "human firewall" and skilling as part of being a digital company is really where we're going to get some momentum over the next couple of years.
NC: But, of course, in spite of precautions, this happens to organisations when they don't expect it. Tarquin Folliss, I know people in your forum may have come to you because of some kind of an incident. Can you just talk a little bit about the repercussions for organisations, particularly the people issues that maybe people haven't expected. I mean, there are obvious things you have to do, like shutting down servers and changing passwords and all that, but there's a lot of other issues that will suddenly arise.
TF: I was once asked, by a journalist, a question, which was "How should the IT function view cybersecurity?", and I wrote back saying "If you're thinking about it just as the IT function, you've got a massive problem in your organisation. This is something that impacts across the whole piece and, in fact, it's very much a Board issue. So, when you talk about an attack taking place or an incident taking place, actually for a large number of organisations, they don't realise they've been attacked until quite a long way into the incident itself and if they're not prepared for it, if they haven't done the training, the exercising etc., the running through playbooks, it can feel an enormous pressure on them to respond and that's when mistakes happen so, the impact upon an organisation can be financial, it can be reputational. It depends on what has been taken, it actually depends on whether the organisation can identify what's been taken or how it's been properly attacked and what further vulnerabilities exist.
So, it can be incredibly disruptive and we tend to think of these incidents as taking place and being, you know, when we train and exercise, we do it over a day. The incidents themselves to resolve, can take months to resolve and sometimes they cannot be resolved at all. If we look at Solar Winds, for example, we know there are vulnerabilities in one of their software, but companies are still using their software because it's ubiquitous and because they can't get rid of it.
NC: And, Tim Ward, they can be unexpected, when an organisation is using a third-party provider, a managed service provider, somebody who's basically looking after the IT for them, this third-party is hacked. They kind of feel that it's all out of their hands and that they had a vulnerability they didn't even know about.
TW: Yes, I think, third-party and supplier risk is becoming a very big topic in cybersecurity at the moment because you can work really hard to secure everything that feels like it's in your control, but then obviously your suppliers are not in your control. I think it was Target that was probably one of the first ones that brought this to people's attention. So, when they were hacked, it was actually an air conditioning company that was hacked. So that creates a massive risk for organisations and one that's quite hard to manage.
I suppose the starting point there is to try and flow down some of your approach to security into those organisations, but that can get really difficult if you're talking about very small micro-organisations who are your suppliers, because not only could there be lots of them but they're very small and they haven't got the same sort of money to have the same sort of protections that you might have as a larger organisation.
NC: So, Shelby Flora, you go into an organisation, you start working with them. What's your starting point? Again, particularly with HR and the people side of the business, to get to grips with how they are currently responding to cyber threats and how they can change?
SF: One of the first things that when we're working with clients is that we want to make sure that the value of investing in the human firewall is known. And so yes, there's the avoidance of "bad" and the avoidance of a breach and all the timing costs that that can incur, but there's research including, some research that the World Economic Forum/Accenture recently performed, that organisations that embedded cybersecurity not only win leaders with the developer community and have the culture of cyber being part of the day-to-day, actually outperform their peers. They add to the top line. So, it's not just an avoidance of "bad", it's a positioning of "good". Oftentimes when organisations have landed, is they're just focussing on "be smart, don't put down a phishing email", versus how as a leader, whether you're manager of a developer community or manager of a customer channel's cyber and being aware of it is actually in the best interests of the organisation.
So, when we speak with our clients, we usually work in though 4 different dimensions. So, yes, there is the culture and the behaviours that you need to instil within an organisation. You need to break it down by persona, because what Joe Bloggs needs to do in finance is much different than manager Susan than developers versus exactly who's looking after a particular business unit. So, you work with them to understand what behaviours they're actually looking to change or they need to have within an organisation. Then it's about the skilling piece of it, which is, yes, you can launch nice-looking e-learning for the general workforce but tailoring it to the needs of the particular groups.
So, we have an adult-beverage brewing company that we were working with, and they had the ambitions to be the most connected brewer in the world and they knew that their own vestigial transformation and they knew that cyber was a part of it. So, yes, there was a workforce layer and we were trying to educate about, you know, "don't put USB sticks in airport slots" and be aware of spook phishing and spear phishing. But for the leaders it was, "here's how you need to respond in the case of an incident and you are the captain of that ship". So, it's working through the various dimensions and tailoring it per group and then doing behavioural changes of approach to get there. Because, just launching nice-looking e-learning at them is highly ineffective and quite costly.
TW: I think bringing in the people side of this or the HR-side of this into it is really important. So, one of the conversations I have quite a lot with security practitioners is "how can I do my job if the rest of the organisation thinks that this is just an IT problem?". And so, when Shelby's going into an organisation, if that disconnect exists, it can be really, really difficult to help the organisation understand that this isn't just a technical problem. Because, at the end of the day, the assets that are at risk and the problems that are going to happen are business problems and those assets can be damaged by, I don't know, by flood or earthquake, and we're saying that they can be damaged by a technical issue happening, and so the governance and the structures the business puts around managing that should be normal business structures if that makes sense.
And I think that's why there's a really important people angle to that, of helping people understand culturally, that the culture needs to value security, needs to understand security, and there's quite a lot of debate in this kind of "do you need to build a certain security culture?" And some people say "oh, you can't change the culture of an organisation, it is what it is" and so a kind of a spin on that is to say, "OK, well that culture now needs to start to add on this "valuing security".
And what can be really effective is if you can tie valuing security into the overall mission of the business.
And I think that can work well with some businesses. For example, in the NHS, everyone cares about patients and patient safety and the outcomes for the patients. Well, them losing their data or them not being able to get their operation because of a cyber-attack, that's about the patient too. So, if you can pin the security aspect to the culture and to the main mission of the organisation, then that helps everyone in the organisation understand quite how important this is and the part they've got to play in it as well.
TF: One of the big, big issues for most organisations is trying to get their head round what cyber is. Because it's virtual and nebulous, it's very difficult for people. I've dealt with some of the cleverest people I've come across and they've really struggled to understand what this means. As Tim and Shelby have pointed out, you've got to put it in a business context. If you don't understand cyber in the context of business risk then, as an organisation, you're going to really struggle, and that's probably the most important point to make in this podcast in many ways.
NC: And of course, Tim Ward, the big thing that's changed everything is that so many more people are working at home, if not all the time, then some of the time, and lots of platforms have emerged to try and accommodate that, with, I expect you're going tell me, a lot more risk.
If organisations figure out these things, look at it from a business risk perspective and talk about, well, what does a cyber incident hack do to my ability to deliver my mission, then that's probably quite a long way to resolving the cultural issues about how an organisation deals with cyber.
TW: Well, I think what's interesting about that, what a lot of security awareness practitioners realise, is that it needed us to start educating people on thinking about risk because we taught them about all these risks in a very specific manner, 'cos they were all based at work, and suddenly, they're at home and they, and they ... people need to be equipped to think about risk because things like "Am I being overheard out the window". I remember in the first lockdown, it was really, really hot and potentially people... everyone was working from home, you could be overheard out the window, you could be overheard through a wall if you lived in a house with not very thick walls. And if you work in certain organisations, that's quite a significant security risk. And so, if you've only trained people to think about very specific risks, then, and not to think about just "how do I evaluate this situation I'm in and think about risk?", but there's also quite a lot of psychological factors that come to play, so we're all a bit more relaxed potentially, ah, we haven't walked through the doors, we haven't walked past security guards, we might not be professionally dressed, we haven't got a lanyard on and so we're perhaps not thinking in such a secure way. We might be allowing our kids to use our device to go to certain pages so yeah, it has created lots of other security risks and perhaps one of those risks is that slightly more relaxed attitude: you're not thinking in quite such a switched-on way about, about the threats out there.
SF: Yeah, I think also, with remote working, that was obviously spurred on by the pandemic it's a little bit of a double-edged sword. So, a firm - the technologists - like, oh, my goodness, you have all these devices for remoting into the network, you know, whether they're the technology controls ... but it actually was really beneficial from the education, from a general cyber hygiene perspective. A lot of organisations started pushing out information and training on the hygiene of making sure, rebooting our computers on a regular basis where you're normally, you're forced to do that in the office, coming in each day, making sure routers are up-to-date and encrypted properly. It's like, you can't really brought it home with times cyber-hygiene into our personal lives as well, which then creates more of an attachment to a... This is just the way the world works and this is how we stay safe and secure, and not only work devices but personal devices, there's this kind of like easy hygiene things that we can do, so, there's a little bit of a ??? that occurs and, fundamentally, it helps contextualise it for a lot of folks.
NC: And another thing that was mentioned a bit earlier, Tarquin, was the question of testing people out. This whole issue of either profiling people or, worse, trying to entrap them in some way to test how likely they are to respond to a cyber-attack.
TW: Well, I think there's an issue here and I know that Tim and Shelby have probably got really much greater depth of understanding of this, but I think there's an issue about what you're trying to do with your people. If you trick them, what kind of impact does that have upon them and their attitude towards the organisation? And does it actually help you to get the message across? Does it improve their ability to do the things they need to do? Does it actually make them want to be more secure? So, I think, when you're looking at this, you've got to look at that: what's the kind of message that you want to come across and how does it evolve your culture as an organisation? And profiling is actually quite a sensitive issue. I think, one of the things I would say is that I don't think you should profile individuals, I think, if you're looking at profiles, it's what Shelby talked about earlier, it's about roles, because each role in an organisation contains probably greater or lesser vulnerability to an attack. Or opportunity to an attack. An attacker's gonna be looking for particular avenues to get into organisations and, if a business is attuned to who are the most likely people to face that threat, then they can provide tailored education and training and advice and information to help those individuals be more secure.
NC: I know, Shelby, you like to talk about attack surfaces...
SF: Yeah!
NC: We might think that's access points or the server or something like that, but that's people too.
SF: Yes!
Yes, so, if we think about attack surfaces, the bad guys and gals are clever [laughs], but they are in this for a reason and so they are going to find the angles in, and so if we think about, you know, of recent notes and historical events, with Covid, we saw a lot of social engineering that was going on, the anxieties and heartstrings of everything that was going on with the global pandemic. You know, they were using relief messaging or, you know, a sustainable option, to try to trick people into giving in, using their credentials and such, but it's also interesting what we see as executives are becoming more and more visible within the environment. Honestly, with AI, it's gonna make it so much easier so we fully anticipate spear phishing to go up. And for those who aren't familiar, spear phishing is very targeted attempts at social engineering where folks are looking to be a bit devious by using references to something that's publicly available on social media or on the Internet to make you think that it is fact a real person that is reaching out in need. So, this brings us back to attack surface and understanding where your attack surface is. As the HR function is working with the security function to know where they need to provide support and coaching and education and structures, it's knowing that it's not a flat attack surface, there will be more parts in the organisation that have more risk, your visible executives. If you are an organisation that is part of, um, there's highly visible and an initiative, likewise your engineering group is always going to be very, very targeted, and that's visible only then...
NC: And your people, management, your HR...
SF: Exactly
NC: They've got all the employee records...
SF: Exactly. So, knowing that the risk profile will vary throughout the organisation, that's how an HR can help work with the security organisation and tailor their structures or their training or the coaching or the leadership enablement to help them be aware of and what to do in case they do find themselves targeted.
TW: Going back to Tarquin's point about ethics and testing, I think a lot of this comes down to data. So, as a cyber security practitioner, you want to be able to show that you are improving the situation and the ideal there is to show that you have changed behaviour so, or you've reduced incidents, and obviously the behaviour is the leading indicator. So, people leapt on phishing simulations as a number that says "look, I ran this test and 20% of my staff clicked on it. I've done some more work, I've trained them, and now only 5% click". And that is so compelling because we like to measure things in business, don't we? And so that became a really good measure. But if you step back from it, you need to think about, well, what are you measuring, because if you make that test really hard, your click-rate will go really high again, and so it's not actually a particularly good measure. And obviously, you're tricking people. So, what some organisations are now thinking about is "how do we gather other data in order to better understand behaviours", and this risks going a bit into profiling. So, the important thing, I suppose, is a cultural one: "why are you doing this?". You're doing this to understand your threat profile and you're doing this to make sure, as both Tarquin and Shelby said, you target that training and education as effectively as possible. You're not doing it so you can say "You keep making a mistake, you're sacked!" Because if you get into that world, then you're, people are going to clam up and they're not going to share with you, they're not going to help you and they're not going to report things.
TF: The big problem here is about that old chestnut of compliance versus assurance. So, a lot of organisations take on phishing programmes because they're required to by their customer. So that's not exactly changing behaviour, it's complying. And the trouble with compliance is that it's always a race to the bottom. It's about the most basic security requirements but it doesn't necessarily mean that an organisation is secure. So how we change that dynamic is quite important. This goes, I think, back to the whole point about how businesses operate. The other point I think I'd make is that if businesses put a lot of pressure on their people to achieve objectives or goals etc. then, with the amount of, and the NHS is a classic example, and I don't think the NHS is trying to do this, but doctors are under enormous pressure, nurses are under enormous pressure. They get lots of emails coming through. They've got to process those emails pretty quickly. And that really puts pressure on them when it comes to checking "well, is that the right link I should be clicking or not?". I think, Tim, you've got some data which talks about how people actually make decisions and most of the decisions we make aren't rational, are they?
TW: No, no, absolutely. And the bad guys play to that all the time. I think Shelby alluded to it earlier that the phishing lures will tend to be trying to play to our cognitive biases. So, they'll be using authority against us, reciprocity, if they're trying to pretend to give you something, so you give something back. They'll be using social proof, all the, kind of, Amazon voucher-type phishing where they're trying to get you to go and buy Amazon vouchers and give you the codes. That's very much tends to authority, so pretending to be the CEO. And because we're all in a rush, then most of the time we're relying on shortcuts. So, we'll look at a few pointers in an email, we'll see "Oh, it's from the CEO, they've asked me to do something, I'd better do it." But also, in that delivery focus situation, people will be cutting corners and going round security controls and so what's really important is that if you have a, really delivery focussed organisation, you highlight that delivery has to be done securely too and that matters to the organisation as well. It's not delivery at all costs. We've had conversations with some government-type stakeholders, where they have a real strong delivery focus and that creates real security problems because everyone's just trying to get the job done, no matter what.
SF: So, the idea of getting tripped up, so there's one thing we know about human behaviour. They're consistently inconsistent. And so, I want to bring you back to the role of how an HR function can help with this, so actually I was just speaking with a healthcare client back in the US. They're having this kind of burn-out culture that's happening within their IT function right now because the CC and the powers that be are running a substantial transformation and they aren't staffed up fully enough and it's creating this burn-out: folks that are going out on medical leave and are working for too of long hours. And that poses a cyber risk because when folks are stressed and tired, some of them are more likely to make innocent mistakes, which is a way in for the bad guys and girls. So as an HR function that is trying to enable leaders within the organisation to make sure that the culture and balance of capacity is right, that is a key area in which they can contribute and shore up the human firewall.
NC: Because, I think you were telling me, Tarquin, that often the chief security officer isn't on the board but HR is. So, they're in a good position here, if only people would listen.
TF: Absolutely, and I also mentioned that in an exercise I was involved in with a big organisation, the head of HR, who was a woman, was making probably the most pertinent comments about how they should be dealing with it and she was being completely ignored. It may be because she was a woman that she was being completely ignored, but I suspect it was mainly because "well, she's HR" and "what does she know about cyber?". I think, given HR's critical role, their central role in dealing with people within the organisation, they also have a critical role to play in cyber.
NC: But, of course, Tim, just playing Devil's advocate for a minute, where HR might be unprepared, things might be a bit different because people managers have to take a lot on board. We've heard throughout this discussion. I mean, do most even know the part they're going to have to play in data disclosure, in dealing with breaches, in dealing with the reputational harm. And then there's all this about enforcing data access and policing people who break the rules.
TW: Yes, it's an interesting one, isn't it? I think people managers and HR, they need to... they don't need to be experts in cyber, I don't think that's quite what we're saying. They need to understand the people aspects of that, which I would expect to be standard. The people aspects... there's always going to be grievance processes, and disciplinary processes and processes that deal with stress and things like that. I suppose they just need to understand that those also have cyber implications or cyber might be causing some of those. I suppose the most important takeaway from this is that HR professionals should have an open door to cyber professionals and vice versa, because we quite often have conversations with practitioners who, there's a little bit of tension, in that Security know that they need to get some communication and some training out and some education and then they're having a little bit of a fight because potentially HR are saying "well, look we've got all this other training as well", so there's that kind of tension between the demands of cyber as a department or function with all the other demands. And I think, I mean, certainly in what we try to do, we're very focussed on making some of the training and awareness as low impact on people's time as possible, so keeping it tiny, short and more on-going so that you're having less impact. Because certainly, and I know we keep talking about the NHS, but people there have not really got the time to sit down and do an hour-long cyber course once a year, and frankly, there would be very little point, because they would forget it almost instantly. And so, you need to think much harder about how do you spread that across the year and how do you make it tiny little snippets and how you get it as close to the point of risk as possible. So, it's taking that behavioural lens we've already talked about and applying that to how you help people day-to-day.
NC: Ok, well that's Tim's tip of the day and now we have to bring this to some kind of conclusion. Shelby Flora, what's the biggest lesson from your work with clients, how you move this forward?
SF: This can seem like an insurmountable problem. You know, human attack surface, human firewall. Oh, my goodness, that means the entire organisation is fundamentally incurred to business risk, in business value, so as I mentioned, there is going to be some kind of obligatory, regulatory education that has to occur to meet the standards of the industry sectors that organisations are operating in. But then take a very targeted and tailored approach to the areas of the organisation that pose the most risk as well as value for cyber. Usually that's leaders, folks in IT, folks in Finance, Accounts Payable. Often there's a lot of social engineering in terms of asking money to be wired to all corners of the world. So, focus on that and don't just focus on the training. Think about the structures you need to enable, to make those folks, help them make the right decisions. I think oftentimes we don't think about the behavioural science components which are, "what are the structures we need to put in place to make it easier?". I use a very classic example. A lot of organisations not only want to prevent phishing but they want folks to report suspected phishing. So, instead of forcing your organisation to remember some random number, some random email box that they're supposed to forward it to, put the Report Phishing button in the email browser and so it's much easier. There's these are little behavioural science things you can do to make it easier for the organisation to make the right cyber decisions. HR will be thinking more like that whereas the cyber engineers, they want to "out-tech" the problem always and HR can help them think about it a bit more dimensionally and about the structures the organisation can put in place for that.
NC: Great, and maybe a final bit of advice from you, Tarquin Folliss, maybe as a result of all the conversations you've had with people who have suffered from attacks but are maybe wiser now.
TF: So, I think first of all, we should remember that most people aren't security specialists and when their function is to provide something else for the company, and now they're being asked to take this function on, and to be honest, it's quite alien for most people. They don't have to deal with that. So, I think that's one thing that people should.. those who are trying to implement a programme should remember: we're here to facilitate, rather than to impose further restrictions. Because as Shelby said, if further restrictions are imposed, people will find work-ways round it. And the other bit, there's two things that I think I'd say: make whatever programme you've got relevant to the people that you're targeting because then they'll take those lessons on, they really will see the value of it. And the other thing, which I think we've just touched on is this "just" culture. I think you've got to develop this "just" culture, which means that when somebody does click on a link or does something that they think they may have made a mistake, they can report it without feeling they're going to get absolutely clobbered by the management. In the end, with an incident, the quicker you know you've had a problem, the easier it is to resolve it, the less damage caused. The longer you wait or the longer it's delayed, the more damage it's going to cause and cyber moves at a phenomenal pace so "just" culture is a fairly critical and important part of this.
NC: Brilliant, well, I'm off to change all my passwords straight away. Tarquin Folliss, Vice-Chairman of the Security Special Interest Group, Tim Ward, Founder of "Think Cyber" and Shelby Flora, MD of Accenture Security: a lot to consider, thank you all very much. I'll leave you with a couple of quotes I came across, you may have heard them before: "Security isn't something you buy, it's something you do." and the other is, "There are only two types of organisations: those that have been hacked and those that don't know it yet". Please go to the CIPD website to check out our excellent back-catalogue that's on the CIPD podcast page and subscribe so you don't miss an edition. We've had some very well-received shows recently on AI, on pay transparency and some of the latest ideas in learning and development. Until next month, from me, Nigel Cassidy, it's goodbye.