Human beings are the weakest link in the chain of cybersecurity, you often hear. Roel van Rijsewijk from Thales is very annoyed about this approach. The reason for this is that the responsibility in this area lies not with the people who deal with it, but with the end users. That must and can be done differently.
When it comes to cybersecurity, end-user awareness plays an important role these days. If end users were to understand what is dangerous, then as an organization you are less vulnerable to hacks and other cyber threats. Or so the story often goes. Van Rijsewijk is clearly not a fan of such an approach. “Awareness is often used as an excuse in my field,” he says. You shift the blame for substandard security to your end users, while organizations would do better to look at themselves.
Basically, most conversations within organizations around cybersecurity are “bad conversations,” Van Rijsewijk indicates. With this he refers to the way in which things are ‘sold’ internally. “Things go wrong in the area of reverse psychology,” he states. A security officer often unnecessarily inflates the problem, after which the other side starts to play it down. As a result, the security officer inflates the problem even more. The result of this is a very poor dialogue between the two parts of the organization.
Such a poor dialogue will ultimately never yield the desired result – a more secure environment. Increasing security measures from the security officer’s perspective, often a consequence of such a dynamic, is also counterproductive. A restrictive security policy always leads to creative ways to bypass it, is Van Rijsewijk’s opinion. Securing an environment 100 percent is not only impossible, it is also undesirable. It only leads to more friction and unwanted initiative from employees.
Awareness is, as already indicated, often drawn into it the discussion when it comes to making organizations more resilient. However, the problem does not lie with the employees, says van Rijsewijk. The problem lies with bad conversations. It’s certainly not the case that awareness wouldn’t help, but it should never be the primary focus. As an organization you have to have better conversations about cybersecurity with your people. If you go straight for the awareness route, it will eventually turn into paranoia and nobody will dare to do anything anymore. In addition, it hardly ever produces the desired results, but that’s another discussion.
More possibilities than we think
If awareness turns into paranoia and your organization pursues a security policy that is extremely restrictive, you make a mistake on several levels. First of all, you lose sight of the end users and the impact that such a policy has on them. The result of this are the creative bypasses mentioned earlier. In addition, as an organization you also block things that do not need to be blocked at all.
As an example of the latter, Van Rijsewijk refers to something that happened at Thales during the corona crisis. It is generally known that many companies have been forced to enable remote access. Thales had a problem with this at one of their SOCs. They have to comply with strict guidelines and so weren’t allowed to enable remote access to their SOCs. However, during the corona crisis, people were often not allowed into the SOC. Yet everything was still running there, and Thales had the obligation to protect its customers. The result? All of a sudden, remote access was possible. Better still, the security of the SOC hasn’t been compromised.
A key component of a successful strategy, one in which you don’t make cybersecurity the problem of end users, is to think from that end user’s perspective. “You have to ask yourself why people use a simple password,” in van Rijsewijk’s words. That has to be your starting point. It doesn’t have to be extremely difficult for end users to be secure. He mentions Paypal as an example of a company that has the right approach. Paypal is extremely simple to use, with a user name and a password. This may not be extremely secure, but Paypal doesn’t make that the problem of the end user. The company has developed its own technology and a team that constantly searches for suspicious transactions and deals with them. So Paypal takes responsibility for this itself.
Shift focus to detection and response, but with follow-up
At the end of the day, organization work best with minimal security for end users, Van Rijsewijk argues. The consequence of this, of course, is that you have to focus more on detection and response (EDR). You (deliberately) open up the organization as a whole more than you do when you have very restrictive security policies, or at least try to have those. This shift to detection and response is also a clear step away from awareness, which is a preventive measure. That is you want to prevent people from falling into the trap by creating awareness.
Awareness is not absent from Van Rijsewijk’s approach, though. He argues that the focus on detection and response should be done in a responsible way. That is to say, as an organization you must also practice responding adequately to detections (preferably by means of Red Teaming). If you practice, people will undoubtedly learn from their mistakes, so there will be more awareness. However, the basis of responsibility no longer lies with the end user, which seems to be the case with preventive awareness. Detection and response thus works roughly the same way as the human immune system, Van Rijsewijk argues. You detect something, deal with it, and recover to a higher level of immunity.
In itself, placing the emphasis on detection and response is not new, of course. The emphasis on actually practicing it we hear less often, though. Of course you have to have the people to adequately deal with the output of your EDR. One problem is often that you receive so many alerts that you lose track of them and take far too long. Fewer alerts and especially as few false positives as possible seems are also very important. In general, EDR solutions are not always very good at this. We recently had an interesting conversation about this with Deep Instinct, who would like to see their Deep Learning platform linked to EDR, in order to achieve precisely this. That platform also clearly does not place the responsibility with the end user, by the way.
Role for industry and organizations
The approach outlined is one that definitely deserves our sympathy. In theory, we agree with the view that you shouldn’t blame end users within organizations by making them largely responsible for preventing hacks and other attacks via the awareness route. The security industry as a whole, however, should also examine its role in all of this. Suppliers of security or other software solutions shouldn’t just claim that they have their affairs in order, and be done with it, says Van Rijsewijk. He sees that happening too often. If you want to influence user behavior, you can do that as a supplier too. Think, for example, of the green bar when creating passwords, to indicate that a password is strong enough.
Of course, the responsibility does not lie solely with the security industry. Organizations also need to take the necessary steps to improve their security posture. As an employee of Thales, which among other things offers security services, it won’t surprise you that Van Rijsewijk advises organizations to invest more and more wisely in security. Yet there is more to it than a sales pitch, he assures us. In his role as Director of Cyber Defense, he always consciously stays away from such stories in conversations like the one we have with him. Even when he does have sales conversations, he stays away from FUD (Fear, Uncertainty, and Doubt) just to sell something, he clarifies.
Whether or not you believe that Van Rijsewijk is sincere here, he does have an interesting point of view, in our opinion. Security officers should get rid of what he calls the “negative assignment” they usually get. ‘As long as it doesn’t go wrong’ is the guiding principle in that assignment. That results irrevocably in very restrictive policies. Van Rijsewijk compares the message of a security officer with that of a dentist. A dentist who is going to perform a root canal treatment on you. He has to do it, but it’s not something that makes you very happy.
Be risk-seeking instead of risk-avoiding
At the end of the day, security officers need to change their mindset, according to Van Rijsewijk. They need to become more risk-seeking, instead of always being risk-averse. Don’t try to make the negative business case in case something goes wrong, but rather the positive one. In other words, you have to point out the advantages that investing in security offers your organization as a whole. If you start from a risk-averse approach, you can’t really talk about investing anyway, because in the end you don’t really get anything back in the form of an ROI. Of course, you can say that by having good protection against ransomware you don’t have to pay a ransom, but that’s not really an ROI as we know it from ‘normal’ investments.
The example of Paypal we cited earlier could be seen as a positive and risk-seeking way of doing security. That company invests in special software and teams to detect suspicious transactions. As a result, the platform is relatively easy to use. This should then have a positive effect on the user experience and thus lead to more satisfied users. As an organization, you should also take such an approach to your employees. Very few things are off the table in this respect. We have seen what is possible in the past one and a half years in terms of remote access, for example.
In order to make the change from risk aversion to risk seeking, the traditional profile of the security officer no longer works, Van Rijsewijk indicates. Traditionally, it was mainly retrained IT auditors and similar risk-avoiding roles who made it as security officers. This is now changing, he says. A common profile nowadays is that of a hacker, or at least young, dynamic people who think like a hacker. They are generally much less restrictive in their approach.
Awareness is still important
From the above, you could conclude that awareness in the broadest sense of the word is nonsense according to Van Rijsewijk. This is of course an interesting starting point for a discussion, but doesn’t hold up in practice. Awareness definitely is important. For example, there must be awareness in the boardroom that things have to change. That awareness is there, according to Van Rijsewijk. He doesn’t see much awareness when it comes to focusing on the positive business case, though.
End users are also not completely exempt from some emphasis on awareness. This awareness, however, is of a different order than shifting the responsibility ‘because man is simply the weakest link’. If you invest in EDR and you practice how cybersecurity works in practice, then of course you also create awareness, but it is a lot more constructive than when you have people completing training courses just to tick a few boxes.
In the end, the message Van Rijsewijk wants to get across is that you have to look at cybersecurity in a fundamentally different way than before. That starts with a strategic switch from risk-averse to risk-seeking and therefore also with investing more in security. To be able to sell this well internally, as a security officer – or whatever passes for one within your organization – you will have to make a positive business case. Only then will it be justified to release larger security budgets. The benefits for organizations that do it this way are considerable in the long run. Employees will be happier, which leads to fewer creative workarounds and therefore ultimately to a better secured organization. All of this while security officers aren’t perceived as dentists preparing for root canal treatment.
Will nothing ever go wrong again if you adopt a risk seeking approach in your security strategy? Of course things will go wrong. It will never be 100 percent, but as an organization (and as an industry) you will be in control. That’s not the case is if you place much of the responsibility at the feet of the end users under the banner of awareness.