Security is a question of culture – cybersecurity in corporate culture

Cyberattacks often happen as a result of a lack of security culture

When security systems and compliance rules are circumvented by employees, incidents occur as expected. However, most damage is not caused by automated attacks on a company’s IT epicenter, but by a lack of security culture. However, there is more to this than simply creating collective awareness through rigid compliance rules. What employees really need is a “culture of security.

When employees circumvent compliance boundaries and security systems, incidents are inevitable. Although investments in protection technologies are rightly increasing year on year, companies should not assume nolens volens that cyberattacks are a thing of the past. The biggest mistake in the “operating system” is often to overestimate the capabilities of technology and underestimate the creativity of cybercriminals. Most damage is not caused by automated attacks on a company’s IT epicenter, but by a lack of security culture. However, there is more to this than simply creating collective awareness through rigid compliance rules. What employees really need is a “culture of security.

Protection in pure culture

Imagine that as an employer in your company you had to get thousands of employees to act together – in a fraction of a second. An impossibility, it seems. But our body “lives” exactly that – with an impressive success rate. Some 86 billion neurons have to work together and communicate with each other uninterruptedly to enable us to think, feel and act. This only works under one condition: each cell must know what exactly it is supposed to do, and the chemistry between all of them must be literally right. If this is not the case, there will be impairments and, in the worst case, major failures.

When it comes to security, this “operating mode” should become the model for security culture. Here, the company forms the central nervous system, and the security culture acts as a synapse and, at the same time, a connection point between the nerve cells. Each employee embodies a highly specialized cell, even the smallest of which contributes to a functioning “defense system.

Uncertainty factor Responsibility

According to the latest “Cybersecurity Culture Report“, 90 percent of the 4,800 respondents see a discrepancy between real and desired security culture. Most survey participants consider the defense measures against internal and external attacks in their own company to be inadequate. In addition, the majority of employees do not even know how they can improve (data) protection themselves. This, in turn, is due to their uncertainty about their own role and area of competence when it comes to security. Only three out of ten employees are actually aware of this.

Against this background, it becomes clear what is often neglected in this context of security culture: Security must leave the purely technical view and place the (intermediate) human element above it. Today, this rethinking is all the more crucial, because cybercriminals have been relying increasingly on “social engineering” methods for some time now. In doing so, they use psychological tricks such as the CEO Fraud (“managing director trick”), with which they manipulate and put pressure on employees in the case with authority. Human characteristics such as gratitude, sense of duty, good faith or pride are mercilessly exploited.

If an employee falls into the trap of a cybercriminal, he or she is threatened with dismissal in most cases. But in this case, shouldn’t the security culture rather than the employee of the company be questioned?

“Culture of security” as an anchor of stability

“Nothing is safe except technology – least of all the workplace” – this sums up the strategy of many companies, regardless of industry and size, in our performance-oriented society. It is precisely this view that needs to be reversed. If we, as a company, only react according to a pattern when building a safety culture, we must be careful not to lose two qualities of our “human capital”: intuition and unpredictability. Or, to put it heretically, do you know how many security incidents have been prevented by unpredictable actions?

If you want to establish a serious security culture beyond the usual marketing blah-blah, as an employer you should first and foremost give the employees themselves more security – in a “(corporate) culture of security”.  Safety is one of the basic human needs; we try to protect ourselves against everything and everyone we value. Why do very few companies start with their employees? An insecure employee will certainly not leave his (thinking) path, no matter how well-trodden he is.

Teamwork between people and technology

Queen Elizabeth is said to have once asked in a conversation about her own funeral, “I’m not sure I like the candles. Can I bring my own?” It’s a similar story when it comes to security in most companies still: Everyone prefers to trust their own experiences and values when dealing with technology and data. Compliance guidelines are a sensible framework for action, but answers like “Because that’s the way it is” don’t get either side any further in implementing a lived security culture.

It is also crucial to finally get the idea of “super secure” technology out of everyone’s heads. In addition, the technology used must not be allowed to become a buzzkill, and people must not be seen merely as cogs in the process chain. Charlie Chaplin’s film “Modern Times” shows how people become dulled by adapting to technology – this is no different today in front of the PC than it was back then on the assembly line. Safety culture should therefore not be regimented, because culture in its meaning as something “to be shaped by oneself” as well as safety as the result of intuitive understanding, values, experiences and an attitude of expectation are dynamic parameters.

From lone fighter to security networker

Safety culture is not a static concept, but rather a kind of social construct with dynamic factors, from which individual and collective maxims for action can be derived. In addition, the horizon of expectations plays a role in this context that should not be underestimated: the more far-sighted, the more safety can be expected.

Both technically and organizationally, employees need (self-)certainty, a cross between security and knowledge. They also need the confidence that the security culture in the company will not degenerate into a marginal phenomenon over time, but will always remain a matter for the boss with investment character. If a reliable environment is created in this way, the security culture will slowly but surely become second nature to employees. In this way, lone warriors can become true security networkers who can confidently delete statements such as “If I had known” or “It’ll be fine” from their vocabulary and walk through the cyber world with their eyes wide open.

The safe harbor: Between authority and inspiration

In addition, the discussion of safety issues is also related to non-knowledge. Technical systems are not completely controllable from the outset; only through “malfunctions” do they become increasingly controllable. So if machines have to “learn” in equal measure, companies should expect all the less from employees that they will function faultlessly. Safety culture is not created by pressure, but rather by a safe environment in which, in addition to technical and organizational specifications, there is room for active design options, “resonance spaces” and human qualities such as gut feeling or creativity in finding solutions.

“What is certain is that nothing is certain. Not even that,” Ringelnatz once said. One thing is certain: there will always be a residual risk with people and technology. This also applies to a “lived” safety culture in which knowledge, anticipation, reflection and exchange are among the most important factors. If employees are encouraged to do this, cybercriminals will face difficult times.

Organizational and technical tips with which you can eliminate certain risks or possible uncertainty factors in advance and ensure greater security:

  • Data thrift as a countermeasure: Social engineering attacks such as “CEO fraud” can be somewhat limited by “data stinginess. Companies should consider whether omnipresent data presence, for example in the form of personal e-mail addresses on the website or absence notes (especially from the CEO), is really necessary.
  • Security and compliance should go hand in hand: On both the organizational and technical side, responsible parties are instructed to make corporate policies and the use of technologies as simple, understandable and binding as possible for employees. It makes sense and is easy to implement, for example, the introduction of digital signatures and technology suitable for everyday use, data protection and data security-friendly default settings, etc.
  • Plan a realistic budget with human and financial resources, also to cushion peak loads in the company.
  • Put internal work and business processes to the test: For example, in the area of data access. Who is allowed to access which data, when, how and where? Is this really necessary?
  • Strengthen employees in their role: Target persons in companies who are particularly at risk of becoming victims of an attack should be aware of this and their security awareness should be promoted accordingly. Training and education can be used to jointly develop realistic areas of competence and clear, practical rules of conduct on what to do in the event of an (incident).
Ildikó ist seit 2011 für ESET tätig und tagtäglich ganz nah am Thema Cybersicherheit. Sie hat an der Bauhaus-Universität Weimar Medienkultur studiert - mit Faible für Filmphilosophie. So gesehen, ist für sie die IT-Security-Welt durchzogen von hollywoodreifen (virtuellen) Duellen zwischen Gut und Böse oder einem ewigen Katz- und Maus-Spiel à la „Catch Me If You Can“. Ihre Artikel werden in (Fach-)Magazinen und online publiziert.

Comments are closed.