[ad_1]
It seems that most weeks, there is a mainstream news story about a major organisation whose failures in the cyber realm have translated to widespread real-world embarrassment (and worse).
Whether we’re talking about a data breach, an outage, a failed project or some other issue, it’s quickly becoming a normal part of the news cycle. We almost expect organisations to play fast and loose with their information security.
Such stories provide a rich vein of cautionary tales for those working in information security, many of whom just shake their heads in disbelief at each new disaster. There are plenty of good people out there to ask for advice, thousands of sources of good practice, and yet it seems that, for some reasons, these are ignored. It’s worth asking ourselves why that is.
It may be that organisations get tied up in the technology. They focus on the “stuff”; the “boxes”. They’re excited about the new shiny “solution” that hits the market and they forget the golden rule: security is a people problem and it cannot be solved by throwing money at it.
It might be that they don’t understand security. Convergence is bringing us to a place where there is no such thing as “cyber security” anymore – there is just “security”.
The quality of our security is directly proportional to the quality of our thinking about security. Perhaps they simply think that “it won’t happen to us”. This demonstrates a failure to understand the risks they face.
We should not forget security management is actually a subset of risk management. This is where we should be starting if we want to understand our information security risks and communicate them better in our organisations.
Risk is an interesting subject, linked to psychology, sociology and mathematics. When we discuss cyber security risks, we are really discussing two things: how likely something bad is to happen and how much it is likely to hurt.
The “something bad” is usually linked to either the criticality of the information (in the form of availability) or its sensitivity (confidentiality and integrity).
The “hurt” we are discussing – usually for the organisation – materialises as lost productivity (meaning lost revenue), lost opportunity, regulatory and contractual issues and reputational harm.
The problem: these are abstract concepts. They exist in a future that may not happen, that’s why some organisations don’t really understand risk. Instead it might help to look at information security risk management in a different way.
Let’s start with the idea that something can only happen if conditions allow it. Whether we are talking about life on other planets or a breach of our network, if conditions do not allow it, it won’t happen. So, from a very practical perspective, we are in the business of “condition management”.
If we leave the default username and passwords on our routers, for example, we are allowing a condition to exist that will lead to loss on a long enough timeline. We need to apply our resources (time, money and intellect) in such a way that we affect our conditions positively so that the bad stuff is either less likely or less impactful.
This then allows us a simpler way of communicating information security risks to our organisations. We should avoid being too technical in our communications with senior management.
Identifying risks
Nobody is interested in how many malformed packets we intercepted this month. They want to know what their risks are and whether we are implementing controls that reduce them to acceptable levels.
They want to know that we are not wasting resources. They want to know that they are compliant with relevant standards, legislation and contractual requirements. They want to know that we are managing the conditions that could lead to harm or loss for the organisation.
Information technology is an essential part of our daily lives, businesses and wider society. It makes sense to understand the risks it presents and learn how to communicate about them in rational, ways that drive engagement and support good corporate governance.
Fortunately, there are ways of learning how to do this. According to a number of industry websites, ISACA’s CRISC (certified in risk and information systems control) was one of the most desirable information security certification in 2017. It focusses entirely on the identification, assessment, measurement and treatment of information security risks.
Being “good” at IT is not enough. We need to be good at risk, and good at explaining it. Where organisations hit the headlines as a result of uncontrolled information security risks, there is an inevitable blame storm.
If the information security professionals (who are the most knowledgeable, most certified, most experienced people in the organisation on this subject) have failed to explain the risks in a way that senior management can understand and engage with, they need to accept their part in the problem and do something to make sure that it doesn’t happen again.
[ad_2]
Source link Google News