A Look at How to Measure Anything in Cybersecurity Risk (Chapter 2)

by Phil Conrad

How to Measure Anything in Cybersecurity Risk, second edition
by Douglas W. Hubbard and Richard Seiersen

PART 1 - Why Cybersecurity Needs Better Measurements for Risk

Chapter 2 - A Measurement Primer for Cybersecurity

This chapter opens with a quote from Malcom Gladwell in his book, Outliers. "Success is a function of persistence and doggedness and the willingness to work hard for twenty-two minutes to make sense of something that most people would give up on after thirty seconds."

I can be persistent for twenty-two minutes. This is a motivating statement as I think about work I have to do, material I need to present or events I need to plan. Even when tasks will obviously take longer than twenty-two minutes, I can sort out the work ahead in twenty-two minute chunks of time. It makes a difficult chore much more manageable.

So what does this quote have to do with measurement in cybersecurity? I don't know. It wasn't specifically addressed in this chapter. But I suppose it has to do with the idea of understanding the concepts presented in, not only this chapter, but this book concerning measurement. They go on to define measurement as a quantitatively expressed reduction of uncertainty based on one or more observations. It is all about reducing uncertainty. They state, "This 'uncertainty reduction' point of view is what is critical to business.... Sometimes even small uncertainty reductions can be worth millions of dollars."

One of the ideas I found most interesting in this chapter is what we can learn from small samples. There are two "rules" they specifically speak to:

  1. Rule of Five
  2. Laplace's Rule of Succession

The Rule of Five - There is a 93.75% chance that the median of a population is between the smallest and largest values in any random sample of five from that population.

For example, if we randomly select five values of anything, "the chance of randomly selecting five values that happen to be all above the median is like flipping a coin and getting heads five times in a row. The chance of getting heads five times in a row in a random coin flip is 1 in 32, or 3.125%.... The chance of getting all heads or all tails is then 100% - (3.125% x 2), or 93.75%."

Laplace's Rule of Succession (LRS) - Given that some event occurred m times out of n observations, the probability it will occur in the next observation is (1 + m)/(2 + n).

One of the examples they share is applying it to working at the same organization for six years without having observed a major data breach. "We treat each year like a random draw of a population where a year observed without a data breach is a miss, and a year observed with a data breach is a hit. We assume you had absolutely no information about the rate of data breaches except for your immediate experience at the organization. LRS tells us that the chance of a data breach the next year is (1 + 0)/(2 + 6), or 12.5%."

These are simple estimation methods but they will allow you to develop better intuition for quantifying what seems unquantifiable.

Statistics was not my favorite course in college. A lot of new concepts were thrown at me at what seemed like a high rate of frequency. My instructor was of foreign decent and difficult for me to understand, which didn't help matters. But taking twenty-two minutes to digest a couple estimation methods is doable and helpful in learning about measurement in cybersecurity.


Posted 1/23/24

Home