The Quantum Vulnerability Tunneling Effect

I know I had promised to talk about how to implement a risk management program in your small organization, but bear with me for a blog (or two).  Given that my brain has been wrapping itself carefully around risk management for the last few weeks, I have found myself revisited ideas from my past.  One particular incident this week reminded me of a subject that I’ve talked and written about before.

One of the individuals on my client’s InfoSec team is responsible for vulnerability scanning and management.  He’s quite talented, has good insight on the vulnerabilities, but like many others in InfoSec, he suffers from the blinding effects of Quantum Vulnerability Tunneling.

“The What?” you ask.

Yes, you heard me, Quantum Vulnerability Tunneling Effect.  For those of you not familiar with physics, this is akin to a process whereby a particle can bypass barriers that it should not normally be able to surmount.  So what does that have to do with vulnerabilities?

The barrier we place to separate vulnerabilities to address and those to accept is typically an arbitrary line we set that says “We’ll address fives and fours, but we’re going to let threes, twos and ones go for now.  This is our barrier, and heaven help the vulnerability that thinks it is going to make its way over that line.  Except….

Did you ever do a vulnerability scan, read through the findings, and find yourself stopping on one vulnerability in particular.  You see it and the thought runs through your head, “Oh, Sheiße!”  Suddenly the world around you stops and you focus on the vulnerability.  You know how it can be exploited.  You’ve read about it in magazines, and you’ve even done some of the necessary tricks yourself in a lab using your kit of tools.  In this case the individual at my client’s site had found a vulnerability that had been classified by the vulnerability scanner as just below the event horizon of “critical vulnerabilities”.

He saw this and upon looking at it had his “Oh, Sheiße!” moment.  He went to his manager and presented his case for why this vulnerability should be remediated.  Immediately.  He proceeded in a very animated fashion demonstrate with his hands and his words how this vulnerability could be exploited and how dangerous it was.  His manager had some good replies to his demand, but the individual walked away unsatisfied – probably because the replies talked to business impact and other metrics that did not have meaning to a vulnerability guru.  When all you have is a vulnerability scanner everything looks like a…

So I sat him down and had a little chat so he could consider the same answer from a different perspective.  I didn’t focus on the impact to the business operations since I saw that it was not clicking for him.  What I did was asked him to do a risk assessment of the vulnerability with me:

I asked, “What is the population of threat actors.”  We had already had a chat within the group that we would classify threat actors by loose groups of individuals so we could get groupings of actors.  We agreed on classifications of Universe/Internet, Company Internal, (specific) Department, Local Machine Users, Administrators, and No One.  He replied that it was *anyone* Internal (said with animation).

I asked him, “What level of difficulty was the vulnerability, keeping in mind commonly known mitigating controls in our environment.”.  He commented that it was a module in Metasploit.  Ah, so it was below HDMoore’s line.  I asked him how certain simple controls we had in place would mitigate it.  His reply, it would make it pretty difficult but not impossible, and it had been documented.  So we agreed to put it right at HD Moore’s line. (We haven’t really qualitatively classified difficulty yet, working on that definition still, but HD Moore’s Line is the start).

I asked, “What is the frequency of attempts to exploit this vulnerability.”  We use attempts since there is rarely good data on actual breach counts, but with a good honey-pot we’ve found we can estimate pretty well the frequency of attempts.  I’m really warming up to the importance of a honey-pot in a company’s environment.  The data you can collect!  And it makes frequency something you can lump into categories.  In this case we didn’t have any data at all since no one would set up an internal honey-pot, so we deferred to Threat Actors as a reference point.

I asked, “What’s the value of Assets that are vulnerable.”  The individual responded, “All things on the computers!”  I whittled him down to some tangible types of data.

We merged all of his answers into a sentence that he could say.

And then I asked the magic questions.

“How many vulnerabilities have we identified in the environment?”

He gave me a number.

“Using the same risk measures, how many of these vulnerabilities are a greater risk than the one you just pointed out to your manager?”

Silence for a moment, and a sheepish smile came across his face, and he said, “I get it.”

I have seen this situation many times before.  In the moment of discovery we get too close to a vulnerability or a threat, and we obsess on it.  We study it intently and learn everything we can about how to leverage it, how it can work.  It becomes real because we can understand it and perform at least portions of the attack ourselves.  We focus on it because it is tangible and at the forefront of our mind.  We become obsessed and let that item tunnel its way beyond any barriers of urgency to place itself at the front of our priorities.  The Quantum Vulnerability Tunneling Effect.  We’ve all fallen prey to it.  We’ve all tunneled our issues to the forefront out of fear and uncertainty.  That’s why I liked using the risk assessment.  It required that he re-examine his assumption that this vulnerability was critical, and test it with facts through a risk assessment.  It reset the perspective of the vulnerability in relation to everything else that should be considered with.  He wasn’t happy that the vulnerability was going to be accepted as a risk, but he also recognized where it belonged in the universe of risks.  He could look at the forest and see that it was filled with trees, and some were more worth harvesting than others.

I used to do a similar exercise with my team when I was leading security.  We did an in-house risk assessment.  I made the team list all of their perceived priorities regardless of how big or small, how insane or sane, and regardless of whether they thought it urgent or not urgent.  I wanted them to know that their ideas and concerns were going to be considered.  We then went through a highly interactive and risk analysis session that resulted with a list of priorities based on those ideas.  We put the top ten that we felt we could accomplish during the year on a board at my desk, and the remainder went in a book on my desk so we could say they never got lost.

Someone on my team would invariably come to my desk, hair on fire to say they had a risk that*had* to be taken care of right away.  My response was cool and calm.  I would simply ask, “Does it require greater attention than any of the items on that board.”  This would stop them in their tracks and make them think.  They would look at the board, think for a few minutes and respond with a “Yes”, or a “No”.  Usually it was a “No”.  If it was a No, we would pull out my book and write down the issue.  If it was a Yes, I would have them write it on the board where they thought it should go, and put their name next to it.  They could claim the success, or suffer the ridicule from our team if they were way off.  Priorities and perspective were maintained.

The Quantum Vulnerability Tunneling Effect was avoided, we stayed calm and on course, and we could react well when a real emergency came along.

But those are just the effects of when you think using your risk.

About Daniel Blander

Information Security consultant who has spent twenty plus years listening, discussing, designing, and creating solutions that fit the requirements presented. President, Techtonica, Inc.
This entry was posted in CISO, CSO, Information Security, InfoSec, IT Risk Management, Uncategorized. Bookmark the permalink.