Wall St. Journal and NYTimes interest in Information Security

The subject of Information Security and its risks to the enterprise is becoming more mainstream. Last week, the World Economic Forum called out Cyber Attacks as a top risk. Today both the Wall St. Journal and the New York Times have significant information security articles:

Bassam Alghanims Email-Hacking Allegations Against His Brother, Kutayba, Exposes Hackers-For-Hire Trade – WSJ.com.

Flaws in Videoconferencing Systems Make Boardrooms Vulnerable – NYTimes.com.

 

The Top 10 Security Questions Your CEO Should Ask — CIOUpdate.com

The Top 10 Security Questions Your CEO Should Ask — CIOUpdate.com.

From PwC, here are the top 10 questions your CEO should be asking you:

  1. Who is accountable for protecting our critical information?
  2. How do we define our key security objectives to ensure they remain relevant?
  3. How do we evaluate the effectiveness of our security program?
  4. How do we monitor our systems and prevent breaches?
  5. What is our plan for responding to a security breach?
  6. How do we train employees to view security as their responsibility?
  7. How do we take advantage of cloud computing and still protect our information assets?
  8. Are we spending our money on the right things?
  9. How can we ensure that we comply with regulatory requirements and industry standards in the most cost-effective, efficient manner?
  10. How do we meet expectations regarding data privacy?

This article provides a paragraph or two on each one of these questions.

What is Information Security: New School Primer « The New School of Information Security

What is Information Security: New School Primer « The New School of Information Security.

I would like to comment on each of the three components of Alex’s “primer” on Information Security.

First, InfoSec is a hypothetical construct. It is something that we can all talk about, but it’s not directly observable and therefore measurable like, say, speed that we can describe km/hr.   “Directly” is to be stressed there because there are many hypothetical constructs of subjective value that we do create measurements and measurement scales for in order to create a state of (high) intersubjectivity between observers (don’t like that wikipedia definition, I use it to mean that you and I can kind of understand the same thing in the same way).

Clearly InfoSec cannot be measured like speed or acceleration or weight. Therefore I would agree with Alex’s classification.

Second, security is not an engineering discipline, per se.  Our industry treats it as such because most of us come from that background, and because the easiest thing to do to try to become “more secure” is buy a new engineering solution (security product marketing).   But the bankruptcy of this way of thinking is present in both our budgets and our standards.   A security management approach focused solely on engineering fails primarily because of the “intelligent” or adaptable attacker.

Again, clearly InfoSec involves people and therefore is more than purely an engineering exercise like building a bridge. On the other hand, if, for example, you look at the statistics from the Verizon Business 2010 Data Breach Investigation Report, page 3, 85% of the analyzed attacks were not considered highly difficult. In other words, if “sound” security engineering practices are applied, the number of breaches would decline dramatically.

This is why we at Cymbel have embraced the SANS 20 Critical Security Controls for Effective Cyber Defense.

Finally, InfoSec is a subset of Information Risk Management (IRM).  IRM takes what we know about “secure” and adds concepts like probable impacts and resource allocation strategies.  This can be confusing to many because of the many definitions of the word “risk” in the english language, but that’s a post for a different day.

This is the part of Alex’s primer with which I have the most concern – “probable impacts.” The problem is that estimating probabilities with respect to exploits is almost totally subjective and there is still far too little available data to estimate probabilities.On the other hand, there is enough information about successful exploits and threats in the wild, to give infosec teams a plan to move forward, like the SANS 20 Critical Controls.

My biggest concern is Alex referencing FAIR, Factor Analysis of Information Risk in a positive light. From my perspective, any tool which when used by two independent groups sitting in different rooms to analyze the same environment can generate wildly different results is simply not valid. Richard Bejtlich, in 2007, provided a thoughtful analysis of FAIR here and here.

Bejtlich shows that FAIR is just a more elaborate version of ALE, Annual Loss Expectency. For a more detailed analysis of the shortcomings of ALE, see Security Metrics, by Andrew Jaquith, page 31. In summary, the problems with ALE are:

  • The inherent difficulty of modeling outlier
  • The lack of data for estimating probabilities of occurrence or loss expectancies
  • Sensitivity  of the ALE model to small changes in assumptions

I am surely not saying that there are no valid methods of measuring risk. It’s just that I have not seen any that work effectively. I am intrigued by Douglas Hubbard’s theories expressed in his two books, How to Measure Anything and The Failure of Risk Management. Anyone using them? I would love to hear your results.

I look forward to Alex’s post on Risk.