“There are three kinds of lies: lies, damned lies, and statistics.” Mark Twain (purportedly quoting Benjamin Disraeli)

The latest edition of the ICS Monitor, last week’s USA Today articles and the reemergence of Joe Weiss’s secret database warrant a hard look at the numbers coming out. Specifically two questions:

  1. Are the numbers intentionally misleading?
  2. Are there additional insights that should be pulled from collected data?

The second question is more interesting and involves less speculation on the motives. Consider Joe Weiss’s database of 400 Cyber Incidents (NIST Definition). No doubt there was a lot of labor collecting and documenting those Cyber Incidents, and there are trust issues that probably limit the information that can be distributed. That said, the number 400 is of little or no help.

Start to break them down by reason for incident (attacker, mistake, malware, etc.), category of impact, qualitative measure of impacts across different categories (cost, loss of life, environmental, etc.), and other divisions and the information begins to be of value. Even then we have to be aware of collection bias. Joe’s background in the electric sector would likely lead to an overweighting of Cyber Incidents in that sector, but looking at each sector as a separate population could provide some helpful information.

ICS-CERT is the champion of intentionally misleading statistics, see ICS-CERT Monthly Monitor for prime examples, but I’ll cover that in part 2 of this article. Let’s focus now on how ICS-CERT could provide useful numbers from the data they have.

  • ICS-CERT provides data on the number of vulnerabilities reported. In 2014, they just provided a raw number of 159 and a trend line. Here would be some useful info, especially if they were tracking this over years.
    • How many of the vendors had a process to deal with the reported vulnerability? This could be broken down further to include a PGP key, security@ email, CERT and other categories.
    • What percentage of the vulnerabilities were verified fixed by ICS-CERT, 3rd party, or the disclosing party?
    • A distribution of the timeline from disclosure to fix … especially now that Google has set the bar at 90 days.
    • Severity distribution to US critical infrastructure (since it is US ICS-CERT)
    • … you probably have already thought of more
  • ICS-CERT provided data on “incidents”. It is very hard to hold off on misleading and just plain wrong, but here are some examples of useful data to pull from the 245 incidents in 2014.
    • Impact, similar to the discussion on the Weiss database
    • Identification and traits of what they called APT/Sophisticated Actors. Somehow they determined 55% fell in this category. What was the distribution of characteristics that led to the 55% being called APT?
    • Discovery, a distribution of how the attackers were discovered, perhaps broken down by type of attack or APT/non-APT in their parlance.
    • Most common attack trees.
    • many, many more

If you really have hundreds of incidents that are not simply corporate network incident noise on big companies that run ICS, then there is a lot of useful information there.

Coming Soon: Part 2 – The Fine Print and Methodology