Guest author Andrew Ginter is the Director of Industrial Security at Waterfall Security Solutions, the makers of hardware-enforced unidirectional security gateways.
The popular press cites an “alarming” statistic from time to time – the “dramatic” increase in cyber-security vulnerabilities being reported in industrial control system components. 129 were reported in 2011, vs only 15 in 2010 and 14 in 2009. Those of us in the industry of course groan when we read nonsense like this. We know the truth to be rather more “dramatic.”
How bad is SCADA security really? Let’s do the math.
I learned about 18 months ago of a security initiative by a major ICS vendor. One of their products, in one of their vertical markets, consisted of about 2 million lines of C/C++ code. This is pretty typical of such products. The vendor had mechanically searched the source code and found some 50,000-odd uses of buffer-overflow-capable C library functions such as “strcpy()” and “printf().” Rather than try to evaluate every one of these snippets of code to identify and repair the true vulnerabilities, the vendor had decided to simply replace all of those calls with bounds-checking versions of these functions over the course of 3 years. Commendable.
But let’s do a back-of-the-envelope calculation: how many vulnerabilities are waiting to be discovered in other industrial products?
- Let’s assume there are at least ten major ICS software vendors in the world
- Each with software product lines in at least three vertical markets
- Each product line consists of at least five products the size of the 2,000,000 lines of code example above
- And let’s guess that 3/4 of those products are written in C/C++
- Let’s also assume that only 2% of the overflow-capable C functions above represent real buffer-overflow vulnerabilities.
And let’s ignore PLC’s, hardened network gear, hard-coded passwords, smaller software packages, smaller vendors, and everything else except buffer overflow vulnerabilities in major industrial software products. The back of the envelope reads:
50,000 * 2% * (10 * 3 * 5 *0.75) = 112,500 vulnerabilities
That’s a conservative estimate. So there are at least 100,000 vulnerabilities waiting to be found out there. This supports reports from security researchers who say they generally need to spend only a couple hours with each industrial software product they look at to come up with their first half-dozen vulnerabilities.
What Does This Mean?
What this means, for starters, is that the 129 vulnerabilities reported in 2011 is neither a “dramatic” nor an “alarming” number. The 129 vulnerabilities mean only that security researchers have finally started to turn their attention to vulnerable ICS products. For the foreseeable future, we should expect reported vulnerability counts to reflect the amount of attention ICS products get, and not reflect anything about the increasing or decreasing quality or security of ICS products and systems.
Additional context: pretty much all industrial systems use plain text protocols. Encryption and authentication are sometimes considered over WAN or wireless links, and often not even then. Further, industrial components are generally deployed for very long periods of time – after all, are you willing to throw out your refrigerator every 3 years because its CPU is out of date? Why should we expect industrial sites to do the same when their equipment has embedded processors? So given that there are enormous numbers of vulnerabilities waiting to be discovered in industrial components, given that vulnerable plain-text protocols are used everywhere, and given that old equipment which is vulnerable to new attacks will always be found in industrial networks, what should we do?
The answer is clear. Yes, we need to keep fighting the fight to make control systems more secure, but we cannot be under any illusions that product upgrades, security updates or even encrypted communications will make us magically safe. We must assume that control system software is and will always be deeply vulnerable. We must design systems and networks to operate safely in spite of these vulnerabilities. Terms like “fail-safe” and “fool-proof” spring to mind. “Intrinsically safe” springs to mind as well, but it has a specific meaning in the safety world. Maybe “intrinsically secure?”
In particular, strong perimeter protection will always be needed in control systems, which is one of the reasons I find the recent rants against old-fashioned air gaps so very frustrating. You want to be a leader, not a follower in the SCADA security world? Start asking “How can we block entire classes of attacks from threatening our systems?” not just “How can I patch the latest vulnerability somebody has finally told the world about?”
Image by Tom Raftery