The Energy Sector Cyber Security Roadmap developed by the US Dept of Energy was well received when it first came out in 2006 and was recently revised. Other sectors saw this and it has led to a Water Sector Roadmap, Chemical Sector Roadmap and various other sector roadmap efforts. It is yet to be seen if these other roadmaps will have an impact. A key factor will be the funding and programs the government and sector organizations but behind the goals highlighted in the roadmap as the Dept of Energy did. (Full disclosure, Digital Bond’s Bandolier and Portaledge were funded by Dept of Energy to help address some of the roadmap goals.)
With all this roadmap mania, DHS decided a roadmap of roadmaps was needed. Eventually it became known as the Cross Sector Roadmap for Cybersecurity of Control Systems. When I first heard this at a 2010 ICSJWG meeting I must admit I tweeted away with a bit of derision. So going into the reading I was skeptical of the value. The document exceeded my expectations.
The document is definitely more accurately titled as a “Cross Sector Roadmap” than a roadmap of roadmap. There are three goals listed in the 47-page document:
- Measure and assess security posture
- Develop and integrate protective solutions
- Detect intrusion and implement response strategies
It doesn’t get much more vanilla than that, and if the document stopped there it would be a wasted effort. But it starts to get interesting when they list 2, 5 and 10 year milestones for each of these three goals. Some of the milestones are general such as “implementation of new protective tools and appropriate training”. Some of the milestones are more specific such as “development of training for control room operators in identifying and reporting unusual events, breaches, and anomalies from a cyber event”.
The real payoff is in Table 1 beginning on page 4-2. This table defines specifically how success will be measured on a scale of 1-5 for seven specific measurements.
- CSET adoption and use
- ISAC or CERT connection
- Employed certified professionals or accredited systems
- Use of the procurement language
- Mandatory security awareness training
- Implemented security standards
- Implemented incident response planning
For each of these measures 25% or less = 5, 25% – 50% = 4, 50%-75% = 3, … And the idea is there would be a periodic survey of each sector to measure progress. The likelihood and practicality of this effort can be debated, but credit is due for actually coming out with a clear way to measure progress.
The document is well written and edited and certainly can’t hurt the ICS security efforts in the US. Someone new to the topic will learn from this document. Where I’d fault the effort is on opportunity cost. DHS and ICSJWG spent two years on this, and the efforts to write the documents and track the progress would be substantial and unlikely to happen without an exponential increase in public and private resources devoted to this effort. I’ve argued for a while now that these broad efforts are much harder to accomplish even partial success as compared to a focused effort on the most critical vendor products and SCADA/DCS installations.
At the last ICSJWG it appeared that DHS/INL still don’t have a list of what vendor products represent the majority of the SCADA and DCS critical infrastructure by sector years into this effort. There are a lot of installations and products that should quite frankly be paid little attention by the government. It would be nice to have everything secured, but the effort to be comprehensive has diluted the progress on the most important systems.
Image by Kevin Hutchinson