Let’s set aside the important question of whether the US Government’s OT cybersecurity and risk management program, led by CISA, is wise. Instead let’s focus on CISA’s own metrics on CISA’s strategy and programs.

CISA issued a Strategic Plan for FY2023 – 2025 in September 2022. It included 4 goals and 19 objectives. Each objective included a “measurement approach”. This weak term for a metric, and the text for each approach, was perhaps foreshadowing that the measurement would not be done. CISA would not track if their efforts were successful.

CISA issued a Strategic Plan for FY 2024 – 2026 in August of this year. The plan did not mention, nor could I find anywhere on the CISA site, anything about the outcomes of the FY2023 – 2025 Strategic Plan. What worked? What didn’t work? The data and conclusions from measurements of achievement on the objectives.

The Strategic Plan for FY2023 – 2025 was a mulligan. 

The Strategic Plan for FY2024 – 2026 has some improvements. Most importantly is the vague “measurement approach” has been replaced with “measures of effectiveness”. It’s more than a change of term. Many of the measures of effectiveness are well defined and can be measured.

Will they be measured?

We are two months into FY 2024, and I haven’t been able to find a published measure of effectiveness. How will we know if the CISA led efforts are working, making a difference, if we don’t have this measurement. 

If CISA is serious they should have, at a minimum, publish the measures / metrics at the start of fiscal years 2024 (right now), 2025, 2026 and 2027. 


The FY2024 – 2026 Strategic Plan has been streamlined to 3 goals and 9 objectives. Each objective has between 1 and 6 measures of effectiveness.


I want to see CISA failures. We should not hear that everything CISA, or any government agency or any company or any conference, is doing is a success. This means they are not being honest or not trying hard things.

It’s encouraging the CISA recognizes this in the strategy document.

“Achieving Impact Or Failing Fast

Where we determine that a given program, service, or capability is not resulting in expected impacts, we will be disciplined in ‘failing fast’ and making best use of our resources to pivot with agility.”

Whenever I interview someone from CISA or DoE on the S4 Stage I ask them what they tried that didn’t work. And how they changed the approach or effort based on this failure. I’m still waiting for a real answer rather then the “what’s your greatest weakness” interview question answer. One test of the strategic plan does CISA report out when they are failing fast and identifying the pivot.

Favorite Measures Of Effectiveness 

CISA deserves credit for writing some measures of effectiveness that relate to OT cyber risk reduction rather than activity. Here are some potentially great ones with the best at the top and then in descending order.

Goal 2.2, Measures 1 and 2

“Measure 1: Increase in the average number of Cybersecurity Performance Goals effectively adopted by organizations across each critical infrastructure sector.”

Commentary: CISA has said the 38 CPGs are the most important actions and controls that every organization should take. Their implementation would reflect the security posture of the nation. This can and should be broken down by industry sector, eg pipeline, water, electric, etc. Gathering this data is likely to be done by survey. Data scientists can figure out the sampling required to get good numbers with a low margin of error.

“Measure 2: Where possible, reduction in confirmed incidents in organizations that have adopted a higher number of Cybersecurity Performance Goals.”

Commentary: This is the key metric. Did implementing CPGs reduce impactful incidents. There is so much you could do with this data, especially if you collect the impact in a number of categories. This measure should create the data that is most needed.

Goal 2.3, Measure 3

Attack Surface Management – – Percentage decrease in prevalence of, and time-to-remediate, vulnerabilities in all participating organizations and percentage increase in visibility across all sectors.

Commentary: This is an easy metric to report out since the service generates the data. It’s an implementation metric of a security control, reducing the vulnerable attack surface accessible from the Internet. We should be seeing this number monthly. BTW, this is the topic of tomorrow’s Unsolicited Response Show.

Goal 3.1, Measure 5

Increase in the number of technology providers that regularly publish security-relevant statistics and trends, such as MFA adoption, use of unsafe legacy protocols, and the percentage of customers using unsupported product versions.

Commentary: The number of vendors who regularly publish is less important than the data they publish. This big, multi-faceted measure could be helpful for OT cyber risk management if the right statistics were gathered. For example, it would be great if CISA could get numbers periodically from Rockwell Automation on the use of CIP Secure as a percentage of all CIP usage. The same from Schneider Electric for Modbus Secure as a percentage of Modbus TCP. Vendors have different security models, for example legacy, standard and strengthened. What is the breakdown in their installed base? This information could be gathered through survey and sampling as well vendors publishing.

I’m also a fan of 1.1 measure 3 and 1.2 measures 2 and 3.

The key is to actually measure, publish, fail fast and pivot. We need the initial measures now. Hopefully many are putting pressure on CISA to meet their own plan. We don’t want to see a new FY 2025 – 2027 plan that completely ignores the FY 2024 – 2026 plan. 

While I remain hopeful that CISA will publish some of their measures of effectiveness, historically we can’t rely on this. In next week’s article I’ll list five US OT Cyber Risk Metrics that I’ll be tracking semi-annually beginning in January 2024.