First, I’m excited to announce that Idaho National Laboratory (INL) will be running the SBOM Challenge at S4x23 next Feb 14-16 in Miami South Beach. Virginia Wright and Ethan Huffman will be leading the team there. We learned from our two OT Detection Challenges that the S4 team is fully occupied putting on the event. I can’t think of any independent group better qualified to create and run the SBOM Challenge.
Based on VC funding, the next hot OT security product segment endeavors to secure the OT software and firmware supply chain. A big part of this involves Software Bills of Materials (SBOMs). These products purport to have the ability to enumerate all of the components in a software or firmware product, AND to be able to distribute them in a way that makes it simple for asset owners to use.
The SBOM Challenge will test competitors in three tasks:
- Create an accurate SBOM
- Identify known vulnerabilities in the components in the SBOM
- Read in and apply vulnerability applicability data feeds (VEX and possibly others)
In addition to the judges comparing how the competitors perform on these three tasks, attendees will be able to see how the vendors present similar information to users.
I think even more important than the full SBOM is the competitors analysis engines and recommendations provided by the products. After all, what good is having all this detail if it overwhelms the team that is already struggling to cope with existing security tasks.
Finally, attendees will also be able to view the “other” items identified by these products. Many of these solutions claim they identify security issues beyond just missing patches. We will see.
Up to six companies will compete in the S4 SBOM Challenge in the SBOM Pavilion. Currently signed up are aDolus, Finite State and Netrise. The results will be made public, but a numerical score will not be assigned because we feel at this early date in the product segment it would be difficult to fairly measure it. How many points would get deducted for a missing component? Would it vary by component? What about a patch applicability mistake?
I do believe the results and the ability to see the same software analyzed through the competitors’ user interfaces will make it easy for attendees to see an apples to apples comparison and determine which product(s) they consider to be Top Tier.
The INL team is just starting to dig into this and will be organizing the SBOM Challenge and rules over the next two months. We will update the S4 SBOM Challenge page with information as it becomes available and have the INL team on the Unsolicited Response Show prior to the Challenge.