Last week, Dale had difficult conversations regarding cyber security with two vendors. Apparently, that was the week for vendor interactions, as I had one too. My interaction was with a control system component vendor, attempting to explain the premise of my upcoming S4 presentation.
I’ve have been downloading as much automation software as I can over the past few weeks, and running Microsoft’s Attack Surface Analyzer against all of them looking for common vulnerabilities and insecure changes. I plan to present the findings at S4, along with some directions for improvement. Please note, this is much different than attempting to find exploits in the software, my work is to see how the software itself can change the underlying OS to make it less secure. I’ve done ~16 pieces of software thus far, and I’m hoping to include a few more as well.
The control system vendor I ran into made a zip file containing the software available on their website, but required an email to get the password to the zip file. Thinking this was just a formality, I sent in an email explaining the premise of my study. To my surprise, the president of the company responded that they “do not see any value in such a study”, and that their software “is as secure, or as insecure, as others that support OPC Data Access V2.0”.
While the no value piece had me miffed, the vendor is entirely accurate. The OPC Data Access V2.0 standard has no security requirements, it specifies how the data is to be translated from each general manufacturer into the interoperable OPC format. This means that selection of an OPC server is then based on cost, performance, and reliability, since any OPC based application can interact because of the standard. These three basic metrics (cost, performance, and reliability) have ruled selections of OPC servers and other automation software for the past 2 decades. So long as a vendor is complying to the standard, everything is fine, right?
Well, in some cases it may lose you business. My main argument back to this vendor was that security is becoming another component when looking to buy automation software, a means to differentiate between many similiar (some would say ‘standard’) products. In cases where every product meets the standard, the only metrics that engineers used to look at are cost, performance, and reliability, which are not governed by the standard. Security is new metric, in that the software with the best security will be selected by security conscious clients.
For instance, let’s take Data Execution Prevention (DEP). DEP is a means of marking memory as executable or not, limiting exploitability of buffer overflow conditions. While DEP isn’t immunity to buffer overflow, it does make exploitation more difficult. If one automation product is compiled with DEP, but another product does not use DEP, there is an argument that the non-DEP product is less secure. The less secure product would then be less desirable to security conscious clients, all things being equal. And since the products both meet the minimum standard (such as the OPC Data Access 2.0 standard), this single fact could tip the balance to the DEP enabled product.
Use of DEP was only an example, there are many security variables associated with products that should be racked up and compared in this fashion. This is the aim of many of the certification services out there, such as the Achilles and INL certifications. Thanks in part to working with Microsoft’s Attack Surface Analyzer, I can think of the following off the top of my head
- Use of Address Space Layout Randomization
- Strict File Permissions
- Default DCOM Permissions
- Required Network Shares
- Open Ports and Services
What other security variables could be used to differentiate products in this manner?
title image by srqpix