Actions have consequences, intentional and unintentional. Last year the SEC provided specific cybersecurity disclosure rules.

the Commission adopted final rules that will require public companies to disclose both material cybersecurity incidents they experience and, on an annual basis, material information regarding their cybersecurity risk management, strategy, and governance.

This is in line with disclosure rules published in other areas and to meet its mission of protecting investors. “Everyone should be treated fairly and have access to certain facts about investments and those who sell them.” The disclosure of material cybersecurity incidents isn’t a new requirement. Companies were always required to disclose material events. It simply highlights that cybersecurity can cause a material event. The annual reporting of cybersecurity risk management, strategy, and governance is new. It was inevitable this type of rule would be put in place as soon as the SEC was convinced that cyber incidents will increasingly be material events.

US public companies will adapt to these SEC cybersecurity rules. They have done it in other areas over many decades. Language that the SEC likes will be developed and be remarkably similar in the SEC filings. The internal controls to support that language will be implemented. Any gaps in CISO coverage under Director & Officer liability policies will be filled in. 

Some of the other likely, and wise from the company’s viewpoint, adaptions could have consequences the US Government and cybersecurity community won’t like.

  • Limited public and private sharing of details on attacks, incidents and risk management approaches. It’s hard to imagine many executives and their lawyers seeing anything but downside in sharing. Share the minimum required by the SEC or other regulator and nothing more. The Clorox incident from last year is a good example of the future of cybersecurity incident disclosure. 

    The long desired and illusory public / private partnership and information sharing runs into a buzzsaw here. Other US government agencies already realize this and are trying to find work arounds. Even if they find them, I’m skeptical that many public companies will trust or use them to share more information through a public government effort. Of course, the backchannel to intelligence will always exist, and anonymized sharing programs might become more attractive.

    I worry that public company asset owners will pull back on presentations, papers and other information sharing on their cybersecurity programs. The case studies and lessons learned will be even more closely vetted than they are today. We likely will see less in number and value. I doubt we will see Tim Brown presenting at S4 again.
  • Internal communications on cybersecurity will also be stifled. The SEC using emails, presentations and other communications in its Solarwinds’ complaint is the proof. Even before this complaint there was a wide range of company practice on internal communications. The more rigid companies had training seminars, and alert bosses, on how and what can and can’t be formally communicated in email or other forms of communication that create a record.

    This training is not telling employees to lie. It is telling them that certain statements must be fact, not opinion, provable, backed up with data and very importantly addressed. That anything that forms a record could become public and be used in court. 

    I’ve seen this in the response to assessment reports with high risk findings. Some clients would push back hard on wording to remove any subjectivity and flavor, and whatever ended up in the final document was addressed and tracked finding by finding in a formal response. The future is going to be more like this and not just for assessment reports. Security pro’s will be trained that whatever they communicate could end up as evidence.

The US Government, led by CISA, is making a big push on Secure By Design and the related Software Liability issue. Failing to meet a documented and consensus secure by design approach would seem to be one avenue to open a vendor up to software liability. 

On one hand the government is saying help us develop and reach consensus on this solid methodology or approach. On the other hand the government is saying if you fail to meet this consensus approach you have liability. Is it in the vendor’s interest to come up with and support a rigorous set of requirements? Even if they plan on meeting them with or without the government program consensus?

We might be heading to the inevitable endpoint where the government is the regulator, the stick, not the partner.