How to Prepare for an Ill Wind

SolarWinds has grabbed the attention of everyone who has an interest in cyber security. And many who don’t. It has started a lot of conversations and provided plenty of talking points. Matt Stamper and Cathy Mulrow-Peattie already had a relevant webinar planned long before news of the hack surfaced in early December. It’s called “3 Ring Cyber Circus: Managing and Mitigating Third-Party Risk and Critical Liability in 2021." Though it wasn’t planned with SolarWinds in mind, you can bet the case will come up.

Stamper is the chief information security officer (CISO) at EVOTEK, a consultancy that helps businesses shift from traditional IT to secure, multi-cloud computing. He is the co-author of the CISO Desk Reference Guide and a former research director with Gartner. Mulrow-Peattie is a lawyer who has deep experience in this area, having worked in-house at Mastercard,, and CA Technologies. She currently works as an outside counsel. Their webinar is scheduled for 11:00 am Pacific time on Thursday, January 28 and is free. They will be joined by Bert Kaminski, director for legal at Google Cloud and Robert Malone, head of middle markets with a focus on cyber security and professional liability at insurer Zurich North America.

In this interview, Stamper and Mulrow-Peattie talked about SolarWinds and the kinds of things that lawyers and CISOs can do at their companies right now to make them more secure, and better prepared for the next big breach.

TAG Cyber Law Journal: When we spoke earlier, the SolarWinds hack was all over the news. And Matt recounted a talk that Ed Amoroso gave many years ago when the two of them worked at AT&T. Ed is now the founder and CEO of TAG Cyber, our publisher. Matt, tell us what he said, and why you brought it up in connection with SolarWinds.

Matt Stamper: The discussion highlighted the fact that the state of software engineering was at an all-time low. And I don't think it's improved that much. Ed asked, "If a software engineer built a bridge, would you drive across it? If a software engineer manufactured a plane, would you fly in it? If a software engineer was responsible for an elevator, would you ride in it?" And at the time, his point was if you cared about your life, the answer should be no. And error rates in software and application design are orders of magnitude higher than other "engineering" disciplines, be they mechanical engineering, electrical or civil. So I think one of the challenges that we're seeing is that a lot of this technical debt is surfacing now and coming up to kind of bite us in the proverbial posterior. We need to look at how we design, develop, and implement applications and software in a manner that is fundamentally secured by design, and that embraces privacy-by-design principles.

TCLJ: Would Ed’s questions elicit the same answers today?

MS: There have been some improvements at the margins. We have tools like static application security testing (SAST) and dynamic application security testing (DAST). Our development teams are more cognizant of some of the security risks. But have breaches gone down? No. Have application errors had really important consequences to our economy? Yes. Have we given adversaries an opportunity to get into our environments and networks? Absolutely. So I don't think the story has fundamentally changed.

TCLJ: At the end of January, you two are going to participate in a webinar on third-party risk and liability. I'm guessing that Target and HVAC vendors will come up at some point in your conversation, but let's talk about SolarWinds. There's a lot we don't know yet. But it does raise questions about risk and liability. So what are your initial thoughts?

Cathy Mulrow-Peattie: What SolarWinds and the Target breach and many others have shown us is that there is risk, a lot of risk, with your critical third-party vendors. And the key is to determine how to mitigate that risk and put processes as well as legal contracts in place to try to understand what risks you're getting into before you get there. You can't mitigate all the risks. What we're trying to do is mitigate most of the risks. SolarWinds is a hard one, because we don't know enough yet to know how that could have been mitigated. But there are many things that you can do as an organization. You can have contract clauses in place to make sure that you check with your vendors on a regular basis about their security. And not just check off the box, but make sure that you have a deep dive into what they're doing, and make sure that they actually attest to you what they're doing. You can also have insurance that maybe covers some of the risks. But when you have a SolarWinds issue, one thing we're going to talk about is how important is that insurance, and how does it really cover you? You have to realize that insurance and contracts aren't going to save the day. We also need to think about how we can work with our government to help secure our vendors and our companies.

MS: As a CISO, it's difficult for me to have much assurance with third parties coming into our environment, even with assessments like SOC 2 audits and appropriate legal contracts. As you noted, there are still a lot of inherent risks that, as an industry, we have to think through. When you look at least preliminarily at what took place with SolarWinds, the notion of a build server and little snippets of code being moved from one vendor into an enterprise is something that happens on a daily basis. We have Patch Tuesday from Microsoft, we have any number of firmware and software updates from our manufacturers. Historically, those have been trusted. We haven't really wondered whether that little piece of code that came from a trusted vendor actually puts my organization in jeopardy. As an industry, how we look at this is going to be all hands on deck. It requires an amalgam of voices at the table to understand this correctly. Certainly, a legal voice is critical. A security and technical voice as well as the broader risk management and governance piece are also clearly required. Collectively, we have to look at this problem and think through the appropriate approach.

TCLJ: Do you expect an endless stream of litigation from the SolarWinds breach, and where will insurance figure into this mess?

CM-P: It has been historically the plaintiffs bar that has brought a lot of claims for consumers against the companies where the breach happens. And it's those types of claims that we do expect out of this, as well as claims against them to indemnify all their clients for these cases where the breaches occurred. The question that we're going to discuss pretty deeply at the webinar is how much insurance do you need to have, and how much does SolarWinds have to cover this? Their insurance policies aren't going to be able to cover the 400 plus companies that were part of this attack. The question is, what risk as a company do you take on when you have vendors? Insurance isn't going to mitigate all that risk. So one of the things you need to start to figure out is what do you do when something like this happens? Are you doing the appropriate breach notifications to your customers? Do you have an understanding of what data was breached for your particular company when you use a cloud vendor? How do you respond in a way to mitigate the breach going forward, and to mitigate those risks going forward?

TCLJ: At this stage we don't even know what consumer damage was suffered, right?

CM-P: We don't. But that's what you need to be talking about with your vendor when something like this happens. You can quickly get that information through a forensic investigation. And I'm going to turn it over to Matt to talk about that. Because I think from a technical standpoint, there's many things you can do to help you close down additional exposures that you may have.

MS: I think when we look at litigation, there's some really interesting dynamics in our economy. Most software is sold with very limited exposure from a liability perspective. The limitations of liability clauses and the disclaimers and warranty clauses with most enterprise agreements or end-user license agreements are really designed to protect the manufacturers of software. As we see these breaches occur, and there's evidence that maybe the appropriate due care or reasonable security practices weren't there, it'll be fascinating to watch whether those limitations of liability actually hold up, or whether there are challenges for the board of directors and other stakeholders within the organization, who were aware of some of these risks and didn't preclude them in an appropriate way. When you look at the United States, there's one thing we do very well vis-à-vis the rest of the world: We're extraordinarily litigious. We sue with the best of them. And so I think we're going to see some really interesting dynamics.

TCLJ: One word that keeps coming up is "reasonable." What is "reasonable security"? How does a company ensure that its vendors are reasonably secure? And what lessons should they learn from the SolarWinds experience, as far as we understand it?

CM-P: There are terms in our laws that define "reasonable security." Not on the federal level, but in Massachusetts and in New York they have defined what security companies should have. On a higher level, we have standards that are set out by NIST that also talk about cyber security and privacy, and what the best practices are. So there are some guidelines for companies to follow. But there also is an issue about due diligence. Know where you're putting your data, and who the company is. I can draft you an ironclad contract with a limitation of liability that gets you all this money in the event of a breach, but if the company has no money, or they're an equity-backed company and someone's willing to close the company in the event of a breach—or if you're in a SolarWinds case, where you need to stand in line—then you're taking on a lot more risk. But there's a lot of safety in using a vendor that is secure: someone like a Google, or an Amazon, or a Microsoft. There is some security in that because they're technology companies that have good processes and procedures. So part of your due diligence is how good are they at what they do, and how good are they at security—not just privacy, but security as well.

MS: I offer a definition, from a CISO perspective, that reasonable security is that level of security capability that meets the organization's agreed-to risk tolerances while fulfilling regulatory requirements and contractual obligations of the firm. But effectively to take that and make that actionable, the CISO really does need to understand that organization's vendor environment. Who are those material vendors? The Amazon Web Services, the Googles, the Microsofts. And then looking at it very much from a full-stack perspective, in our world we grew up with the proverbial OSI model. A bit of a bastardized version is: What are the applications, databases, operating systems, network storage, backup networks, and data centers? And ask basic questions: Are we secure? Have we addressed the types of risks that are noted? We have phenomenal guidance, whether it's the NIST Cybersecurity Framework, or ISO 27001, or frameworks like ISACA’s COBIT. There are a number of standards and frameworks that allow us to ask these questions. Historically, where we have challenges is that the folks managing digital and technical risk have lacked the language of enterprise risk. We haven't translated digital or technical risk into the risk to my organization's finances, or its reputation, or operations. Or dare I say even the safety of consumers or individuals. What a webinar like ours does is allows you to look at risk from a variety of perspectives until you get to a reasonable definition. As Cathy noted with respect to New York and the Shield Act, what's in the act is not out of bounds. The controls are common sense. If you and I were CFOs or controllers, we'd understand GAAP, we'd know the difference between liabilities, income, assets, expenses and the like. We in technology should demand the same. It goes back to Dr. Amoroso's point about the engineering discipline around software. We do need to think of IT and applications as services. And are we validating them appropriately? Do we have the level of assurance from a reasonable perspective that a board of directors, executive management and stakeholders—be they external or internal—can rely upon? And so I'm confident that we're heading in the right direction. I'm also intimately aware that what we're going to see are massive, unfunded liabilities and technical debts that are going to need to be resolved. Not that dissimilar to the rusting bridges and dilapidated infrastructure we see in different parts of our country.

TCLJ: In-house lawyers and CISOs are, among other things, their companies' risk managers. Cathy, what are some of the most important lessons you've learned about playing that role, since you've been an in-house lawyer more than you've been an outside lawyer?

CM-P: One of the most important things is you need partners. You need to have a team that approaches risk. You don't do it alone. So your business team needs to understand why you're raising these issues, and that you're on their side. And you need your CISO along with you. You need your data privacy officer or data privacy team there with you. You need to jointly understand what these risks are and raise them as they become critical. The goal is to have a plan in place when something goes wrong. Know that you can try to mitigate risk, but know too that you can't mitigate everything. If we stopped all risk, we'd never do anything in business. And that's not the goal. The goal is to mitigate risk, get rid of the critical risk, and know you're going to accept some as you go forward. And then when something goes wrong, have that plan. Have that information security breach plan, so if something goes wrong, what do we do next? Who gets involved? How do we mitigate the damage?

TCLJ: When we’re talking to people in situations like this, we’re talking to people who have already found ways to partner effectively with their colleagues. That's why they're featured in webinars. But there are plenty of instances, as you alluded to, Matt, where cooperation doesn't exist. How can companies encourage general counsel and CISOs to work together?

MS: In pre-pandemic times, the notion of breaking bread was really powerful. You know, go grab a cup of coffee, have lunch, go out and meet with some of your colleagues and have a discussion. What types of risk, from their perspective, are no-gos? What types of risks are ones that can be managed, and reviewed, and evaluated? And I think it's absolutely critical that when we come into these conversations, to recognize that in many cases we do speak different languages. So if I come into a discussion with an attorney, and I'm talking bits and bytes, and things like IOCs, and hashes, and memory dumps, and the attorney is talking about certain precedents, we're going to lose a conversation that would have otherwise been very meaningful.

CM-P: The general counsel needs to understand the pain points that your technology and your security leaders are facing. Where are these pain points that you're raising in the whole strategy of the business? Understanding how the business operates, and understanding where security plays a role and how it can benefit the business is really critical for everybody around the table. And it’s critical for the company's reputation and revenue. It's up to the lawyers to not make it just a legal issue. It's a company issue, where everyone needs to be at the table.

TCLJ: And that's probably an important point for both areas—for the tech side and the legal side. Because you're both cost centers, not revenue producers.

CM-P: We're service providers. We're not cost centers.

TCLJ: There you go. You need the language. That's what you've been saying. You need to make a point about what your contributions are in a positive way. And you have every reason to want to be able to do that together, because you're in a similar situation. Last time we spoke, the subject of security questionnaires came up. And, as I mentioned, we ran an article in which the author wrote about the importance of vendors' lawyers getting involved in completing them before they're already done and ready to send back. When lawyers are the last stop, said the author—who is a lawyer—they get blamed for slowing down business. So what role do you think lawyers should play? And what role should CISOs play?

MS: These questionnaires can go from a low end of maybe 10 to 20 questions all the way up to 400, even up to 1000 questions. So you're hitting upon one of the soft underbellies of the modern enterprise. No enterprise lives in a vacuum. It relies on service providers, its supply chain, independent contractors and the like. And these different types of suppliers or third parties have different inherent profiles. One of the most critical things as a CISO, along with counsel and along with our vendor risk management or vendor management teams, is to look at materiality, defining what type of organization warrants which level of scrutiny. And then equally important, our organizations are recipients of questionnaires. So as a CISO, I want to make sure when we're responding to a questionnaire, that I understand the risk that the prospective client is looking to evaluate. I want to be able to review that risk, make sure that our responses are accurate and appropriate to the question at hand. Similarly, when we're evaluating and reviewing our own prospective suppliers and vendors, I want to be able to understand their risk appropriately. We have a kind of Sophie's Choice: I can't love every vendor equally. There are certain vendors that I have to really understand and know their risk profile in depth. And then there's others that I might just review at a perfunctory level. This is the quintessential challenge of modern security right now: the SolarWinds issue, the Target breach that you described earlier. In many ways a rogue insider, like an independent contractor, becomes a natural extension of this. I don't think the industry in general right now has got any monopoly on the best way to move forward. It really is the challenge of our time.

CM-P: I think it is the challenge of our time. To have a 300-question information security questionnaire go to a very small organization and expect that to secure you when you work with them isn't necessarily the right approach. We maybe need to move away from questionnaires, and move toward better information. And that's where I see the problem. The other thing is, many of these questionnaires come in as part of the sales process. Sales people shouldn't be answering these questionnaires. There should be a process within the organization where these questionnaires get responded to by your CISOs and the attorneys because most of them have you certify that you're compliant with certain laws similar to the New York Shield Act and the Massachusetts cyber security laws. But I don't know if questionnaires solve any of these risk issues. I think it's kind of a process to say, "Look, we're trying." But I agree with Matt, we haven't solved this yet.

TCLJ: I introduced a key word in that last question: "blame." When a company experiences a security failure, the CISO nearly always seems to be in the crosshairs at a minimum, and in many cases they do seem to be replaced. Is that a fair statement, Matt? I think it's the default assumption that a lot of people make.

MS: I think the appropriate answer is, "It depends." The reality is that many CISOs are relatively short-tenured in their environment. Just at the point where they're really starting to understand risk—the infrastructure, business processes, how the organization derives value, how to protect that value stream—something that may have been completely out of their control, like a third-party app that their team did not select but was selected by the line of business—the organization shares protected health information with it and that gets popped. Then it’s: "Let's blame the CISO and move on." I think CISOs have to recognize that dynamic and need to really press the notion of risk acceptance across the organization. So if you're my CEO, and I come to you and say, "I can address X and I can address Y. I can't deal with Z. How do you, as the CEO, want to address Z?" You have multiple choices there. You can accept that risk, you can mitigate it, insure against it, transfer it, but you can't ignore it. And I think one of the things that CISOs do is many of them are fearful for their jobs. They need to put a little bit more gravitas behind the role and say, "Listen, here are the risks that I can address. Here's what I can't. As an organization, how do we choose to deal with it?" And recognize the consequences for those discussions. I've had exceptionally tough discussions with CEOs in the past. And oddly enough, in one of those I ended up being promoted to the board of directors. So I was kind of an oddity as a CISO, where I was actually on the board. But that was because I was willing to put my job and my career at risk to state what I thought was appropriate. The reality too is that CISOs right now are kind of a rarefied commodity. Organizations are really trying to find individuals that have privacy knowledge, legal knowledge, risk, technology, and cyber security knowledge to glue it all together. And so the role itself, I think, has grown in its importance, very similar to a data protection officer or chief privacy officer, where if something doesn't go well at one company, the likelihood of having three or four other companies pick you up in short order is pretty high.

TCLJ: The expanding role of the CISO may be similar in many respects to the expanding role of the general counsel. Cathy, do want to talk about that?

CM-P: The general counsel and cyber security counsel—whether inside or outside—are increasingly involved in cyber security strategy and prevention. It’s critical that your cyber law adviser looks at your business holistically, from a business, privacy, cyber, and technology perspective. The increasing involvement of state regulators, notably in California with the CCPA and CPRA, along with federal regulators such as the Federal Trade Commission enforcing Article 5, a deceptive practices regulation to protect consumer data, have revenue implications for most companies. SolarWinds should set off another alarm bell in many C-suites, where using consumer data is part of the business strategy. Consumers provide data based upon trust. Where is that trust barometer now? Legal teams, CISOs, and product and information development executives should be reviewing cyber security policies and software development processes to ensure that they’re meeting changing requirements and attack vectors.