<img src="https://ad.doubleclick.net/ddm/activity/src=11631230;type=pagevw0;cat=pw_allpg;dc_lat=;dc_rdid=;tag_for_child_directed_treatment=;tfua=;npa=;gdpr=${GDPR};gdpr_consent=${GDPR_CONSENT_755};ord=1;num=1?" width="1" height="1" alt="">

The National Cybersecurity Strategy: A shift in responsibility to software companies

Air Date: March 6, 2023


On March 2, 2023, the Biden Administration released the National Cybersecurity Strategy providing guidance on how the United States should protect and secure its digital ecosystem from cybercrime and nation state-driven adversaries.

Check out our latest Hash It Out session between Virtru experts, Dana Morris, SVP of Product and Engineering and Matt Howard, Chief Marketing Officer, offering commentary on the aspects of the newly outlined cybersecurity strategy and its impacts and implications on the software industry. Howard and Morris touch upon:

  • Software company liability - increased legislation for software makers to take the proper precautions to secure their products and services

  • An Increased investment in cyber - enlisting cloud providers and US military resources to actively disrupt countries such as China, Russia and other cybercriminal infrastructure

  • “Safe Harbors” - Implementing a framework to shield companies from liability when they demonstrate security best practices

  • Overall thoughts on the National Cybersecurity Strategy components

Matt Howard: Good afternoon. My name is Matt Howard. I'm with Virtru. I'm joined by my colleague today. Dana Morris who leads our engineering organization. We wanted to have a quick chat today about the National Cybersecurity Strategy document that was released last week by the Biden Administration. Super Interesting times as we kind of step back and look at the big picture here. You know, my, my quick quick two cents on this Dana, is that that having been watching this game play out for, God, a couple of decades. Now, I hate to date myself. This seems to be a big, big deal. I'm curious to get your perspective high level on it.

Dana Morris: I would agree Matt. I think it's funny because you know with Chris Inglis kind of transitioning out of his position. This sort of feels like is like coup de grace on the way out the door. And I think back to RSA when he did the keynote and he made a comment at one point is keynote about Moving away from checklists to focusing on outcomes. And someone who is responsible for engineering an operations for a SaaS platform where you spend an inordinate amount of time worrying about, FedRamp checklists, You know, SOC 2 checklists, and it's not that these things have no value, but, but at the end of the day,…a lot of times, it has become, You're filling out a checklist and you you've met these criteria. But, but what's the outcome? You're driving towards. It feels like this is a switch towards that. Try to start focusing on outcomes even if it's not perfect. It's a really interesting first step in that direction. I think.

Matt Howard:  Yeah, there's a lot to unpack here for sure, that kind of gets to outcomes, perhaps more than anything. Is the idea that people who build software for a living, might actually find themselves in a new situation altogether, as it relates to liability. I mean, historically, as you know, software vendors have been able to put products into the market and sell software to customers and generally kind of absolve themselves of any real liability, by virtue of using contracts to explicitly disclaim liability There. This seems to put on the table. A topic, for some type of national discussion, certainly congressional review, the possibility of legislation that would make software companies liable for defects that they put into production. Not unlike the automobile industry, your thoughts?

Dana Morris: Yeah, that's exactly what it is. I mean, if you think about I just mentioned SOC 2, FedRamps, some of these compliance regimes. You only focus on those if you're selling into particular industries, where there's other regulations that require them. But if you think about, it's a lot of consumer software, there's no regulation. Controlling, who owns the data, who has to put security around the data, whether that data can be sold. All of those things are happening and there's really no government sort of oversight. There's no standards being applied to that. So I think that's certainly a move towards maybe starting to put some focus on driving, a set of standards that must be met if you're going to sell something, and if you don't meet those standards, there will be consequences now. I think the challenge as a software provider thinking through is like, Okay, but where do you draw the line between negligence or just not bothering to care or just moving fast to market and figuring out security stuff later? And draw the line between that and maybe. When somebody is doing all the right things and still falls victim to an attack, which is inevitably gonna happen to almost every company because the nature of attacks that they keep changing and they're getting more and more sophisticated and often even backed by state actors. So I think that's going to be the hard part is figuring out where you draw the line between, you know, somebody being negligent and just not caring and and somebody who is caring, but maybe makes a mistake.

Matt Howard: Yeah, I mean that that's well said and I think one of the things I find most fascinating about this particular document coming out of the White House is that it seems to Have taken to heart the the on one hand, simple concept that software companies shouldn't be allowed to ship software with known defects. That sounds really simple. And if you say like that, most people would agree. On the other hand, most people don't build software for a living and they don't actually understand how difficult it really is to govern a software development life cycle with perfection which arguably is impossible. So so, they they put forth in this document, this concept of, you know, loosely defined at this point, but this concept of a safe harbor which I think would do exactly what you were describing. Kind of create incentives for software development organizations to kind of do the right thing. Adopt best practices, minimize known defects and third party libraries do the right things with respect to simple to fix and catch. You know, mistakes in first party code and whatever else might be considered, you know, table stakes. And if you're doing the best practices, then maybe you've got yourself into a safe harbor where you don't suffer the liability consequences. Even if your ultimately subject to some type of a downstream attack is that the way you see it? 

Dana Morris: Yeah, I think so. I think, you know, I go back to something Chris Inglis  said at RSA, when you talked about the you know, to date, there hasn't been A reason for an organization, in a lot of cases to prioritize security over time to market, right? The incentives are set up in a way that without regulation, and with the incentives being featured competitor to market, get out there first move quickly. We have to start to try to change that calculus. I think that's what this this responsible  administration is trying to do is change the calculator to say, you have to prioritize security as much as you prioritize time in the market. If the safe harbor, sort of implementation is done correctly. I don't know exactly what that means yet, but if they do it right, I think it does create this, this space where you can, incentivize organizations to be secured and unnecessarily overly penalized them if they just make an actual mistake. And to your point, you made the comment earlier about the automobile industry. I mean, we certainly see this in the auto industry, right? I mean, they are not penalized for making simple mistakes. They are penalized when they make a mistake and they try to cover it up or they just don't respond to it. There are actual penalties and their actual consequences, I think if they can structure it in a similar way and figure out a way to adopt that software, it could be quite quite useful and actually could make this a positive motion for the software industry. But equally. If they're not careful, it could get out of control quickly.

Matt Howard: it's gonna be delicate to say the least. I I do love the idea of going back and maybe learning from Other industries that have gone before us to sort of, you know, imagine what this could look like the automobile industry, being the perfect example. I mean it wasn't what in the 70s when Ralph Nader wrote his famous book "Unsafe at Any Speed", which was effectively critical of the automobile industry for not putting things like seat belts and airbags into cars and not prioritizing safety features and you know through a very long, you know public service campaign and and and You know, other efforts, eventually, the automobile industry came to conclude along with regulators, It's on others that it was in, everyone's best interest to prioritize safety features as part of the products that they were selling. And today, when you watched the Super Bowl and you see 10 commercials for car companies, they're all competing on safety features. And you can imagine a future where some organization will, some software company will sort of maybe compete decidedly on the fact that they're better at security than others or they're better at privacy than others or they're they're just sort of, you know, saying We're different perhaps. Not because of our features and functions. We're different because we respect your data more than our competitors. Is that a plausible potential future from you? 

Dana Morris: I sure hope so, I mean that would be exciting. I think for people to actually, you know, differentiate make make privacy and security, a differentiator, take on the hardware side. We've already seen indications of this from Apple. Apple has been whether leaving aside… how their tech works or what your opinions are and how they do their what they do. For the last few years, they have actually been running commercials and ads around protection and data privacy and security as a differentiator for Apple as a brand compared to their competitors. Now, that's I still view them as more of a hardware company even though they are obviously increasingly software, but they're kind of the marriage of the two. We haven't seen that as much from software vendors that are not cybersecurity vendors, right? We're not seeing that from Facebook, you know. Not to single them out but that's not what we're seeing from those brands yet. I think if we start to see that shift, that would be a positive outcome. If that's one of the outcomes that comes from this pushing the administration and start to see that become a differentiator. 

Matt Howard: Yeah, I mean, one thing on that front, just to kind of pick up that thread. It's it's fascinating to to think about companies, software companies or better yet, cloud infrastructures, public cloud providers potentially saying to the market. Do business with us because we respect the privacy of your data more than our competitors do. And you know, I mean certainly Google has you know, a massive business which is advertising related and many respects data driven and there's a lot to unpack on that side of the equation. But I do think some of the recent work that Google has done with the introduction of client side, encryption both in terms of their Google Workspace product as well as some of their EKM solutions for their Google Cloud compute stuff. You know, it's sort of a new day dawning where, you know, they're saying to the market. Come do business with the Google Cloud or the Google Workspace, cloud and manage your own encryption keys in a way where the data that's stored in. The Google Cloud is private to you. No one has access to the keys to decrypt. It, not even Google. I mean, is that, is that sort of in line with some of what you're thinking?

Dana Morris: It is, absolutely. I think they're still operating at the perimeter, but they're certainly, you know, focusing on a differentiated story relative to who owns the keys and who actually can decrypt the data, which is interesting in the case of Google, right? Because then they sort of have two very distinct parts of their business, that's already true, but they're making it even more so where on one hand, they're known as a company that has collects a lot of data and sells based off of access that data their advertising business and on the other hand leaning into this idea that Google Cloud and…Google workspace are not that and in fact are being increasingly pushed as a default stance of you on the data, you own the keys and they're, they're very privacy oriented on that side. So it's interesting to have the same organization have, you know, sort of two viewpoints that are not diametrically opposed but are very different businesses with different differentiation because clearly the differentiation on the, the ad side is the fact that they have so much data. They can give you really a great access to, you know, consumers and really help you target ads to the right. People. That's not always negative, right? That allows a lot of small businesses to compete with large businesses because they can do the same kind of hyper targeting. So like most things in technology, there's a plus in the minus to those those two sides. I think that privacy differentiation is certainly something that that Google is starting to lean on more and I think it's gonna become increasingly important, as people start to realize the problems inherent and how the lack of sort of focus on who owns the data and how well is it protected?

Matt Howard: Yeah and one thing I wanted to kind of just highlight you know as you and I continue to pick this apart and I've been talking to a number of colleagues in the industry have been watching this thing unfold now and Reflecting in just the last few days since the since the strategy came out, you know, and a lot of people are harping on the idea, that traditional, you know, contract laws gives too much protection to the software vendor, they can hide behind terms and conditions and a EULA. and and if they, they certain organizations suffer, some kind of a massive breach due to some type of you know, egregious error in their security operations, they aren't really subject to very significant downstream liability or penalties. You know, by virtue of lawsuits or FTC action that might exist. I would just highlight for the benefit of everybody who may not be familiar. I kind of agree, but I also kind of disagree because it wasn't that long ago that Equifax, of course, had a massive breach and in the wake of that. Breach, in addition to suffering, like I forget how big of a loss relative to their public value as a publicly traded company. They ended up settling I think, was 575 million dollars with the FTC one of several suits. So, clearly in that particular case the laws around the edges of the current system in, you could argue held Equifax, accountable absent, some, some sort of software, liability that. That's, that's now going to be contemplated as a result of the strategy, man, but I just think it's interesting to reflect on that. Any quick thoughts?

Dana Morris: I think what's addressing is You're seeing the I did think that's interesting. I think that the FTC is certainly been more aggressive lately at trying to find instances where organizations are In fact violating those terms. And not disclosing it and sort of holding the task on the fact that they're not even leading up to those, those agreements, right? So like it's a liability, it's a EULA, As you pointed out, it's kind of terms and conditions, but what we're seeing is also cases where the terms and conditions are not even the terms and conditions of the vendors are following, And so the challenge you might have on that side is like well the teachers aren't big enough. Because clearly they're willing to risk that and, you know, pay like what was it was recently. I forget, BetterHelp? Maybe I don't like the name of the company wrong, but but they're just paid a five or six million dollars settlement, I believe. Five or six million bucks is not insignificant, but does it have enough teeth to make it?

Matt Howard:  Right. 

Dana Morris: To make you change the behavior, right? I think that's where this push from. The administration is going is to try to put more teeth around about the question of courses. We've kind of been talking about throughout this discussion is You know, how do you manage those teeth? And make sure that they're not overly punitive within the other direction.

Matt Howard: Yeah. So I want to shift gears from all in and highlight someone else who I think has been doing some really incredible work on behalf of taxpayers and American citizens on the very, very difficult stage Jen Easterly at CISA. You know, I had that pleasure of chatting with her last year at an event. And, you know, we were talking about this concept of assume breach like we all go to bed every night, every night, we all wake up every day. And and the question is, do we think we've already been breached in some form or fashion? I mean, and I already hear. Are they on our doorstep? They're together, they in our house.? And on a practical level, I think everybody kind of knows. The answer is yes. And then, you know, one of the things about the, you know, strategy doc from the White House is that it for the first time, I think it goes really explicit and sort of did sort of defines who they are. They being China, they being Russia, and they being predominantly nation-states, you know, if for a long time there was a lot of thought about these criminal organizations I'm not to say the criminal organizations aren't, You know, a big source of risk. And and and and and and breach as well. I'm just suggesting that from my own view, I'm curious if you agree that this this document from the White House, sort of goes to another level. In terms of identifying, the real risk is state-sponsored.

Dana Morris: Yeah, I think it's interesting to see it in actual writing. It's one of those sort of in, you know, we're based in DC and it's sort of a well-known secret, if you would hold it. That's been the case for many, many years, but to have them stated. So explicitly as an interesting shift in policy. And I think frankly is acknowledging a reality that all of us known a lot of your focus lately on post quantum,…encryption of those and sort of those pushes has to do with the assumption. That those state agencies, the state actors are collecting encrypted data, Ahead of the eventual breakthrough that allowed them to decrypt it, which is now a concern, right? So there's there is a lot going on in that in that realm. And I think you're seeing this sort of shift away from the traditional criminal organization to either it's directly the state itself or it's a criminal organization that's sponsored by the state indirectly. But in either case, it's got the sophistication and sort of where with all of a country behind it, which is an interesting challenge, right? As a vendor. That's one, that's a sort of daunting in some regards to know that you're sort of up against a much, much bigger, much more well-funded organization, trying to attack you. Try to protect against them with a much smaller budget. A much smaller team, and a much smaller footprint.

Matt Howard: Yeah, I mean okay, case in point. I wonder what if you could make wave a magic wand and fast forward five or ten years from now? Let's assume that most if not all of these recommendations from this strategy memo are put into place in some form or fashion and then imagine something like SolarWinds happening. What would be the outcome? I'm not sure.

Dana Morris: I'm not sure either. I mean, the SolarWinds hack in particular was super sophisticated and to your point,

Matt Howard: And we know nation state sponsored.

Dana Morris: Yeah, they should state sponsored very sophisticated. It's not like SolarWinds was doing something. Totally wrong or they were just ignoring best practices. They were doing everything right and They just fell victim to a very sophisticated attack from a well-funded actor, which is state backed. No amount of regulation is going to help, right?

Matt Howard:  Yeah. So, the theory is, my question would be under those circumstances would Solar Winds have been in the safe harbor?

Dana Morris:  I would assume so because they weren't doing something directly wrong. I think.

Matt Howard: Right, right. And certainly likely following best practice.

Dana Morris: Yes, and I think a lot of the focus now is, you know, for an organization like that, or even like us. Where I think the administration is trying to go is like if we can get more organizations doing all the right things, we can maybe limit the blast radius and…we can limit the the effect of any particular, hack In particular, all the focus on software Bill of materials. And supply chain. Right has to do with trying to minimize the impact of an attack like that.

Matt Howard:  Yeah, hundred percent. Listen, we're running short on time. There is one last thing I want to bring up as much as I kind of look at the strategy doc from from the White House and sort of think, you know what, it's well done. And it and it represents, you know, you some have argued kind of a seminal point in history as it relates to all of us potentially aligning, our our thoughts and resources in a common way to up our games. Maybe that's it. Maybe it's not, I don't know. Time will tell, but one thing I do find that's missing from the current strategy memo. And it's a little disappointing, especially given how much we've seen coming out of DoD. And, and the Zero Trust security, transformation, narrative, or the last couple of years, you know, there's this, you know, obvious focus on, you know, identity is critical. Uh you know, endpoints are critical networks are critical applications are critical all of its important. But certainly there's been a rising tide of emphasis around the data pillar itself which, you know, from my perspective being here at Virtru I think is really refreshing and, and good to see. I didn't quite see as much as that much of that in the strategy doc from the White House last week as I would have hoped. There was a lot of talk of data and breaches and protection of data. But in terms of like the practical sort of, you know, next steps, it was sort of lacking. I'm curious what your thoughts are.

Dana Morris: I had the same reaction, Matt. I was, I was a little surprised that it didn't, at least try to even link to other assets from the DoD and from NIST, and other organizations that are driving, you know, zero trust maturity models and zero trust frameworks. Not that those are perfect either. But they're at least starting to try, to put some structure around. What does it actually mean to put in place a zero Trust framework that assumes that you're breached and assumes that, you know, they're bad people in your, in your network already and how do you try to prevent the not prevent them from doing anything? But try to limit their access to limit the blast radius in particular, we think about data, If you can move more of the, some of the controls that you apply at the network or the app and you can move some of those to the most important data assets in the organization. Feels like that would certainly help shrink the blast radius further. So that if you assume that they've already penetrated the network, they're already in your network. They're already collecting things. Well, if you have different classifications and different ways of protecting different pieces of data, then perhaps you can limit them to only accessing some parts of the data as opposed to all of it right versus the gap or network layer. Once they get into the app, they're actually usually able to access everything. They can see that which is a lot.

Matt Howard: Right? And and I mean, I think there's a huge amount of value to just imagining that data at rest is is encrypted, you know, huge amount of value in imagining that access controls to any of that data would be sufficiently prioritized with some types of, you know like limited availability, you know whether that's like some type of management of keys or who knows what? But Limiting the Blast Radius As you say, limiting lateral movement. I mean, you know, you still have to get business done. You still have to provide access to data. But that brings up the whole other point. Which, I know we talk a lot about here at Virtru. I don't want to harp on it too much, but I think there's this other piece of the puzzle which is business and indeed, government requires that, you know, data be shared externally with third parties. And and so it's one thing to possess data and have proper governance with respect to encryption and access priority, access controls and stuff like that. But it's another thing to imagine. Well, you know what? I need to share this data. Now with this third party and therefore, I want to encrypt and protect the data and I want to apply a policy and I want to guarantee that access to that particular piece of information is only, um, available to the person I intend and no one else. And, and that, that would be sort of the logical extension. I think of what we've already seen from DoD and others around sort of the data pillar in the Zero Trust architectures that have been widely discussed in the last year or so. And you know, the fact that it wasn't there seemed like a bit of a myth from my perspective but no one's perfect nonetheless. I think it was a really interesting document. I appreciate you taking the time to chat with me today about it and Yeah, and we'll catch up with you soon.

Dana Morris: Thanks, Matt, anytime.

Matt Howard:Take care.


Enjoy a coffee on Virtru!

Fill the form below to claim your gift.