1. Library
  2. Podcasts
  3. The Secure Developer
  4. Ep. #15, Enterprise Security with RedMonk’s James Governor
The Secure Developer
36 MIN

Ep. #15, Enterprise Security with RedMonk’s James Governor

light mode
about the episode

In episode 15 of The Secure Developer, Guy is joined by James Governor, Analyst and Co-founder of RedMonk, a developer-focused industry analyst firm. The pair discusses multiple ways that companies can be incentivized, and how they can incentivize others, to invest in and improve security.

James Governor is a London-based analyst and co-founder of RedMonk, a developer-focused industry analyst firm started in 2002. You can check out his blog, James Governor’s Monkchips, here.

transcript

Guy Podjarny: Hello everyone, thanks for tuning back in to The Secure Developer. With me today I have James Governor from RedMonk. Hello, James.

James Governor: Hey Guy, how are you?

Guy: Thanks for coming on the show. I know it's been a while I wanted to get you on. Can you, just to kick us off, tell us a little bit about yourself and about RedMonk?

James: Yeah, sure. I'm an industry analyst, but don't hold that against me. I think that RedMonk looks at the world somewhat differently from the more traditional firms, the Gartners and Forresters, and so on.

They have a model that is really about client inquiries from enterprises, top-down purchasing, legal procurement, RFP theater. That sort of view of the world that technology is something that is done to the people that actually have to do the work.

RedMonk is much more about bottom-up adoption, looking at the decisions the developers are making, that engineers are making, that the admins are making, increasingly in a world with cloud. Social in terms of, look at things like GitHub and MPM and so on. The simple fact is that we're not in a world of permission anymore.

It's not to say that the top-down thing is no longer relevant, but there is another way of understanding technology adoption. And we focus on the practitioner view and really see ourselves as sort of developer advocates, practitioner advocates.

We try and encourage, whether it's enterprises, service providers, or software companies, to just do a better job of serving the developers and then hopefully they'll get better business results.

Guy: Pretty cool. Yeah, definitely. That sort of bottom-up version of the analyst, reach them out. Well, I definitely hear about RedMonk all the time, from the language analysis to the presence in pretty much every event, and have the ear of I think, pretty much any big, or even not big cloud provider, dev tooling entity that's relevant there.

James: We've been lucky, I think. As a research firm, to deal with a secular shift is quite challenging. When you're small, I guess it's easier. But certainly I would say traditionally at least, a lot of our revenues are driven by vendors. Obviously some of those vendors are disappearing.

I mean, you look at a company like HP, and then you're like, wow, it's turned into HPE. And then Micro Focus, which is this small Newbury company is now buying a big swath of it. Obviously for the likes of IBM, there are all sorts of challenges.

The fact that we've now got clients that we're doing a lot of work in that are the new generation of companies, I think is super important. If you're not doing business with Amazon, Google, companies like that, you're not really going to know what's going on in modern software development. Yeah, I think we're in pretty good shape.

Guy: Cool. You talk a lot about many things in the world. Specifically, a lot about DevOps, what's right and wrong, kind of have a perspective around cloud, dev-driven technologies, where do we evolve.

I guess over the course of the next little while we'll chat about how security plays into all of that world. As we dig in, what do you see? You look at the world of developers, of DevOps, of companies as a whole. They need to be secure. What do you see as sort of the biggest problems right now, or the gaps of where we are, versus where we want to be?

James: Well, I think the gaps are really massive. We've frankly got an industry that has failed. Security as a separate thing is just not a model that works anymore, and yet we've got entire industries, audit, compliance, look at the security in the big banks.

Let me tell you, if you looked at any of the companies that have had big breaches, they all have huge security staffs, but unfortunately, they are not fit for purpose.

If it turns out that the problem is a vulnerability in Struts, in an older version that you haven't fixed, I'm pretty sure that auditors had signed off on all of the processes there. But then you end up with a really significant breach. I think security as an industry hasn't really done itself a great deal of favors in terms of modernizing and becoming, as I say, for purpose in the new world.

Then you've just got all of the, well what does it actually mean to secure infrastructure when it's running on the cloud? We've gone from "The cloud is insecure," to "Don't worry, the cloud is completely secure," to "Oh shit, it really isn't."

The cloud undoubtedly can be more secure. But that doesn't by definition mean it is. That's the gap that we need to address as an industry.

Guy: I'm totally with you on the need for this inclusive security. Security is everybody's problem. I guess I think of it as the fact that nobody can keep up anymore, right?

You've got the developers, you've got the ops teams that are just like, they're running. We're innovating, we're changing stuff all the time. Whatever, move fast and break things, but don't get broken into, right? Don't have a breach.

Nobody can keep up on it. What do you see as the root causes for it? Why aren't we embracing it? Why isn't security naturally moving into being at the core?

James: Well, one is that it's hard. I mean, we as a species will tend to optimize for the easiest thing. Nobody has yet made it super convenient and easy to do security. We are lazy, we take shortcuts.

It's all the things. How do you develop a personal culture of staying fit? How can you find a thing that works for you? Not everybody wants to be the gym person. For me, I've found a thing which is that I like cycling to work, and you make sure that some of the time you really boost it, get your heart racing, and that's going to help you out.

All of the things. Flossing your teeth, brushing your teeth, the stuff you try and teach your kids to do, how do you make that fun? Basically, hygiene factors is just something that we're not good at.

I think that in terms of that delta between current behaviors and change behaviors, we really need to understand how to package and make security, as I say, a simple thing. I don't want to get too carried away, but try and make it fun.

Guy: I think making security fun is key. I agree with that. I do a lot of talks about security, right? Security fundamentally, is boring. It's insurance, it's risk reduction. It's not an exciting thing. You haven't built anything. You don't have something to show for it.

But hacking is fun, you know? Hacking is invigorating. It's something where you feel like there is a problem, a challenge. I want to break in, and I've succeeded.

That's definitely one means to do it in the education side. Definitely relates to this desire to make security fun, but there's just all this heaviness of responsibility. There's no room for error, and it's hard to have fun in that context.

James: I think that's a really interesting example, and that's one where as an industry we we are beginning to do a bit better. If you just think about rewarding people at all different levels, and sometimes it's just acknowledging them, you know, bug bounties.

But just thinking about when somebody identifies a flaw, dealing with it in the right way so that everybody feels like A, they're acknowledged, but B, that the problem was dealt with. What you don't want to be is in a situation where someone identifies a problem with your code. They come to you, they bothered to come and tell you about it, rather than just announcing that they've found it.

Guy: Or selling it on the black market, or something.

James: Or selling it on the black market. So how do we reward them, and how do we make sure that there's the social factors? But also, yeah, paying them, you look at some of the things.

Again, I feel like as an industry, if it's going to be fun, it has to be rewarding. We're certainly pretty good as an industry at supporting ego-driven behaviors, so maybe we need to think about that a little bit more.

But definitely paying people for identifying bugs, and if not that, at least do the, "Hey, such-and-such found this thing," and send them some socks for goodness sake, whatever it is. I think that that's one that we're beginning to do a little bit better on. But as you say, it tends to be for the hacker aspect, rather than the hygiene aspect.

Guy: That's the defender. It's hard, it's a little bit more sexy, or compelling to be on the red team. To be on the one that's sort of breaking in, versus to be the one defending.

I find the same sentiment when I worked at security, and then moved into ops, and now back in security. Then I felt like if you go to Black Hat, or any security conference, when you're back you kind of want to curl up in a corner and cry. Everything is that "the world's against me" mindset.

While, when you go to a Velocity, or some big sort of DevOps conference, everybody's kind of together and singing Kumbaya, saying, "We can make the web better." There's a community of people that love this, and together we're going to make the world a better place. Kind of lacks in security.

I think there's a set of conferences, or community mindset that embraces security. I love that bug bounties are getting more widely adopted. Conferences like DevSecCon that's running now. That one has more security consciousness. O'Riely's trying to do this with O'Reily Security to be more kind of a positive, be the defenders conference. There's definitely elements of it. I still feel like we're not gamifying, we're not getting that.

James: No, and it's very early days. I think that to draw parallels, if we think historically, testing was not something the developers did. They're like, "No, I'm a coder. I write code. I let the people that aren't quite so good do the testing. That's for someone else."

Of course now, that sort of sounds absurd. Of course developers would do their own testing. But if we look at the things they are currently doing that with, clearly unit testing is canonical. We understand that, and that is something that we now do.

Thinking about test coverage, clearly functional testing, there are areas that we are doing more. Security testing is still not something that we've made easy enough, and then baked into the tool chains. I think the fact that we were able to take testing from something that somebody else did, to something that we now do ourselves, and take pride in.

You get people bragging about how many tests they wrote for a few lines of code. That revolution, we do need to be thinking about those approaches in security.

Guy: Do you think there's examples of key shifts, if you look at, indeed, testing and embedding that. Were there key means of success that helped us drive this to the mainstream? I use testing as well as an example of something that we want to do it, but it's hard to kind of pinpoint how did we tip that balance?

I'm not sure if it's entirely tipped. Testing still needs more love. But how do we move it into the mainstream?

James: Well, it's a really good question. You're always looking for patterns of success and how this stuff took off. I think some of it was kind of to do with resource constraints. That's always an interesting one because at the moment it very often feels like we don't have a lot of resource constraints, at least in availability of software and hardware resources.

But you know, Kohsuke Kawaguchi was sitting there at Sun Microsystems and he didn't want to be waiting for others. He was like, "Well, I got this server under the desk. I'm going to build this thing called Jenkins."

I know that these days, "Oh no, I'm going to use Travis, or Circle, or whatever else," but the simple fact is, that notion of actually, we've got some resource here that we can use, and we're going to use it ourselves to become more effective.

There's some interesting things there. He was an individual that's made a really big impact on the industry in terms of thinking about testing. He acknowledged that it was good and important.

I know that a lot of people are like, "Wait, the user experience of Jenkins is not super great." We're talking about a tool, it was a lot easier to use because it was something that someone could do themselves. And he is profoundly about, "How do we just make things actually easier for people?"

It's difficult because over time, a product becomes so much bigger, and there's so much other stuff, and there's so much configuration. I think the spirit of the work that Kohsuke did is definitely something that we could all learn from.

To me, he in terms of the tool, and he as an individual, changed the industry. It's dangerous to, I think, do too much hero worship, or the idea that one person changed the world. But sometimes the right tool at the right time, hitting a movement where people are like, "Yeah, we need to do more testing," can be super effective.

Guy: Yeah, agreed.

Making something easy is a core success factor, it seems, in pretty much everything that we do.

The less visible it is, the less naturally visible it is, the more important it is that it be easy. There's the level that you care, and there's the level of how easy it is, or the level of friction that it creates. And you need the friction to be lower than the level that you care, right? You need to care more than it is hard.

We'll spend a lot of time talking, and also the news, and breaches help us a little bit growing how much people care, but then we have to lower the bar. Where for the typical security tool, that other line of how much friction it is, how hard it is to use, sometimes how expensive it is to use, is super high. You just don't care enough to mobilize to action.

James: It's very interesting. We've been through this period, and as I say, I think security has sort of failed, in a sense. One of the things there is also, just the nonsense spoken by, you know, you read the consultant reports.

It's like, "After a breach, a business is going to have this huge problem, it might be share price. There were these made up theories, "Oh, they will have gone out of business within three years."

Let me tell you, all of the large companies have had significant breaches, and they're still there, right?

Guy: Yup.

James: Sometimes stuff does begin to change. Sometimes people do lose their jobs. I think at the moment, certainly what we're going to see with GDPR as a regulation, that's going to really concentrate the mindset. When you as a company know that negligent breach can cost you 4% of global turnover if you're doing business in Europe--

Guy: You will invest in keeping that from happening.

James: You will invest in keeping that from happening. If you're investing in keeping that from happening, you will hopefully, as we understand it, are they investing in people and the tools that they want to use. Yeah, I mean at the end of the day, there is a trigger that is fairly universal in getting people to do stuff. That is to pay them.

Guy: Yeah.

James: I think rewarding people for better behaviors is a big part of changing them.

Guy: Interesting to see if people explore security champion rewards inside the engineering team. Say, "Okay, in the engineering team every month, or whatever, we will give some substantial reward," if it's financial, or send you off on a vacation, or whatever it is. Or just bragging rights even, that comes in for the person who has contributed to security the most this month, or something along those lines.

James: Well it's super important. As you say, it's all through the fixations we have. We don't want to reward maintenance. "Oh, maintenance is bad." Well, let me tell you, I don't want the bridge I'm driving over to collapse.

If I think about my house, sure just slap on some paint is great. But actually you need to, if you're doing a window frame, you've got to take the paint off. You've got to repair the putty, you've got to let it dry properly. That's how you're going to get results.

But we don't as a culture reward maintainers. Teachers, nurses, immigrants doing shitty jobs that are actually in many cases--

Guy: Operate the--

James: Operate the thing we live in. And then we just want to support the people that are in marketing. I think that we need to as a culture, I think that's part of the problem.

Security is maintenance, right? It keeps the lights on. It's not necessarily the cool new thing. I think that's part of the challenge.

Guy: Indeed. It's invisible, right? You don't get rewarded. If you invested a lot in security, and you didn't get breached, if you bought an amazing new lock for your door and nobody broke in, was that investment worthwhile? There's no clear indicator.

It doesn't hurt until it hurts really bad that you didn't invest. But there's no intermediate feedback loop to say, "Oh look, some of the pain went away because I invested this amount in security."

James: It's really beautiful when you see it done right, actually. There's an organization. It's a roll up of I think, five Blue Cross Blue Shields in the state, HCSC, right? They are under a regulation called HITRUST.

HITRUST is difficult because in a way, it's an old-fashioned standard. It's not amenable to the cloud because auditors need to come in and actually check out the data center, right? That's the bad news.

The good news is when you look at the degree of probity, and care, and concern that HCSC is applying to that medical information, that's really good to see. They have a culture of, they don't just go, "Oh okay, we've got HITRUST so we're okay."

They're like, "No, no, we need to take everything as far as it can go because this is peoples' healthcare information." I mean, compare and contrast. It is staggering to me. This isn't a security flaw in a sense, but it is a, "We don't actually care about things."

Yeah, one of the NHS trusts wanted to get better results, it was either liver or kidney cancers. Let me tell you, if you have one of those things, you don't give a shit about the security of medical information, right? But they just said to Google, "Here is over a million medical records. Go and see if you can fix the problem."

Now, on the one hand, okay good. I want Google machine learning, Deep Mind, looking at improving healthcare outcomes for cancers. No doubt. But on the other hand, they just dispensed with all of the normal clinical controls concerning user and patient data in doing that. At some point sometimes quite staggering how little respect we have for this information.

Guy: But how do we balance that with pace? Today, one of the buzzwords of the day is digital transformation, right? Really at its core, it's all about speed. It's all about the fact that you want to mobilize.

It's react to a market need faster. You make your development process continuous. You tap into cost-effective technologies like cloud and the likes. You build up and you move faster.

But if you move faster, again, this is the move fast and break things. You're taking risk. We've acclimatized, or we've accepted the functional risk of it in the DevOps mindset, right?

James: Mm-hmm.

Guy: "It's okay." The gain from moving faster and seizing an opportunity is greater than the occasional risk of having broken something and it didn't work. Your system went down.

Granted, you also need to do what you can to prevent that from happening. But security is not as forgiving. You can't say, "Oh listen, I move fast, and once every year or two I'll get broken into and my data will be stolen."

How do you balance this sort of healthcare organization, this Blue Cross Blue Shield mother company with they will get obsolete if they don't move at the pace of the market and adjust their offerings?

Do you see it? Do you feel like there's some core guidelines? Is it always a judgment call? Where do you think we sit there?

James: Yeah, that's a fantastic question. We're certainly under pressure to move more quickly. We all are as organizations. You use the phrase move fast and break things a couple of times. That's like, famously a Facebook maxim. Well they literally broke democracy, right? We're still feeling those repercussions.

So quite interesting today, I think, the US Treasury is applying some sanctions against the internet research agency in Russia. I'm a little bit cautious of, "Don't worry, it'll all be okay. Because that was, hey, it's just social information. It's just Facebook, right?"

And yet the law of unintended consequences means that if we have no sense of a precautionary principle, really bad things can happen.

The reason we have regulated industries is because bad shit can happen.

I was looking at pictures earlier this week. Someone had just posted some beautiful pictures of exploded steam engines with all the pipework just all splurging out like a Cthulhu. It's like this spaghetti of pipes, and they're beautiful. But you wouldn't want to be the person driving that, right?

Guy: Yeah.

James: And if we think about boilers, regulations around boilers, I want that.

Guy: Yep, you want to keep that from happening in your house.

James: Yeah, I don't want that to happen in my house. I want strong regulations about carbon monoxide making sure that we have regular audits of boilers, and stuff like that. I want a certified professional.

And here's me doing the opposite side. I said security fail, but I still want a certified professional to deal with those things. I think that it sort of gets to maintenance. Speed is great, but I think that again, as a culture, would it really hurt to just take a step back and say, actually it's probably a good idea to do this right?

Does HCSC get obsoleted because Google isn't held to those standards? I don't know. But I'm still actually glad that that's there.

As I say, GDPR is going to be a complete shit show. It's going to be, some US company is going to complain that this is just the EU trying to mess up US companies. There's going to be legal problems. People will go out of business, it is going to be a mess.

But actually, the ideas of right to be forgotten, you may sort of disagree with it, but privacy is becoming important again. I think almost that as a driver for security becomes absolutely key.

Speed is important. But just as a culture right now, I think it would be good to take a step back and say, actually let's reintegrate security into everything we do.

Let's have some ownership of that. Would it really hurt to just move a little bit more slowly?

Guy: I think the problem, or maybe I would say the opportunity lies to an extent in new foundations, new best practices, new tools that allow us to walk that line a little bit better.

When we kept referring to Google here, and in a sense I would venture that Google security is probably no worse, if not better, than at Blue Cross Blue Shield's company. Maybe I'm wrong, it's sort of a loose statement, but it's definitely pretty darn good, right?

It's not bad. You're not implying otherwise. But when you look into why, why would you think that that's the case? It's not because they have larger security teams. They do have large security teams. But I would wager that they don't have larger per capita, per employee security teams, as compared to those healthcare companies.

What they do have, is they have full-on foundations of just how software is built. Constraints built throughout the system, tooling and automation that just makes security something that is core, but still merges that in, that apply to a different constraint, which is the threshold to be able to be an engineer in Google is maybe higher.

It might constrain them from growth in that sense, but fundamentally it's almost like the hope, the Holy Grail to balance these two super desirable outcomes of securing my data but moving fast.

This really comes back to technology. It comes back to the changes, the substantial changes that have happened as part of enabling DevOps, or enabling cloud to just move some responsibilities to be a part of the fabric of what it is that we're building.

I guess one of the key questions is, what does that look like from in security? I think we got a portion of it, we got in cloud. We get in server-less offloads a whole bunch of security concerns from you.

James: Mm-hmm.

Guy: But there's others that are there, and they would get emphasis. But there's some concerns to move off it. The right tooling inside the application, whatever, like key management systems help deal, it is a technology that helps not quite fully alleviate, but simplifies, makes it easy.

Maybe comes back to your core point, which is if we had a way to make it easy, we wouldn't need to create too much of that balance between the moving fast and--

James: Well yeah, and you know, look. I'm not here to say nice things about Snyk, right?

Guy: Yeah.

James: But the simple fact is that if we think about the way the world works now, everything is pulling down a package, integrating it to build a new thing. So I need lightweight tools that are able to inspect and understand the dependencies to make sure I don't mess things up. So you're doing good work there.

We could look at some of the works that we're now seeing from Tidelift, and the Libraries.io folks. These are good approaches because they map to the way that modern developers are thinking about building software.

I think definitely tying into, or at least understanding that that is the way software is built now, we're beginning to identify, actually here are some best practices. You should not be deploying software if you have not used a tool to understand what's in it.

You should understand the license, but you should also understand, as I say, what are the dependencies?

We should also understand if we're taking this approach, then it's that, dependency management, not just from a security perspective, is super important. Otherwise, you could just have like a left-pad example, right? Just understand what you're building. Yeah, good luck with that, I guess.

Guy: I think the interdependent web, sort of the fragile interdependent web is indeed a problem. I'm happy to be contributing with Snyk to a part of it.

There's also third-party services. When Dyn went down with that big DDoS attack, a couple years back I think, and it takes down a whole portion of the internet because we all depend on it.

Was it the NHS side or one of the government's side that was serving bitcoin mining scripts because one of the third party components that it was using was compromised? We're definitely interdependent in a pretty big way.

James: Yeah.

Guy: That moves into the internet, and the paths. Hopefully that's another area where these tools brought us up. These interdependencies is the reason we move fast. It's the reason we create the wonderful technology, and enables us to do this podcast and record around.

James: Also to a point, look, sometimes it's the pain that makes us do better things. If you have a heart attack, chances are much higher you're going to give up smoking, right?

Guy: Yup.

James: Getting the pain, and we have had some really big examples lately, that begins to change behaviors. But I think it's important to work through the pain, and not be like, "Oh yeah, we won't worry about it." Yeah, I do think that it is important.

That's the thing again about GDPR. Someone's going to get hit, and they're going to be like, why me? Well actually, it's going to be applicable to everyone. You know what? If you are the one that gets caught, that is bad luck. But if it drives better behaviors, then--

Guy: You may be better off for it.

James: Yeah, maybe you needed to--

Guy: What doesn't kill you makes you stronger.

James: Yeah. Google, unbelievably good at security, right? I certainly wasn't saying, I just meant this Deep Mind thing, if somebody just gives them data to do what they want, then it is what it is.

Guy: They're not held to the same compliance and regulations as well, so that's also a factor.

James: I think that what we do need to do in making things easy and finding automations and the right tools is helping security for the 98%, right? Even more than that, Google, Facebook, those folks, the engineers that they can hire, the amount of money they can throw at this, most people really struggle to do that.

We do need to have empathy for Marks and Spencers, or Kroger, or HCSC. The bottom line is... or Barclays. Because it used to be, the financial services companies in London, they were like, "Yeah, we're going to hire all the best developers." I'm like, "Oh yeah, all the best developers want to work for the web companies." They are under pressure now.

But that's true of all, that's part of, you said, digital transformation. That's where we are. That's why people are using server-less because infrastructure is hard. Not everybody wants to spend their time fiddling around with Kubernetes, right?

I don't really know what the answer is in that. One of the things you were like, who does it well? I think honestly, partly because the stuff is hard,

there are not that many organizations doing this stuff super well. Where is the Stripe of security? Arguably, there kind of isn't one.

Nobody has yet done such a magnificent job of documentation developer experience that everybody's like, "Oh yeah, when I'm developing an app, I know I'm using that." I mean, there is a missing thing at this point.

Guy: I think there's a whole new breed of developer-focused security tools that needs to come in. Of course security is not one thing. Security's kind of this whole, it's like operations. There's not one operations tool. There's specific segments of it. Hopefully, like this whole sequence of tools that come in that operate in that fashion, that help us build--

James: But developers need to be given the budget to use those tools.

Guy: Yeah, and the time.

James: At the moment, it's still security is off to the side with all this money that they're spending on ridiculous firewall products.

Some of that money needs to be taken out of the hands of those organizations and driven into development and engineering so that it can begin to spend that money on third-party services that help them make more secure software.

Guy: Indeed. We've been chatting here, and I've got all sorts of thoughts but we're already far longer than planned.

I want to ask you one question that I try to ask every guest that comes on the show, which is, if you had one suggestion, or one pet peeve that you have around the security space, one practice that you would want people to embrace or stop doing, what would that be?

James: A pet peeve. I think it is probably something around forgetting the most basic thing, which is that you do all you want with tooling, everything else, the truth is, is the easiest and best attack vector is always humans.

We're so prone to social engineering. I think security has to be about risk management, and understanding those risk factors. Sometimes it becomes too much about, "Oh yeah, we'll come up with a technical solution."

There's no technical solution to human stupidity, right? People will always be gained.

Which is not to say we should attack people. We need to have some empathy for the fact that people will make mistakes. It annoys me sometimes where you just, you have people like, "Oh yeah, this is a solved problem." Truth is, there are no solved problems.

Guy: Yeah, indeed. Humans are often the weakest link. That's going to be a reality so we need to deal with it. Cool. Well, thanks James, for coming on the show. There's a whole slew of things we need to get you back on to chat about. Some more things.

James: Well, the good thing is, is that you're about 100 yards away, so that should help you out.

Guy: That definitely helps. Thanks for coming on, and thanks everybody for tuning in.