Ep. #21, Managing Security with Cybersecurity Leader and DevSecOps Practitioner Julie Tsai
In episode 21 of The Secure Developer, Guy meets with Julie Tsai, Cybersecurity Leader and DevSecOps Practitioner, to discuss ways to manage secure systems and bridge the gap between security and DevOps.
Julie Tsai is an InfoSec, infra and DevOps professional of 20 years with Silicon Valley technology companies, now serving as Head of Security for The RealReal. For Walmart eCommerce from 2013-2016, she built up and led 2/3 of the InfoSec team including much of the technical cohort.
In episode 21 of The Secure Developer, Guy meets with Julie Tsai, Cybersecurity Leader and DevSecOps Practitioner, to discuss ways to manage secure systems and bridge the gap between security and DevOps.
transcript
Guy Podjarny: Hello, everybody. Welcome back to The Secure Developer. Today we have an awesome guest with us, we have Julie Tsai. Thanks for coming to the show, Julie.
Julie Tsai: It's a pleasure.
Guy: Can you tell us a little bit about yourself, about what you do and how you got into security?
Julie: Today I am a leader and director in security organizations. I am now working for a major cloud content management company. I started up my career more than 20 years ago now, debugging network controller hardware, and spent about 13 years in system administration and infrastructure space on the actual delivery side of it, TechOps. In that process I gained security knowledge because it was often part of our second jobs that weren't named.
Guy: They were never named but had to be done anyway.
Julie: Absolutely. It was not named, but if there was a problem they'd be calling us. I gradually made my way into doing a couple of years of development, found out that I was a better sysadmin and then went back into doing more DevOps and DevSecOps types of things, and eventually moved into management.
Guy: What drew you in? What about the security aspects of that sysadmin attracted your attention?
Julie: It's an interesting thing, for me at least. I'd never intended in the beginning to go to security, but it was always something that needed to be handled. At some point I realized it was always those kinds of problems that I couldn't stop thinking about. Either because of the acuteness of it, or because of the complexity.
Then eventually as I started to get in deeper I would find myself more frequently having conversations with my managers and executives about, "We need to do this and I need us to not forget about it, and this is how we need to do it." And it was starting to happen at company after company.
So finally, when I was interviewing about five years ago over at Walmart, I met a great VP of platform over there and originally, they were interviewing me for a DevOps role. And I was like, "I don't know. These things get defined in so many ways."
He looked at my resume and said, "You seem really passionate about security because you're always doing security problems." I was like, "Yeah. It's important." And he said, "Why don't you consider working for the security department?" And I said, "Sure. We'll give it a go." It ended up being great career advice.
Guy: Yeah. Awesome. I think there are today, for the biased selection of the people that come on this show, there is oftentimes either a dev or an ops. Or "sysadmin" and "ops" are just different terms for similar types of activities that evolve them into security, and that passion that draws you in puts you in a stronger starting point for it.
Julie: Absolutely. You have a love-hate relationship with it.
You may find yourself cursing out the security people you work with, but then wondering why you're always going to go talk to them or work with them, or figure out those problems and someday you will be there, too.
Guy: It sounds like you made that transition in Walmart, but part of the conversations we had here ahead of the show was around your experience spanning a good number of sizes of organizations.
Julie: Absolutely.
Guy: Maybe let's talk a little bit about being a security leader and driving security in organizations and let's go from small to big.
Julie: My career roughly progressed in that direction as well. I loved being at startups, and started out my first few professional years working for organizations that were 7 people, 30 people, and thought, "This is the way to go. Be in two rooms with a bunch of smart, difficult people and get to know each other well." Eventually over time the infrastructure problems that I needed to get good at and to solve were starting to go to slightly bigger and bigger companies, let's say 100-person companies, 200-person companies.
And I realized, "OK, if I want to understand how to do scale and performance at this stage, I need to just go there." I found great experiences there as well, bonding with the team, learning how other people did stuff. Also great to get hands-on experience implementing security at the ground level when the buck stops with you.
You need to be able to do it in a sustainable way and what the answer is going to be if you go back and ask for more budget. So I started learning how to design and think about things in a way that I thought was going to be scalable for my teammates and myself, and not create more work.
My first encounter with what we call some of the DevOps tools with configuration management was with CFEngine after I finished reading a bunch of articles on bootstrapping and infrastructure and some of Mark Burgess's papers from the late '90s, and said, "This is how I think it should go. I haven't had a chance to implement it, but let's go ahead and try it at this company."
At this company it was maybe about 150 people to 200, and we had level 1 service provider PCI compliance we had to meet. So actually, the standards were pretty high for a company that was trying to punch above its weight class.
Guy: In this context you're still in a sysadmin role?
Julie: I am.
Guy: You're not officially security, you're just doing it as one of those things that needed to get done?
Julie: That's right. There was no security department that was separate at the time. This is maybe 10 years ago, and when I think about the bulk of my system administration experience, it was always your first responsibility was going to be reliability and uptime as well as the responsiveness to the customer and the product folks. But the hidden responsibility was security.
The more I was thinking about it I realized this hidden responsibility is as big as or possibly bigger than our other responsibilities. Because there's reputational risk here for the company as well as for us individually.
And the liabilities for this are maybe more than that particular feature is going to make in terms of money. So, how do we express this to the upper brass? That became a trajectory that I saw going through my career in the beginning, just figuring out, "How do I make these changes efficiently? I'm going to use these really cool tools that are going to let us scale and keep things in perpetuity and be bulletproof."
But over time I started to see we have a really serious thing here where we have to be able to express risk about all these different things we're doing in the technical space, to people who haven't yet been quantifying it.
Guy: What were the key tips and tricks that help get that done? Do you remember some core approaches that did work and the ones that you thought would but actually ended up bombing?
Julie: Yeah.
This is one of those things that you have to use the carrot, the stick, and every bit of social influence that you need to have.
And part of what makes careers in security so interesting is that every professional skill you have you will use. In the beginning, we'd use the lever of compliance. Because in their world I started to realize compliance was in some way our product version of security when we able to say, "OK. We were PCI compliant," or, "We're meeting SOX obligations or JLB."
We were providing a level of assurance to the customers that they could use to gain more trust with and therefore productize in their fashion. So I realized that for so long I've been working in this space where we weren't sure how to quantify what we were doing, but this helps give a label and a price to it.
Guy: Yeah. And it's a clear cut accomplishment as well.
Julie: Absolutely.
Guy: You achieve compliance, versus you reduced risk. Which is this nebulous thing that's out there.
Julie: Yeah. That is 100% true.
Security is both science and art.
We have to find these very objective results and metrics, and at the same time it becomes a judgment call. "Was that good enough? Was that risk brought down just enough? What do I think is going to happen in the future?" And eventually over time I started seeing that some of these companies we're selling our reliability and we're building products that our security focused, so there's something beyond just the compliance piece of it.
And once we could get past compliance, it was also about elevating the conversation to the next level. Once the leaders understood, "This is important. This is the language around compliance and security. And by the way, it's still not enough." Because if you look at all these companies that did get breached, they were compliant. And we all know on the back end of it that the paperwork is only part of it.
Guy: Compliance is a minimum bar, it's not a sufficient bar.
Julie: Yes, that's right. You haven't gotten there; it's your starting point.
Guy: To an extent they're on the same continuum. It resonates with me, the notion that the first element is compliance. It's a stick, to an extent.
Sometimes some companies demanded that, but for many others once we have it it's this stamp that says, "OK, we are secure, or we're at a certain caliber of security," which actually helps foster the conversation. That's the compliance version of the second bit that that you're referring to. But in general,
the more you boast humbly that you can demonstrate your security posture, then you can communicate to your customers that you are a trustworthy partner to work with.
Julie: That's right. Also along that continuum of bringing it past the stick into something positive, because we're dealing with things that people generally don't like to think about. They don't necessarily like to think about penalties and regulation or failure. So, making it a point of pride. What we found was that eventually when people were starting to solve hard problems, and sometimes under difficult circumstances, man-made or not.
You could find that sense of pride in what they had accomplished and the work, and over time even if they hadn't identified the security people before, like me and like a lot of us they would say, "Actually, I really care about this. I am passionate about security." Finding that passion in the activity of it. Once they started to identify with it from that standpoint, "This is something that's important to me and I'm proud of it," you're able to bring the practice to where you want.
Because what you really want is people to internalize it at that level of, "How can I better do security and think about security for our results and for the customer?" It's not about the book it's not about passing the cert, it's not about the grade. It's about, "How do we do this better?" So it's getting it moving along that continuum.
Guy: I love that, making it a source of pride.
Oftentimes we talk about security in the context of the shame, and pride is basically the reverse of it.
Julie: Exactly.
Guy: I love that. Making security a source of pride. That's the evolution of techniques, and this is in the context of this company that is still relatively nimble. You don't need too many procedures you just need to make it make sense as you're driving, because it's a 200-ish person company.
Julie: Yeah. That's right. In that context I could see it start to happen in individuals, but in my career I got to see it at a greater scale as I moved through and to bigger companies. I moved through into companies that were either in the process of going to IPO or past that in a SOX compliance, and you could see, "OK, now the problems are starting to get not just about how do I solve this locally within my own division, but how do I get these other departments to also move down this parallel work stream of equally complex complicated stuff, and we all meet at the same time?" And we have all these different divisions that are responsible for the people and the financing, and they also have to be on board with it.
I spent about three years over at Walmart e-commerce, and one of the things that I saw my CSO at the time do very, very smartly was the alignment of interests between the development teams and security teams. They were measured on their performance for their security bugs, and security bugs were qualified as quality defects not just security. It was not a separate thing, so it spoke with their language and their values and it grafted onto their existing process. And getting back to, "What is it that a particular group feels pride in, or values?" You can see that continue to resonate.
Once a particular group was becoming an example of how to do things, they would want to continue that example and you could see that evolution.
Guy: The quality bugs as security bugs makes a lot of sense, sort of streamlines the process. How did the prioritization of those bugs come into play, do you know?
Julie: Great question.
Guy: Because to an extent if there was a burning customer problem, then that bug gets prioritized more than if your colleague just gets annoyed by it.
Julie: It can. This is where you need to have good expertise on both sides, and then the alignment of the interests. The groups would have their own prioritization standards and finding what those standards were in terms of, "What do you consider a P0, 'have to get done within one or two days or as soon as possible?' What do you consider a P1, 'get done within a week?'"
Whatever that vernacular was, being able to slipstream into that becomes really important to say, "I need this group to evaluate. We consider this security bug to be an ASAP bug, so it needs to graft on to how you guys treat ASAP issues. Whether or not this is new or par for the course."
In the security world there are a lot of objective definitions that are debated but are out there.
Especially with regards to NIST standards and CVEs and vulnerabilities. As much as possible I usually encourage my teams to borrow from best-in-class standards that we know. If we can leverage what we know to be the definitions within NIST, the CVSS ratings, and then give whatever mitigations that we know to be context and prioritize there. Say, "That's where I want this to land ultimately." Take the objective standard, figure out how it maps in our world, and then give that over to the developer teams and infrastructure teams.
If we have to have a discussion or negotiation with the leadership there on why that is or isn't considered important, then it does have to click up to that management level. But the thinking behind it is that it's not your priority versus my priority. It's a, "We understand this to be an objective priority. What does it take to fit that within your groups urgency?"
Guy: That makes a lot of sense. If I go way back, you can say something like, within each of these silos for the infrastructure world or the ops world it might be more than an outage is a P0. Your servers are down.
Julie: Exactly. That's liability.
Guy: On the development side, it might be a major bug that is blocking a big deal. That might be your top element and it might be a CVSS-10-type vulnerability from the security side. A standardized severity score of the top rank, but at the end of the day once you've classified it as a P0 in that context, then your methodology of dealing with P0s doesn't need to defer between whatever these three buckets or whatever other buckets that you have for those.
Julie: Yeah.
A big part of us being able to leverage known art on the standards is being able to take some of the egos out of it, and also help people develop their thinking about these things.
Because a lot of times we are working with a lot of smart, disciplined engineers and they're going to come up with their own system. Which is great for the thinking process, but again, it's well-tread territory so take what's out there and then develop on top of that, as opposed to starting completely new.
Guy: Yeah. At the end of the day this is not that dissimilar from evolution of engineering practices or others, when you're a 200-person company or even before that, when you're a 20-person company. Your organization, most people are engineers. You start by just having smart people and they know what good looks like. Later on, I need to define what good looks like, but they still can work a little bit more methodically and as you grow you have to at least get better at the definition and communication of what good looks like.
Even if you keep the autonomy of reaching that goodness level within every one of these groups, you still need better definitions. And you started those, talking about alignment of organizations and tracking it. Is this also the way that these metrics basically got reported up? Security issues reported or from a development perspective, my security bugs were just counted alongside my outages and quality bugs?
Julie: They were very much reported up, and also reported out. In terms of how we manage things in the large enterprise company, there were regular SLA measurements in terms of how well they had accomplished meeting at the bar for the different criticality levels for the criticals versus highs, and whatnot. Who is adhering to them or over performing, and which groups were not?
To some extent there is alignment at least on, "This is going to feed into how you measure your group's performance," and managers were left to their own discretion in terms of some of the mechanics on that. But there was also a fun competition aspect of it in a place that allowed it. We had a leader board of, "These are the groups that are ahead, these are the groups that are not." It was not a problem in terms of punitiveness. It was just--
Guy: Just gamification.
Julie: Yes, exactly.
Guy: To get some competitive juices going and to get people running.
Julie: That's right. Which is an important part of it, it has to fit with the culture that you have and be able to keep people constructively oriented.
Guy: The notion of engaging people and maybe back to your previous comment of making it a source of pride, you need to somehow be proud of something. It might be intrinsic. Have you seen good recognition models that work for security? You've been in all these elements, were there good ways to reward security that you've seen work well?
Julie: Yeah, I believe I've seen pockets of it. We had a program that's part of the security enthusiasts and embeds through the org that might be interested, and we would have some regular meetings or help subsidize trainings for certs for CSSLPs or CISSPs. That was one incentive. There's various ways, again back to the hard and the soft recognition, it is really important. Especially once your organization has gotten to the point where security has its own division.
A lot of times security is a completely business-critical and mission-critical function.
But if the organization is separate, at this point, a lot of what has to happen has to happen through influence and goodwill to the rest of the org. The groups that are running the systems, and the people. Every opportunity that you have to have a good interaction, as opposed to something that's felt to be a negative or penalizing interaction, is really important to leverage. And the models I've seen at different companies depends on what's valued there. Some of the more startup-y culture companies might be interested in having public accolades in various forms, or swag or things that are indicators of stuff.
Companies where things are more formalized recognition, maybe things like performance recognition, bonuses, and that kind of thing. So it's important at that point, as a security leader or a security professional, when you come in to really look at what seems to drive behavior at that company. And also the makeup of the employees there because it is going to change, it depends on many things. The profile and the character of the folks that are there.
Guy: But you do come back to the notion of alignment. The notion of not just alignment within their organization, but alignment of the means through which you incentivize or you drive security practice, and the way that you incentivize other stuff in the organization and other activities which makes a lot of sense. I love this pattern around almost this evolution of security and retro evolution of security practices. And it feels your path took you from small companies where security needs to permeate.
It's no individual person's job, so it needs to be to an extent everyone's job. A lot of it is done through these informal interactions, through growing and creating a security department and then going to a place in which the organization is already large, and you need to have your tentacles out.
You need to have some strong advocates that are to an extent almost coming back to that first tier. Which is you need enough people that are local and that are embedded within a smaller team, back to the original small team, that get it done despite the fact that it's not their job. You're now permeating many small teams.
Julie: That's an excellent point, because ultimately what good looks like at the end is not too different in the practice. Everyone has to have internalized it. With
security becoming a deeper and more pervasive challenge nowadays, it really does become everybody's job on some level.
The engineering groups get the messaging and the focus first because so much of what we touch for the product has a security implication, and we're going to get that direct feedback. But if you think about it, every group that brings in a vendor that's a vector for you.
You look at how many companies had issues with third parties or contractors or consultants, and so every part of the business has a role to play. A lot of what you're trying to do in creating a larger company that works, especially for security, is replicating the healthy intuitive behaviors that happen in those small environments. In a startup that's on the right trajectory, the people understand each other and understand the mission, and almost through osmosis start to absorb each other's priorities.
Then as the companies get bigger and you get more distance in the people, as well as just what's going on and the knowledge levels, things have to work on their own. At that point the structures, the incentives, the goals and the communication channels all have to start reinforcing what was there in the beginning and the intuitive stage.
Guy: The mistake that you might make, which frankly goes counter to agility as a whole, is that you can just continue that original structuring and take it to the extreme. Because you've gone from osmosis to giving it more structure, but when you're larger the continuation of that is not even more structure and mandated path, it's actually to say, "OK. You've given structure."
I've grown very fond of the word "scaffolding." That structure needs to become scaffolding, and you need people climbing on them. You need some element of the local champion and knowledge that permeates it, which you see a lot of companies fall into that mistake of the path.
Julie: Absolutely. It is very common, and you're starting to hear about the various companies that have those kinds of different groups.
The good thing for security is that at some stage of career, a lot of engineers want to be involved with security. There's an aspect of it where even just being part of the evangelical or embed aspect of it becomes appealing to folks.
And that's great for us in terms of being able to reach out. The part that we have to make sure is really there is that we're not mistaking soft influence for the hard functional lines and authority that's also needed to be there.
And you bring up a great point about, "It's not just the things or the structures that are important, it's the behaviors and the internalized actions that are really what you're looking for in security. I think that's a big reason why there's so much around DevSecOps these days. What is it about DevOps that was allowing people to build infrastructure and scale in an agile and in a cost efficient way. It was because all of these ideas and technical controls were embedded and internalized upstream. It's not that they went away. Similarly for security, both from that technical piece of it does need to be embedded upstream.
But then as you're looking at how a security programming is functioning for a company the things you end up wanting to measure ultimately aren't just the activities of your security team. I always start out and say, "OK. Great that we got a report of what we did and the number of touches we had, the number of tickets we handled, the number of this and that. Ultimately, how many security incidents were averted? How many were reported? How much did we minimize by shrinking our attack surface?"
Because ultimately you want to be looking at the risk and the overall impact, not just the activities that we did to get there. And in that way companies that are comfortable with moving quickly and evolving quickly from an Oracle level as well as a technical level have an advantage. Because over time a lot of the structures that emerged in the industrial age were designed for a non-networked world. And we have a lot of procedures and artifacts that are built up around this idea of, "What are we doing to show control and show. rigor in a non-networked world?" In a digitally connected and interconnected world--
Guy: And fast moving.
Julie: It is very fast moving. Both your attacks are fast moving, your defenses are fast moving, and if I did the control correctly how many artifacts do I need along the way, after the fact?
Guy: Yep. I think it's oftentimes indeed the essence of the eventual outcome of DevSecOps is agility. But I'm entirely with you, I had this whole article talking about how DevSecOps is oftentimes confused because it's a buzzword and you can interpret it however you want. But interpret it as, at the end of the day, security for DevOps. Technologies or security for DevOps. Ideologies with their container security or it's micro server security or container security, but the true Holy Grail is the DevOps shared ownership that you want to embrace.
You want to say, "This is about, 'we all own it together, it's everybody's problem, it's inclusive security.'" and it comes back to that osmosis. So this is great evolution and it naturally builds on itself, and it sounds like a super interesting career and flow here. I have a whole bunch of other questions to ask you, but we're around the edge of time.
Before I let you go, I like to ask every guest on the show, if you had one piece of advice or maybe a pet peeve or something that you would want to share or tell teams that want to level up or improve their approach to security, what would that be?
Julie: I would say it's encapsulated well by a joke that an infrastructure friend told me about security. He said, "How do you tell the difference between a good security person and a bad security person?" He said, "You can't. They both say 'No.'" And as a former infrastructure person I said, "That just sucks. I got into this to build stuff, not just to tell people 'No.'"
It reminds me that the whole point of this is for us to be fueling creativity and product for the customer in a secure way.
Being able to keep that bigger picture in mind on all sides is for me, that Northern Star.
Guy: Remember, at the end of the day you are there to drive business and to make it successful. Julie, thanks a lot for the great conversation. Thanks for coming on the show.
Julie: You're welcome. It was a pleasure.
Guy: Thanks everybody for tuning in, and I hope you join us for the next one.
Subscribe to Heavybit Updates
You don’t have to build on your own. We help you stay ahead with the hottest resources, latest product updates, and top job opportunities from the community. Don’t miss out—subscribe now.
Content from the Library
The State of Security in 2021
When we hosted DevGuild: Enterprise Security in late 2019 and HackerOne CEO Marten Mickos in Feb 2020, we had no idea that the...
Building a Dev-First Security Unicorn: Fireside with Snyk Founder Guy Podjarny
In this Speaker Series, Snyk Co-Founder Guy Podjarny joined Heavybit Managing Director Tom Drummond for a fireside chat exploring...
Roundup: Security Best Practices in a Remote Workforce
As we collectively adjust to the new normal, we’ve been collecting insights from industry experts and experienced leaders to...