Skip to main content
Episode 42

Season 4, Episode 42

News Media Security With Kate Whalen

Guests:
Kate Whalen
Listen on Apple PodcastsListen on Spotify Podcasts

Kate Whalen: “The boundaries between personal and professional lives are blurring an awful lot. I want to have a similar level of security on both. People are worried that they shouldn't get involved because security is something that they don't know much about and that they might do something wrong. Once people start learning a bit about all of the scary things out there or all of the ways that you can be insecure, it can feel that it's an unsolvable problem. The work is endless but at least you can make progress to being in a better place than you were yesterday.”

[INTRODUCTION]

[0:00:33] Guy Podjarny: Hi, I'm Guy Podjarny, CEO and Co-Founder of Snyk. You're listening to The Secure Developer, a podcast about security for developers covering security tools and practices you can and should adopt into your development workflow. It is a part of the Secure Developer community. Check out thesecuredeveloper.com for great talks and content about developer security, and to ask questions, and share your knowledge. The Secure Developer is brought to you by Heavybit, a program dedicated to helping startups take their developer products to market. For more information, visit heavybit.com.

[EPISODE]

[0:01:12] Guy Podjarny: Hello, everybody. Thanks for joining us back at The Secure Developer Podcast. Today, we have a great guest. We have Kate Whalen from The Guardian. Thanks for joining us on the show here, Kate.

[0:01:16] Kate Whalen: Thank you for having me, Guy.

[0:01:17] Guy Podjarny: Kate, we're going to dig a lot into The Guardian and how you work on security and the like. But before we do that, can you tell us a little bit about yourself? What is it that you do? Maybe, how did you get into this world of security?

[0:01:28] Kate Whalen: Okay. I'm a security engineer at The Guardian, which means that my day-to-day is writing a lot of code, but also doing a bit of security advocacy as well. I haven't actually been a security engineer for that long. I moved from a regular developer role into security engineering because it's an area that I've always been interested in. My background is actually in pathology and microbiology. So, I like to think that I came into security mostly because I'm constantly interested in how systems get compromised or infections and viruses spread.

[0:02:03] Guy Podjarny: I guess it's not so different. You went from one type of virus to another.

[0:02:05] Kate Whalen: A lot of the terminology is the same, though. That's nice and familiar, at least. It's been an interesting three years, upping my security knowledge. I actually went to DevSecCon in London about three years ago, and that's where I started learning about security-rated things, and just started, trying to teach myself as much as I could in my free time. Eventually, I applied for the security engineering role.

[0:02:31] Guy Podjarny: How was that? What was the response in the company to think about moving from dev to security? Was that a frequent practice? Were you a pioneer?

[0:02:40] Kate Whalen: I'm the first person to do it to my knowledge. It is actually quite encouraged. At The Guardian, we encourage people to move around inside the organisation. We have people having quite long tenures, which I think is amazing because we encourage our developers to move back and forth between a developer-manager role, or maybe a tech-lead role, or to pursue careers in other areas of the business that interest them, which allows them to maybe try out different skillsets or practice other areas that they haven't had a chance to. So, yes, for me, it was a bit of a big move, at least an internal big move. But I'm really enjoying it so far.

[0:03:18] Guy Podjarny: Very cool. Do you feel it was helpful, from when you're in the security engineering role, is it useful that you came from the dev team in the company? Or is it all the same, how helpful is it that you were a developer before you got into the security engineering role?

[0:03:34] Kate Whalen: I think it's really, really helpful, because a lot of companies have an issue whereby different teams or different groups are a bit siloed. It can definitely be the issue where maybe you have a quality team or application security team, and it's more of a throw-it-at-the-wall attitude whereby developers and security don't get a chance to talk or developers in operations don't get a chance to talk. I know a lot of people around the company, which is nice because then when I have a problem, or I need some support, or I need someone to review a PR, I've got a large team of developers who I used to be part of the team of that I can still go to and rely on. It allows us to be a bit more cross-functional.

[0:04:19] Guy Podjarny: Yes. I love that idea. I think it's a good personal people-type aspect. You connect, you can relate to the problems, people individually. It's also sometimes professionally valuable, because you know the systems and you come with a certain amount of knowledge over there, and bring that into the security team and help educate.

[0:04:41] Kate Whalen: From the point of bringing developer tooling or sharing best practices, there are things that a lot of our developer teams might be using, or our internal tooling that in InfoSec we can really benefit from adopting.

[0:04:54] Guy Podjarny: Very cool. Did you find you brought along a good number of those? I mean, are there a couple of examples to tout from dev tools that you brought along to the journey?

[0:05:03] Kate Whalen: Definitely at the moment. My depth for the quarter is a lot more of CI/CD, so continuous deployment for some of our InfoSec tooling. Being able to rely on the tooling that other members of my company are working on, and being able to request features or help from them has been really helpful.

[0:05:24] Guy Podjarny: Give us a slightly bigger picture of the security org. You're a security engineer, what are the different teams in the security org and how do they relate?

[0:05:34] Kate Whalen: To be honest, there's only five of us in InfoSec. We are a small team. I'm a security engineer within it, but we've all got different roles and we've all got different specialisations. What we want to do is be able to secure an awful lot of The Guardian, and that can be everything from helping employees manage their passwords, to advocating for secure developed practices, to building out tooling so that we can be more strategic and less tactical. Our remits are all quite broad, and we have all got our own specialisations. Because as well as doing the development side of things. We might also be looking at join us, leave us processes, or how do we do account management or email auditing?

[0:06:20] Guy Podjarny: Yes. How is it different to be a security engineer versus maybe the other titles in the team?

[0:06:28] Kate Whalen: To be a security engineer in the team, for me, it means that I'm trying to look at our current workflows and design tooling or solutions that can automate some of it, or at least automate the boring parts. Also, we have an awful lot of alerting and monitoring that isn't really integrated with itself at the moment, or integrated with all the different systems.

I would quite like my inbox not to be quite so inundated with all of these alerts, so I am trying to find better ways to ingest and respond to security alerting and data. That would be one of my other big remits. So, I'm looking at building tooling and then also supporting the developer teams around things such as security reviews. If they maybe want to look at how their AWS accounts are configured. So, lot of our infrastructure is deployed on AWS. We are a quite big user of cloud services. But sometimes, you need a second pair of eyes, or you want someone to look through how you've configured all of your security groups or your applications, to make sure that you haven't left any security holes. That's something I can help with, and other things such as doing security reviews of applications to see if people are adopting secure coding practices.

[0:07:52] Guy Podjarny: Basically, you're a security engineer in two capacities. On one hand, you are an engineer yourself. You build stuff. You put the security tools into the pipelines and the likes. Then on the other side, you support the engineering team working with them. I guess your engineering skills come into the forefront when you need to review code or you need to understand how an application works. Is that a fair assessment?

[0:08:15] Kate Whalen: Yes. That's an excellent summary.

[0:08:16] Guy Podjarny: Cool. So, you're in The Guardian, and that's a fairly influential organisation. It's got a lot of good news that comes out of it.

[0:08:25] Kate Whalen: Thank you.

[0:08:25] Guy Podjarny: How do you look at, indeed, all these different kinds of risks that you face? How do you approach the threat model of The Guardian?

[0:08:35] Kate Whalen: I suppose, through different attack vectors. We might have a digital threat model. If we're worried about how we might be attacked through infrastructure or through applications, that's one risk assessment or threat model to do. Then, the other one might be more of physical safety or operational safety and practices. How do we ensure that people's accounts don't get hacked or that their passwords don't get compromised? Doing that type of threat modelling, or risk assessment.

[0:09:13] Guy Podjarny: When you look at the population in The Guardian, of course, you have the tech as you build the web content and all the different technology pieces. You also have a fair bit of journalists in the company, as one would expect. How does that change the security work? Does that have any impact versus the technology stuff?

[0:09:32] Kate Whalen: Yes. Absolutely. The assessments of threat models we might want to do would depend on the individual, and also sometimes the locations. If you are travelling abroad, you might want to consider which devices you bring with you, or what checks you put in place, or other security practices you might want to adopt before doing that.

I apply that to myself as well. If I'm travelling to different countries, I'll have a different threat model for each one of those countries according to how concerned I feel I should be about me going there. You might want to do the same thing with your employees, if they are travelling. It might be to cover a protest or it might be to cover a sporting event, but sometimes you do have to think about, “Okay, how do we look after them and their devices in the field?”

[0:10:23] Guy Podjarny: So, this is really the level of safety you might attribute to cellular connectivity, even if you're in a more dictatorial regime-type surrounding, or your phone being physically stolen if you're in the midst of a protest. Is that what you're referring to? That you have to think about those types of threats?

[0:10:41] Kate Whalen: Yes. You'd have to go through all of the worst-case scenarios, and it might also not just be an employee. It might be someone that you're meeting with. So, what are their expectations around their confidentiality and their privacy or anonymity? They want to have certain boundaries respected, and so how do you ensure that employees or individuals can do their jobs, but also look after themselves and whoever they might be in communication with.

[0:11:11] Guy Podjarny: Interesting. I'm tempted to drill into that journalist route, but I think we're at The Secure Developer podcast here, so let's take a look at the other population which is your developers. You mentioned a bunch of this when you described your role and about how you work with developers to do code reviews and the likes. But if you had to look at the learnings over the course of these years that you are in security engineering, what do you find was effective working with the developers? How do you try to engage the team in helping you make the software secure?

[0:11:42] Kate Whalen: Before I even joined The Guardian, we always had a system of security champions or security agents. We switch between the two names. Trying to get developers who are interested in security or just want to learn more about it to come to semi-regular meetings, where we might try and learn something together, or run through one of the small training games that various websites have. Or maybe use it as a bit of a knowledge-sharing opportunity.

That's been really good as a way of ensuring that there is somewhere, that people can ask for help, or talk about things that might have happened to them. It's also a really good forum to discuss security incidents or potential security incidents. If someone's noticed something strange going on, then they might mention it at that that type of meeting, and then it drives a bit of discussion, and then you might have someone ask a question about a particular vulnerability and then someone else explains it. So, that's a really good way of doing communication and engagement.

[0:12:44] Guy Podjarny: How does it logistically operate, this group? You mentioned, sharing and meeting, how often does this group meet? How many – what are the rough ratios of security agents to developers?

[0:12:57] Kate Whalen: I'm only the one that organises it, so if I'm being organised it's once a month. It might just be like myself from the InfoSec team there, and then hopefully developers from our different development teams. A mix of people from different seniority levels and different projects, so hopefully covering most of the areas, which is good because then if we have an announcement to make, or a bit of guidance or advice to give out, then you can encourage them to share on with their teams.

On top of that, you might also want to do additional sessions. I'm running a Halloween session so I can share all of the scary stories that have happened over the last year.

[0:13:42] Guy Podjarny: That's a great one. Do you also use – you talked a lot about these people coming along and sharing within the group, problems with sharing experiences. Do they also serve as good advocates, or as extensions of you inside the different teams? Do you have any learnings about what did and didn't work in trying to achieve that?

[0:14:04] Kate Whalen: I think something that works quite well is having smaller focus sessions, not necessarily having too many people along because if you've got lots of people in the room, it tends to make some people less likely to speak up or more worried to admit that they don't know something or to ask questions.

Particularly, with new developers, I like to try and have them for a one-on-one or maybe two-on-one intro session to explain how we approach security at The Guardian, and the fact there is a shared responsibility and that they should never feel bad about asking questions about it, even if it's just saying, “Is this quite right?” Then it's a good time to also talk to people about how to look after passwords, and how to set a strong password, and why multi-factor authentication is amazing. Because with our developers, I imagine like most places, our developers have got quite privileged access. They can run pretty much anything on their laptops and they can access an awful lot of systems, so you really, really, really want to make sure that they understand how not to get hacked.

[0:15:15] Guy Podjarny: Big fan of two-factor authentication. Definitely one of the greatest thing since sliced bread. We need one of those for all the security problems, a couple of those magic bullet solutions. You talked a lot, are you doing these one on one, two-on-one developer interviews, or not interviews, conversations around security. What are some common misconceptions you come across?

[0:15:37] Kate Whalen: I find a lot of people think that they have a password system which solves the having a unique password problem. So, they'll have a starter password, and then they'll iterate on that and do variations on that. They have a unique system that no one else could ever figure out. That's an interesting one to talk to people about and maybe suggest other alternatives.

Other misconceptions about security, I haven't run into too many. I suppose the main one is that people are worried that they shouldn't get involved in it, because security is something that they don't know much about, and that they might do something wrong. Is there probably an expert or someone else who should be looking after all of that? So, trying to persuade everyone that actually best efforts are better than no effort, so even if you feel like you don't know that much about security, actually just ask questions. We're never going to be annoyed if someone sends an email to us asking, “Does this look quite right?” Or “Should I click on this link?” We'd much rather everyone send that than get phished or open that malware.

[0:16:47] Guy Podjarny: That's a great one and very much an emphasis, I guess, or an aspect of this shared responsibility. They need to accept that shared responsibility. Before you used the term security nihilism, is kind of the description of it. How do you battle it with this type of, “Just know that it's okay. We're going to be receptive.” How has the response to that been? Do people use that? Do they embrace the responsibility?

[0:17:14] Kate Whalen: Yes. I hope so. I definitely had a new joiner flag an interesting blip in some of our monitoring to us, which was good to see and then go chat to them. It was about a week after they're joined as well, so it was wonderful that they immediately thought, “Oh, I should tell InfoSec about this.” Definitely, feels that it's been helpful having these conversations. Then also people know who you are. They know where you sit and what you look like, and you're a bit more approachable rather than being something scary to involve security or InfoSec, where we don't want to only turn up when something bad happens. But with security nihilism, often whether I'm doing training internally at work, or if I get some external meetups, once people start learning a bit about all of the scary things out there, or all of the ways that you can be insecure, it can feel that it's an unsolvable problem. There's too much to fix and we'll never be secure. Where even to start? Trying to combat that feeling of just pick up something, the work is endless, but at least you can make progress to being in a better place than you were yesterday.

[0:18:28] Guy Podjarny: Indeed. If you don't do it, nobody else will. It’s the reality of, if nobody else can keep up, really. Developers have to embrace a lot at work. How do you assess? Also, having spoken to you before, a lot of the approach started with your comments on CI/CD is through tooling. You bring tools into the mix, you put them into CI/CD. What do you look for in the solutions that you bring in? How do you assess tools that would work in your context versus ones that you don't think would be a fit?

[0:19:00] Kate Whalen: Tooling is one half, and then the other one is adoption because you really need to make sure that if you're getting tooling in or if you're building tooling, that it's solving the problem that developers actually have and that they want to use it. Otherwise, it's just going to collect dust, and no one's going to log in or check any of your dashboards.

If I'm building internal tooling, then that's great, because my users are normally trapped in the same building with me and they can't get away. You can do an awful lot of UX and testing and feedback and ask people what they would like to be automated, and show them what you're currently displaying, and then get feedback on that. That also makes sure that they are aware of all the features of the product. If you have a short sit down and chat with them and show them through it, they will actually say, “I didn't realise that I could get the information from this panel.”

So, that's really useful. Making sure that people actually have a chance to spend some time with the tooling will mean that they can understand how they can get value from it. With picking out tooling, it's really great if our developers have some tooling in mind that they're already using. Funnily enough, actually, with Snyk one of the reasons why we adopted it as an organisation is because we had about four or five different teams all trying out the free version. When we went to the enterprise, that was interesting to reconcile, because we had five different The Guardian Snyk accounts to try and integrate. But that was ideal, because it had five different developers all decide that they wanted to use Snyk.

[0:20:41] Guy Podjarny: I always appreciate it. I think I love the comment on the developer usability bit. So, like you do, I guess it makes perfect sense, but it's still unfortunately not terribly common to do this usability testing for your security tools with your developers and iterate on those with them. It seems like a really bright idea and yet not one that is done often enough.

[0:21:06] Kate Whalen: It's really important for developers, because we tend to have a lower threshold for bad UX, particularly if we're working on applications ourselves. We've done AB testing. We know what's a good user flow and what's a bad user flow. So, when we're confronted with a very unintuitive system that doesn't seem like it's had any user testing, it's intentionally frustrating us.

[0:21:28] Guy Podjarny: It’s excellent. It's actually also using the talent and the skills in-house of how to do it right. They’ll be probably tapping a little bit into your skills as having come from the dev side of the fence within the organisation, but also it gets people to have skin in the game. If you've commented and you've given your feedback, and your feedback was implemented, then you were that much more inclined to actually embrace and use the solution because you feel you had a hand creating it.

Before I let you get back into that security work, I like to ask every guest that comes on the show if you have one bit of advice, one tip to give a security team looking to level up their security foo, what would that be?

[0:22:08] Kate Whalen: Probably to adopt password managers. It's a very quick win, and then get everyone else to adopt password managers. Ideally, for their professional life and personal life, because I think the boundaries between personal and professional lives are blurring an awful lot. I know that my GitHub password reset goes to my personal email account, so I want to have a similar level of security on both and ideally, I want to be using password managers everywhere.

With password managers, you can also get an enterprise account and get team shares, so that you don't have developers sharing secrets, or API keys via Slack, or other less secure channels. So, not just good for passwords, good for everything else you don't want to be shared in the clear.

[0:22:58] Guy Podjarny: Not in sticky notes on the board, or shared over Slack. Excellent tip. I fully well appreciate it. I'm a big fan and I don't know any of my passwords. They're all in the password manager, as they should be.

[0:23:11] Kate Whalen: Ideal, yes.

[0:23:12] Guy Podjarny: Kate, this has been a pleasure. Thanks a lot for coming on the show.

[0:23:14] Kate Whalen: It's been lovely. Thank you so much for inviting me.

[0:23:16] Guy Podjarny: Thanks everybody for tuning in, and I hope you join us for the next one.

[OUTRO]

[0:23:22] Guy Podjarny: That's all we have time for today. If you'd like to come on as a guest on this show, or get involved in this community, find us at the thesecuredeveloper.com, or on Twitter, @thesecuredev. Visit heavybit.com to find additional episodes, full transcriptions, and other great podcasts. See you next time.