Cybersecurity Sessions #9: Security and privacyCould we be more secure online by enforcing stricter privacy measures?
Many businesses argue that they need to collect information about customers to verify who they are and secure their accounts. However, this is at odds with online privacy advocates, who say organizations are compromising our security by collecting far too much information about us.
In this episode, Netacea co-founder Andy Still talks to ZDNet cybersecurity journalist Charlie Osborne about the intrinsic link between security and privacy, how legislation like the GDPR has strengthened both, and where responsibility lies for protecting our privacy online.
Charlie Osborne, Cybersecurity Journalist at ZDNet
Charlie Osborne is a cybersecurity and finance reporter for ZDNet and has written about business tech, innovation, and cybersecurity since 2011 for CBS Interactive, Informa, and Mastercard. She is also a freelance journalist for several top-tier security titles including Cybersecurity Venture and The Daily Swig. In addition to her passion for writing Charlie also has a keen interest in photography and occasionally tries her hand at a bug bounty or two.
- How important is privacy to security and vice versa?
- Where should responsibility lie for protecting our privacy online?
- Does businesses collecting more information about their customers help or hinder security?
- Has GDPR changed attitudes to privacy and security, and how does this differ in the EU and the USA?
[00:00:00] Andy Still: Hello everybody, and welcome back to the Cybersecurity Sessions, our regular podcast, talking about all things cybersecurity with myself, Andy Still, CTO and founder of Netacea, the world's first fully agentless bot management product. In this episode, we're gonna discuss the relationship between security and privacy. Both online and offline security have always relied on compromising individual privacy. The more we know about individuals, the more we track what they're doing, then the more we can identify any threat they pose. However, that very data captured can then become a target. If we cared more about privacy, would overall security be improved? Well today, we're lucky to be joined by Charlie Osborne to talk about how security and privacy are intrinsically linked. Welcome Charlie. Great pleasure to talk to you today. Before we start, could you quickly introduce yourself for our listeners?
[00:00:47] Charlie Osborne: Absolutely. Hi, my name is Charlie Osborne. I'm a cybersecurity and finance journalist for ZDNet, which is part of Red Ventures. And my work is also shown on the Daily Swig and Cybersecurity Ventures.
[00:00:59] Andy Still: Thanks Charlie. So let's start with the big question. How important is privacy to security and vice versa?
[00:01:06] Charlie Osborne: It's absolutely fundamental. But unfortunately I think it's a commodity which we're going to lose in the future, personally. Privacy is the sort of fundamental thing when it comes to boundaries, whether that is us choosing what data we share or information we share with, people we trust, businesses we trust, whether it's healthcare providers, whether it's retailers. And then the moment that is eroded, we start to have problems. Now it's not just about criminalization and groups that are targeting individuals for that information, it's also about whether or not businesses treat it with the respect it deserves.
[00:01:42] Andy Still: So just drilling down into that a little bit, speaking from the point of view of a business, if you can't collect enough data about the individuals who are interacting with you, how can you be secure? How can you protect yourselves from malicious users, aiming to hide behind anonymity when interacting with you?
[00:02:00] Charlie Osborne: Well, I think this is the issue that data is cheap. So whether you look at dark web marketplaces, where you have big batches of information that can be used to everything from identity theft to cloning cards, or whether you want to use it for marketing and tailored advertising purposes. I think quite a few businesses either have become complacent when it comes to collecting data from their customers, or they don't quite understand how important it is on a personal level, on an individual level to protect it. Unfortunately security is often not really considered when it comes to production chains, business transformation, business projects, generating revenue streams, or it's an afterthought. And that's really what has to change in the future. And it's probably a bit too late now, considering how common data breaches are, whether it's from an SMB or one of the, you know, Fortune 500. And we have a lot of catching up to do, but the damage is done.
[00:02:54] Andy Still: And has, things like GDPR, has that helped in this situation?
[00:02:59] Charlie Osborne: I think it's helped, at least in terms of awareness, whether or not it's helped on the ground is a different matter. Certainly there are some businesses, at least in the EU that have had to invest far more in their infrastructure. They now realize that they are data controllers and they have certain responsibilities when it comes to managing their customers' information. But I can give you a good example. When GDPR came in, I acted as a consultant for a relatively small business in the UK, it was a few million pounds a year they generate in revenue and they asked me how to go about it, because there was some confusion over whether you could go in with GDPR as a sort of soft measure or a hard measure. So where do you draw the line when it comes to getting consent from your customers? But the problem was when I looked at their actual legacy infrastructure, the way they were storing the information was one incredibly old server in office block. There was no backups. There was no security, you could walk straight in. So they had serious fundamental issues when it came to protection anyway, or even just backing up their data. Because if one person had realized how easy it was to get into that particular business, used an old vulnerability, everything, deleted everything, the entire business, it would've been catastrophic. It would've gone down anyway. So then they had to invest a huge amount of money just in bringing their basic infrastructure up to speed before they could then implement the GDPR controls and protections they were legally required to do.
[00:04:25] Andy Still: Okay. So it sounded like it was actually quite a positive input in that it's got people actually seriously considering not only the data that they capture in the first place, but also what they do with that data once they've captured it.
[00:04:39] Charlie Osborne: That's it. And one of the fundamental questions we looked at once they'd actually started upgrading their systems and moved some of the legacy architecture, they'd gotten rid of it, replaced it with modern alternatives, was what data are they actually collecting and why did they need it. Because one of the issues, I think that's still very prevalent, is businesses collecting more data than they need. And part of that is also is a very slow erosion when it comes to privacy, when it comes to, you know, so what information should we share and what information isn't necessary for us to share. You can walk into, say, an Apple Store. You might be asked, you know, what your job role is and what this is and what that is. You're just going to buy a computer. But I think there's more fundamental issues that are beginning to creep in society. And it's those slow changes and slow erosions, which are also going to become a very big problem in the future.
[00:05:29] Andy Still: Yeah, I think it's the challenge between capturing information, which is from a business point of view, the more information you can capture about customers, whether that's for security reasons or whether that's for business reasons, understanding the responsibility that then comes with having captured that data, because you then become a target for other people who want to source that data from you.
[00:05:53] Charlie Osborne: Exactly. I mean, you know, if it comes up in T&Cs that some information we've shared with third parties. That's totally fine. But as long as there are controls in place to make sure that data's treated properly, whether that's being secured in transit, in storage, whether it's anonymized, you know, these are sort of controls that businesses need to look at because privacy now I don't think is just a matter for the consumer. It's also about the business taking their shared responsibility.
[00:06:20] Andy Still: And do you see that there is, I think, a general move and a general concern, in the public as well as in the kind of legislative world, to actually treat online privacy as a very serious? Do you see that driving policies within companies?
[00:06:36] Charlie Osborne: I think it does to an extent, yes. Because data breaches are now so commonplace. You know, it's every week we hear about either a major security incident, or we hear about a bucket that hasn't been secured and it's been exposed on the internet for the last five years, or, you know, an engineer loses a laptop and leaves it in a taxi somewhere, and that has all of his business' customer's details on it. But I also think there's an element of... so, whether it's fatigue when it comes to it, because we hear about it so often it just seems, "well, you know, my information's out there anyway, so what can I do about it?" We're also creatures of habit, I think. So we like to reuse passwords. We like to make things as easy for us on a daily basis as we can. You introduce elements of friction, when you are implementing security controls for your business or for consumer service, if you go too far with them, so for example, a password, 2FA, a CAPTCHA, all of it. You can also put customers off using your service I think, so a balance has to be maintained in that respect.
[00:07:42] Andy Still: Yeah, its always a challenge, I think from a cybersecurity point of view, of how much you have to take responsibility for the, for want of a better word, laziness of your users. So reusing passwords is the kind of obvious one in that, for you, as a company, shouldn't be your concern, but you end up taking responsibility it, because if it ends up out there in a breach, other companies are then at risk. One of our areas of protection, we look at a lot of credential stuffing attacks. One of the things we talk about is the fact that it doesn't matter how much you secure your data, other people's breaches are a threat to you and they're a threat to other companies, but they're a threat to you as an individual as well.
[00:08:23] Charlie Osborne: That's right. Now, I think there are some controls, which are coming in which are definitely beneficial for trying to stop that from happening or at least mitigate the effects. Whether it's a password manager that alerts you when your password's been involved in a breach, so you can change it immediately, whether it's that appearing on your credit report, which is a relatively new sort of feature I've found in some of them, for example, Experian does. So it's also about introducing an element of awareness so people actually know the information is out there. Their passwords need to be changed here, there and everywhere. Other ways they can be used to access a different service, and just generally educating and improving the situation over time, I think is what's happening right now.
[00:09:04] Andy Still: Okay. Do you see there's much variation in companies' attitude to privacy and security, kind of geographically? Because obviously GDPR is an EU innovation. Do you see that as being stronger in the EU and the UK versus, say, the USA?
[00:09:21] Charlie Osborne: I do actually. I think things improved in EU over the last couple of years, mainly because of GDPR. No matter how unwillingly it was imposed, a lot of organizations, they sort of had to get it right, whether they liked it or not or they'd face potential penalties, fines, and investigations. Whereas when I've, for example, covered data breaches that involve US companies or unsecured buckets or this that and the other, they either don't want to listen, they're not as quick off the mark, shall we say, in terms of things, I'd say the exception to that is the larger enterprise companies that in the United States, there was one which was a non-secured bucket, and the company, their entire database was open. So I was able to have a look in it. And I noticed they kept sending my email, informing them of it, to the bin, because the system was open online. They just would rather brush it under the carpet than actually tackle the issue. And I think potentially part of that is because, in terms of the strength of GDPR, I don't think there is anything that's really happening in that respect in the US at the moment, but it will.
[00:10:26] Andy Still: Yeah. I think like a lot of standards, what GDPR was, largely common sense, but it actually forced companies to go through that journey of considering what they should be doing, and actually take it a bit seriously.
[00:10:39] Charlie Osborne: Yeah. And I think to an extent it is understandable. You know, if you've got a retailer or a restaurant. I mean, cybersecurity really isn't necessarily part of their business process. And so you're asking people to learn about, you know, another part of their industry in terms of IT, and to make sure they understand concepts that can be quite difficult to grasp and learn about in the beginning. So I think in that respect, some businesses just are frightened by it. They don't want to go near it. Or they don't have the in-house specialty or expertise and they don't know where to turn or how to start improving things.
[00:11:17] Andy Still: Yeah, absolutely. And I think, you know, it's had impact on businesses. Do you feel that, for us as individuals, that standards and general practices are we more or less vulnerable to things like social engineering or fraud as a result of the push towards more online privacy?
[00:11:35] Charlie Osborne: I think as there's been a push towards strict controls, threat actors, cyber criminals will try and always go for the weakest link in our chain if you like. And a lot of the time is trying to appeal on a personal level to people. So social engineering, getting the smallest bits of information, whether that's from a couple of different data leaks, whether that's oversharing on social media, you create a story with that, and then you can, say, go to a telecom provider, impersonate them, and conduct a scam without even the victim being involved. For example, SIM swap attacks are a great example of that. Or when it comes to things like phishing, you and find themes, which will elicit an emotion, whether that's panic, urgency, and then you force your victim to make a decision without really thinking about it. And before you know it, your bank account's been cleaned out. So whether or not you have sort of things like encrypted systems, whether you have 2FA, they can only go so far if people aren't and they're not looking out for these sort of scams that are becoming more sophisticated on a daily basis.
[00:12:40] Andy Still: Yeah, I think for individuals, the amount of data that can be harvested legally, just through kind of large-scale automation and then, intelligent grooming of that data to get small bits of data together that can actually build quite a large picture of you as an individual that can very easily build a compelling rationale for you to take some action. I know someone was relaying a story to us the other day of a phishing attack that had been launched on a business and the finance officer got an email from what was supposedly the CEO, asking them to pay an invoice urgently. And that email had mentioned the fact that the CEO was just leaving Tokyo because he'd just left a conference, and the invoice needed to be settled before he was off the plane. So they'd harvested all that information from various sources to make a very compelling story. The only mistake they made was at the end of the email, they said "thanks". That rose the suspicions in the finance officer, because that CEO never said thanks for doing anything. So they got in touch, questioned them and it turned out it was a phishing attack. They could have built such a compelling story just by publicly available bits of information via social media to do that. So the complexity of the attacks that can be built even in, things that people aren't particularly worried about. So going on social media to say, you know, I'm at a conference in Tokyo at the moment, something that most businesses, you do put that kind of information out there. Do you feel that that is that a privacy issue? Should we be more cautious about what we're saying? Or is that a security issue? Is that about getting the processes in place to... Manual or automated processes to try and stop those kind of phishing attacks?
[00:14:25] Charlie Osborne: I'd say that's a privacy issue. But it's still one that's incredibly important for maintaining personal security. I think it's important to notice, well, that in the same way that social networks have all brought us together and it shows our friends, our family, everything else. Now privacy has to be a collective effort. So, for example, you could be very good at maintaining your own personal privacy online and not oversharing, but if you've got an aunt or a brother or a boyfriend or a girlfriend who shares absolutely everything online, that can still be part of your story and that can still impact your own security and your own risk of falling prey to social engineering attacks, or similar kinds of fraud and scams.
[00:15:08] Andy Still: To what extent do you think this is a matter of personal responsibility? Or do you think that the companies should be taking some responsibility for the information that's being shared, particularly on social networks? That information is out there. People can be using that to then target you. Should the social networks be taking some responsibility, protecting people from that kind of danger?
[00:15:33] Charlie Osborne: It's a difficult question to answer, really. Because if you start saying social networks should be censoring what people are sharing, then they're also gonna be criticized for not allowing freedom of speech, I suppose, right? Of course they can impose certain types of censorship when it comes to have no extreme violence online, graphic videos, sexual content, whatever. But I think in an ideal world, it would be collective and people would just be more mindful about what they share, whether that's information concerning themselves, or whether it's a family member or a friend. But in order to have that, you've also got to have a basic level of education, knowledge about current scams that are going on, what information you should share and what you shouldn't. And we're growing up in world, sort of now we've moved away from dialup where oversharing is pretty much standard, right? To go back and delete everything. Well, once it's online, it's online. It's very difficult to get rid of certain things. I'd say something over time would have to be improved, but I'm not entirely sure unless you, I suppose you could compare it to, say, financial education. There's not really much of it in I'd say the UK or arguably the US from a young age. Right? So with IT I'd say it's same thing. You might have been taught how to use Microsoft Word and Excel, we're not taught anything about basic security or privacy. A lot of the time, for example, it was expected to be down to the parents to make sure their kids knew not to talk strangers online. But we need to go a lot further than that. And if I had my way, start teaching them about cybersecurity and the risks and what, how you can protect your digital persona if you like, and I'd have them doing it now. I can't really see that happening, unfortunately, but that's what I would have.
[00:17:10] Andy Still: Yeah, I think that's a very valid point. Because I think the challenge you've got in social media is that, it is a combination of a communication tool and a publishing tool. So you use it to talk to each other, you talk to your friends via it, but it's very easy for those things that you say to your friends to accidentally leak out, to be published to the entire world. Imagine everything you said late at night, whilst drinking in the pub, was available for the entire world forever. It's not a situation you generally want to be in, but it's a situation you can easily end up on social media. You see examples of people who have said stupid things that were meant just, of for a small group of people when they were young now coming back to haunt them, they're having to take the consequences for that in a way that you wouldn't have had to do with if that hadn't been available online. So I think that that element of considering what you say, and the fact that there is no privacy, regardless of your privacy settings, via social media... you've said it to someone else. They can then relay it on. You're dependent on the people you've said it to. You know, it's easy to see examples of, you share a picture online with just some friends and it then becomes available to their friends and their friends. And suddenly you’re getting comments on pictures from people that you don't know, and you had no intention of having that picture seen by anyone else. So I think the idea, as you're saying of, that kind of personal responsibility in school is essential because it isn't necessarily something that you can secure.
[00:18:38] Charlie Osborne: That's correct. And it's something, once it's there, it's there and it is very difficult to erase your digital footprint as it were. So if you choose to participate in social networks and this, that, and the other, then you have to be really, really careful about what you say, but you have younger age. There was a case a fair few years back, but it was a young woman who was joining the police force, I believe it was. And there was a big fanfare about her being appointed until someone found a tweet that was quite racist that she posted years back when she was basically a teenager. Because of that, then she lost that prospect. You type her name into Google now, and you can guess what comes up. Because she overshared, she said something done online, that's also not only impacted how she's viewed, but also her reputation.
[00:19:27] Andy Still: Yeah, the thing is, it's like you say, this is not just celebrities. This is not just people in the public eye. This is individuals, it's something, know, you do when you're young, you can carry that for the rest of your life then. And like you say, it's there on Google every time someone searches you. And I think increasingly people are doing that kind of diligence as part of the recruitment process as well, to do a basic kind of online presence search to see, make sure that you're not bringing any previous baggage with you as part of a role.
[00:19:53] Charlie Osborne: That's correct. I think it should be noted that when it comes to maintaining privacy you ought to consider employment as well. So if you have a social media account, perhaps have a separate one for work or keep your personal one far away from public eye, maybe use a different name or a different surname or something so you're not so easily found. In the same way that working from home has sort of merged work and home life, you can probably say the same about our digital life, so we ought to try and keep those separate as well. So I think that's also part of making sure that you're maintaining your own privacy.
[00:20:28] Andy Still: Yeah, I think that's a really great piece of advice. And it's certainly something that as social media has grown, particularly when you look at the likes of celebrities, but also the growth of influencers. One of the really attractive parts that people like is that you get let into the private life of those people. But it does mean that you lose that private side of your life. You put your whole self out there. That means you don't have any way to turn away from it. You don't have any way to turn off. And I think the idea of keeping your personal and professional life completely separate on social media is a really good idea. I think it makes a lot of sense from a privacy point of view, but also probably from a mental health point of view as well. So Charlie, just to finish off, if there was any message you'd like our listeners to take away from today, what would it be?
[00:21:14] Charlie Osborne: I'd say that privacy is something that we've treated as just part and parcel of our lives and we're always gonna have it. I don't think that's going to remain the case. So you have organizations, whether that's the EFF, whether it's Amnesty International fighting for the rights to privacy and security, whether that's through technological means, whether that's education, whether that's sort of grassroots initiatives, I think it's something as individuals we also need to take a bit more responsibility for ourselves. Because if we don't, then we're going to lose it. And we've seen what happens in certain countries when it's not considered a fundamental right. We have governments that, you know, perhaps are going towards a big brother sort of state when it comes to surveillance, spying and laws that allow them to tap into our phones and our online accounts and everything else. It's up to us as individuals that we want to keep a level of privacy. We need to also educate ourselves and the best way to do that.
[00:22:11] Andy Still: Thank you, Charlie. That's excellent advice there. Thanks again for joining us today and thanks everyone for listening in. As usual, if you got any feedback, please do that via either our Twitter account @cybersecpod. Or you can also email us at email@example.com. And we look forward to seeing you again on the next podcast episode. Thank you very much again, Charlie.
[00:22:35] Charlie Osborne: Absolutely. Thank you.
[00:22:37] Andy Still: Thank you.