Listen early, and ad-free!

207: Cyber biowarfare, giant ladybugs, and strippers

With , , ,

Fears are raised about cyber bioterrorists, there's a widespread blackout for IoT devices caused by a cloud cock-up, and what role do strippers play in a revamp of the United States's computer crime laws?

All this and much much more is discussed in the latest edition of the "Smashing Security" podcast by computer security veterans Graham Cluley and Carole Theriault, joined this week by Mark Stockley.

And don't miss our featured interview with Steve Salinas of Deep Instinct, discussing ransomware.

Follow the show on Twitter at @SmashinSecurity, or on the Smashing Security subreddit, or visit our website for more episodes.

Remember: Subscribe on Apple Podcasts, or your favourite podcast app, to catch all of the episodes as they go live. Thanks for listening!

Warning: This podcast may contain nuts, adult themes, and rude language.

Theme tune: "Vinyl Memories" by Mikael Manvelyan.

Assorted sound effects: AudioBlocks.

Special Guests: Mark Stockley and Steve Salinas.

Sponsored By:

Support Smashing Security

Links:

Privacy & Opt-Out: https://redcircle.com/privacy

Transcript +

This transcript was generated automatically, and has not been manually verified. It may contain errors and omissions. In particular, speaker labels, proper nouns, and attributions may be incorrect. Treat it as a helpful guide rather than a verbatim record — for the real thing, give the episode a listen.



CAROLE THERIAULT. All right, hi everybody, Carole Theriault here from Smashing Security. Something a little different this week. We have had quite the year, so Graham Cluley and I have decided that any monies we receive via Patreon during the month of December 2020 will go directly to our local food bank. We're doing this because there are a lot of people that are hungry and it's getting cold out there and it's Christmas. If you're not a Patreon a supporter, which is totally fine. I do urge you to look at your communities to see how you might be able to help bring a little bit more joy this season to those that are having a hard time. And lastly, just a huge thank you for all your support this year. It has meant the world to us. Now let's get this show on the road.


MARK STOCKLEY. So hang on, hang on. There's a website. Yes. Put together by Some guy—


GRAHAM CLULEY. Scientists.


MARK STOCKLEY. I've seen the people that make websites. I'm already scared. And then they send you DNA in the post.


GRAHAM CLULEY. Yeah, they send you the genes.


MARK STOCKLEY. Okay, I give up. I give up.


ROBOT. Smashing Security, Episode 207: Cyber Bio-Warfare, Giant Ladybugs, and Strippers, with Carole Theriault and Graham Cluley. Hello, hello, and welcome to Smashing Security, Episode 207. My name's Graham Cluley.


CAROLE THERIAULT. I'm Carole Theriault.


GRAHAM CLULEY. And we're joined this week by Mark Stockley again. Hello, Mark. Again?


MARK STOCKLEY. Yeah, yeah, Jesus.


GRAHAM CLULEY. Well—


CAROLE THERIAULT. And here's Mark again.


GRAHAM CLULEY. Now, Mark, after your last appearance on the show, you had a bit of feedback on the old Twitters, didn't you?


MARK STOCKLEY. I did, I did. Yes. Somebody wanted to tell me that I was the poshest-sounding guest you've ever had.


CAROLE THERIAULT. What? What, one party told you that?


MARK STOCKLEY. Yeah.


CAROLE THERIAULT. It must be 100% true.


MARK STOCKLEY. Well, it's good enough for a marketing survey. It's good enough for me.


GRAHAM CLULEY. Thanks to Mrs. Stockley for leaving you that message. Very good of her.


MARK STOCKLEY. It was a real surprise. But you have some very, very illustrious guests, like BBC journalists and—


GRAHAM CLULEY. Yes.


MARK STOCKLEY. You know, serial chess world champions and things like that. And so I felt like that was something I could cling on to. At least, I felt that way until you piped up and said, actually, he's not the poshest-sounding guest we've ever had, because we've had Dr. Jessica Barker on.


GRAHAM CLULEY. She's very posh-sounding. Although apparently from Newcastle.


UNKNOWN. There you go.


GRAHAM CLULEY. Surprising. And that's something which we found out on our recent YouTube livestream, which we did. I know a lot of our listeners joined and watched that video monstrosity in action. And Carole, we've got some news, haven't we, on the livestream front?


CAROLE THERIAULT. Why are you calling it a monstrosity?


MARK STOCKLEY. Because you watched it afterwards.


UNKNOWN. Is that why?


GRAHAM CLULEY. Carole, what's happening on the livestream front?


CAROLE THERIAULT. Well, we are going to have a second inaugural livestream party. We're going to do it Thursday, 17th of December, and it's going to be our little pre-Christmas party on YouTube. So Graham Cluley, talent and friends, maybe Mark, you would like to join us as a friend, but you might have to perform. We're talking to people who are thinking about doing songs or street dances. So just saying, high caliber.


MARK STOCKLEY. I can sound reasonably posh.


CAROLE THERIAULT. Yes. Does that count? Or maybe you could just bring a chicken along. That'll work.


MARK STOCKLEY. Okay. Yeah.


GRAHAM CLULEY. So if anyone wants to pre-register for this or find out more, all you have to do is go to smashingsecurity.com/live. And as As Kryll said, it will be on December the 17th, Thursday, December 17th at 8:00 PM UK time. And what other times around the world, Kryll?


CAROLE THERIAULT. Can you not do the math? 3:00 PM Eastern Standard, noon Pacific. People can work it out. Same time as last time.


GRAHAM CLULEY. Okay.


MARK STOCKLEY. Well, I think we know what talent you are gonna be showing off.


GRAHAM CLULEY. Okay, that's fabulous. So what else is coming up on the show this week, Kryll?


CAROLE THERIAULT. Well, first let's thank this week's sponsors, Deep Instinct Culture AI and LastPass. Their support help us give you this show for free. Now coming up on today's show, Graham is going to scare us with research on cyberbiological attacks. Mark Stockley laments a broken smart vacuum, and I find out why the US Supreme Court is talking about the Computer Fraud and Abuse Act. And we also have a featured interview with Deep Instinct's Steve Salinas. We do a deep dive into ransomware and how it's impacting us all in 2020. All this and much more coming up on this episode of Smashing Security.


GRAHAM CLULEY. Now, chums, chums, have you ever been accused of releasing some kind of biological weapon into the atmosphere?


CAROLE THERIAULT. I'm so bored of these ridiculous questions.


MARK STOCKLEY. Yes.


GRAHAM CLULEY. Interesting. Okay, well, I've been thinking about this, and certainly there is a lot of concern out there that bioterrorists may begin to get up to all kinds of shenanigans.


MARK STOCKLEY. I thought this was a fart joke.


GRAHAM CLULEY. No, not that kind of admission into the atmosphere.


MARK STOCKLEY. What have I just admitted to? Well, it's been lovely being on the podcast. I think the police are here.


GRAHAM CLULEY. Now, one of the most interesting group of researchers who I've come across in my years of investigating cybersecurity are the boffins at Ben-Gurion University of the Negev over in Israel. They produce some of the most fascinating and wacky, crazy ape bonkers security research that's going. In the past, they've described how data could be could be leaked by your computer monitor's brightness, how your headphones could be reprogrammed to record your conversations, how data could be stolen from air-gapped PCs through the fan or ultrasonic emissions— not that kind of emission, Mark— through your built-in speaker, and much more besides. Really crazy, bonkers stuff. And now those researchers claim to have discovered a new end-to-end cyberbiological attack.


CAROLE THERIAULT. What?


GRAHAM CLULEY. Yes.


CAROLE THERIAULT. Okay, I need you to explain this like I'm 5. Tell how this would work.


GRAHAM CLULEY. Well, what they're saying is that they have found a way of compromising Kroll. The synthetic DNA supply chain, I'm sure.


CAROLE THERIAULT. Okay, again, I still not following.


MARK STOCKLEY. Hang on, hang on. Is this. Are we talking about actual DNA here or are we talking about Bitcoin marketing bullshit?


GRAHAM CLULEY. No, no, no, no, no, no, no, no, no. This is actual, actual DNA. This is real DNA.


CAROLE THERIAULT. Yeah, he got this in the Daily Mail, so.


GRAHAM CLULEY. It's definitely not just the Daily Mail, Crow. I also, I'm sure you read the about it this week in Nature Biotechnology magazine.


CAROLE THERIAULT. No, not in my reading list.


GRAHAM CLULEY. You didn't pick that up?


MARK STOCKLEY. I'm still on last week's. I haven't got to this week's yet.


GRAHAM CLULEY. Now, you might have imagined in the past that a bad guy would have to get their hands on a dangerous substance such as a toxin or some kind of poison or biological virus to produce and deliver it to an unsuspecting world.


CAROLE THERIAULT. No, I'm not imagining that. I still don't even understand what synthetic DNA is.


MARK STOCKLEY. We're going to come to that.


GRAHAM CLULEY. We're going to come to that. I'm just saying, I'm just correcting you.


CAROLE THERIAULT. I can't imagine it.


MARK STOCKLEY. It's bitcoin. I'm telling you, it's bitcoin. This is going to be a bitcoin story.


GRAHAM CLULEY. Well, Mark, when you've previously emitted dangerous toxins into the atmosphere—


CAROLE THERIAULT. He passed out.


GRAHAM CLULEY. You probably had to work hard to produce them yourself, didn't you?


MARK STOCKLEY. No, it's quite easy, to be honest.


GRAHAM CLULEY. Just eating some sprouts. Well, here's the thing, right? In the old days, Carole, making genes— so genes are made up of DNA.


MARK STOCKLEY. Right?


CAROLE THERIAULT. We're not talking Levi's here.


GRAHAM CLULEY. Okay. No, no, no. Making jeans from scratch used to be laborious, time-consuming. You know, you couldn't do that many before lunchtime. But it isn't anymore. Today, you can literally log into a website, upload the DNA sequence you want, which is like, you know, G-A-T-A-A-A-T-C-T-G-G-T-T. You know, DNA sequences, all those characters.


CAROLE THERIAULT. I do know, yeah.


GRAHAM CLULEY. It's very odd, isn't it? G-A-T-C. I'm not sure how that works. Mark, you probably understand that. Benway, you upload the DNA sequence that you want.


CAROLE THERIAULT. Do you think that we're all like secret medical scientists?


MARK STOCKLEY. Mark reads a lot.


CAROLE THERIAULT. Yeah, but—


MARK STOCKLEY. I assumed it was to do with the shape, because there are interlocking pieces. And it's A, T, C, G, isn't it? I assumed it has to do with the shape of the interlocking pieces.


GRAHAM CLULEY. I don't know. You upload the DNA sequence you want, and you order however many genes you want online, right? So it's just like making any other online order. And a couple of—


MARK STOCKLEY. So hang on, hang on. There's a website. Yes. Put together by some guy.


GRAHAM CLULEY. Scientists.


MARK STOCKLEY. I've seen the people that make websites. I'm already scared. You type in ATGCTG.


GRAHAM CLULEY. Or cut and paste.


MARK STOCKLEY. And then they send you DNA in the post.


GRAHAM CLULEY. Yeah, they send you the genes according to the sequence that you've requested.


MARK STOCKLEY. Okay, I give up, I give up.


GRAHAM CLULEY. It arrives. It arrives a couple of weeks later in the post.


MARK STOCKLEY. We've gone too far. We've gone too far. Stop.


CAROLE THERIAULT. But how does it arrive? Okay, what arrives? In the post. No, no, okay, I understand that. What, like you open up the box and what is inside?


GRAHAM CLULEY. Well, I imagine it's some receptacle.


MARK STOCKLEY. Genes.


GRAHAM CLULEY. Containing genetic material or something.


CAROLE THERIAULT. What, in a syringe?


GRAHAM CLULEY. Well, I don't know.


MARK STOCKLEY. Is it just like an organ? Just like an organ that no one's ever seen before?


CAROLE THERIAULT. Like 3D printed?


MARK STOCKLEY. Like a placenta or something? Is it refrigerated?


GRAHAM CLULEY. I don't know why you're making this sound so complicated, because the powers that be, the powers that be are worried about how simple this is, and they are worried about how to stop this technology being abused. So they're worried about terrorists, for instance, going to these websites, uploading the DNA sequences of bioweapons or dangerous viruses, or changing a harmless bacteria into one that could be a deadly toxin. So what they've done, the US Department of Health and Human Science have put together guidelines for scientists and for these websites for how to screen requests for synthetic DNA to stop naughtiness happening.


MARK STOCKLEY. Hang on, hang on, hang on.


GRAHAM CLULEY. Hmm.


MARK STOCKLEY. So these guidelines, like step one on the guidelines is obviously don't make a website where somebody can type in some words —some letters.


CAROLE THERIAULT. And they get biotoxins.


MARK STOCKLEY. Well, well. Right? Is that— I mean, surely.


CAROLE THERIAULT. Yeah, you'd like to think that as you were typing in your sequence, they might go, "Ah, stop right there, mister." Well, that's not a good one.


GRAHAM CLULEY. So let me tell you what they do, right? And in some ways, this is— what would you rather they did, Mark? Would you rather they wrote it in crayon and put it in the post to people? That would— imagine the transcribing of that. That could go terribly wrong as well, wouldn't it? What's so wrong with using the web?


MARK STOCKLEY. Hang on, I don't even understand why we're debating a service whereby some random person just randomly writes something out, and then they get the thing, like the organ comes in the post.


GRAHAM CLULEY. Can I explain what the first screening test is? The first screening test is, who the hell do you think you are ordering synthetic DNA, right?


MARK STOCKLEY. Oh, well, that's a relief, 'cause those work really well.


GRAHAM CLULEY. So they ask you, for your name, contact information, billing and shipping address. And they also ask some other questions. If you answer these kind of questions incorrectly, it rings a few alarm bells. Like, would you rather pay in cash or cryptocurrency? Do you want the DNA shipped to you in an unusual way? Or, you know— What does that mean? Do you have any special requests? Like, don't tell the feds what I'm doing, or, you know—


MARK STOCKLEY. Would you like it shipped to you in some sort of projectile. Could you send it to me in an artillery shell, please?


GRAHAM CLULEY. Would you like it shipped to the White House directly? Something like that would ring an alarm bell, thinking, hmm, I wonder what's going on here. So they do, they first of all say, who the hell are you to be asking for this stuff?


MARK STOCKLEY. Are you a terrorist? No.


GRAHAM CLULEY. It's like when you fly to the United States, you have to fill in the form and tick the box if you're a member of the Nazi Party or something like that. So similarly, they have that kind of test. So you have to get through that hurdle first. And of course, a web form always, 100%, will weed out any ne'er-do-wells at that point.


MARK STOCKLEY. I mean, you know, as long as it's got a captcha, right? Yeah. Yes, Carole?


CAROLE THERIAULT. I'm guessing this site was online, alive and kicking before these regulations came into place?


GRAHAM CLULEY. Oh no, these regulations have been in place for about 10 years.


CAROLE THERIAULT. All right, so this is how the environment tries to control who gets their hands on synthetic DNA.


GRAHAM CLULEY. This is how it's done right now, although sadly, not Not all websites which are producing synthetic DNA are following these guidelines at the moment. But these are the official ones which they are encouraging these websites— because of course, anyone can set up a website if they want to. And you may be some bizarre foreign state which is doing this as well. But let's not even get there. I hate this.


CAROLE THERIAULT. It's 2020, we're all having a hard year, and now you've just added another layer of complexity.


GRAHAM CLULEY. Let me reassure you, because there's now a second screening you have to pass. So once you've convinced them that you have a legitimate request— You are who you are. The next thing is, what on earth do you want? So what they do is they look at this DNA sequence, which you are ordering, and they check whether it contains any sequences which have the potential to pose a severe threat to human life, animals, plant health, and other things like that. Life, basically. Yes. So they've got, they've created some kind of database of common, like, bad stuff and say, if we spot any of this, I imagine they do something like a grep.


MARK STOCKLEY. I imagine they're just looking for a sequence of characters. Oh my God.


CAROLE THERIAULT. So I wonder if someone was trying to— If someone was trying to get like weed killer, right? Right. Would it kind of, you know, and they're putting the sequence for some kind of— Why would you have a DNA weed killer? Who knows? But if you did—


GRAHAM CLULEY. Then it might try.


MARK STOCKLEY. I feel like I've detected a flaw in this plan. Well— A lot of these things are dose-dependent, aren't they? Yeah, it is. If you take too much salt, it'll kill you. Can I order a ladybug, but one that's 9 feet wide?


CAROLE THERIAULT. Well, I wouldn't worry about that. She's not gonna get into your house. What if she was 9 foot long though?


MARK STOCKLEY. Are you telling me Amazon don't have cardboard boxes for that? 'Cause I'm telling you, you're wrong. On a more serious note, I have other concerns. So I'm not a geneticist, but my understanding of genetic code is that there are very large parts of like the human genetic code that we don't know what it does and are quite possibly useless, just kind of remnants from, you know, previous iterations of humans. And who's to say if they're harmful or harmless if they were activated? I'm very concerned. Mostly concerned about the giant ladybug.


GRAHAM CLULEY. Yeah, I'm sure the US government department have got this all under control and they've created a totally thorough database.


CAROLE THERIAULT. So we can say they've logged Mark's concern.


GRAHAM CLULEY. If your request for the DNA, the genes, the synthetic DNA has triggered either the 'Who the hell are you?' or 'What on earth are you doing ordering a giant ladybird?' challenge, then— They look at it a bit more closely, and they can eventually pass it over to the nearest FBI field office. Specifically, the Weapons of Mass Destruction Coordinator.


MARK STOCKLEY. So there is— Whoa, that jumps up the chain pretty quickly.


GRAHAM CLULEY. There's someone whose job it is at the FBI field office to look after WMDs and coordinate them. And so he'll say yes or no. So, back to the eggheads over in Israel. They say that there's a few problems. One is there's no comprehensive database of pathogenic sequences. So all the bad stuff. So they said the guidelines are fundamentally outdated already. So they say that's a bit of a problem. But more than that, they created a proof of concept cyber attack which could obfuscate a nasty toxic DNA sequence in such a way that it wasn't picked up by the screening.


CAROLE THERIAULT. So What, so they basically encrypt it and then slip it through?


GRAHAM CLULEY. Well, encrypted is perhaps the wrong term. Okay, that's true. But it was something which would effectively emulate the same sequence of characters, although it didn't look the same. So it would muddle it up, but you'd get the same result.


MARK STOCKLEY. So like steganography?


GRAHAM CLULEY. Yeah, maybe in a way. So how can they inject their toxic, nasty, malicious DNA sequence into one of these systems. What they're talking about is they could actually infect legitimate laboratories who are asking for synthetic DNA. And with a browser plugin, so if they managed to install a browser plugin, when the scientists cut and paste their DNA sequence, they could actually intercept that and inject some of their own nasty DNA in there as well. And in their tests, 50 obfuscated DNA samples, 16 of them were not detected. So around about a third of all of them their attempts were successful to sneak past effectively malicious toxic DNA, which could then end up in the hands of people.


CAROLE THERIAULT. Yeah, but they weren't actually using toxic DNA sequences, presumably, right? So they basically obfuscated DNA samples and said, does this get past or not?


GRAHAM CLULEY. No, I think they looked at the databases which they felt were out of date, and they found the toxic pathogens, and they obfuscated those, and they put them into the requests. And they went through and they passed the tests.


CAROLE THERIAULT. Oh, great. So we now have a research team with 16 toxic DNA sequences. Yeah.


GRAHAM CLULEY. Great. They're Israeli students.


MARK STOCKLEY. We can trust them. If anybody's looking for some biological weapons, you'll find them at Ben-Gurion University. Yeah. So it's kind of interesting, I thought.


GRAHAM CLULEY. Oh, yeah. This kind of thing. As with much of the stuff done by this particular university when it comes to cybersecurity threats, it's not necessarily something you should lose too much sleep over. Despite Mark's nightmare vision of a giant ladybug at night. But clearly, better screening is required. So just relying on a computer to look for particular sequences of characters is something which has to be maintained and make sure that someone's not trying to slip something past you. So there you go. We've all learned about cyber biosecurity today.


CAROLE THERIAULT. I don't feel I've learned very much at all, really. But thank you. It was entertaining.


GRAHAM CLULEY. I'll put links in the show notes. Yeah, thanks. Mark, what's your story for us this week?


MARK STOCKLEY. I'm sorry, I'm still terrified. I'm gonna start with a question. So, what would you do if your hoover stopped working?


CAROLE THERIAULT. Panic. What? Who cares? I don't know, buy a new one.


MARK STOCKLEY. You'd buy a— Yes. That was a first-world answer. No, no, buy a new one. I was gonna say on mine, it normally stops working 'cause there's a ball of fluff in the tube. Yeah, yeah, that's true, that's true. And I just have to fish the ball of fluff out. But you know, another way of getting rid of the ball of fluff is obviously to throw the whole thing out. Buy a new one.


CAROLE THERIAULT. I'm like you, I would get in there. I enjoy doing that actually, cleaning out a vacuum cleaner. Unplugging it from some mat of hair somewhere.


GRAHAM CLULEY. I think when I was a student and I had a vacuum cleaner, it never really occurred to me to empty the vacuum cleaner. So they just naturally stopped working. On the 3 times during the year that we used it.


MARK STOCKLEY. So we don't need to worry about ordering foul genetic creations on the internet. There's a Hoover somewhere that's never been emptied in some student flat somewhere. So what about your doorbell? What would you do if your doorbell stopped working, do you think?


GRAHAM CLULEY. I'd replace the battery in the doorbell. Yeah. Or the piece of string.


CAROLE THERIAULT. I'd assume people would knock.


MARK STOCKLEY. That's true, that's true. Yeah, continue to use the door as has been used for centuries. Yeah. Easy. Well, what about your lights? What would you do if your lights didn't turn on?


CAROLE THERIAULT. Headlamps.


GRAHAM CLULEY. Start a fire? Turn on the oven?


CAROLE THERIAULT. If all these three things happened at the same time in my house, I think I would be thinking poltergeist, actually.


GRAHAM CLULEY. No, I think the fuses have gone.


MARK STOCKLEY. Okay. Poltergeist or fuses, one of the two. Exactly. Last one, last one. What about your computer? What would you do if your computer wasn't working properly?


GRAHAM CLULEY. Well, has it ever worked properly? I'd try and fix it and then go and buy another one if I was unsuccessful. What about turning it off and on again? Oh yeah, you've got to turn it off and on again. That's the first thing you learn, isn't it? What's the first thing IT tell you to do? People in IT support.


MARK STOCKLEY. Have you tried turning it off and on again?


GRAHAM CLULEY. Yeah. Yeah. Sometimes I've made the mistake of saying, "Have you tried turning it on and off again?" Which is the wrong way round. That's a mistake. That's probably why—


CAROLE THERIAULT. If they said no, then you have to say, "Aha! I have diagnosed the problem. We need to turn this thing on, baby." Funny.


MARK STOCKLEY. If they see a brief flicker and then it goes dead, then it was off in the first place. Now, The reason I'm asking the question is because on November 25th, so just last week, a bunch of people in the eastern United States suddenly found themselves with exactly these problems. So, vacuum cleaners that didn't do what they were supposed to do, doorbells and thermostats that stopped working, podcasts that wouldn't upload. Don't know if that affected you guys. No.


CAROLE THERIAULT. Oh, this is all smart tech stuff, isn't it?


MARK STOCKLEY. Lights that wouldn't turn on. Very clever.


CAROLE THERIAULT. Yeah, yeah, yeah, okay.


MARK STOCKLEY. And basically, a bunch of mysterious technical issues apparently unrelated. And of course, as Carole has guessed, what all of these things had in common was that they are all modern, internet-connected smart devices, or what we like to call part of the Internet of Things. So in other words, they are part computer. Ugh. Now unfortunately, what was affecting these devices is that the computer part was on the fritz. And so like all computers, when they break, as we just discussed, what you need to do when it breaks is you just need to turn it off and on again. Right, yeah. But there was a problem. Okay. Because these things were all part of the Internet of Things. Okay? So they're internet-connected devices, and what that means is the computer part, or at least a very important part of the computer part, isn't in the device. In fact, it's not even in the same house. It's actually out there somewhere in the cloud. Ah. And so the question is, how do you turn the cloud off and on again?


GRAHAM CLULEY. Well, yes, you don't have the plug for that, do you? Someone else does.


CAROLE THERIAULT. So yeah, but if your router goes down, it would impact all these things. Or your router. Sorry, I was trying to be posh there.


MARK STOCKLEY. But if all the routers went down at the same time, that would be something else. This isn't some guy whose Hoover stops and his light switch. This is a whole bunch of people in the eastern USA, suddenly all these vacuum cleaners stop and all these light switches stop and all these thermostats stop working.


GRAHAM CLULEY. So, so something's gone wrong with the cloud.


MARK STOCKLEY. Something has gone wrong with the cloud. And if when something goes wrong with the cloud, you have to talk to Amazon, right? Because in this case, the part of the cloud that failed was actually an Amazon service called Amazon Kinesis, which is one of thousands of Amazon cloud services that you've probably never heard of. That it turns out your entire life depends on. It's amazing, isn't it?


GRAHAM CLULEY. I think, because I think the average person in the street, I'm sure our listeners aren't like this, but the average person in the street thinks of Amazon as a company which delivers you cardboard boxes with small things inside them.


MARK STOCKLEY. But in fact— Massive boxes, small things.


GRAHAM CLULEY. Yes, exactly. But they also have so many services which rely upon Amazon cloud computing services.


MARK STOCKLEY. It is mind-blowing. I mean, I can't keep track of all the different names of things that they have. And it's all happened very, very quickly. So, I mean, it's a very interesting story actually, because you think about Amazon as, you know, back in 2000, Amazon was just— it was basically a shop, wasn't it? It was a bookshop. But it was a very, very big bookshop, and it was on the internet. And what they wanted to do was they wanted to start letting other people use their shop, like sort of create websites for other people. Who could use their shopping technology. And they realized that all their stuff was a total mess. And so as they kind of cleaned it up and worked out how to get all the bits of their own company working smoothly together, they realized in about 2003 that they had this fantastic cloud infrastructure. Yeah. And they could start selling that infrastructure to other people. And I first heard about this in about 2007, and it was mind-blowing even back then, but it was just the whole idea of these sort of servers that weren't servers, they were just like computing blobs.


CAROLE THERIAULT. And all these people who tried to, you know, think, actually, maybe I won't shop at Amazon to try and, you know, rebalance the landscape out there. They actually looked at their houses and how much of their services rely on Amazon.


GRAHAM CLULEY. Or the websites they're going to, which are probably still being hosted by Amazon. Exactly. Yeah.


MARK STOCKLEY. So one of the services, one of these many, many services is this thing called Amazon Kinesis. Yeah. Which in its own words, it's there to collect and process and analyze real-time streaming data. So it's things like video and audio, but it's also telemetry from Internet of Things devices. So what that means is you have apps and devices and websites that use Kinesis, and Amazon uses it as well. So its own services make use of Kinesis. So there's a thing called Cognito, which is used for authentication that relies on Kinesis. There's nothing called CloudWatch, which is used for monitoring, that relies on Kinesis. And what happened was Amazon decided, because Kinesis is so popular, it needed to increase the capacity of Kinesis in its US-East-1 data center.


CAROLE THERIAULT. Okay.


MARK STOCKLEY. So the Amazon universe is divided into a kind of small number of these fantastically boringly named data centers. Yes. So they were increasing the capacity in US-East-1. I can sense something bad is about to happen. They've produced this fantastically impenetrable very, very serious document to explain what happened. But I am going to attempt to translate this into short words that, you know, that simple people like me and Carole will understand. My attempt for an explanation for people like you. And so what you want to imagine is that this isn't one thing. This is actually thousands and thousands of servers, right? Yeah. That are all receiving these requests. And all of these thousands and thousands of servers are all aware of each other. So they all keep count of each other. So you imagine that each of these servers has got two hands, okay? And they're counting on their hands all the other servers. And what happened was Amazon added too many new servers. And so the existing servers, which have to keep count of all the other servers, they no longer had enough fingers on which to count all of the servers.


CAROLE THERIAULT. So they added servers but didn't tell the other services, hey, expect new servers to connect.


MARK STOCKLEY. Modern computing, you don't have to tell them, they figure it out for themselves. So what happens is you have this pool of computers, and they're all kind of watching each other going, "Oh, I see more computers are being added. I need to keep track of that one. I need to keep track of that one." And it exceeded each computer's capacity to keep track of other computers. So they ran out of fingers to count the number of computers on, and then they all stopped working. So how do you fix a problem like that?


GRAHAM CLULEY. Maria. You turn it off and on again.


MARK STOCKLEY. You turn it off and on again.


GRAHAM CLULEY. So was there some poor soul a god in an office somewhere who has the extension cable which is controlling all of these computers, and he went flick.


CAROLE THERIAULT. Is that what happened? It reminds me of my time on Air Canada, every single flight I've ever taken where they have to stop and restart the television or whatever entertainment system due to some glitch somewhere.


MARK STOCKLEY. So the question is, the question is, so Yes. They had to turn it off and on again. Obviously—


CAROLE THERIAULT. Meanwhile, people are crying over the fact they can't vacuum.


MARK STOCKLEY. Why will my vacuum cleaner not automatically start?


CAROLE THERIAULT. I think Roombas are depressed.


MARK STOCKLEY. The guy I felt most sorry for was the one who couldn't turn his lights on. I mean, limited sympathy, because obviously he did decide to buy lights that depended on the existence of a supercomputer. And I don't understand—


GRAHAM CLULEY. No sympathy, actually.


MARK STOCKLEY. I don't understand why that's important. But it does raise the question, what are we doing? Now, back in the '60s, when they invented the internet, they had this idea of a communications network that was so resilient, it could withstand a nuclear attack. Like, the whole point of the internet is that it routes requests through the undamaged part of the network. And then our generation has taken this amazing blueprint and gone, what we're going to do is we're going to put this massive centralized system smack bang in the middle of it, and all of your devices are going to be dependent on this centralized system, which occasionally will fall over and someone will have to kick, and then it will restart. Maybe. Maybe. Yeah. What's clear from the write-up of it is that there was an awful lot of sweating. Mm-hmm. And, you know, I imagine there was a lot of whiteboarding.


CAROLE THERIAULT. I'm glad he didn't say defecating.


MARK STOCKLEY. I mean, how do you model and test something like a data center? What will happen if we add 1,000 computers to our data centre? Or what will happen if in an emergency we take 1,000 out? Because, you know, it's all—


CAROLE THERIAULT. You'd think that there'd be simulated environments to do exactly that before you do it live though.


MARK STOCKLEY. But even if there are, they clearly don't work perfectly because you wouldn't get in a situation like this in the first place. I mean, if they had a fire or something, maybe you'd understand. But this was of their own volition. They added more computers and caused their own problem. If there are simulations, the simulations are obviously incomplete.


CAROLE THERIAULT. Have people dried their tears now that— and their lights are back on and the vacuum's running?


MARK STOCKLEY. Well, now their internet-connected fans are working again. They can get some assistance in drying the tears. But I think that the kind of worrying aspect of this is this is just where we are now. Like the speed with which we are acquiring devices that are dependent on an internet connection. Yeah. Not just that have one, but don't work without one. It's rather alarming. Okay, if your Hoover doesn't work, well, Graham managed to get through like 3 years of being a student without a Hoover that worked. So, you know, it's not that bad, but there are all sorts of things connected to the internet now. So if you own a Tesla car, for example, your Tesla car is effectively part of the Internet of Things. Tesla can turn off your car whenever they like if they want to, because it's dependent on that umbilical. And who knows what else?


CAROLE THERIAULT. And come back in 10 years Yeah, we'll have Mark back on the show with an update.


MARK STOCKLEY. I told you so, from my cave.


GRAHAM CLULEY. Well, that was cheery. Thanks, Mark. Any solutions? Any sort of wise words on how to improve the world?


CAROLE THERIAULT. How dare you accuse him of cheeriness after you talk about biological cyber warfare? I mean, is he talking about Roombas going offline for a bit?


GRAHAM CLULEY. Crow, what have you got for us this week? Well, we have a very cheery topic, Graham.


CAROLE THERIAULT. The 1986 Computer Fraud and Abuse Act. The CFAA. Okay. And the reason we're talking about this is it's in front of the US Supreme Court this week. This is the first time the Supreme Court has ever heard arguments to or against how the Computer Fraud and Abuse Act is currently designed. This is America's main anti-hacking statute. Okay. And the Supreme Court are looking at the scope of the CFAA law and how it is and can be interpreted. And so you have the court's 9 justices have a range of views on the question. And I'm inviting you all to don your Supreme Court judge hats or robes, rather. A wig?


GRAHAM CLULEY. Do they have wigs? No, no, no.


CAROLE THERIAULT. They have wigs there, but they'll have certainly robes, I believe. And we'll see what you would do here. All right, I'll get my bathrobe on.


MARK STOCKLEY. I'm wearing my robe already and my wig, but I always wear those for podcasts.


CAROLE THERIAULT. So I'm going to start with the year 1986. That was when the CFAA was put into law, and that is almost 35 years ago. Now, is it just me, or do we feel like, as you've just talked about, Mark, we're going through this incredibly huge technical revolution, and one of the richest countries in the world is depending on a 35-year-old law?


MARK STOCKLEY. I find that shocking. This law is designed to stop teenage hackers breaking into Whopper and playing tic-tac-toe, basically. That's what it's there for.


CAROLE THERIAULT. Well, that's interesting. What it's there for and what it actually covers, that's what we're going to talk about, okay? All right, okay. So it turns out that many, many Americans and organizations in the US have inadvertently broken this federal law repeatedly because inside the 1986 law, there is a broad definition of what's considered hacking. Okay, so quote, the law considers any intentional access to a computer without authorization to be a federal crime. Okay, now as CNET point out, this is broad enough that sharing a Netflix password could be considered a CFAA violation. So that's a— that's—


MARK STOCKLEY. does that make it a federal crime? Yes, a federal crime to share your Netflix password.


CAROLE THERIAULT. It makes it a federal crime. Granted, unlikely to warrant federal attention in a normal case-by-case basis, right? But it does mean that Americans are extremely reliant on how individual prosecutors and individual judges understand and decide to enforce this law.


GRAHAM CLULEY. But yeah, okay, but there are always extreme cases like this, aren't there? I mean, it's not like you say, it's not as though they're going to pursue it. And sharing your Netflix password is a naughty thing to do.


CAROLE THERIAULT. Have you ever done it? Oh yeah. Well, there you go. So you should be in jail.


MARK STOCKLEY. So next time you go to the States and it says, are you a terrorist on the little thing?


CAROLE THERIAULT. Have you ever committed a crime?


MARK STOCKLEY. You're going to have to Yeah, well, I, I— You committed a felony, Graham, and you've admitted it.


CAROLE THERIAULT. So if you take that, so the law considers any intentional access to a computer without authorization to be a federal crime. Would that mean that a 12-year-old who starts a Facebook page breaks the rules because she's basically not authorized to have an account? Does that mean it's a federal crime? What if someone shares their logins with a third party in order to get IT support from someone?


GRAHAM CLULEY. There are gazillions of laws like this, and it's all down to interpretation. No, there's—


CAROLE THERIAULT. well, we're talking— I don't know about the other laws. I know about this one. And I think, as you'll see very quickly, there's a loophole here that is kind of scary for our industry, Mr. Cleary. So the core issue that they're discussing is, should violating something like the terms of use on a website or a computer system lead to legal trouble at a federal level? Using the CFAA as your umbrella. So that's one. Now there's a second angle to consider. There's a group of people who are not happy about this 1986 law and its potentially incredibly broad remit. And this is our cybersecurity researchers, because many cybersecurity researchers' work involves finding vulnerabilities on software and gadgets without a company's authorization. So election security researchers at MIT uncovered issues with voting machines without the approval of the manufacturers. So they wrote it up and presented at the USENIX Security Conference earlier this year. Okay, and that they called it 'The Ballot is Busted Before the Blockchain.' So this is a blockchain e-voting company known as Voatz, V-O-A-T-Z. Oh yeah, these guys, uh, piggybacking on an unrelated CFA case, argued to the Supreme Court Right? This has been September. That security research conducted by MIT on their machines that found several security flaws breaks the CFAA and should not be allowed, not because the research was wrong, but because they were not authorized to conduct said research. Right. And that's what's being discussed right now.


MARK STOCKLEY. That's a terrific example because you're absolutely right that the way that we think about computers now versus the way that we think about computers in '85, '86 is Even the idea that there's computer security, like I can almost remember the day that computer security arrived in my life as a thing. It's a long time after that. Do you know, it reminds me of if you ever read about people that try hacking into cars, because obviously, you know, cars are part computer now, just like everything else. The trouble that they have to go to, to avoid poking any of the bits that they're not allowed to poke. Is really quite excruciating. And when you read it, my reaction on reading that stuff is certainly, there's something wrong here, that they are assuming that they are going to be ethical with what they discover. It seems like there's something wrong with them not being able to look. And in whose interest?


CAROLE THERIAULT. Yeah, because I think we're all fans of responsible disclosure.


MARK STOCKLEY. Yeah. Yeah. Yeah. But whose interests are being— I guess maybe there's concerns about intellectual property or something like that, that, OK, here's a black box and it's full of proprietary stuff and you're not allowed to look inside it. But it seems to me the greater good is normally served by people being able to poke around.


GRAHAM CLULEY. So how are people calling for the legislation to be changed, Carole?


CAROLE THERIAULT. Well, people like, for example, a bunch of security researchers obviously have had a beef with Voatz's point. And right, and soon after its tap dance in front of the Supreme Court, these guys responded publicly in an open letter saying, you know, security research is vital to the public interest. And they say a broad interpretation of the CFAA, which is what we currently have, it risks undoing many of these positive advancements, like, you know, being able to discover security vulnerabilities in election machines, for example, which is a big deal. Voatz's actions threaten good faith security research are indicative of what may come should the courts decide that a breach of controlled terms constitutes a criminal CFAA violation. They urge the courts to adopt a narrow interpretation of the CFAA.


MARK STOCKLEY. Can you imagine if the outcome of this is that we will have to start reading the terms and conditions of every website we visit? Like, if you think it's bad with the cookie pop-ups now.


GRAHAM CLULEY. Yeah. Combined with eye-tracking technology to see if you really have read it. As you scroll down, have you actually taken in each of those words? As you plug your brain into your IoT device So it looks at your brain activity to make sure— maybe they should actually insert inside their terms and conditions some random words about unicorns or—


MARK STOCKLEY. I thought you were going to say they should insert something into the brains of the people who are looking.


GRAHAM CLULEY. Well, you could do that with a malicious synthetic DNA. There's a website for that. So, I mean, this sounds very interesting, Carole, but surely there's a concern that this could go too far the other way as well.


CAROLE THERIAULT. We're not even close to going too far the other way. Like, right now is so, so broad. Grimm. And what that means is, for example, let's say, um, they wanted to arrest you, right? But they couldn't for, for whatever reason, a side reason, and they couldn't get you. They could get you on this federally.


GRAHAM CLULEY. What I did with the Hoover did not, did, did not cause any harm or damage to any other individual because you brought up cyber biological warfare on the show, right?


CAROLE THERIAULT. The FBI now got you earmarked and they want you, right? Because you've now opened up this huge floodgate. And the way they'll get you is by sharing your Doctor Who password with someone.


MARK STOCKLEY. Anyway, it's Gary Kasparov, isn't it? Yeah.


CAROLE THERIAULT. Politico are the only ones that said, you know, they are hopeful, I feel, is the way I would interpret what they wrote, saying that they saw a number of the Supreme Court representatives indicating reservations questions about the ambiguity and the scope, and they feel maybe they need to review it. So there is a silver lining if they do. I, I think they totally should. I think it's, uh, insane that we're relying on something that is 30, almost 35 years old. It's insanity.


MARK STOCKLEY. Well, those Supreme Court justices, they, they all look pretty tech savvy. I'm not concerned.


GRAHAM CLULEY. Yeah, they've got their finger on the pulse.


CAROLE THERIAULT. Yeah, yeah, don't worry. Justice Thomas has got us, uh, well enough. You want to know what I just said? You know how this all got in the Supreme Court. You might remember this story, actually. So this is back in 2017. So this guy says he's keen on this stripper, right? And but he's nervous. He's nervous that she might be an undercover agent. So he goes to a Georgia police officer, and he pays the police officer to look up her license plate in a confidential database. You know, the way we see every single cop do in every single show that we've ever seen, right? Just check her out, check her out online. Anyway, in a nasty twist, the unnamed guy, the one who was keen on the stripper and paid for the intel, was in fact the FBI agent.


GRAHAM CLULEY. So— Sorry, no, I'm confused. There was a man who was interested in whether a stripper was actually an FBI agent, and he himself was an FBI agent? Yes.


CAROLE THERIAULT. And it was all a way to capture, to get the Georgia police officer. So it was kind of a— what's it called?


MARK STOCKLEY. A sting? A honey trap.


GRAHAM CLULEY. Oh, so they were after the man who the FBI agent asked to do the looking up?


CAROLE THERIAULT. Yeah, they were after the Georgia police officer who did the looking up of the license plate. And— Okay.


GRAHAM CLULEY. Have they not got anything better to do?


CAROLE THERIAULT. But the point is, is this is still being fought, right? Because he keeps— you know, some people are saying it's not right that, you know, he did have access, legitimate access to that database. So therefore, is not that he's unauthorized access to this database, but he did misuse that database for other purposes. So everyone's discussing this, and this is the case that opened up all these floodgates and why it's in the Supreme Court. So it all came down to strippers.


GRAHAM CLULEY. Well, he wasn't authorized to do that lookup, was he? He wasn't authorized to do that lookup because that hadn't come through the correct route.


CAROLE THERIAULT. Yeah, but that's not explicitly explained in the CFAA. So legal quagmire ensues.


GRAHAM CLULEY. So they're going to need some sort of sub-clause about stripper lookups. Exactly.


CAROLE THERIAULT. You understand. You know, Graham's got it. Graham's got it. Stripper lookups.


GRAHAM CLULEY. That's what we need. I've got it. I think I've got this all under control, Mark.


MARK STOCKLEY. So what you're saying is that this law has been around for 35 years, and the Supreme Court, because it's not clear what the law actually is. The Supreme Court has never ruled on this law. And then a story about strippers came along, and now it's like, yep, that's the one. We'll have that one.


CAROLE THERIAULT. All comes down to strippers. That's all you got. That's your takeaway.


GRAHAM CLULEY. Most people agree that the most effective way to reduce the cost of an attack is to prevent it from happening in the first place. Deep Instinct strives to prevent all known and unknown threats using deep learning. Making detection and response automated, fast, and effective for any threat that cannot be prevented. Check out a report by the Ponemon Institute which studied the cost savings of adopting an efficient prevention model. Go grab it at smashingsecurity.com/deepinstinct. And thanks to Deep Instinct for sponsoring the podcast. This episode of Smashing Security is sponsored by LastPass. Now, everyone knows about LastPass's password manager for end users, but it's also a great solution for businesses. In fact, tens of thousands of companies rely upon LastPass to protect themselves. LastPass Enterprise simplifies password management for companies of all sizes and helps you secure your workforce. So whatever the size of your business, go and check it out. Go and visit lastpass.com/smashing to find out more. And thanks to LastPass for supporting the show. Security training sucks, it's boring, users hate it, they aren't paying attention, doesn't work, For security training to actually work, you'd have to find out what each person in the company is doing that's risky, send them phishing emails, monitor logs, check for passwords and have I been pwned, and then you'd have to train them in a way that doesn't send them to sleep, try and track what they're doing to see if it worked.


CAROLE THERIAULT. Who's got time for any of that? Culture AI do. What? Culture AI. They make this amazing software that plugs into your company, runs your phishing campaigns, integrates with Slack, tests if your users accept phony MFA requests, Yes, that's a biggie. And pulls in tons of other behavioral metrics from your existing apps. It basically figures out what everyone needs to know and then creates personalized training that is not boring. And it even checks that it's working. And it's all done automagically. And they've got a deal just for our listeners. Sign up at culture.ai/smashing and your first 50 employees are free. For life. Cool. More information, culture.ai/smashing. Stop your whining, Graham.


GRAHAM CLULEY. And welcome back. And you join us at our favorite part of the show, the part of the show that we like to call Pick of the Week. Pick of the Week.


CAROLE THERIAULT. Pick of the Week.


GRAHAM CLULEY. Pick of the Week is the part of the show where everyone chooses something they like. Could be a funny story, a book that they've read, a TV show, a movie via record, a podcast, a website, or an app, whatever they wish. It doesn't have to be security-related necessarily. Better not be. Well, my pick of the week this week is something that I'm sure many people have already checked out. I'm a few weeks late to this. I did have a Smashing Security listener, a couple of them actually, contact me and saying, Graham, are you going to talk about this TV show on the podcast? It seems right up your alley, mate. And eventually I got around to watching it. And I've nearly finished it. I haven't quite— I've got a couple more episodes to go, but I can already tell—


MARK STOCKLEY. You've spoken about Doctor Who loads of times.


GRAHAM CLULEY. It's not Doctor Who, it is chess-related. There is, of course, a Netflix hit TV series called The Queen's Gambit, all about a chess prodigy in the 1960s called Beth Harmon. And it's drama, she didn't really exist. And it's a really good show, it's quite enjoyable. And the amazing thing is, of course, that normally when chess is presented on screen, it's all a load of old nonsense and it's not actually anything like chess. But The Queen's Gambit, I'm sitting there saying, yeah, that is the Caro-Kann Defense. Yes, that is what, you know, the Queen's Gambit Declined or whatever. And they're referring to things in chess and they're absolutely on the nail. And maybe the reason for this is that Garry Kasparov, of course, is famous for being the former world chess champion and even more famous LastPass for being a past guest on Smashing Security, acted as a consultant on the show. So well done, Gary. I know he's listening. Well done, Gary, for doing that. And coming up with a great TV series, The Queen's Gambit on Netflix. I enjoyed it. Have either of you watched it?


CAROLE THERIAULT. Yeah, I told you about it repeatedly. In fact, the other day I told you, why don't you watch it tonight? You went, okay, yeah, maybe I'll do that. Yes, I will. Yes.


GRAHAM CLULEY. Yes, I will. Well, I've got round to it now. Yeah. And also chess related, but slightly security related. I saw on Twitter in the last couple of days, security researcher Sarah Jamie Lewis discovered that they were able to exploit the popular Stockfish chess engine by feeding it malformed chess positions and can cause it to crash and do naughty things. When trying to find a best move, or even trick it into believing there were no valid moves, even when it appeared that there were. So if anyone's interested in that, it's a bit nerdy.


MARK STOCKLEY. Even if you're not, I fully endorse your pick of the week. I thought it was a brilliant series, but I would like to fully endorse Sarah Jamie Lewis as well. And she does endless amounts of really, really interesting stuff, whatever she gets into. She does very interesting things with it. So I'm not at all surprised to hear her name attached to this.


GRAHAM CLULEY. Well, I am linking to her research in the show notes. You can check it out some more. It is security related. I'm sorry about that, Carole. And on that point, I pass over to Mark Stockley for his pick of the week.


MARK STOCKLEY. So it's basically an antidote to all the, all this guff about hoovers that break when they can't talk to supercomputers and order your own anthrax online and all that stuff. So it's a beautiful book. I listened to it as an audiobook over the summer. And it's by an unassuming Japanese research scientist called Masanobu Fukuoka. And he basically turned away from science and back to quiet life on the farm after what he describes as a profound spiritual experience in 1937. So he trained as a microbiologist and an agricultural scientist, and he became very disillusioned with with what he saw as kind of westernized ideas about agriculture and farming. And this is back in 1937, remember?


CAROLE THERIAULT. It turns out he was right.


MARK STOCKLEY. Before World War II and before the kind of mass use of chemical fertilizer. So even then he was—


CAROLE THERIAULT. Well, the Industrial Revolution had started though. Yes.


MARK STOCKLEY. But it was a huge change in farming in particularly because you need nitrogen for explosives. And our capacity to produce chemical nitrogen for bombs ramped up massively during the war. There was this huge chemical industry that existed after the war that wasn't there before, which is where the sort of chemical fertilizer industry comes from. Anyway, he predates that, but his alarm bells were going off. So long story short, he basically pioneered all sorts of techniques for growing food that were way ahead of their time. So these— so he called it his do-nothing technique, but we would call it things like no-till, organic sustainable agriculture, things like that. But it's a fascinating read because it's not just an instruction manual, it's kind of part manual, part memoir, so it's his life story as well. But also there's this kind of dose of Eastern mysticism in there as well. It's like one person described it as Zen and the Art of Farming. And so there's a very kind of Japanese quality to it where he's You can see that he's trying to be a bit kind of mysterious and a little bit mystical with it as well, as well as being very practical. So this great little book.


CAROLE THERIAULT. Yeah. I might get this for my in-laws for Christmas. This could be perfect for them.


MARK STOCKLEY. So that's my pick of the week. It's called One Straw Revolution. One Straw Revolution. Cool.


CAROLE THERIAULT. Carole, what's your pick of the week? Okay, my pick of the week is a podcast. Now it's hosted by the Telegraph journalist Cara MacGougan. I'm not a telegraph reader, but I happened upon this podcast and I decided to give it a whirl. It's called A Bed of Lies, or Bed of Lies. Now, it's kind of true crime, kind of investigation-y. And so in the intro, Cara says of the show that it looks at one of the biggest scandals in recent British history. So I'm holding back on agreeing with that or not, because I haven't finished it yet.


GRAHAM CLULEY. What's it about? What is the spin?


CAROLE THERIAULT. Well, that's the thing. I can't really give you the big reveal, okay? It starts off with the backstories. I can't, I can't. So you gotta just trust me because that's part of the way she kind of does it, is she holds off on telling us what the biggest scandal is until episode 3 or 4.


GRAHAM CLULEY. But can you give us a sort of genre? Can you sort of say whether it involves corruption? Yeah, yeah.


CAROLE THERIAULT. Can you just hold on? I'm getting there, baby. It starts off with the backstories of 4 women, Rosa, Lisa, Alison, Lindsay, all pseudonyms. They were all part of a lively activist community a few decades back, and they found partners who shared their passions for activism and seemed perfect until it totally wasn't. So big 180 happens.


GRAHAM CLULEY. Yeah, I think I know what this is going to be about.


CAROLE THERIAULT. Okay, well, don't say, don't say, because it's, it's kind of part of the way it's built. I think it's kind of silly. I think they should have given it away at the beginning and then just gone with the story personally, but you know. The first episode is hesitant, but I don't know, it's almost like the host and the producer are finding their feet or something, but it gets a bit pacier and the story's pretty juicy. And it's, I don't know, the stories from the women are actually pretty honestly told. It's quite good. So if it sounds like your thing, check it out wherever you get your podcasts. It's called Bed of Lies, hosted by Cara McGoogan.


GRAHAM CLULEY. So I don't know what you're talking about, Carole, but if it's what I think it is, it is a fascinating news story which has spanned a few decades. And, um, well, I don't want to give anything away.


CAROLE THERIAULT. I know, I hear you.


GRAHAM CLULEY. I understand. But yes, involves activism and maybe law enforcement.


MARK STOCKLEY. Why? Yeah. Is the punchline that it turns out they're all actually connected to US East 1?


CAROLE THERIAULT. Mark!


GRAHAM CLULEY. So mysterious. Now, Carole, um, it's time for our featured interview, isn't it?


CAROLE THERIAULT. Who have we got this week? We have an interview with Deep Instinct's Steve Salinas, who's gonna get us all up to speed on ransomware and what's going on today. So ears open, people. So hi, Steve Salinas, product marketing manager at Deep Instinct. Thanks for having me.


UNKNOWN. I'm glad to be here.


CAROLE THERIAULT. It's really lovely to speak with you because we're gonna talk all about ransomware. But first, I wanted to ask you about the lady on your homepage. So for our listeners, there's this woman who looks very lovely, and then you look at her and her eyes are like neon phthalo blue, and then you zoom into her eye and it's all like sci-fi. So I just want to know what the thought process was behind that.


UNKNOWN. Yeah, it's kind of a, it's kind of a funny little picture there, but I think the whole point is, you know, we want to get across the, the point to someone that visits our website that, you know, we We are using, applying artificial intelligence to solve cybersecurity problems. And it kind of is an interesting way to get someone's attention. So the idea of, you know, kind of the way a human brain works, the way that our brain makes decisions, we're using technology that works in the same way to solve cybersecurity problems. So totally, it works.


CAROLE THERIAULT. It works. I mean, I just brought it up as the first question, and I'm sure loads of listeners are now going to your homepage to see what I'm talking about. So tell me first, tell me a bit about Deep Instinct and what you do?


UNKNOWN. Sure. So the company name is Deep Instinct, and we are a deep learning cybersecurity solution provider. So what that— that's kind of a long way to say is that we're applying artificial intelligence, which we interact with every day, all day. We're applying a form of it, the most advanced form, which is known as deep learning, to identify threats as early as possible, which is known as pre-execution. Using a deep learning neural network so that we can identify these threats and prevent them from ever having the chance to run in an environment. So it's really our company, what we call it, we are a prevention-first company. So our whole idea and philosophy around security is that the best way to protect yourself is to stop a threat from having the chance to run in your environment. We offer a lot of different solutions that extend that, but that's where we start about preventing.


CAROLE THERIAULT. So can you tell us a bit about deep learning? Just how does it work? And just for some of our listeners so they can understand exactly.


UNKNOWN. Sure. So the best way to kind of think about deep learning is if you think about, you know, ourselves, brains are very complex. There are tons of things going on in there we don't have any idea. So if you think about when you're like, we were a child or anything, when you learned how to, let's say, ride a bike, right? As a kid, you know, the first time you rode a bike, You probably fell off a lot. I did.


CAROLE THERIAULT. I broke my arm, actually. Ouch.


UNKNOWN. Yeah, I definitely had my share of scraped knees and whatnot. You know, then you got the hang of it, right? And as a kid, I rode my bike all over the place. The phrase that we're all used to is like, it's like riding a bike. That's not a mistake. So like, even though you probably haven't been on a bike in years, if you saw a bike and you got on it, your brain would remember how to ride that bike. It just would. Yeah. So our brains are very complex and they're very advanced. So very smart people, a lot smarter than me, they kind of looked at that, the way that the brain works, and they said, all right, we're gonna— we can take the same approach to solving lots of different problems. So a very brief history about artificial intelligence: it's been around since the '50s. If you're familiar with Alan Turing, you know, the whole idea of the Turing test was if you can distinguish between a machine asking questions compared to a person. So, that was like a form of artificial intelligence, very, very initial forms. And then, in the '80s or so, this concept of machine learning came out where you could train a machine based on a set of data to come up with some sort of decision or to take some sort of action. And machine learning has been around for a while. The latest version of artificial intelligence is known as deep learning. So, this is where— this is what we do. So, what we do is we take vast amounts of data. Machine learning, deep learning, artificial intelligence, all about data. So we take a lot of data around threats, and then we take a lot of information about good files, you know, files we use all the time, applications we use all the time. Not just a few, millions. And we have data scientists, the founders of the company, they created algorithms, very proprietary set of algorithms that can take this data, we feed it into the model, which is called a neural network, And we, oh, the only thing that we tell the model about this data is this set of files and data is known as malicious. And this set over here is good. So we feed it in and we look at the results. And after training, and we train in the cloud, it takes a lot of horsepower to train it, develops an innate ability, like getting back to my bicycle analogy, to identify a threat. It's really astounding in that once it's developed this ability, security, it doesn't need to know anything else about the files at all so that when we point it at a file that it has never seen before, it's going to be able to come up to a decision whether it's malicious or what we call benign or good in milliseconds, and it's extremely accurate. Okay.


CAROLE THERIAULT. So this must play into ransomware because we're actually here to talk about ransomware today, right? So, so talk to me about ransomware and how— what is deep learning telling you about ransomware.


UNKNOWN. Yeah, so ransomware is one of the most insidious threats that, that can hit an organization. And the attackers, they're, um, you know, they're pretty ruthless because the way that they deploy these threats, they could be either targeted or they kind of can use some automation to target vast amounts of IP addresses and whatnot. But the fact of the matter is, if an organization gets hit by ransomware, they could be crippled, right? The whole way— if you're not familiar with, for anyone listening, essentially what ransomware does is it gets into an organization If someone initiates it and starts the ransomware, it will go through and encrypt all the data on your computer. So think about all that data that's on your machine, stuff you rely on, use all the time. It encrypts it and it holds it hostage. And what, what the attackers do then is they display on, on the machines a ransom note. It's very— I mean, it's called ransomware for a reason. It's just like when someone, when a person gets held hostage, they get a ransom note said, here we have your data. You need to pay us X amount. It's usually some sort of cryptocurrency. Bitcoin is their favorite. Or we're either going to destroy your data, or some of the newer firms of ransomware, they say, we're not going to destroy it, but we're going to leak it, which is obviously very concerning to an organization. You know, it's all sensitive data.


CAROLE THERIAULT. Oh, well, hey, if they were going to leak my diary, I'd be concerned. Right.


UNKNOWN. So if an organization finds themselves and this situation, you know, it's full-on panic at this point because, you know, there are definitely things that we can talk about mitigation strategies. There are several you can use.


CAROLE THERIAULT. I was wondering first, are companies targeted more than individuals or is it more or less the same or?


UNKNOWN. Well, the attackers, they target companies a lot because they have deeper pockets, right? So I mean, there are definitely lots of stories that you hear where this was a few months ago, I believe it was some, you know, I don't want to, I'm not going to say the name of the university because I don't want to get it wrong, but they were doing some COVID research and they got hit by ransomware and they had no choice but to pay the ransom. And it was, you know, in the millions of dollars because this was extremely valuable data that they needed for their research. A university is going to have a lot more money to pay than, you know, just Joe Public.


CAROLE THERIAULT. Yeah, I mean, like, here's $50, does that work?


UNKNOWN. Yeah, you know, they're looking to get the big paydays. One of the other things that's happened is people's work environments have changed.


CAROLE THERIAULT. Oh, right. Yeah, yeah. So people are working from home, so that makes their kind of— now it's kind of an individual and a company problem, I guess.


UNKNOWN. Exactly. So it's really bad. I mean, I remember hearing a story. This was years ago, but it's relevant. I won't, again, save all the names to protect the innocent, But at any rate, there was an executive from some large financial institution that would regularly use his home computer to access his work email and things like that. Well, he also had a son that would use that machine to go play online games. And somehow the attacker realized, you know, they were looking and they realized, oh, this kid is using this machine, and they did some sort of social engineering and got access. LastPass and figured out, oh, this, this is an entryway into this bank.


CAROLE THERIAULT. I don't think I know a parent these days that haven't under exhaustion handed over their phone or tablet or computer to kid just to have, you know, 10 minutes silence.


UNKNOWN. Especially now when you think kids are all doing school from home. Yeah, this is what this is. This is broadening the attackers' area that they can, they can target. So not only before it was all right, company, you know, they have— they're all behind the firewall, they're in buildings. Now the firewall is kind of virtually disappeared and everyone's at home. So now the smart attackers, and they're doing it, they're identifying, oh, okay, all of these machines are now all over the place. So if I can penetrate Steve's machine, I can get into the company. Let's face it, people's protections at home are a lot, lot less sophisticated than an organization's would be, right?


CAROLE THERIAULT. So totally, that must be, it must be keeping so many IT security people awake at night just thinking about all the doom and gloom that could happen just because one employee doesn't have the right, you know, infrastructure in place to protect the company and protect themselves.


UNKNOWN. So let's talk a little bit about how you protect yourself against ransomware organization. So there are a couple of different ways you can do it. The kind of the old school, which doesn't work very well, is I guess I'd call it legacy, like we'll call it legacy antivirus. So this is where there's a piece of software that's installed on all the— let's call it laptops just for simplicity— and it pulls down a list of known ransomware, we'll just say, for to be very generic, and its signatures. So the, this, the software, if it identifies there's a file that matches one of these signatures, it doesn't let it run. But guess what? The attackers know that, so they have gone long past that. So there, it's easy for them to make small modifications to their their ransomware, and it will completely evade that, that type of protection.


CAROLE THERIAULT. So if you're looking for a specific thing, they can just change it slightly so it doesn't match specifically.


UNKNOWN. You know, if someone's looking for a person and they put a disguise on, you walk right by the police, right? Would never be seen. So then there's the machine learning approach, which is kind of this category what I call like the next-gen players. They look at the features of ransomware, so it's a little bit better and they train a machine learning model to identify threats. But then the attackers, really smart, all that what they've done is they've identified the features that these machine learning models are trying to use to identify the threat, and then they just simply don't use them or they change them. They bypass two different types of protection. Now, our protection seems to be the best. You know, I, I, one of the things that I do all the time is look at the latest ransomware and I run it against our, our, our model And it's very, very effective because we're not using features. Again, we're training against this vast amount of data and the deep learning neural network is making a decision. It's taking any sort of route to get to that decision. It's hard to say impossible, but it's virtually impossible for an attacker to figure out what decisions it's making to avoid that. So like one good example, there's a really bad strain of ransomware that was hitting a lot of healthcare. Organization is called— I don't know how you pronounce it. I call it Ryuk. It's R-Y-U-K. It's been causing major issues. So the other day I pulled down, and there's a video on our website, I pulled down almost 100 samples of this, this ransomware, and I ran it against our neural network. And one thing I want to be clear is once we've trained the model, no additional training happens. So the one that I was using was trained in December— in November of 2018. 2016. So two years old. And I said, okay, analyze all this ransomware. And it identified every single file as malicious. Wow. Yeah. So it's really powerful. And, and that's why we're finding a lot of— to get back to the whole point of, all right, all my employees are all over the place, what do I do? What you need is a protection that is what we call resilient, right? That doesn't need daily updates. So like, I could provide this two-year-old model to most organizations in the, in the country, the world, and I— and they're going to get better protection probably than what they have today. Day, even if I don't ever update it again.


CAROLE THERIAULT. Yeah, I, I, I agree with that, that sentiment totally. You don't have to have perfect protection, you have to have better protection.


UNKNOWN. You know, even though our solution shows extremely great results, it, you know, you can't guarantee 100% all the time. So we do do other things, behavioral analysis, to look for things that, that look like ransomware. It's very rare that a ransomware would be able to get past us, this, this first phase. But, but that's why we're seeing a lot of interest in our approach because it's resilient. It doesn't even require being connected to the internet. All the decisions are taken on the machine that it's protecting and it doesn't need updates. Okay.


CAROLE THERIAULT. So what about behaviors that individuals— so we're talking about these employees who are working from home, right? So I'm guessing rule number one, please don't lend your computers to your kids if you can avoid it. But like, what other advice do you have? What do you— what do you what do you do at home? What do you tell, you know, your family to do?


UNKNOWN. If you get an email from someone you don't know that has a link on it, don't open it. Or if you get an email that says, oh, I'm from a company that you work with a lot, someone you buy stuff from, but it looks a bit weird, you know, has, oh, we need you to update your information here, or we're going to terminate your account, be very suspicious. You can also look at the URLs. Like, if it has a really weird URL, it says it's from like a big box store, like, I don't know, a Target or something, but the URL is completely different, it's probably not They're not there.


CAROLE THERIAULT. These are really, really good points because right now we're kind of sitting ducks because a lot of us are looking up for information on elections that may have just happened. We're looking for information on coronavirus and what's going on there. We're getting ready for the holidays. We're buying loads of stuff online because a lot of us are not allowed out. So we've kind of told the bad guys exactly how to get us.


UNKNOWN. I know. And I'll give you one example. You mentioned the COVID Some attackers, they said because people were looking for information desperately, Smashing Security. They actually had embedded malware in an actual map, an interactive map of COVID No way. And fortunately though, we supported that file type. So like, so someone downloaded it, we would identify it as malicious, but not everyone would. So, you know, the attackers, they don't have any qualms about doing things like that. I know.


CAROLE THERIAULT. Well, I'm glad there are people like you, Stephen, and Deep Instinct to help protect us in this crazy, crazy world we live in right now. No, we're doing our best.


UNKNOWN. But I think, you know, again, you know, just be vigilant, be suspicious. It's okay to be suspicious. You're not going to hurt anybody's feelings.


CAROLE THERIAULT. Yeah, and stay calm maybe is another good piece of advice. A lot, a lot of these things try to incite, you know, some kind of outrage or emergency scenario. You know, just maybe just take a second. It's okay to think for a minute or two and ask, you know, a trusted friend who knows more about this stuff if you're not sure.


UNKNOWN. And one last thing I'll mention too, like from a personal standpoint, this is good for just individuals. There are lots of online solutions or ways to back up your data, you know, they're not very expensive. So if you do happen to— oh man, I actually get hit by ransomware— well, I always say don't pay the ransom. You know, it only makes the attack— they're only doing it because they get paid. So if you have other mitigation strategies, have backups, I use a solution like ours, obviously if you're an organization, the, the lower we can make the incentive as far as like the attackers are not going to get paid you know, they'll go down in the long term. Yeah.


CAROLE THERIAULT. Choke the green, right? Choke the green. Exactly. Steven Salinas, product marketing manager at Deep Instinct. Thank you so, so much for coming on Smashing Security. Thank you.


GRAHAM CLULEY. Well, that was terrific. And that just about wraps it up for this week. Mark, I'm sure lots of our listeners would love to follow you online. What's the best way for folks to do that?


MARK STOCKLEY. You can find me on Twitter @MarkStockley. Terrific.


GRAHAM CLULEY. And you can follow us on Twitter @SmashInSecurity, no G. Twitter won't allow us to have a G. You can also join the subreddit for Smashing Security. And don't forget, if you want to be sure never to miss another episode, subscribe in your favorite podcast app such as Apple Podcasts, Spotify, or Pocket Casts.


CAROLE THERIAULT. A few announcements first. You each have a VIP invitation to our YouTube Live Smashing Security Christmas special on Thursday, 17th of December, 8 PM UK time. If our last session was anything to go by, where hundreds of you joined us, asked questions, made friends with other Smashing Security listeners, it was just awesome. And if our plan for this one on December 17th comes together, it's going to be a YouTube sesh to remember. We really hope to see you there, guys, because we need to see this shitshow of a year out in style. And remember, Patreon supporters, any support we receive via Patreon during the month of December 2020 will go directly to our local food bank. And can I urge you all to look at your communities to see how you can help bring a little joy this season to those who are having a hard time? There's some awful stories out there. Lastly, a huge shout out to this week's Smashing Security sponsors: Deep Instinct Sophos, Bitdefender, Sophos, Bitdefender, Sophos, Bitdefender, Sophos, Bitdefender, Sophos, Bitdefender, Sophos, Bitdefender, Sophos, Bitdefender,Bitdefender,Bitdefender,Bitdefender,Bitdefender,Bitdefender,Bitdefender,Bitdefender,Bitdefender,Bitdefender,Bitdefender,Bitdefender,Instinct, For details on past episodes, sponsorship details, or how to join our Patreon community, check out SmashingSecurity.com. Plus, you'll find all the details for how you can get in touch with us.


GRAHAM CLULEY. Until next time, cheerio, bye-bye. Bye. Bye-bye.


CAROLE THERIAULT. Do you think anyone's going to show up on December 17th, Grant?


GRAHAM CLULEY. What, on to our live show? Well, hopefully Mark will, and, and a chicken.


CAROLE THERIAULT. Are you thinking of giving it a pass, Grant?


GRAHAM CLULEY. No, I'm going to be there if my internet connection holds up. Be interesting to see.


CAROLE THERIAULT. Sorry, I couldn't quite make that.

-- TRANSCRIPT ENDS --