Listen early, and ad-free!

206: Robo dogs, deepfakes and dirty deceptions - with Tim Harford

With , , ,

Author and broadcaster Tim Harford joins us as we discuss the merits of robotic canine security guards, deepfakes, and the curious tale of an art forgery.

All this and much much more is discussed in the latest edition of the "Smashing Security" podcast by computer security veterans Graham Cluley and Carole Theriault.

And don't miss our special featured interview with James Moore from CultureAI.

Visit https://www.smashingsecurity.com/206 to check out this episode’s show notes and episode links.

Follow the show on Twitter at @SmashinSecurity, or on the Smashing Security subreddit, or visit our website for more episodes.

Remember: Subscribe on Apple Podcasts, or your favourite podcast app, to catch all of the episodes as they go live. Thanks for listening!

Warning: This podcast may contain nuts, adult themes, and rude language.

Theme tune: "Vinyl Memories" by Mikael Manvelyan.

Assorted sound effects: AudioBlocks.

Special Guests: James Moore and Tim Harford.

Sponsored By:

Support Smashing Security

Links:

Privacy & Opt-Out: https://redcircle.com/privacy

Transcript +

This transcript was generated automatically, and has not been manually verified. It may contain errors and omissions. In particular, speaker labels, proper nouns, and attributions may be incorrect. Treat it as a helpful guide rather than a verbatim record — for the real thing, give the episode a listen.



CAROLE THERIAULT. Hey everybody, Carole Theriault here. This is our little moment to say thank you, Patreon supporters, for helping us give everybody this show who needs it for free. Shout out this week goes to Mansui Dejean, Jacob Lofgren, Alexander Hoogerhuis, Donald Wilson, David Warren, Shelter, Herman A., Emily Lau, and special mention goes to Jan Torkinton, Ask Your Husband Why, also Heartful Dodger. Thank you very much. If you want to join this amazing group of Patreon supporters, go to smashingsecurity.com/patreon. Now let's get this show on the road.


TIM HARFORD. I know they're robots. They look a little bit like dogs, but you know what they look more like to me is, well, you guys may not know this. Did you ever watch the children's television program Willow the Wisp?


GRAHAM CLULEY. Oh, it looks a bit like the Moog. The Moog.


TIM HARFORD. Looks like the Moog.


GRAHAM CLULEY. Who—


TIM HARFORD. doo doo doo doo doo.


ROBOT. Yeah. Copyright grab. Smashing Security, Episode 206: Robo-Dogs: Deepfakes and Dirty Deceptions, with Carole Theriault and Graham Cluley. Hello, hello, and welcome to Smashing Security, Episode 206. My name's Graham Cluley.


CAROLE THERIAULT. I'm Carole Theriault.


GRAHAM CLULEY. And we are joined this week by a special guest, someone who hasn't been on the show before, but may well be known to many of our listeners. It's Financial Times columnist Tim Harford.


TIM HARFORD. Hello.


GRAHAM CLULEY. Tim, hello. Welcome to the show.


TIM HARFORD. Thank you very much. It's a delight to be on the program and bluff as though I know something about security. Terrific.


CAROLE THERIAULT. It's so thrilling to have you here because you— I mean, I'm a podcast lover and you do a lot of radio work as well as being a columnist for the Financial Times.


GRAHAM CLULEY. I do, yeah.


TIM HARFORD. Yes, I've got an American podcast with Pushkin Industries, the empire of Malcolm Gladwell himself. That podcast is called Cautionary Tales. It's all about things going wrong, mishaps, catastrophes, fiascos, some of them hilarious, some of them very, very not hilarious. But in each case, the idea is there's some geeky lesson. There's something to be learned from the stories of disaster. And some of the disasters are, I think, security adjacent, so conmen and forgers and that sort of thing that I think are potentially of interest.


GRAHAM CLULEY. And many of our listeners as well may know you from the Radio 4 show More or Less, of course, where you dig in statistics and try and find the truth from the numbers.


TIM HARFORD. Yes, indeed. I have a new Radio 4 show as well called How to Vaccinate the World.


CAROLE THERIAULT. It's fantastic. I've listened to it. I just think the work you do is incredible.


TIM HARFORD. Really good. Thank you. Oh, you're so kind. You're so kind.


CAROLE THERIAULT. No, but I mean it. I'm kind of starstruck that you're here.


TIM HARFORD. Well, I mean, I'm here for a very good reason, which is that I have a book out, and therefore I'm appearing on as many podcasts of quality as possible.


CAROLE THERIAULT. Tell us about your new book.


TIM HARFORD. The book is called How to Make the World Add Up. It is a guide to thinking clearly about the world, and my argument is that one of the things that we need to think clearly about the world is numbers, good solid data. But another thing that we need is to get a handle on our own filters and biases and mental shortcuts. So that's what the book's about, and there is a story in it that I think is relevant to my pick this week, and there's also a story in it that's relevant to, to your pick, Carole. So exciting! Yeah, that'll be excuses to talk about the book every 3 minutes or so.


CAROLE THERIAULT. Excellent.


GRAHAM CLULEY. Well, Carole, what is coming up on the show this week?


CAROLE THERIAULT. Well, first, let's thank this week's sponsors, Culture AI and LastPass. Their Your support helps us give you this show for free. Now coming up on today's show, Graham turns his interest to an Air Force base in Florida with an unusual security system. Tim will tell us of a notorious forger. And I have a tricky misinformation dilemma for us all to contemplate. And we have a featured interview with James Moore, the CEO of Culture AI. All this and much more coming up on this episode of Smashing Security.


GRAHAM CLULEY. Chums, I don't know whether you've ever found yourself in the unusual position of breaking into a military base. Have either of you ever done that?


TIM HARFORD. I couldn't possibly comment.


CAROLE THERIAULT. Well, yes. Yeah, exactly. Just last week, Graham. Just last week.


TIM HARFORD. Yeah.


GRAHAM CLULEY. Dressed up as a ninja, scaled the walls in order to steal the microfilm. Well, if you were to do that in Florida today, then you might find yourself in something of a sticky pickle because there is an Air Force base in Florida which has hired added some new security guards to patrol its facility. And they're not using humans. They're not even using geese. They are using robotic dogs.


CAROLE THERIAULT. How long before— robotic dogs. How long before they have robotic seagulls? They're way worse than dogs because they can fly and shit on you from above, right?


GRAHAM CLULEY. It's a good point, Kroll, as to why on earth would you choose a dog? Why is the dog The perfect form of—


CAROLE THERIAULT. He's man's, he's supposed to be man's, he's human's best friend, is he not? Is that how he's—


GRAHAM CLULEY. I think he's man's second best friend, the dog. Oh really? We've just learned what his best friend is. But it's, it's, it's, but yeah, it is an interesting choice, isn't it? Because if you were to try and protect something with an animal, I'm not sure dog is the first thing I would think of when making a robot. I would think of maybe something like an alligator. Or a rhinoceros? Much more terrifying, I would say, than a dog.


CAROLE THERIAULT. Well, if you live in a swamp, maybe, then you'd have the environment for that said alligator to get around.


GRAHAM CLULEY. It is Tyndall Air Force Base in Florida.


CAROLE THERIAULT. You live in Oxford, dude.


GRAHAM CLULEY. Well, yep, well— Hey, an alligator or crocodile in Oxford is much scarier, I think, than its natural habitat. It's gonna be pretty upset.


TIM HARFORD. It would scare you off. Are they dogs though? Because I know they're robots. They look a little bit like dogs, but you know what they look more like to me is. Well, you guys may not know this. Did you ever watch the children's television program Will O' the Wisp?


GRAHAM CLULEY. Oh, it looks a bit like the Moog.


TIM HARFORD. The Moog. It looks like the Moog.


GRAHAM CLULEY. Doo doo doo doo doo doo doo doo. Copyright Gram.


TIM HARFORD. Yeah.


GRAHAM CLULEY. Will O' the Wisp with Kenneth Williams Crowell. You have missed out on something culturally.


CAROLE THERIAULT. I will look it up. Okay.


TIM HARFORD. They look like the Moog because they've basically don't have necks, they don't even really have heads. And the Moog didn't have a head. He just had a sort of a face on the end of his body, if I remember rightly. And these creatures are just like robo-Moogs. But substantially less cute than the Moog.


GRAHAM CLULEY. I think you're onto something.


CAROLE THERIAULT. I'm kind of surprised you led with dogs when they're actually beheaded, you know? Well—


GRAHAM CLULEY. Which normally makes dogs considerably less scary.


CAROLE THERIAULT. They don't even have a tail.


GRAHAM CLULEY. They've got a red light bulb at their back. I think that's like their—


CAROLE THERIAULT. Oh, okay, right.


GRAHAM CLULEY. Stop something driving into them. So it's a bit like a baboon, I suppose. But when I think of a dog, I think of something like a Rottweiler or one of those bull terrier things, you know, which is basically a chainsaw controlled by something which has a brain the size of a walnut. And I think that's kind of terrifying, isn't it? That kind of dog. Anyway, let's get back to the point. Tyndall Air Force Base in Florida. They are one of the first bases to incorporate these semi-autonomous robot dogs into their arsenal. These mechanical pooches have been developed by a couple of companies. Ghost Robotics are doing the hardware. Another company is doing the augmented reality. And that company's called Immersive Wisdom. Because what happens is you put your pooch out—


CAROLE THERIAULT. Immersive Wisdom is the company name?


TIM HARFORD. Wow.


GRAHAM CLULEY. That is the company's name.


CAROLE THERIAULT. I like that.


GRAHAM CLULEY. So, the idea is they want to free up security officers so that they don't have to patrol the grounds. But these dogs, if we're gonna call them dogs, have 360-degree cameras on them, and they can be monitored remotely by people wearing those sort of VR headsets. So they can see everything that the dog can see. And they can look around. It's almost like they've dressed up as the dog and are going around on all fours.


CAROLE THERIAULT. So it's basically CCTV that can walk around and jump around and piss on lampposts.


GRAHAM CLULEY. I don't know if it has the ability to expel lubricant like that or not. There are some things that they can do. So the dog's driver, the real human, they can use the speaker built into this robot dog to talk to any intruders and say, I'd say, what on earth are you doing here? Should you really be in here? But it seems to me that that's somewhat inefficient because if you see on your camera and you identify that someone shouldn't be there, Isn't the natural next step to, rather than send humans in to deal with this person who may well run away or put themselves somewhere which the dog will find difficult or the alligator will find difficult to get to, wouldn't it be better if you were, and I can see this happening in the future, to sort of equip these dogs with tasers or something like that instead, which the security guards could operate?


TIM HARFORD. It's just a matter of time, isn't it?


GRAHAM CLULEY. It is.


CAROLE THERIAULT. Or a chainsaw, mouse. Right. Or—


GRAHAM CLULEY. Right? Some sort of armaments.


CAROLE THERIAULT. Yeah, just take the leg off the person in front of you that shouldn't be where they are.


TIM HARFORD. Or just bear trap. Isn't the monster in Gummy Bears a robo-dog with a bear trap face? That's what we want.


GRAHAM CLULEY. Oh my goodness. Yes.


TIM HARFORD. But no, but Carole, you're right. It's just, it does seem to be just a way of getting a camera to move around. And as such, you wonder, you know, why can't they just put it in a drone? Can't you just have several cameras and put them on posts? I mean, it's— It's a bit odd.


CAROLE THERIAULT. Can you get off your fat, lazy butt and walk around the compound, maybe?


GRAHAM CLULEY. Well, it's probably quite big.


CAROLE THERIAULT. Okay, fair enough. Too much exercise.


GRAHAM CLULEY. You have to pay them, you have to give them a 401(k), you have to give them salary, you know, you have to look after your staff and all the rest of it. Robot dogs, provided they're charged. Apparently they have a range of about 7 miles before they have to return to their kennel to be recharged. But maybe over time they're thinking this actually will be a money saver.


CAROLE THERIAULT. 7 miles. So that's like, what, about 2 hours of walking? I guess.


GRAHAM CLULEY. So I was thinking about this and I was thinking, well, would I be put off by this if I was a criminal breaking into a military establishment?


CAROLE THERIAULT. You wouldn't be able to put steaks in your pocket though and to distract them.


GRAHAM CLULEY. And a seed ball.


CAROLE THERIAULT. Right? There's a lot of things that people need to rethink about how they're going to get to the air force base after this.


GRAHAM CLULEY. Well, I'm thinking maybe what I need is a robot cat doing my dirty work for me. And the cat can snoop around and spy and chomp through wires or pee on electricals, whatever I want it to do, or take photographs of the secret plans or the plane that they don't want photographed.


CAROLE THERIAULT. When are you planning to start working on this cat of yours?


GRAHAM CLULEY. Well, I've got a bit of time on my hands at the moment under lockdown, so potentially I could.


TIM HARFORD. I'm going to start with a robot mouse that will tease your robot cat and kind of hit it with irons and ironing boards and just generally get up to all kinds of tomfoolery.


GRAHAM CLULEY. And that would also disable my robotic elephants, which I was planning, which would be terrified and jump on the table.


CAROLE THERIAULT. I'm just gonna have the robo killer hornets and you guys are all screwed.


GRAHAM CLULEY. So I think there's a number of concerns here. One is, why is this just for surveillance? Surely, especially it being a military base, they're going to at some point sell a tape on some kind of missile or something.


CAROLE THERIAULT. Water gun. Let's start with a water gun, right?


GRAHAM CLULEY. A Super Soaker, something like that.


CAROLE THERIAULT. Then a Nerf ball.


GRAHAM CLULEY. These dogs apparently— there are some amazing videos and news reports of these dogs in action. They've even got pictures of them sort of rolling on their back and being tickled on their tummy. Dummies, and some of the army officers are sort of patting them like they're a dog.


CAROLE THERIAULT. And it's like, they're headless machines.


GRAHAM CLULEY. Yeah, well, don't you think— don't you think it's interesting that they mimic animals? Do you think that makes it more unsettling or less?


CAROLE THERIAULT. Yes, I, I— okay, so you know that in even in old age homes they've been trialing out kind of robotic plush toys effectively to try and make some people feel less lonely, and it's worked a treat. So I think there was one in Japan, it was a seal, and they would give it to the people in the home and they loved the seal, you know, and they would share it found amongst users. So I think a face helps you understand it as a being, and I think it confuses the brain a bit, you know, when it has like big eyes and looking at it. So in a way, maybe it's better that it doesn't have a face. It's not pretending to be anything other than a machine, a CCTV camera on four legs.


TIM HARFORD. Yeah. I mean, it looks a bit like a server rack, doesn't it? It's kind of like, you know, it's a sort of box with— it's very utilitarian and it's very Eerie indeed, the way that it moves. Very unsettling. I would run.


GRAHAM CLULEY. So it's not just the military who are beginning to use robot dogs. There's a Norwegian oil company which has just put some robot dogs on to patrol its ships on the Norwegian Sea. Why?


CAROLE THERIAULT. 'Cause people come over and steal oil?


GRAHAM CLULEY. Well, no, I don't think it's necessarily to stop pirates and things like that. I think it's— especially in the Norwegian Sea. I think it's more about if they're somewhere dangerous. Where they don't necessarily want humans working. But if they had a device popping around, visiting different things and seeing if anything bad was happening, then that maybe is a better idea. In Japan, they've been really worried about wild bears. So I heard a story earlier this week about—


CAROLE THERIAULT. Wild bears, as opposed to all the tame ones?


GRAHAM CLULEY. Well, so apparently the bears are really pissed off, Carole, because there's a lack of acorns and nuts, which they're normally scoffing around on and filling their bellies. There's been a real dearth of those lately. So they've been venturing closer to humanity and into farms. And so there is now a robotic monster wolf, which is scaring away the bears. And I've put in a little link. I'll put it in the show notes so people can check it out.


TIM HARFORD. I wish you hadn't put that link. It's just, there's some things you can't unsee.


SPEAKER_03. I'm not watching.


CAROLE THERIAULT. I'm not looking.


GRAHAM CLULEY. I'm not looking. Let me describe it. If I took a couple of bicycle lamps and a rotating washing line and a Sony Walkman, and an old fur coat and mangled them together.


CAROLE THERIAULT. You look like my husband. He won't listen, right?


GRAHAM CLULEY. That's what the monster wolf is like. Anyway, I get the feeling that we're going to see more of this. And I don't know, it doesn't— on this show, we do tend to be— well, I tend to be a bit of an old fogey. I don't really like technology. And this sort of scares me a bit.


CAROLE THERIAULT. You don't think you're sounding a little conspiracy theory on this?


GRAHAM CLULEY. No, I just don't like—


CAROLE THERIAULT. Do you think people should lose sleep over this?


GRAHAM CLULEY. The world's just changing too quickly for me, Carole. I'm getting confused by things. Worried where it's all gonna end up. I'm not sure.


CAROLE THERIAULT. We'll have this conversation again tomorrow, don't worry.


GRAHAM CLULEY. And I'll have forgotten I did today. Tim, what story have you got for us this week?


TIM HARFORD. Well, this is a story that has fascinated me since I first heard it. And it's in chapter 1 of my book, How to Make the World Add Up. And I'm gonna make a Cautionary Tales podcast about it for those people who want to subscribe. The story begins in the 1930s in Monaco, where a charming Dutch lawyer called Gerard Boone shows a painting to the world's leading art critic, who's a gentleman called Abraham Bradius, who is in his 80s and is nobody's fool. He has debunked many a forged artwork. He is expert on Rembrandt and an expert on Vermeer, and Gerard Boone shows him this painting and says, 'We think it might be a Vermeer. Can I have your opinion?' And Bredius is completely spellbound by this painting. He writes a piece for Burlington Magazine, the art magazine, saying, 'When I first saw this work, I had difficulty controlling my emotion. It is not only a Vermeer, It is Vermeer's greatest work. And anyway, well, you can see where this is going. It wasn't a Vermeer. It was a rotten fake. It doesn't look like a Vermeer. That's the weird thing. You look at it and you look at a Vermeer and you go, well, I don't know much about art, but those two paintings don't look anything like each other.


GRAHAM CLULEY. Oh, really? And it also didn't even look like a Vermeer.


TIM HARFORD. It didn't look like a Vermeer. It was hardened with industrial plastic. What has fascinated me about this story, and what I think is so instructive, is how did Bredius, this incredibly well-respected, incredibly expert guy, how was he fooled by a forgery that wouldn't have fooled me and wouldn't have fooled you?


SPEAKER_03. What went wrong?


CAROLE THERIAULT. Was he fooled, or did he just get a payoff?


GRAHAM CLULEY. Did he not investigate the provenance of the paintings?


CAROLE THERIAULT. Oh, Graham, you're so excited to say that word in situ as well. I knew it was coming.


GRAHAM CLULEY. Provenance.


TIM HARFORD. I love Carole's assumption that it's all pure corruption, which is probably a good go-to. But no, what happened was, Bredius had a theory, he had a pet theory about Vermeer, who's quite a mysterious figure, amazing painter, not that much known about his life. And he had a theory about Vermeer, and there's a gap in Vermeer's work where he didn't— he painted some early paintings, he painted some late paintings. What was he doing in the middle of his life? Where are those paintings? Who influenced those paintings. And he'd written about this, and the forger, who was a very clever little man called Han van Meegeren, the forger basically painted a painting that fit Bredius's preconceived ideas of what Vermeer might have been doing, who he might have been imitating. And it contained all kinds of very subtle clues that I would not notice, you would not notice, but Bredius noticed because Bredius is the world expert.


GRAHAM CLULEY. Right.


TIM HARFORD. So for example, there's a 17th-century vase in the painting. It's a genuine antique. It's painted on a 17th-century canvas. It uses Vermeer's color palette, the pigments, the dyes, all perfect.


GRAHAM CLULEY. Yeah.


TIM HARFORD. All of these things that I wouldn't notice, but Bredius noticed. And because he was able to identify all of these little pointers, plus this was confirmation that he had been right all along. He fell for it, and then once he fell for it, everybody else fell for it because he's Abraham Bradyus. And this links into the sort of social science that I talk about in the book that basically says if you are motivated to reach a particular conclusion, if you want to believe it, being more expert, having more, more knowledge, more intelligence, more information doesn't help you because you simply deploy all of that intellectual armory to reach the conclusion you want to reach.


CAROLE THERIAULT. Yeah. And it's self-fulfilling. No, but it's self-fulfilling as well based on your education because then you can go through and you can go, "Oh, but you see, I knew that he's using the Zorn palette," or, "I knew that they were using this and I was aware of all these points, therefore it must be right." And if someone plays you at your own game, you're screwed.


GRAHAM CLULEY. Ignorance is bliss.


CAROLE THERIAULT. Yeah, well, Graham, you should be blissful.


TIM HARFORD. Ignorance is bliss. There is a sort of social science literature on this which I describe in the book that gives people the the task of evaluating certain political arguments and on hot-button issues like abortion or same-sex marriage, gun control, things that Americans have very, very strong views about. And basically, people who have more knowledge about politics are more subject to biases in their reasoning. They find it easier to generate ideas that support their own conclusions, harder to generate ideas that support opposing conclusions because the whole kind of cognitive arsenal is being focused on reaching the conclusion you want to reach. So it's not just about technical expertise. Thinking clearly is about noticing your own emotional reaction. And Bredius even said, "Oh, I had difficulty overcoming my emotions." He also said, "It doesn't look anything like a Vermeer, but it's as great as Vermeer." But I know it must be. It must be. It was incredible.


SPEAKER_03. I love it.


GRAHAM CLULEY. But hang on, Tim, hang on, because you're telling us this story of this chap, Han van Meegeren.


TIM HARFORD. Yes.


GRAHAM CLULEY. Forger. How do we know about this? How did he get found out? Which presumably he—


CAROLE THERIAULT. Well, someone did a test, they found plastic.


TIM HARFORD. No, no, no, it's better than that. It's an amazing story. So Van Meegeren, this all went down in the late '30s. Van Meegeren was arrested at the end of the Second World War.


CAROLE THERIAULT. Okay.


TIM HARFORD. He lived in this mansion in Amsterdam funded by all of these fake Vermeers because he produced, I mean, tens of millions of dollars worth of these things. 'Cause once you've done one, you can produce all these others that look similar.


CAROLE THERIAULT. Well, they've got the seal of approval by the art critic of the jour.


TIM HARFORD. Absolutely, absolutely. And he was arrested by two officers from the Allied forces. The war was coming to an end and they said, "Well, Mr. van Meegeren, it's very awkward. We have found this treasure trove of stolen Nazi art and it includes a Vermeer." and it's Hermann Göring's art collection, Hitler's right-hand man, and it includes a Vermeer. And the Germans, being Germans, kept the receipts, and they say they bought it from you. And so Van Meegeren was up for treason. He could have been hung for that. And so he had to prove that in fact he had forged it rather than simply obtained it in some other way, stolen it and sold it to the Nazis.


GRAHAM CLULEY. 'Oh yeah, I was just conning the Nazis.


TIM HARFORD. I'm one of the good guys.' That's what he said. So he was able to paint himself as this kind of Robin Hood figure. The Dutch were sick of the war, they were sick of collaborators, they were ashamed. Anne Frank wasn't the only Jew who was shipped out of the Netherlands to the extermination camps. People just wanted a hero. And here's Van Meegeren, and he's kind of done one over on Hermann Göring. Actually, when you look at the evidence, he was probably a Nazi, and he was certainly very friendly with Nazis and producing all kinds of antisemitic work and just a really nasty character. But when he died, he was the most popular man in the Netherlands, other than the prime minister, who bizarrely was extremely popular as well. He was incredibly popular. He was a folk hero because not only did he sell all these fake Vermeers, But he then sold the story to the Dutch people of this guy who poked Hermann Göring in the eye. And people would rather have believed that than the truth, which is that he was a really nasty piece of work.


SPEAKER_03. Wow.


CAROLE THERIAULT. That is an incredible story, Tim.


TIM HARFORD. It's—


GRAHAM CLULEY. yeah. And you can read more about it in How to Make the World Add Up.


TIM HARFORD. The book contains other stories.


GRAHAM CLULEY. So the real message here is, you know, even though you might be an expert in a particular topic, A lot of people who listen to this show know all about computer security, for instance.


CAROLE THERIAULT. Yeah, Graham.


GRAHAM CLULEY. Oh, well, might be. But you can still be fooled if you read something which ticks your boxes or facilitates some beliefs you already have, then you can be easily lured into thinking you're seeing what they want you to see.


TIM HARFORD. Absolutely. The subtitle of the book is "10 Rules for Thinking Differently About Numbers." And this is rule number 1. And rule number 1 is, notice your emotional reaction. Whenever we see a claim on social media, we see a newspaper headline, very often we'll have an emotional reaction. We'll be like, oh, that can't be true, or oh, this proves I was right. And what I'm saying, you can't overcome that reaction, and you shouldn't be trying to suppress your emotions, but you should notice them. And if I think if Bradyus had been a little bit more aware of his own state of excitement and noticed that and thought, hang on a minute, maybe I need to calm down. And of course, we all know that, you know, some security exploits, you know, the— I'm not sure what you call it— the human factors hacking, you know, where you're— what do you call that?


GRAHAM CLULEY. Social engineering.


TIM HARFORD. Social engineering, yes. That is all about understanding people's emotions and getting people to feel they need to make a decision in a hurry or getting people people to feel really comfortable, manipulating people's emotions is a great way to get them to do something that they will later regret.


GRAHAM CLULEY. Well, Carole, talking of things we might regret, let's go straight to your story right now. Let's hear more.


CAROLE THERIAULT. What have we got for us? You want to try that one?


GRAHAM CLULEY. What have you got for us, Carole?


CAROLE THERIAULT. Okay, so I am very, very pleased that, Tim, you're here because Graham, I rarely admit this publicly, but Graham, you're a smart guy, right? And Tim, you're obviously a very, very smart guy. And I know that just from listening to More or Less and being a diehard fan. So, I have a dilemma for us all to noodle on. As this butthole of a year nears a close, we are all looking at 2021 with, I don't know, I'd say, for me, incredible hope. I don't know if you guys have some diehard wishes for the next year that you're kind of praying come true. Like—


GRAHAM CLULEY. One of my wishes is that my butthole doesn't close, Carole. It was a strange sort of— image which you gave me there. I would rather that—


CAROLE THERIAULT. Butthole of a year.


GRAHAM CLULEY. I would rather that, yes, I would rather that 2020 was expelled. Yeah. Would leave.


CAROLE THERIAULT. I pray that you get your car replacement soon so I can have my wheels back.


GRAHAM CLULEY. Oh, for goodness' sake. Okay. So I have, I've just moved house. I might be a little bit echoey and this combined with no longer having access to a car. And so Carole has very kindly lent me her car.


CAROLE THERIAULT. Weeks ago.


GRAHAM CLULEY. Weeks ago.


TIM HARFORD. Weeks ago.


CAROLE THERIAULT. Yes.


GRAHAM CLULEY. Well, I'm living sort of out in the wilds of Oxfordshire.


CAROLE THERIAULT. Yeah, anyway, I can't wait to get it back, right? That's something I want. Okay, great. Very sooner than 2021.


GRAHAM CLULEY. Yeah, it'll be back soon.


CAROLE THERIAULT. Thank you. You know, we want a sharp drop in political upheaval. You know, wouldn't it be nice to have a side-effect-free, affordable vaccine for the coronavirus pandemic?


TIM HARFORD. It's coming.


CAROLE THERIAULT. It's coming. It's coming. And maybe even a plan to tackle this onslaught of misinformation. So one vector of misinformation involves the world of computer-generated people. So these are people that have never existed in real life. And so question number one, are these deepfakes? Because it's not of a real person and duping people into pretending that they've said something that they haven't, but it's the image of a person.


GRAHAM CLULEY. They're still fake images, aren't they? Right. That way, I would Or video, right? Yes.


CAROLE THERIAULT. Yeah.


GRAHAM CLULEY. I think it's fair enough to call them deepfakes.


TIM HARFORD. Because they are quite strikingly convincing.


CAROLE THERIAULT. They are. So, I mean, I just think whether it's trying to pretend to be Geoff Goldblum or it's just a pretty face selling bitcoin or perfume, the idea is that you identify with that person, right? You kind of— that person's helping you believe something or buy something or do something. They're often used by organizations to help us to, you know, get things done. And so one question is, you know, how is that really different from hire an actor to, you know, sell your chocolate bar or sell your newspapers? Is that— is this worse by using these non-people people?


TIM HARFORD. We can do it at much cheaper and at scale, which I suppose changes— I think it was one of those things that Stalin never said but is supposed to have said, that quantity has a quality all of its own. So just the fact that you can mass-produce these images that seem to be people is, I'm sure, something that can be worked out.


GRAHAM CLULEY. It's certainly worse for the actors as well, isn't it? I mean, put them out.


CAROLE THERIAULT. You can't stand there and smile and hold a yogurt.


GRAHAM CLULEY. If you're being a spokesmodel.


CAROLE THERIAULT. So there is even businesses that are selling fake people. Okay, this was in the New York Times. So quote, there's an article by Kashmir Hill and Jeremy White. So quote, on the website Generated Photos, You can buy a unique worry-free fake person for $2.99 or 1,000 people for $1,000. If you just need a couple of fake people for a character in a video game or to make your company website appear more diverse, you can get their photos for free on thispersondoesnotexist.com. Hey, and if you want that your fake person animated, a company called Rosebud AI can do that and even make them talk.


TIM HARFORD. So The idea that you're going to, "Oh, our company is just a bunch of white guys. Can we have some brown faces? Can we have some women in there? But we don't actually have them, so we're just going to fake them." So this is—


CAROLE THERIAULT. I wanted to give you this. So the New York Times have this interactive tool, which is quite fun. But I think it'll just show you how far we've come in what, two years in this front? So if you take a click on that link—


GRAHAM CLULEY. This is the link in the show notes.


CAROLE THERIAULT. Yeah, the link in the show notes. So if you scroll down, you'll see a number of faces and you'll see this scroll bar. So you can change genders, you can change race and ethnicity, you can change a person's perspective, where they're looking in the picture, a mood, their age, their eyes. It's shocking.


GRAHAM CLULEY. Oh, this mood thing is— oh, that'd be quite handy in real life, actually. Yes. Sometimes— stop scowling at me. I don't like being—


CAROLE THERIAULT. Next time we're on YouTube.


TIM HARFORD. It also has some clever advice as to weird ways to spot the fakes. So for example, there's a guy I'm looking at who looks very convincing, except that one hinge on his spectacles is different from the hinge on the other side of his spectacles. And there's a lady with two odd earrings. And it's that sort of thing that—


GRAHAM CLULEY. Yeah.


TIM HARFORD. Yeah. The kind of the deep learning, as I I'm just an economist, what do I know? But the way that the algorithm—


CAROLE THERIAULT. We're just podcasters, we don't know anything either.


GRAHAM CLULEY. I would quite like to upload my own photograph here and then be able to change my age, my eyes, my mood, because quite often, you know—


CAROLE THERIAULT. What, to be what?


GRAHAM CLULEY. Well, you know, it's like—


CAROLE THERIAULT. What would you go for?


GRAHAM CLULEY. I'd quite like to have, can I have a photograph where I'm actually smiling nicely and then I could adjust the dial or if I could change, you know, if I could take off a couple of years or something or make my eyes slightly larger. Then it might be quite— my eyebrows slightly less bushy. That'd be quite— it would be quite a fun thing to do. And oh my goodness, I can change my gender. Look at that.


CAROLE THERIAULT. You could just go get plastic surgery. That already exists. If you're really concerned about these things.


GRAHAM CLULEY. I thought it'd be easier with a scroll.


TIM HARFORD. There is a really, really easy way to take a few years off, which is to use an old photo. This is what we journalists do. It's really not that hard.


CAROLE THERIAULT. Okay, I'm gonna pivot here. I'm gonna pivot here. Have you guys heard of the term the term "the liar's dividend"?


TIM HARFORD. It rings a vague bell, but remind us.


CAROLE THERIAULT. The gist of it is this, okay? So the mere existence of a conspiracy theory, say, gives more credibility to the believers or for the believers to cling to. It's a hard sentence to say. But so where it's unclear what's real and what's fake, the fact that people are simply aware that there's misinformation floating around actually benefits those that create and spread fake information.


GRAHAM CLULEY. Give us an example, Carole. Dream up a scenario and then we'll ask you.


CAROLE THERIAULT. Okay, okay, okay. So let's say I was talking to someone and they were saying that the royal family were blood-drinking, flesh-eating, shape-shifting extraterrestrial reptilian things in human form.


GRAHAM CLULEY. Yeah, but what would be the conspiracy theory in that case?


CAROLE THERIAULT. And let's say I question that idea, like, "Really?" Right? Or something like that. The liar's dividend says that I'm actually attributing more credibility by simply being aware of the concept of this lizard elite conspiracy theory.


GRAHAM CLULEY. I see. Just the mere existence of this crazy theory.


CAROLE THERIAULT. That I know about it and that I go to go, I don't believe that because you can't believe your eyes anymore. We just saw that on the New York Times article, right? We can't believe our eyes. We don't know if those are real people or not real people.


GRAHAM CLULEY. So no one is going to think, for instance, that, I don't know, the Prince of Wales has one of those space hoppers tucked up at the front of his shirt and has done for the last 14 years, because no one has ever heard that theory from me before. But once it's been said, then it becomes a little bit more believable, or similar conspiracy theories might be believable. That's the sort of principle of what you're saying, the liar's dividend.


TIM HARFORD. I don't know if it's related to something that worries me or whether it's a subtly different point, but what worries me is not so much that you will be fooled by a deepfake, but the fact that everybody knows deepfakes exist means that you can now do something on film and then plausibly claim that it wasn't you, it was a deepfake. So you think about the Access Hollywood tape that came out just before the 2016 presidential election and doomed Donald Trump's chances of getting the presidency, remember?


CAROLE THERIAULT. And the pussy-grabbing one?


GRAHAM CLULEY. Yeah, he might have become president if it hadn't been for that, wasn't it? That would really—


TIM HARFORD. yeah, Donald Trump as president for 4 years if it hadn't been for the release of that tape. But if that came out now, Trump would just be able to say, 'That's not my voice on the tape, it's fake news.' At the time that, you know, there wasn't enough currency around the idea that you could fake an audio recording. I mean, you can fake an audio recording, you can fake photos. But it's not so much, "Oh, people will be fooled by these deepfakes." It's the idea that people won't believe things that they should believe. Exactly. Because the deepfakes create deniability. And there is, even before we get to Van Meegeren in my book, the introduction of the book talks about a very famous statistical book called How to Lie with Statistics, probably the most famous book about statistics ever written. And it's a very witty kind of debunking of all kinds of statistical misinformation and all the different ways that people will fool you. The argument I make is actually this might not be that helpful, even though everything this guy Daryl Huff, the author of this book, even though everything he's saying is correct. The fact that all the emphasis is on misinformation and there's no acknowledgement that you might use statistics to actually figure something out or tell something true about the world, that's corrosive. And in fact, Daryl Huff ended up using stories from his book to shill for big tobacco and to try to attack the epidemiologists who were arguing that smoking is quite likely to give you lung cancer. And he deployed the same ideas in his book to say, well, you know, you can't really believe all this kind of— all these medical statisticians. We've had enough of experts. Took us to a very, very dark place. And I think the deepfakes are a similar thing. It's not we'll believe stuff we shouldn't. It's that we'll refuse to believe stuff that we should.


CAROLE THERIAULT. Exactly. Perfect segue. So this is where, in my view, things get a little sticky. So there's experts like like you, Graham, and you, Tim, and academics and technologists and journalists all around the world that have been advocating that the general public learn about misinformation and deepfakes to make sure that they're forearmed or better armed against malicious use of these types of communications. But have we all been duped, right? Could it be that the more that we talk about it, the more validity we give to nonsense because we're basically saying it exists?


TIM HARFORD. Yep.


GRAHAM CLULEY. Yep.


TIM HARFORD. I think that's something I worry about a lot.


GRAHAM CLULEY. So well done for talking about it on the podcast, Carole.


CAROLE THERIAULT. Sorry, was that you telling me how smart I was? Sorry, I didn't hear that, Graham.


GRAHAM CLULEY. I said, well done for talking about this on the podcast.


CAROLE THERIAULT. Well, it just seems to me if people can't believe their eyes anymore, maybe it comes down to who believes it more, right? Whoever believes more is the winner of whatever said argument.


TIM HARFORD. So for me, I think it comes down to people have to be willing to put in that little bit of extra work, to show a little bit more curiosity, not just reflexively believe or disbelieve the first thing they see, to retweet, to like, to share based on their emotional affiliation with what they're seeing. They have to go, hang on, what's going on behind this? And ask a few extra questions, get a second opinion. And if we're not really interested enough in the world to do that, then we've got problems.


CAROLE THERIAULT. But I don't know if you've seen people in these kind of emotional frevers. I totally understand what you mean. I have seen people that I would say categorically are very sound mind, sound reasoning people. And when they're caught by the bug, it is really hard. Like, I mean, they don't even, you know, like when they'll show me something and I'll go and just do a tiny bit of Googling, I can find debunking immediately. And these are smart people that I think under normal circumstances would go and double-check. But somehow there's been some pre— like maybe the person who said it has been pre-vetted by them as someone worthy of trust or something. There's something weird that happens, but it's very frightening. I've seen it in my own circles and it's shocking. And it may be happening to me. That's the other thing. Like, how do I know? I'm an emotional being.


TIM HARFORD. Yeah, no, we're all emotional.


GRAHAM CLULEY. Maybe you need someone, Carole, all the time to make you question yourself. Someone who will say, are you sure about that, Carole? Are you sure you've got that right? Well, I just think I just need to believe more.


CAROLE THERIAULT. So I just really, really believe I'm the funniest person here, Graham.


SPEAKER_03. I'm really, really funny.


CAROLE THERIAULT. I'm funnier, funnier than you. Definitely, definitely. I believe from the bottom of my heart. I believe it.


GRAHAM CLULEY. I believe it. I believe it. Security training sucks. It's boring. Users hate it. They aren't paying attention. Doesn't work. For security training to actually work, You'd have to find out what each person in the company is doing that's risky, send them phishing emails, monitor logs, check for passwords and have I been pwned, and then you'd have to train them in a way that doesn't send them to sleep, try and track what they're doing to see if it worked. Who's got time for any of that?


CAROLE THERIAULT. Culture AI do.


GRAHAM CLULEY. What?


CAROLE THERIAULT. Culture AI. They make this amazing software that plugs into your company, runs your phishing campaigns, integrates with Slack, tests if your users accept phony MFA requests, that's a biggie, and pulls in tons of other behavioral metrics from your existing apps. It basically figures out what everyone needs to know and then creates personalized training that is not boring. And it even checks that it's working and it's all done automagically. And they've got a deal just for our listeners. Sign up at culture.ai/smashing and your first 50 employees are free for life.


GRAHAM CLULEY. Cool.


CAROLE THERIAULT. More information, culture.ai/smashing. Stop your whining, Graham.


GRAHAM CLULEY. This episode of Smashing Security is sponsored by LastPass. Now, everyone knows about LastPass's password manager for end users, but it's also a great solution for businesses. In fact, tens of thousands of companies rely upon LastPass to protect themselves. LastPass Enterprise simplifies password management for companies of all sizes and helps you secure your workforce. So whatever the size of your business, go and check it out. Go and visit lastpass.com/smashing to find out more. And thanks to LastPass for supporting the show. And welcome back. Can you join us on our favorite part of the show? The part of the show that we like to call Pick of the Week.


TIM HARFORD. Pick of the Week.


CAROLE THERIAULT. Pick of the Week.


GRAHAM CLULEY. Pick of the Week is the part of the show where everyone chooses something they like. Could be a funny story, a book that they've read, a TV show, a movie, a record, a podcast, a website, or an app. Whatever they wish. It doesn't have to be security-related necessarily.


CAROLE THERIAULT. Better not be.


GRAHAM CLULEY. And my pick of the week this week is not security-related. Um, I discovered that, uh, the Ravensbourne University in London, who I think are based in Greenwich, they have done something rather remarkable, uh, in coordination with the BBC, and they have created something called the BBC Motion Graphics Archive. And you may be wondering, well, what is the BBC Motion Graphics Archive?


TIM HARFORD. I am wondering.


GRAHAM CLULEY. Well, it is an online resource where you can look up TV title sequences that you may have long forgotten. A few you may remember dating back to the 1940s up until the present day. The thing which seems to connect all of these title sequences is that they largely involve some sort of graphical element. So I've done a quick perusal. And there's some marvelous old things, things which you won't find up on YouTube, but you're able to download.


CAROLE THERIAULT. No, it's really cool actually, Graham.


GRAHAM CLULEY. And they have quite often quite a lot of detail regarding the thinking behind the title sequence and the design. And I found this quite enjoyable. So it's all kinds of shows I looked up. I found Emu Broadcasting Company from back in the '70s. I enjoyed that. I, Claudius, one of my favorites. Discovering Portuguese wasn't a show I watched regularly, but I was interested in things I read about the thinking behind it.


CAROLE THERIAULT. This exists because people took a punt in the old days. They thought, you know what, I think anti the inside story of the BBC is what we need to put out. Who cares what the viewership is? Let's just try, you know, more anti-bloomers, for example, is another one.


TIM HARFORD. Well, I love the idea that you can just see So these are kind of the title sequence. This is like the music and the whatever would've been shown at the beginning of a show.


GRAHAM CLULEY. Yeah. Love it. They're quite fun, like 20, 25 seconds or so. They're not all very long. And I love the fact that these have been preserved and now they're available digitally to everybody. And—


TIM HARFORD. So what's your favourite, Graham? What's the best title sequence ever?


GRAHAM CLULEY. Well, I'm—


CAROLE THERIAULT. The answer, Graham, is I haven't seen them all. Okay, that's the answer.


GRAHAM CLULEY. Obviously. That is a smart answer. I am a huge fan of Doctor Who, and this week was, of course, the 57th anniversary of Doctor Who. And I have to say, although I don't think it's up on this archive, the original 1963 Doctor Who title sequence, which was done in a remarkable way through a howl-around technique of having a camera pointing at its own monitor and basically picking up the feedback and the weird distortion, I think that was a remarkable title sequence for way, way back then. So I have to say Doctor Who, but there's some, there's some other crackers. I'll tell you what I found wasn't in there though, was Willow the Wisp. I just did a quick look and not to be found. Oh dear. Very disappointing.


TIM HARFORD. That's a blow. That is a blow. Well, maybe they'll expand it. For me, Box of Delights from, from the early '80s.


CAROLE THERIAULT. You're connected to it.


TIM HARFORD. Amazing title sequence.


GRAHAM CLULEY. Yeah. Put in a word, Tim. Put in a word. Tim, what's your pick of the week?


TIM HARFORD. My pick of the week is a New Yorker article by Cal Newport titled "The Rise and Fall of Getting Things Done." Now, I'm, I'm a bit of a productivity geek, and I like Cal Newport's books on this, particularly his book Digital Minimalism. I also like David Allen's productivity bible Getting Things Done, and so I was interested to see Cal reflecting on GTD and in The New Yorker of all places. And so there's lots to enjoy about the piece, but what he really gets you thinking about is at what point did it become your problem, my problem to be productive versus a sort of systemic problem. In manufacturing, being productive was regarded as a system thing. Like, a factory has to be productive, a production line, an assembly line has to be productive. We need to get our processes all sorted. And Cal says the same is true for programming. But for a lot of knowledge work, it's all just, well, you know, everything goes to email and we'll figure it out. And it's all very ad hoc. And if that feels very stressful and everyone feels overwhelmed, that's an individual problem to sort out rather than a system problem. That's what he's questioning and getting us to try and rethink.


CAROLE THERIAULT. Because I, I mean, I don't know, I grew up certainly in a household that had to be busy all the time. You know, you were— if you weren't doing, if you weren't doing something, you were wasting time. You know, it was always, what are you doing? What are you doing now? So I don't know if that was of the time in the '70s, '80s, and it's just kind of come through a generation. But it's certainly true.


TIM HARFORD. I don't mind being busy. I tend to feel I have to be doing something useful, which is probably, you know, I probably need some kind of therapy about that. But the problem with email though is, you know, the answer to the question, what are you doing now, could always be, well, I'm just going to do some email. There's always more email. And maybe that's not really a very good way of getting stuff done.


CAROLE THERIAULT. Oh, it totally isn't. I just don't do it. Are you on social media? How have you minimized your digital sphere?


TIM HARFORD. I am not on Facebook in a serious way. I decided it was too much hassle to delete my Facebook account, and I do have a—


CAROLE THERIAULT. I'll do it for you, Tim, if you want me to.


TIM HARFORD. But I don't check Facebook more than And then maybe once every 3 weeks I'll pop on for 5 minutes and pop off again because nothing's happening. Um, I have a sort of automatic posting of, of stuff from my blog will go to a Facebook page, but I don't have anything to do with that. And Twitter, I, um, I'm on less than people might think. So I will pop up and I will put some links to my articles and various other things, and they'll disappear again. I don't find it— I mean, I've got about 160,000, 170,000 Twitter followers.


CAROLE THERIAULT. Show off.


TIM HARFORD. But I don't— yeah, I don't like Twitter. And so— and I am slightly— I do— I have these kind of mixed feelings of, well, you know, I can reach us— I can reach the population of my hometown every time, every time I want to. Um, but do I want to? I think I probably want to do some real work.


CAROLE THERIAULT. Yeah, but what do you— I mean, a lot of people do those things though to kind of unwind, I guess, right? And but I don't think it actually unwinds anyone really.


TIM HARFORD. Twitter certainly doesn't unwind me.


CAROLE THERIAULT. That's the irony of the whole thing. Yeah.


GRAHAM CLULEY. I've certainly put a lot of time in building filters and rules on my email client to try and put emails which I think are less important or people I interact with less often into different folders, try and triage that kind of thing so that I've got time to spend perusing the BBC motion graphics archive for looking at, because that's real work in my view. That's how I'm going to unwind is watching those title sequences.


TIM HARFORD. That's the deep work.


GRAHAM CLULEY. Carole, what's your pick of the week?


CAROLE THERIAULT. Okay. You're going to give me some stick for this. Okay. But as some of you know, I host other podcasts, one of them being the brand spanking new Sticky Pickles, a hilarious weekly podcast How many weeks have you promoted Sticky Pickles on Smashing Security? It's been 8 weeks, okay? And the whole idea is that, you know, each host drops a tangle of a situation and we try to wiggle out and find the best course of action.


GRAHAM CLULEY. Couldn't you just sponsor Smashing Security if you want to promote your podcast every single week?


CAROLE THERIAULT. I don't think so. It's half mine, babycakes. Now, Sticky has enjoyed more than 5,000 downloads, which isn't bad for a silly pandemic project, I think.


GRAHAM CLULEY. Okay, well done.


CAROLE THERIAULT. Well done. Thank you very much. And, but we've suffered, Sticky Pickles has suffered its very own sticky pickle.


GRAHAM CLULEY. Oh?


CAROLE THERIAULT. Because after 8 episodes, my wonderful co-host Anna Breiding had to bow out. And she said it was because she was having a baby. But it's probably because she has issues with me.


TIM HARFORD. Do you believe that?


CAROLE THERIAULT. I don't, I don't, because it's a pandemic and I can't see her. So I think it's all a lie. I feel dumped, kicked to the curb. It's not you, it's me. Yeah, okay. Anyway, so what do I do now, right? Do I stop and let it float away into nothingness, or do I scramble like a little bug and get a shit-hot replacement? And I got someone amazing. So Smashing Security favorite Maria Farmazis is my pick of the week this week. She's agreed. She has agreed to come in and be a co-host with me for some of the sticky picks. Picking Stickles.


TIM HARFORD. Was your pick of the week just one long advert for your own podcast? Yes.


CAROLE THERIAULT. Learn from me, Tim. Learn from me.


TIM HARFORD. Oh, don't worry. I think I already did that.


CAROLE THERIAULT. I learned from you. That's what I meant to say. Anyway, this past Sunday, we recorded season 2, episode 1, which is scheduled to drop tonight, Thursday at midnight. And I just edited it and it sounds awesome. So check it out.


GRAHAM CLULEY. And your podcast is called Picking Stickles. Is that right?


CAROLE THERIAULT. Sticky Pickles. It's an excellent name for a podcast.


SPEAKER_03. Come on.


CAROLE THERIAULT. With the lovely Maria. So Maria, you are my pick of the week.


GRAHAM CLULEY. Oh wow, my goodness. Now, Carole, I heard you spend some time chatting with James Moore from Culture AI.


CAROLE THERIAULT. I did. We had a really interesting talk, so check it out. So, Mr. James Moore, CEO of Culture.AI. Welcome to Smashing Security.


SPEAKER_03. Thank you very much. Thank you.


CAROLE THERIAULT. Now, we haven't met before, though I hope one day we actually can do in real life.


SPEAKER_03. It'd be nice.


CAROLE THERIAULT. As I was preparing for today, I read your bio and I really liked it. So, I want to read it out here to start this off. So, it says—


GRAHAM CLULEY. Oh, God. Oh, God.


CAROLE THERIAULT. It says, James Moore is the founder and CEO of the human-centric cybersecurity company, Culture.AI. He's allergic to traditional awareness training and has a passion for for finding new ways to empower people and keep their organizations secure. Now, I've been living security awareness for donkey's years, and I am so thrilled that you're here because you might be able to give me a fresh perspective on things. So let's go back. Let's talk about you first. So what led you to actually start Culture AI?


SPEAKER_03. I started life as a pentester, right? And I think every pentester goes through this journey of realization that, you know, you start out testing web apps and then mobile apps, and then you do a bit of social engineering, and then you you land your first red team job and you get in and you think, oh, that's amazing, I've got in. And then you do the next red team job and you get in again.


CAROLE THERIAULT. So the boss might want to say, let's see if you can break into our super solid defenses. And your job is to act like a bad guy and try and break in and then give them a report. Okay, got you.


SPEAKER_03. Yeah, exactly, exactly, exactly. And I think you do your 10th or your 20th or your 30th and you kind of go, well, well, I get in every time. This is insane. And I think it doesn't matter how many blinky boxes are either getting stored on networks. I've had so many clients go, oh, we think you'll get captured or caught because we've got this blinky box. And it never quite works out like that. Sometimes that slows us down a little bit.


CAROLE THERIAULT. But you know what? Pentesters, they all have stories, right? You're the best dinner party guests in the old world when we were allowed to have dinner parties. You must have a good one you can share with us.


SPEAKER_03. Well, so I've got, I've got, yeah, I've got a good, a good one which I'm going to get killed for, for bringing, for bringing up.


CAROLE THERIAULT. Um, we won't tell, we won't tell.


SPEAKER_03. So I mean, I did a, I did a, I did a conference a while back. I have several hundred, hundred people watching. Um, stood up live in front of an audience and kind of said, look, every time I do a red team, it's human behavior that, that lets me into a, into an organization, typically phishing. And it's normally something that people do that let me move around that network. So I'm that confident that people typically fall for things like email phishing that I'm going to stand up in front of everybody and phished my own mother, which, you know, I think it's a little bit taboo. I think doing it probably wasn't the best maneuver, and it certainly damaged our relationships for a little bit of time. But we made it look like it had come from something to do with her work rather, and she fell for it. There was this really awkward moment actually when we launched the attack because I had the stats up live on screen behind me. And for the first minute and a half, nothing happened. And we sent it to about, I think, about 15 people inside her company, including her, and nothing happened for about a minute and a half. So I sat there, I stood there thinking, oh my God, yeah, panicking. What happens if nothing happens? Anyway, she fell for it. A few other people fell for it. The worst part of that for me was actually not the fact she fell for it. We captured her password as part of the attack and we masked the password on screen so we couldn't see what it was. And everybody wanted to know what the password was. And I'm just stood there thinking, I've got my mum's password. Do I really want to see this? Yeah, it could be something terrible.


CAROLE THERIAULT. It could be like, James is a dick for trying to phish me.


SPEAKER_03. Yeah, exactly. It could have been anything. Absolutely anything. Anyway, we revealed it and it was worse than anything. I think it was either password exclamation mark or password one. It was one of the two. I know it was so bad, but I remember the conversation with her off the back of it. And I said, I was talking to her about it. She goes, well, what I don't think you understand, James, about this security thing is hackers would never guess that. And I said, what do you mean they're never going to guess password exclamation mark. She's like, well, it had a capital P. I was like, oh my God, this is not how— it's not how the world works, mother.


CAROLE THERIAULT. That is a great story. But you know what? Your mom is not alone. Listen, we all have family members that are exactly, exactly the same. So no big deal. OK, so what led you to Culture AI then from that exciting life?


SPEAKER_03. Well, Sue, I started out initially, I kind of said, well, I'd like to solve the email problem. Because what you just said is absolutely right. A lot of kind of end users that aren't exposed to the kind of the security world think similarly, and rightly so. So I kind of said, well, I kind of want to try and fix that. So I kind of said, well, why don't we start doing simulated phishing attacks against people? So I founded a company called Phished. But I did that for between 2014 and 2018. And we saw a lot of success with what we were doing, right? I think the biggest insight that we got from that was that where we were able to personalize kind of the education that we were sending and the campaigns we sent to people, to those people as individuals, we got really good results at changing behavior. And we— I've always said that people all behave differently for different reasons, right? The reason that somebody clicks on a phishing email will differ between people. Some people, it'd be an awareness thing, or some people would be an attitude thing. And you can break that down further.


CAROLE THERIAULT. Yeah, exactly.


SPEAKER_03. It could be anything. That really frustrated me with, with Phish, that we got good results, but we were only focused on email phishing and we didn't collect a huge amount of data around around kind of why people were behaving the way they were behaving. So we couldn't really, we couldn't tailor things enough to users. So I kind of, we sold, we sold Phished in 2018 to, uh, to F-Secure, right?


CAROLE THERIAULT. Great company, awesome company.


SPEAKER_03. Yeah, awesome company. I mean, they do some, do some amazing stuff. Um, but I kind of took a, took a step back and said, well, knowing what I know now, could I, could I go back and build something a bit, a bit different, right? And I kind of said, well, we're at a time where there's a lot of, there's a lot of companies out there that are investing quite heavily in cloud. So there's, there's lots of different apps that be, that are being used as well as kind of existing infrastructure. A lot more companies are open to this concept of doing kind of attack simulations. So I kind of said, well, can we not just build something that aggregates all this data from lots of different sources and turns that into some kind of almost behavioral insight? So what are the things that our employees are doing way beyond just email phishing that are putting the company at risk? Can we use that data and that info to try and change behavior and deliver, you know, let's forget this generic security awareness training rubbish that everybody's been doing. Can we actually start to personalize training and nudges and content and deliver it down different channels and things? And the answer was, yeah, we can. And we kind of looked at it and said, why has this not been done before? But I think the problem is everybody's just been wanting to push out easy, boring, generic awareness training.


CAROLE THERIAULT. Tried and tested, maybe, but just to a level, not actually pushing the envelope.


SPEAKER_03. Yeah, exactly. And then everybody's frustrated when they don't get good results with it, and they kind of go, well, this awareness training stuff's a load of rubbish, which, you know, it is. So yeah, that's kind of where we went with Coutry AI. We tried to do something a little bit different, I guess.


CAROLE THERIAULT. You must encounter occasionally organizations that they're talking with you or whatever, you're hearing this kind of blame attitude, like the users are always the bane of my life, they cause all the problems.


SPEAKER_03. Yeah, so I mean, we see this all the time. We quite often hear the phrase, you know, that the humans or the people or the users are the weakest link. And then we hear the opposite, which is people saying we're trying to turn humans into the human firewall. Human firewall. I think somewhere between the two but further along to the human firewall side, right? I think human firewall is a bit of a weird phrase and it puts an unrealistic expectation on users. But I think what organizations need to do is kind of say, well, they're people, let's treat them as people. Let's see how we can support them. And just because they've fallen on a— they've clicked a link on a phishing email, it doesn't mean we should immediately fire them. We should look at, well, how do we support them and help them? And you might have a user that's really good at spotting phishing emails, but they set weak passwords or they post stuff online that is quite sensitive or they allow tailgating. There's lots of different behaviors that people struggle with. And for me, it's about supporting and kind of empowering those users rather than almost damaging their relationship with the security teams by shouting at them. That's not what security should be about. It's always, you speak to a lot of CISOs and they always say they want to come across as enabling the business. And I think that historically, a lot of security teams have come across as blockers. And one of those reasons is people are scared of them, especially when they're doing simulated phishing campaigns and things like that.


CAROLE THERIAULT. Yeah. So companies out there, is there specific areas that you might recommend they actually focus on in terms of security awareness training?


SPEAKER_03. Yeah. So I think, I mean, email phishing, right, is the obvious one. A lot of companies are already doing it. They could be doing it better in a lot of cases. But email phishing is definitely something something that the company should be measuring and should be improving. SMS phishing is another one there. You know, they're two pretty easy ones to measure. Another one though that's kind of— it's a bit more recent, and we're actually about to do a white paper on this because some of the results we've got are quite— well, I say quite, they're very, very interesting— is multifactor authentication. So we've found, and we've recently put in the functionality in the platform to issue things like push notifications, um, to imitate, you know, if somebody signs into, um, into an application, they get a push notification to say, did you sign into this?


TIM HARFORD. Yes, Accept.


SPEAKER_03. Okay, it logs you in. We've actually put in the functionality to imitate that. So users get a seemingly legitimate push notification that they didn't initiate. And we found that over half of the people that we've tested with that have accepted. They've just gone, okay, I'm used to seeing this. I'm going to hit Accept, which completely negates the use of multifactor authentication because if a real attacker did it, the user would go, okay, well, yeah, I just accept and let the attacker in. Which is really scary.


CAROLE THERIAULT. Yeah, but I can also now see how that happens because it happened to me the other day with my other half. We run a company. He was doing some of our accounts. I'm the principal owner of the email account. I had my phone with me, so I assumed he was doing it and then pressed okay and let him through. And then suddenly I thought, my God, what if it wasn't him? Now, I called him and it turned out to be him, but I literally just went through it because I made it make sense in my head without double-checking.


GRAHAM CLULEY. Checking.


SPEAKER_03. Yeah, exactly. We're so used to it, and we have this, we have this concept of system 1 and system 2 behavior. System 2 is typically where you kind of, you stop and you think about something, and system 1 is kind of autonomous. And it's essentially when somebody clicks on a link in a phishing email, that's normally system 1 behavior that's causing that. And it's a very similar thing because you immediately get the notification and you're just so used to going, okay, accept. You don't stop and think. Um, and actually, when the, um, when the team at InsightCulture AI built this into the platform, the first person they targeted with it was me. And I didn't, I didn't know it was coming up, I'm not going to lie. And the only reason I spotted it was I was actually coming out of the gym at the time, which is a small miracle because I'm very rarely near the gym. So I spotted it coming out of the gym and I thought, that's really weird because it's for VPN and I'm not near my laptop. That's very strange. And that's the only reason I spotted it. And I think when we started to test clients with this, we're seeing similar stories. So that's the kind of stuff that we're setting out to measure. I think a lot of organizations should definitely focus on, on MFA. Um, because I just think there's, there's some hidden stats there. But a lot of companies are looking at MFA at the moment and going, oh, this could be not the silver bullet, but it's— it will have a big impact in terms of reducing the effect of phishing. And I suspect maybe it, you know, it doesn't have quite as big an effect as a lot of places are hoping. Um, so that's a big one.


CAROLE THERIAULT. Now what about home users? We do have a few of them that listen to the show as well. So today, the day that we, uh, published this, uh, show, it was Thanksgiving in the United States, and Christmas is just around the corner for many of us. So any tips for us users?


SPEAKER_03. Yeah, definitely. I mean, so Thanksgiving, Christmas, in particular, the kind of the increase in delivery-based email phishing attacks goes up through the roof. So we see quite a lot of users will get targeted by attacks or emails that will say your shipping for such and such gift has been delayed, or your Amazon order requires you to update your payment details. Or attackers know people people are expecting deliveries around this time of year, and they really, really look to exploit that. So there's kind of one big tip that we can give this time of year. It's to, it's to watch out for, for emails that you may even feel like you were expecting, and just double-check them. Make sure that it is Amazon or it is the other, that other website you've ordered off, and they're sending you that email. Um, look at the link really carefully, and again, don't just kind of click without, without thinking. Um, I think that's really important.


CAROLE THERIAULT. Okay. And now your company name is Culture AI, and AI as a term in our industry, at least, is sometimes causing a little bit of confusion because people are going, well, actually, there is no AI, and AI doesn't exist, and it's really just algorithms. And what do you think about that? What are your thoughts on actually using that name inside your company name?


SPEAKER_03. Yeah, I think it's a really good one. And to an extent, I think Maybe we don't regret putting AI in our name, but I think there's a real risk that people just go, are they using it as a buzzword? Because I think that happens so much. For us, the phrase AI is not about 100% replication of a human mind inside of a computer. It's about the ability to make very, very good predictions based on data. We use machine learning to basically try and make predictions around how and why people are behaving. The way they're behaving so that we can work out what the best type of training and the best messages are to give to that individual user at scale. The AI side for us is machine learning. It's using machine learning to make predictions based on the data we're getting, and those predictions allow us to, to a reasonably high degree of accuracy, predict how a user's likely to behave based on data we've got about them and why they're doing it so that we can tailor training better than we could if we were just using using a traditional kind of if-else statement.


CAROLE THERIAULT. Brilliant. Well, James Moore, thank you so, so much for sharing all this. I'm excited to see how this can change the landscape because people often complain about security awareness training and being able to tailor it might make it a heck of a lot more useful and interesting to people because they feel that it's actually talking their language. So, anyone who would like to learn more about Culture AI, they've actually created a whole page page just for Smashing Security listeners. So you can see that at culture.ai/smashing. Plus, they have a deal just for Smashing Security listeners. Sign up at culture.ai/smashing and get your first 50 employees for free for life. Can't beat that. James Moore, everybody, CEO and founder of Culture AI. Thanks so much for coming on the show.


SPEAKER_03. Fantastic.


GRAHAM CLULEY. Marvelous. Well, that just about wraps it up for this week. Tim, thank you so much for joining us. I'm sure lots of our listeners would love to follow you online or find out more about your new book.


CAROLE THERIAULT. Yeah, buy your book.


GRAHAM CLULEY. Or Cautionary Tales podcast. What is the best way for folks to find out all about that kind of stuff?


TIM HARFORD. The, the single place to find out is my website, timharford.com. Timharford.com. The book is called How to Make the World Add Up.


GRAHAM CLULEY. Terrific. And you can follow us on Twitter @SmashingSecurity, no G, Twitter doesn't allow us to have a G, and also join the Smashing Security subreddit. And don't forget, if you want to be sure never to miss another episode, subscribe in your favorite podcast app such as Apple Podcasts, Spotify, or Pocket Casts.


CAROLE THERIAULT. Huge, huge thank yous to all of you for listening to us each week. We hope we Eased the horror that is 2020 at least a teeny bit this week. Of course, high five to this week's Smashing Security sponsors, Culture AI and LastPass. And of course, huge thank yous to our Patreon supporters. Your support makes Smashing Security free for all. Check out smashingsecurity.com for past episodes, sponsorship details, and information on how to get in touch with us.


GRAHAM CLULEY. Until next time, cheerio, bye-bye, bye-bye, bye.


CAROLE THERIAULT. All right, how do you How do you feel, Tim?


TIM HARFORD. It's great. That was really good fun. That was really good fun.


GRAHAM CLULEY. Thank you.


TIM HARFORD. Thank you so much, guys.


CAROLE THERIAULT. A full hour of your time means the world. Thank you so much.


TIM HARFORD. My pleasure.


GRAHAM CLULEY. Have you ever thought of writing a book or anything like that? And if you did, what would you call it? How to Make the World— I can't think of what— what should it be?


TIM HARFORD. That should be— yeah, something. We're getting there somewhere.


GRAHAM CLULEY. We're getting there.


CAROLE THERIAULT. How to make the world add up, it's already in my basket.

-- TRANSCRIPT ENDS --