Listen early, and ad-free!

291: Deepfake dangers, AI image opt out, and controlling your urges

With , ,

Anti-porn "shameware" apps take a privacy pounding, is your image already being used by AI, and deepfake danger continues to deepen.

All this and much much more is discussed in the latest edition of the "Smashing Security" podcast by computer security veterans Graham Cluley and Carole Theriault, joined this week by Host Unknown's Thom Langford.

Warning: This podcast may contain nuts, adult themes, and rude language.

Episode links:

Sponsored by:

  • Kolide – the SaaS app that sends employees important, timely, and relevant security recommendations concerning their Mac, Windows, and Linux devices, right inside Slack.
  • Bitwarden – Password security you can trust. Bitwarden is an open source password manager trusted by millions of individuals, teams, and organizations worldwide for secure password storage and sharing.
  • The Cyber Security Inside podcast – Relevant cybersecurity topics in clear, easy-to-understand language. With every episode, you’ll walk away smarter about cybersecurity, and have fun while you’re at it!

Support the show:

You can help the podcast by telling your friends and colleagues about “Smashing Security”, and leaving us a review on Apple Podcasts or Podchaser.

Become a Patreon supporter for ad-free episodes and our early-release feed!

Follow us:

Follow the show on Twitter at @SmashinSecurity, or on the Smashing Security subreddit, or visit our website for more episodes.

Thanks:

Theme tune: "Vinyl Memories" by Mikael Manvelyan.

Assorted sound effects: AudioBlocks.

Privacy & Opt-Out: https://redcircle.com/privacy

Transcript +

This transcript was generated automatically, and has not been manually verified. It may contain errors and omissions. In particular, speaker labels, proper nouns, and attributions may be incorrect. Treat it as a helpful guide rather than a verbatim record — for the real thing, give the episode a listen.



GRAHAM CLULEY. It's interesting. The system claims to be able to distinguish between porn and non-porn images.


THOM LANGFORD. Right.


CAROLE THERIAULT. I had a professor who said that anything that was longer than it was wide was a phallic symbol. So, you know, just saying. I was like, a toaster?


THOM LANGFORD. Um, fridge?


CAROLE THERIAULT. Smashing Security.


UNKNOWN. Episode 291, Deepfake Dangers, AI Image Opt-out, and Controlling Your Urges with Carole Theriault and Graham Cluley. Hello, hello, and welcome to Smashing Security episode 291. My name's Graham Cluley.


CAROLE THERIAULT. And I'm Carole Theriault.


GRAHAM CLULEY. And Carole, who have we got in the hot seat joining us this week?


CAROLE THERIAULT. Well, it is the sometimes wonderful Thom Langford from Host Unknown. Welcome, Thom.


THOM LANGFORD. Sometimes? Well, let's hope I'm wonderful today then.


CAROLE THERIAULT. We'll see. How you doing, Thom?


THOM LANGFORD. I'm very good. It's always a pleasure to be here, I have to say.


CAROLE THERIAULT. You've been a busy boy though.


THOM LANGFORD. No, I know, I know. It's, uh, we've been all over the place and it's, yeah, just busy, busy, busy. What can I say? I was, I was photographing a wedding just the other week actually.


CAROLE THERIAULT. There you go, you see, multi-talented.


THOM LANGFORD. Well, you know, something like that.


GRAHAM CLULEY. One talent at least.


THOM LANGFORD. Yeah, that's right.


CAROLE THERIAULT. Let's not waste Busy Thom's time here and let's kick off and thank this week's sponsor Bitwarden, Kolide, and Cybersecurity Inside Podcast. It's their support that helps us give you this show for free. Now coming up in today's show, Graham, what do you got?


GRAHAM CLULEY. Oh, I'm going to be telling you how technology can help you get over your filthy little habit.


THOM LANGFORD. Oh God, why would I want to get over that?


CAROLE THERIAULT. It's too early for this. Thom, what about you?


THOM LANGFORD. I'm going to be talking about how the internet really doesn't forget, even when it shouldn't have remembered in the first place.


GRAHAM CLULEY. Intriguing.


CAROLE THERIAULT. And I'm talking deepfakes, Clint Eastwood style. The good, the the bad, and the ugly. All this and much more coming up on this episode of Smashing Security.


GRAHAM CLULEY. Now, chums, chums, I've got a question for you, which is this. Do you have a porn problem? Thom, I'm looking at you principally.


THOM LANGFORD. I wouldn't call it a problem, more a hobby.


GRAHAM CLULEY. Pastime?


THOM LANGFORD. Yeah.


GRAHAM CLULEY. Right. Okay. All right. Well, some people think that they do have a problem. Maybe they think they spend Too much time bashing the bishop, polishing the lighthouse, pulling the pud, tally-whacking. Have you got any favorite phrases you like to use, Thom?


THOM LANGFORD. Spanking the monkey.


CAROLE THERIAULT. We've been recording for not even 3 minutes.


GRAHAM CLULEY. Well, 3 minutes. I want to— I know this is a serious point. I've been reading Wired magazine and they have investigated something called accountability apps. And I want to talk to you about today. They tell the story of a chap called Hao Wei Lin. Who was attending an Evangelical Baptist church in the deep US South. And he had a problem. He had a problem, which was that he was going to regular weekly one-on-one sessions with the church leader to see how his faith was going. His particular concern was that he was gay, and he thought that he might get kicked out of the church. And he was reassured— this is a happy ending, if you like— he was reassured to be told that God God still loved him in spite of his, quote, "struggle with same-sex attraction." And this is what the church leaders told him? Yes, that's right. And that they would welcome him into the group, which is a marvelous thing. Good times, right? But of course, God alone can't fix the, quote, "problem" of being gay. And so at the next one-on-one, at his next session with the church leader, not that kind of session, Hao Wei Lin was told to install an app on his smartphone. So the church leader says, "So remember what we were talking about last week? I think you should install this app here." Not Pornhub or Grindr wasn't the app that he's been told to install. He was being told to install an app called Covenant Eyes. Covenant Eyes is an app which monitors everything that users see and do on their smart devices.


THOM LANGFORD. Whoa.


CAROLE THERIAULT. Okay. No, no, no. What do you mean everything the user sees? It's not like a Google Glass.


GRAHAM CLULEY. Well, no, it's not plugged into your spectacles.


THOM LANGFORD. It's plugged into something else.


GRAHAM CLULEY. It takes screenshots, at least one per minute.


CAROLE THERIAULT. What?


GRAHAM CLULEY. Of your activity.


THOM LANGFORD. At least one per minute?


GRAHAM CLULEY. Yes. Yes.


THOM LANGFORD. You couldn't even get through to lunchtime without your battery dying, surely?


GRAHAM CLULEY. Well, it is apparently doing that. And then if you look at anything that the app considers questionable, it sends a report to the person that you have identified as your ally, your assistant.


CAROLE THERIAULT. Oh my God. This is not bossware. This is Godware, TM, Ontario.


GRAHAM CLULEY. Well, before you start claiming the trademark, there is already an app. So these apps are already— they've got names. They can either be called shamework.


CAROLE THERIAULT. Oh my gosh.


GRAHAM CLULEY. Or accountability.


CAROLE THERIAULT. Shamework?


GRAHAM CLULEY. Shamework. That's right. The idea is that if you share with maybe a close friend, an ally, details as to your porn habit, or details as to how often you're looking up rude things on the internet, then that may encourage you to do it less. And the concern here is that this church is telling its congregation to install these apps, and this has been reported to someone who's your spiritual elder.


CAROLE THERIAULT. Let me just put this in a different context. Imagine this was It's a cupcake app, right?


THOM LANGFORD. Yes.


CAROLE THERIAULT. So, and every time you eat a cupcake, it starts yelling at you going, "You disgusting cake-eating disgusting fatty fat fat, blah blah," right? And that's supposed to be, and as I cry sobbing, stuffing icing into my face.


GRAHAM CLULEY. I saw you with that lemon drizzle. I know what was going on there.


THOM LANGFORD. You harlot.


GRAHAM CLULEY. So the idea is that after an evening's porn perusal, the Covenant Eyes app tells your friend what you've been up to. So it gives them a full report of where you've been, what you've searched for. One of your buddies, right? Krow, you're my bud bud, right?


CAROLE THERIAULT. Right.


GRAHAM CLULEY. So you would get a report from me as to where I've been on the internet, what I've been searching— Well, what if I don't want that? Well, no, 'cause you've agreed to do this because you're my bud bud. You're helping me with my problem.


CAROLE THERIAULT. Oh, right. I'm like your mentor, your guiding light to salvation.


GRAHAM CLULEY. You're someone I trust, and you will receive blurred screenshots of whatever I've been looking at. And you can then call me up and say, hey, hey, how's it going? How you doing this morning? Hey, everything all right over there? You have a good evening?


THOM LANGFORD. Can you send me the picture with it not blurred?


GRAHAM CLULEY. Well, sometimes the blurring isn't obvious, is it? I don't know if it isn't always easy to make out what's going on. I remember, Carole, long, long ago when we worked at Sophos, we did a press release about a piece of malware called Bad Bunny. Bad Bunny.


CAROLE THERIAULT. Yes, I remember Bad Bunny. We talked about this before.


GRAHAM CLULEY. Have we? Well, some listeners may not know about Bad Bunny. I know. But so maybe—


CAROLE THERIAULT. Maybe they shouldn't.


GRAHAM CLULEY. Would you like to describe the Bad Bunny malware?


CAROLE THERIAULT. No, not at all.


GRAHAM CLULEY. No.


THOM LANGFORD. No, no, no, no.


GRAHAM CLULEY. So Bad Bunny was a piece of malware which displayed an image, didn't it?


CAROLE THERIAULT. As a payload, yeah.


GRAHAM CLULEY. As a payload of two people leapfrogging, but the person behind—


THOM LANGFORD. Missed.


GRAHAM CLULEY. No, the person behind was dressed up in a full-size bunny rabbit outfit.


THOM LANGFORD. Oh, cute.


GRAHAM CLULEY. Yeah. Yeah, it was very cute. It was very cute. There was nothing rude about it at all. It was all in the mind of the person watching, thinking that Easter was coming. So it wasn't anything like that. But we pixelated out the eyes, didn't we, of the bunny? That was the thing.


CAROLE THERIAULT. And the human involved.


THOM LANGFORD. The recipient.


GRAHAM CLULEY. And the recipient as well. That's right, yes. So I'm just saying that sometimes, you know, the blurred image, you can still get the gist of what's going on.


CAROLE THERIAULT. Graham, you know, you're getting loody, lood, lood.


GRAHAM CLULEY. Look, I'm sorry. It's just because Thom was coming on the show.


THOM LANGFORD. It's what's in the news. Let's face it. We are merely holding a mirror up to society.


GRAHAM CLULEY. Can I say, I think I've done very well not reporting at all on the chess scandal, which is going on for the last few weeks.


THOM LANGFORD. I'm surprised you haven't, actually.


GRAHAM CLULEY. But I—


THOM LANGFORD. I thought that'd be right up your street, or passage, or whatever you like to call it.


CAROLE THERIAULT. I haven't given it a moment's thought. I didn't even know about it.


GRAHAM CLULEY. Oh, you know. Well, you're missing out on a lot of good gossip. Anyway—


THOM LANGFORD. It's great.


GRAHAM CLULEY. Anyway, anyway.


THOM LANGFORD. Makes me want to learn Morse code.


GRAHAM CLULEY. Anyway, Covenant Eyes, this app tells your friends what you've been up to, gives them this report.


CAROLE THERIAULT. Covenant Eyes. It's terrific, even the name of it.


GRAHAM CLULEY. It's awful. Well, I went to check out the website of Covenant Eyes. Turns out around about 1.5 million people have installed it. They've got this really professional promotional video containing a sort of cut-price Poundland Thom Cruise who's describing his porn habit and his cutesy wife and how their marriage has improved since he installed this.


CAROLE THERIAULT. Until he's regularly shamed into not doing anything sexual.


THOM LANGFORD. Either that or his wife now finally gets to see what he was looking at.


GRAHAM CLULEY. Anyway.


CAROLE THERIAULT. Every night.


THOM LANGFORD. Yeah, she gets a little report. Oh, okay, that's all right. Let's try that one.


GRAHAM CLULEY. The reason why these apps apparently appeal to people is why would anyone want to watch watch porn if they're going to have to talk to their parents or their church leader about it.


CAROLE THERIAULT. Isn't it rule 34 of the internet?


GRAHAM CLULEY. Oh, I don't know.


CAROLE THERIAULT. Maybe some people enjoy that.


THOM LANGFORD. Well, yeah, I know a few people who would probably watch more of it if they knew someone was watching them watch it.


GRAHAM CLULEY. Well, well, Thom, interesting you should say that because I had you in mind. You can subscribe. You can say, I know someone, I have a friend who. You can subscribe for $16.99 per There is a 30-day money-back guarantee.


CAROLE THERIAULT. You don't get any data back though.


GRAHAM CLULEY. Well, this is the thing. I don't know if you were to send an enormous amount of data, which I imagine you might, Thom. Maybe there should be a platinum plan or something.


CAROLE THERIAULT. You know what's also interesting is how they would— like, would they pixelate images? Is that— is that like— they wouldn't want to affect—


GRAHAM CLULEY. Yes, they blur out images. They do blur out images. So it's not a way of getting—


CAROLE THERIAULT. So they might get it wrong, right? Like, Thom might be looking at just a huge donut, for example. For example, and it misconstrues that.


THOM LANGFORD. Chocolate donut.


GRAHAM CLULEY. Anyway, moving on. The app, it's not spyware. I don't think it's really spyware. It isn't secretly spyware. Godware, TM, Carole Theriault. It's quite brazen about admitting what it's aiming to do and to help people with their porn addiction.


THOM LANGFORD. I do have a serious question.


GRAHAM CLULEY. Yes, go ahead.


THOM LANGFORD. So, at some point, the images are captured and then blurred.


GRAHAM CLULEY. Yes, yes.


THOM LANGFORD. Where is that stored? Is there an unblurred version? Is that copyright? I could be looking at pictures of, I don't know, past girlfriends, past partners, etc.


GRAHAM CLULEY. Oh yeah, that's possible. Yeah.


THOM LANGFORD. So where and how is this data being secured? Is it obfuscated permanently? Is it reversible? So is there effectively now another storage area of pornography that could be used for nefarious purposes.


GRAHAM CLULEY. So it's interesting. The system claims to be able to distinguish between porn and non-porn images.


THOM LANGFORD. Right.


CAROLE THERIAULT. I had a professor who said that anything that was longer than it was wide was a phallic symbol. So, you know, just saying. I was like, a toaster?


THOM LANGFORD. Fridge?


GRAHAM CLULEY. Fridge. Anyway, so the images are uploaded to a server under the control of Covenant Eyes, and it claims that the images are blurred. Now, according to Wired's investigation, it says that when it set itself up as a user, it received slightly blurred images.


THOM LANGFORD. So—


CAROLE THERIAULT. What? Slightly?


THOM LANGFORD. I love how vague this is.


CAROLE THERIAULT. Do you know, slightly blurred is not always good. There's a coffee shop I go to, and across the road, I think someone has, like, basically their loo right in front of the window with some very ineffective plastic opaque coating.


GRAHAM CLULEY. Oh, so you can work out the general—


CAROLE THERIAULT. Yeah. When they're, you know, when they're finished and stuff and yanking up their trousers. Yes.


GRAHAM CLULEY. So let me tell you about some other accountability or shamedware apps. There's one called Fortify. And what you can do with this, you can log information about when you last masturbated, where you were when it happened, And this one was the one which intrigued me. What device you used? I mean, I would— device?


THOM LANGFORD. I was entirely manual today.


CAROLE THERIAULT. I'm with Thom. I think people must be, you know, installing this because they want to. I just wonder what the app creators collect as well, right?


GRAHAM CLULEY. Ah, yes.


THOM LANGFORD. Yeah, exactly. Exactly.


GRAHAM CLULEY. Well, in the case of Fortify, it asks you how challenging your urges were during the day. So you can choose from a sliding scale from very easy to very difficult. With different smiley faces.


CAROLE THERIAULT. Are you kidding me?


THOM LANGFORD. They should have like a vinegar strokes face, surely.


GRAHAM CLULEY. There's also, I don't even know what that means. There's also, he's younger than me.


CAROLE THERIAULT. I have no idea what it means. It's not about youth, dude.


THOM LANGFORD. I'll tell you later.


CAROLE THERIAULT. Please don't.


GRAHAM CLULEY. There's also, it gives you trophies and rewards and allows you to, quote, celebrate your victories. Now, what form the celebration takes if you've had like a one-week streak of not stroking.


CAROLE THERIAULT. I don't know.


GRAHAM CLULEY. There's even in this Fortify app an SOS button. So I imagine—


CAROLE THERIAULT. Oh my God. This is just fun. Wouldn't you do this with your partner? You could do this if you were living long distance, right? You could have this little relationship thing and they're like, oh, oh, oh.


THOM LANGFORD. What does SOS stand for? Is it save our sperm?


CAROLE THERIAULT. Oh my God.


GRAHAM CLULEY. Stop our spaffing. Anyway, so, so anyway, so Wired looked into this Fortify app as well. And they found that the form you use to log your masturbation stats with Fortify, it unfortunately has a Facebook tracking pixel on it.


THOM LANGFORD. Oh my God.


CAROLE THERIAULT. What? Explain that to me.


THOM LANGFORD. Oh my God.


GRAHAM CLULEY. What that means is that Facebook is able to track you as an individual and how often you go to Fortify and potentially what you might be entering on it. What could possibly go wrong with Facebook?


CAROLE THERIAULT. Changing your ads.


GRAHAM CLULEY. Facebook knowing how often you're wanking off. But there's another problem. It turned out—


THOM LANGFORD. Do you like more disinfectant wipes?


CAROLE THERIAULT. Easy access underwear.


THOM LANGFORD. Kleenex man-sized tissues.


CAROLE THERIAULT. Snap-on, snap-off trousers.


GRAHAM CLULEY. It turned out there's a bug in Fortify, which means it also passes your account password.


CAROLE THERIAULT. Shut up.


GRAHAM CLULEY. Fortify account password in plain text.


CAROLE THERIAULT. No!


GRAHAM CLULEY. To Facebook as well.


CAROLE THERIAULT. Bree, why did you go first this week? You could have just invited me to do my story first. How is anyone supposed to follow this?


GRAHAM CLULEY. So there's a— I think there's a problem with these kind of apps.


CAROLE THERIAULT. Do you? Hugh, you are, you know, as an expert, I'm listening very carefully. Okay, so you think this isn't very good.


GRAHAM CLULEY. So I think it's your choice whether you want to install an app like this and get your mate, get your bud bud like Carole, to call you up in the morning and say, "Hey, what did you get up to last night?" As if I don't know. But there's—


CAROLE THERIAULT. Got this detailed report.


THOM LANGFORD. From your bloodshot eyes and the— The burst blood vessels in your cheeks.


GRAHAM CLULEY. It's definitely in your area of palms. It's definitely a problem if churches are telling their congregations, or indeed, you know, religious cults, anyone who's sort of in a spiritual position, because apparently hundreds of people at this particular church have installed the app. And get this, if you volunteer for the church, it is mandatory. You have to agree to install the app before you're allowed to work for the church. Or do any voluntary work.


CAROLE THERIAULT. The thing is, Graham, I think you're not thinking about this as a religious person.


GRAHAM CLULEY. Oh, okay.


CAROLE THERIAULT. So if you were Catholic, for instance, couldn't this app save you a lot of time at the confession?


THOM LANGFORD. I'll airdrop you my sins.


GRAHAM CLULEY. Right, exactly. This comes up in the Wired article. Does it? They say that the church elders, they like it.


CAROLE THERIAULT. Of course.


GRAHAM CLULEY. It makes it much easier to know what to talk about.


CAROLE THERIAULT. Exactly. You don't have to pussyfoot around.


THOM LANGFORD. There's huge amounts to laugh at here and there's huge amounts to ridicule here because the apps—


CAROLE THERIAULT. They haven't even started yet.


THOM LANGFORD. Yeah, exactly. Exactly. But Carole, I think you were hinting at a very valid point. Any kind of addiction is going to be destructive.


GRAHAM CLULEY. Anything in excess.


THOM LANGFORD. Yeah. Anything excess is going to be damaging, et cetera, et cetera.


GRAHAM CLULEY. Binging on podcasts, cheese sandwiches, whatever it might be.


THOM LANGFORD. Exactly. But people do need help and all that sort of thing. The problem is, is when that help is not entirely focused on the individual and has ulterior motives. Yes. Or flawed frameworks or facilities in which it operates in. And that's what I think we have here. Yes. For churches, it's about control. And from an app perspective, they're obviously very poorly built, you know, sending passwords in plain text. So it's, it's It is actually very troubling.


CAROLE THERIAULT. What about curvy vicars, right?


GRAHAM CLULEY. Yes. And, and if you think about certain religious— Scientology. If you think about particular groups where they've collected information about their members in the past and they use it as blackmail.


THOM LANGFORD. Yeah, very troubling on, on many counts, preying on people when they are at their lowest and potentially seeking help.


GRAHAM CLULEY. Now I've probably gone on too long about this already. But I just want to mention one thing from the security point of view which it was doing, which is that the latest is following the Wired report, Google has removed Covenant Eyes from the Android App Store because it was apparently exploiting Android's accessibility features in order to see what was going on the screen. So these were features which are built into the operating system to help people with poor eyesight, for instance, screen reading. Those sort of things. So there's a certain irony here that features which are normally used legitimately for helping someone with poor eyesight are now being used by someone maybe who's ended up with poor eyesight because of their wanking problem. Basically, it's essentially the same thing.


THOM LANGFORD. It's still on iOS. I'm sorry, there's somebody come to my door.


CAROLE THERIAULT. Professional.


GRAHAM CLULEY. Amazon delivery.


THOM LANGFORD. That was typical. Right.


GRAHAM CLULEY. Is that the Kleenex from Facebook that's just arrived? Thom, what have you got for us this week?


THOM LANGFORD. Well, a little bit more serious, although it does still cover pornography because, well, like I said, it's a hobby.


CAROLE THERIAULT. What is going on this week?


THOM LANGFORD. No, no, no, no, no. You know, I'm taking it a very serious and sober view of this. So we all know that AI is a big thing now. Certainly hit the media recently about these AI image generating tools like DALL-E and Stable Diffusion. In fact, you were talking about it fairly recently. Yeah. You type in something and up comes your name.


GRAHAM CLULEY. Yeah.


THOM LANGFORD. And they're powered by massive, massive datasets of images. That are scraped from the internet. So, you know, you set your little algorithm off for your AI, it scrapes up every image it can and every bit of context it can about that image so that when you type in something like, I don't know, Thom doing a podcast, it actually pulls something together that actually vaguely resembles Thom doing a podcast or whatever that might be according to the datasets.


GRAHAM CLULEY. Up comes an image of a dumpster fire.


THOM LANGFORD. Yes, exactly.


GRAHAM CLULEY. Yeah. I've heard your podcast. Yeah.


CAROLE THERIAULT. Floating down a river. Yeah.


THOM LANGFORD. The problem is, is what if one of those images is of you? There's actually no easy way for you to opt your image out from these AI datasets that are being used.


CAROLE THERIAULT. So what you're saying is that the images are scraped from the internet and you haven't been asked permission for that?


THOM LANGFORD. Yes.


CAROLE THERIAULT. Right.


THOM LANGFORD. To be used for potentially commercial purposes. So this is actually from a vice.com article. It talks about how in one example, sensitive images can even end up powering these AI tools. So for instance, there was one individual, she had a photograph taken of her, a particularly sensitive image taken of her by her medical practitioner for the purposes of a procedure that she was undergoing. And this image that was guaranteed to be only used for purposes of the procedure and would not be shared elsewhere was found in the dataset belonging to an AI company called LAION, L-A-I-O-N, which it used to train the Stable Diffusion and Google's Imagen. So you've got a company that builds huge datasets and those datasets are then shared with various other AI companies that allow them to then generate their images.


GRAHAM CLULEY. But surely that sounds like a surgeon or a doctor has been careless with the privacy of her photograph because they should have—


THOM LANGFORD. Careless?


GRAHAM CLULEY. Did they post it up on the internet somewhere and leave it publicly accessible?


THOM LANGFORD. Well, exactly. This is where it gets very, very difficult, for instance, because actually the traceability of that is very, very difficult because companies are buying large data sets of images not knowing their provenance effectively. And they could have been stolen, they could have been illegally obtained one way or the other and made available. And nobody is actually claiming any kind of responsibility for this as a result.


CAROLE THERIAULT. So you could imagine, for example, if you were like, I don't know, say a plastic surgeon, and you had all these close-up images of body parts, it's basically pseudo-anonymized or basically anonymized because you don't have headshots that go along with it. So do you care?


THOM LANGFORD. Well, unless there is a headshot, right? You don't know, but—


CAROLE THERIAULT. Yeah, yeah, you don't know, exactly. But I guess if you're not recognizable, do you care? And I think the answer is yes, because we have no idea how much of this information will be used in the future.


THOM LANGFORD. Or if you're not recognizable in that particular crop of that picture, but— there is a larger image of you that is identifiable. But there's other elements to this as well. So Motherboard, who are doing the research into this, has also found that some of the worst images that have ever been posted online are also included in the dataset, including ISIS executing people, real nudes that were hacked from celebrities' phones, all that sort of stuff. So stuff that was very clearly illegal or grossly offensive that have no place on the internet as such or in the public domain are being used to train these AIs.


CAROLE THERIAULT. Is that because it's just being scraped without any regard to what is in the content?


THOM LANGFORD. I think that's exactly it. And companies will probably say, "Hey, we are just gathering what's out." But there's no accountability for the content as a result. You know, there is a moral imperative here. So, you know, Lay on Sight doesn't even go into detail about the, you know, the not safe for work and violent images that appear in the dataset. It does say that it does not contain images that may be disturbing to viewers, which is untrue, but links in the dataset can lead to images that are disturbing or discomforting depending on the filter or search method employed. The FAQs goes on to say, we cannot act on data that are not under our control. For example, past releases that circulate via torrents. This sentence could potentially apply to something such as Scarlett Johansson's leaked nudes, which already existed on the internet. And basically it relinquishes control from the dataset creators. Hmm.


GRAHAM CLULEY. So our lie on making money out of these? They're scraping this data.


THOM LANGFORD. There's gotta be some money being made. I think they're scraping the data and they're providing those datasets to companies like Google, et cetera.


GRAHAM CLULEY. Yeah. Yeah.


CAROLE THERIAULT. You'd expect Google to kind of actually look at this and go, whoa, right? Let's keep this dataset.


THOM LANGFORD. What? Do no evil, Google? Sure.


CAROLE THERIAULT. I'm calling on them right now on a very, very influential show.


GRAHAM CLULEY. It sounds like Google's thing entirely. It's the way they've always operated, isn't it? Scoop up whatever they like.


THOM LANGFORD. And this, the article goes on and talks about how responsibility has to be on the part of the developers of the AI and the machine learning tools and on the people who are actually creating these datasets. It shouldn't be on the individual whose photos are there.


GRAHAM CLULEY. No.


THOM LANGFORD. It should be on the companies that are basically scraping the data, making money. They need to make sure that that the data they're gathering is valid, is legal, etc.


CAROLE THERIAULT. Ethical.


THOM LANGFORD. Ethical, absolutely. Of course they won't.


GRAHAM CLULEY. No, they're never going to do that. They're never going to do that unless someone comes at them with a great big cricket bat.


CAROLE THERIAULT. Yeah, or a regulation.


THOM LANGFORD. Yeah. The best way to police this or to make this happen is to put the responsibility on the people who are actually using these images. And the article goes on to quote somebody who's saying that algorithmic destruction is, I think, an actual deterrent because that's going to cost money and time.


GRAHAM CLULEY. Surely they've got a backup, haven't they?


THOM LANGFORD. You would like to think so.


GRAHAM CLULEY. Someone erases the awesome code.


THOM LANGFORD. There is, however, a little ray of hope for us.


GRAHAM CLULEY. Oh yes.


THOM LANGFORD. An artist and musician, Holly Thirndon, has created a website that makes it easy for people to search if their images have been used to train AI. This website is called Have I Been Trained? The link's in the show notes. Very simple. You type in words, your name, name, you can upload a photo, etc., etc. I gave it a try. I uploaded one of my sort of publicity photos and found there's an awful lot of bald, bearded white men out in the world, is all I can say, who all look very, very similar to me.


GRAHAM CLULEY. They could be the stunt double in Host Unknown, the movie, couldn't they? You know what?


THOM LANGFORD. They could. Exactly. No.


GRAHAM CLULEY. If someone demanded a body double for you in a sex scene, if someone was in a— if someone was filming a sex scene with you and didn't want it to be you, obviously, Thom, they could ask for one of these people.


THOM LANGFORD. They couldn't afford my moneymaker. Oh my.


GRAHAM CLULEY. Now, I've had a go at 'Have I Been Trained?' I haven't uploaded a photograph, but I entered my name.


THOM LANGFORD. Yeah.


GRAHAM CLULEY. To see what would happen.


CAROLE THERIAULT. I entered your name as well.


GRAHAM CLULEY. Oh, oh, did you? Now, I don't think they look particularly like me. There's one who looks like a jockey, a couple who look like a murderer.


THOM LANGFORD. Yeah, I had a whole bunch that looked very odd, I have to say, when I put my name in.


GRAHAM CLULEY. They all look vaguely—


CAROLE THERIAULT. they all have very crazy eyebrows though, Graham.


THOM LANGFORD. The AI algorithm starts with the eyebrows, works out from there.


GRAHAM CLULEY. Carole, what have you got for us this week?


CAROLE THERIAULT. Okay, so meet Patrick Hillmann. He is the chief communication officer at Binance, the world's largest crypto exchange with $25 billion in volume, says CSO Online in an article about two days ago. And he starts his day as any other, you know, butt scratch, 5-mile run, a kale and goji berry smoothie, and a scroll through the daily deluge of email.


GRAHAM CLULEY. He's not running the Fortify app, clearly, or CovenantEyes.


CAROLE THERIAULT. I made 3 of those things up. So, so there he is reviewing his email and he spots messages from clients about a recent video call with investors. And there are like 6 of these emails. And one of them says, thanks for the investment opportunity. Another one says, I have some concerns about your investment advice. Another one complains that the video quality wasn't very good and one even asks outright, "Can you confirm the Zoom call we had on Thursday was you?" No way. Way!


GRAHAM CLULEY. Way!


CAROLE THERIAULT. So according to CSO Online, this is where Patrick Hillman got that sinking feeling in his stomach that someone had deepfaked his image and voice well enough to hold a 20-minute investment Zoom call trying to convince his company's clients to turn over their bitcoin for a scammy investment.


THOM LANGFORD. It's happened. It's happened.


CAROLE THERIAULT. Right? So he says, quote, the client I was able to connect with shared with me links to faked LinkedIn and Telegram profiles claiming to be me, inviting them to various meetings to talk about, you know, different listing opportunities. Then the criminals used a convincing-looking holograph of me in Zoom calls to try and scam several reps. Of legitimate cryptocurrency projects.


GRAHAM CLULEY. Holograph. Is this guy a fan of Star Trek: Next Generation? What?


CAROLE THERIAULT. Now, there's a few different approaches, right, on how deepfakes are created, but many deepfakes use generative adversarial networks, okay, or GANs. G-A-Ns. I don't know how you say the acronym. But this is basically where two machine learning modules duke it out, right? One machine learning model trains on a dataset and then creates a video forgery while the other attempts attempts to detect the forgery. And the forger creates fakes until the other machine learning model can't detect it.


GRAHAM CLULEY. Okay. Yeah.


CAROLE THERIAULT. Right? And so of course, the larger set of training data, the easier it is for the forger to create a believable deepfake. And this is a massive challenge, says Eric Horvitz of Microsoft. This is in a brand new paper on the subject, link in the show notes. So over time, the generator learns to fool the detector. And with this process, he says, at the foundation of deepfakes, neither pattern recognition techniques nor humans will be able to reliably recognize deepfakes. So thoughts?


THOM LANGFORD. Well, I'm very happy to be corrected if I'm wrong, but this is the first time I've heard of a video deepfake being used to run a con in reality.


CAROLE THERIAULT. Oh no, no, there was a big one. A bank, was it in Austria, Graham? Where they—


THOM LANGFORD. That was a voice one, though, wasn't it?


CAROLE THERIAULT. No, video. And— Oh, actually, no, they weren't using a— sorry, a quote, holograph. The guy was masked, actually. That's right.


GRAHAM CLULEY. Oh, the one involving—


CAROLE THERIAULT. The guy was masked.


THOM LANGFORD. It was basically old school deepfake.


GRAHAM CLULEY. Yeah, there was one involving the French Defence Minister. They created a set to make it look like it was inside the French—


THOM LANGFORD. But it was an actual physical mask, wasn't it?


CAROLE THERIAULT. Yeah, it was a physical mask, that's right.


GRAHAM CLULEY. Yeah, it was like Mission: Impossible style, you know?


THOM LANGFORD. Yeah, yeah, yeah.


GRAHAM CLULEY. That kind of thing.


THOM LANGFORD. But this, I think, is the first time that I've heard of this actually being used in the wild, as it were. Because a couple of years ago, end of 2020, you know, when you get those things of, what predictions have you got for the next year? And so I said back then that video deepfake would be used for, you know, running a con. And of course, 2021 went through and it didn't happen. But now it has. It's a little bit behind.


GRAHAM CLULEY. You're a soothsayer, Thom.


THOM LANGFORD. Well, something like that.


GRAHAM CLULEY. We'll ask you for the lottery numbers later.


THOM LANGFORD. Yeah, they won't be right for this Wednesday, but maybe for about 7 Wednesdays in advance.


GRAHAM CLULEY. So it's not that this chap from Binance actually went out a bit, had a few lagers, got a bit crazy, went on Zoom, gave him some bad financial advice.


THOM LANGFORD. Do you know what? That was my initial thought. He went on a bender, got some gear up his hooter and he was off.


CAROLE THERIAULT. Although I suspect this argument will be used in the very near future when someone does something fucking stupid online, right?


THOM LANGFORD. Yeah, yeah. It wasn't me, it was a deepfake.


CAROLE THERIAULT. Exactly.


THOM LANGFORD. You were in my office.


GRAHAM CLULEY. That'd be Shaggy's next song. It wasn't me, it was a deepfake.


CAROLE THERIAULT. Now, obviously, or maybe not obviously, but obviously to me, 'cause I'm very, very smart. No, no, but obviously audio deepfakes would be easier to fake, wouldn't they? Yes, yes. Because you only have the audio to worry about, not having to match up the video as well. And all that kind of stuff. Like, so it could be fairly easy for me to, you know, someone like Graham, right? You're a podcaster, or Thom, and I could get so much audio from you that I could probably get some audio deepfake to convincingly have you sing I'm a Little Teacup Short and Stout.


GRAHAM CLULEY. Yeah.


CAROLE THERIAULT. So the idea is like, how would we be able to tell?


GRAHAM CLULEY. Yeah.


CAROLE THERIAULT. I was just on the MIT deepfake site because they have a few tests there, link in the show notes again, and I was feeling quite smug because I was like, oh, of course that's not Trump, and oh, I can see that. No, no, no, no, no. But I got screwed on one. Like, I literally— yeah. So, so I was thinking about this, but according to the conversation, researchers at the University of Florida developed a technique that measures the acoustic and fluid dynamic differences. I know it sounds complicated, but between voice samples created organically by human speakers and those generated synthetically by computers.


GRAHAM CLULEY. Yes, that's going to be my suggestion of how to tackle the problem. Yes, it seems fairly straightforward.


THOM LANGFORD. Fluid dynamics, something, something differences.


CAROLE THERIAULT. Okay, so this group hypothesized that deepfake audio samples would not be constrained by the same anatomical limitations that us humans have. And if machines could detect that difference, could that not be very helpful in detecting fake messages or fake, you know, voice alerts? And they were right. The testing results not only confirmed their hypothesis, but revealed something super interesting. It was common for deepfake audio to result in vocal tracks with the same relative diameter and consistency as a drinking straw in contrast to human vocal tracks. So they were much more limited and not as variable in shape.


GRAHAM CLULEY. Oh, okay.


CAROLE THERIAULT. Right? So by estimating the anatomy responsible for creating the observed speech, their findings suggest that it's very possible to identify whether the audio was generated by a person or a computer.


GRAHAM CLULEY. That's interesting. So at the moment. Yes, quite exactly. Because it could be introduced later. So for instance, if I understand correctly, a very simple characteristic would be, here I am speaking quite closely to the microphone, right? And if I spoke like this the entire time, that would be unusual. But maybe in a normal human conversation, sometimes you would move further away.


CAROLE THERIAULT. Yeah, exactly like Thom's been doing this whole episode. Doesn't sit still for one second.


THOM LANGFORD. I never sit still, Thom.


CAROLE THERIAULT. Feels like he's on a unicycle.


GRAHAM CLULEY. Oh, the chair is creaking, creaking. As long as you're not on a ping pong stick, that's all we care about.


THOM LANGFORD. Ping pong stick.


GRAHAM CLULEY. Yes.


CAROLE THERIAULT. Graham. So, so these—


THOM LANGFORD. What?


CAROLE THERIAULT. These deepfakes, they may become, you know, the new ransomware of tomorrow, which is very scary for a lot of us. And we've been talking about it for a while, but now we're seeing examples of it. According to MIT, these are the things that you want to look for. We've talked about this before, but it's worth just going through really quickly. So you want to look at cheeks and foreheads. The wrinkles in these areas are often non-existent. So yay for wrinklies, and screw poor plastic surgery people!


THOM LANGFORD. Are we back to Graham's story?


GRAHAM CLULEY. Oh my goodness.


CAROLE THERIAULT. I'm not even— I'm ignoring that. Um, so shadows in the areas around the eyes, nose, and open mouth. Shadows would, uh, often be poorly formed. Glasses. So position and angle of any lighting glare in the lenses, right? Should shift correctly as the head moves.


GRAHAM CLULEY. No one's going to do this.


CAROLE THERIAULT. Yes, they will. They're gonna have to. 'Cause there's no technology at the moment that we can really reliably—


GRAHAM CLULEY. I've got another answer. I've got another answer, right? Deepfakes are clearly going to become a bigger and bigger problem, which means we should stop trusting people that we communicate with via computers. We should—


CAROLE THERIAULT. Yes, so don't listen to a word that we say, listeners.


THOM LANGFORD. Get back in the office. Get back in the office.


GRAHAM CLULEY. Yes, like Jacob Rees-Mogg would recommend. I think we have to start having face-to-face meetings again. And even then, you can't be sure if they're not a hologram. So carry a baguette with you.


THOM LANGFORD. Holograph.


GRAHAM CLULEY. Carry a baguette and whack them, so you check that they're solid as well. That is, for now, the best way to conduct any important meeting.


CAROLE THERIAULT. See, I hate this, because I actually don't really like looking at people very much, right? Like, and I—


GRAHAM CLULEY. I don't like looking at you either, Carole.


CAROLE THERIAULT. I have a very old television, and part of my love for my old television is it's not high-def. So I don't have to look at people's pores, you know, or war paint on like US newscasters. And now I've gotta study their faces just to make sure they're not lying to me and they're full of shit. Anyway, so deepfakes, they're a-coming.


GRAHAM CLULEY. Cybersecurity continues to be a hot topic. It's relevant for all of us, no matter what field we are in. And the Cybersecurity Inside podcast is a fantastic resource to stay up to date on the latest news and trends. Whether you're a security expert or just want to know more about the subject. The Cybersecurity Insight Podcast is hosted by Thom Garrison and Camille Morhardt, and they and industry guests make it easy to understand and learn more about today's most important security topics. Recent episodes have included the ethics of AI and machine consciousness, where we're headed with the cloud, how small businesses can get access to cybersecurity resources, ransomware viruses, and so much more. Every episode, you will walk away smarter about cybersecurity and have fun while you're at it. So what are you waiting for? Check out cybersecurityinside.com/smashing to listen to the latest episode. That's cybersecurityinside.com/smashing, or search for Cybersecurity Inside wherever you listen to podcasts. And thanks to them for supporting the show.


CAROLE THERIAULT. October is Cybersecurity Awareness Month, and Bitwarden would like to remind everyone about key actions that the US Federal Agency for Cybersecurity recommends that you take. Number 1, use strong passwords. Bitwarden can generate and store strong passwords for you. And 2, enable multifactor authentication on all your accounts, including your password manager. And of course, it's recommended that you keep your software up to date and take steps to recognize and report phishing. Bitwarden supports security for all with fully featured free accounts available to everyone. This Cybersecurity Awareness Month, protect yourself and help protect loved ones by educating them on password security and starting up a free Bitwarden account today at bitwarden.com/SmashingSecurity. Smashing. That's bitwarden.com/smashing. And thanks to Bitwarden for sponsoring the show.


GRAHAM CLULEY. Collide sends employees important, timely, and relevant security recommendations for their Linux, Mac, and Windows devices right inside Slack. Collide is perfect for organizations that care deeply about compliance and security, but don't want to get there by locking down devices to the point where they become unusable. So instead of frustrating your employees, Kolide educates them about security and device management while directing them to fix important problems. Sign up today by visiting smashingsecurity.com/kolide. That's smashingsecurity.com/kolide. Enter your email when prompted, and you will receive a free Kolide goodie bag. After your trial activates. You can try Kolide with all of its features on an unlimited number of devices for free, no credit card required. Try it out at smashingsecurity.com/kolide. That's smashingsecurity.com/kolide. And thanks to Kolide for supporting the show. And welcome back. Can you join us at our favorite part of the show, the part of the show that we what we like to call Pick of the Week.


CAROLE THERIAULT. Pick of the Week. Pick of the Week.


GRAHAM CLULEY. Pick of the Week is the part of the show where everyone chooses something they like. Could be a funny story, a book they've read, a TV show, a record, a movie, a podcast, a website, an app, whatever they wish. Doesn't have to be security-related necessarily.


CAROLE THERIAULT. Better not be.


GRAHAM CLULEY. Well, my Pick of the Week this week is not security-related. My Pick of the Week this week is the joy of sets.


CAROLE THERIAULT. Of what?


GRAHAM CLULEY. The Joy of Sets. Are you familiar with this?


CAROLE THERIAULT. Did you design this entire show for Thom Langford?


GRAHAM CLULEY. So, yes, I did actually. You did?


CAROLE THERIAULT. So, Thom, be honoured, because he didn't think about me or what I think about any of this.


GRAHAM CLULEY. Oh, Carole, you may have misheard me. I said sets.


CAROLE THERIAULT. S-E-T. Like badger sets?


GRAHAM CLULEY. Not quite badger sets.


THOM LANGFORD. Okay.


GRAHAM CLULEY. As in the TV, television sets, because The Joy of Sets, link in the show notes. Is an archive the BBC have put together of empty, over more than 100 empty sets from different shows, which they've had over the years. There's no actors getting in the way. It's just the beautiful set of, for instance, the Liberator from Blake's Seven.


THOM LANGFORD. I've just opened that one. And there's 7 people in it.


GRAHAM CLULEY. Oh, are there? You see?


THOM LANGFORD. But you're right. They're virtually all, every other one is virtually empty.


GRAHAM CLULEY. There you go. There you go. And so it's kind of curious, and it's a wonderful sort of— It's brilliant. And you know how you can use these, Thom, don't you? Because you love a green screen.


THOM LANGFORD. Yeah.


GRAHAM CLULEY. You could put yourself on the multicoloured swap shop in place of Noel Edmonds, Top of the Pops, Steptoe and Son, any of these things, Fawlty Towers. Anyway, I thought it's marvellous. I like it very much.


THOM LANGFORD. Is brilliant. I love it.


GRAHAM CLULEY. This is The Joy of Sex.


CAROLE THERIAULT. You see, I didn't grow up here, so a lot of these shows were before my time.


GRAHAM CLULEY. It's very nostalgic, but if you like old TV shows, Yes Minister, Grange Hill, Only Fools and Horses, you're going to like The Joy of Sex.


THOM LANGFORD. Good Life.


CAROLE THERIAULT. Oh, Thom.


GRAHAM CLULEY. Oh, Barbara.


CAROLE THERIAULT. Oh, Thom.


GRAHAM CLULEY. Oh, she's so sexy.


THOM LANGFORD. Yes.


GRAHAM CLULEY. Wasn't she, eh?


CAROLE THERIAULT. Yeah, let's objectify them now.


GRAHAM CLULEY. Right. Okay. And that is why it is my pick of the week. Thom, what's your pick of the week?


THOM LANGFORD. My pick of the week is the Steam Deck.


GRAHAM CLULEY. What's that?


THOM LANGFORD. So there was a company, it came back in 1996 called Valve, Valve Software. And they produced the series of games called Half-Life. Initially based on the Quake engine, wildly successful games. They've spawned multiple sequels, huge amounts of community love, et cetera. Valve evolved from just a software company into a hardware company, into a services company. They launched, I think it was in 2002, 2003, something like that. They launched the thing called Steam, which is a gaming platform where you buy your games online, you store them in their library or your library. You have the software on your computer and you can download and delete and upload and it creates, you know, it will store all of your high scores and it's where you download your patches and it's a very community-based, it's great, it's fantastic. Primarily for Windows PC, but there is some support for Mac games on there. So I'm a gamer from the days of yore, from the '90s. So I cut my teeth on Doom and Doom 2 and Quake, et cetera. Had the regular LAN parties. And I like to, I used to go to Steam and, you know, download some old games that I used to like. They have plenty of new stuff there as well. The problem being, since I'm an Apple person all round, I was quite limited to what I could actually find play until the Steam Deck came along. Now, the Steam Deck is like a handheld console, slightly larger than the Nintendo Switch. So a big screen in the middle with sort of, you know, connected joypads on the side. It's effectively a self-contained battery-powered computer. It runs SteamOS, which is a form of Linux, an AMD processor. Wi-Fi, etc., etc. And it connects directly to Steam and you can download virtually, I think it's something like 90+% of Steam's catalog of dailies and run them all from a battery-powered handheld or you plug in a little USB-C hub and you put it up onto your TV or your screen and run them off there. A Bluetooth controller, or even if you're like me, old school with Quake and stuff like that, a keyboard and mouse. And yep. And to top it all, you can also go into desktop mode and you've got a standard Linux desktop, which you can then download and run all the productivity tools. So if you're out and about and then suddenly you're in a pinch and you need to make a Teams call or jump onto the internet for something, you can still do that. The, there's trackpads on either side that you can use as a mouse if you need, there's an on-screen keyboard. It's fabulous. I've had it about 4 days and I love it. I love it.


GRAHAM CLULEY. You love a gadget, don't you?


THOM LANGFORD. I love a gadget. I've downloaded the entire Myst series. Do you remember that game, Myst?


GRAHAM CLULEY. Oh yeah, Myst.


THOM LANGFORD. Yeah, the entire Myst series. I've got I've got Quake, Quake II, Quake III Arena, Quake Champions. It's, I've got a whole bunch of, you know, various other games on there playing around with it. And it's just so convenient. I don't have to buy a dirt right big gaming machine. It's just, it's brilliant. I'd check it out. The link is in the show notes. I'd check it out. Wonderful piece of kit.


GRAHAM CLULEY. Thom, I've got a question for you. When do you find time to masturbate?


THOM LANGFORD. I know, I know.


CAROLE THERIAULT. Oh my God.


THOM LANGFORD. Well, that's when the web browser comes in handy on it.


CAROLE THERIAULT. Can we?


GRAHAM CLULEY. This sounds like a very impressive game. So it plays the games well. It doesn't crash and things. It's very reliable.


CAROLE THERIAULT. It's not cheap.


THOM LANGFORD. There's, there's, it's about £500 for the top end one. Yeah.


CAROLE THERIAULT. £350 for the low end.


THOM LANGFORD. Yeah.


GRAHAM CLULEY. But I guess the thing is that this, this means all of your games are now portable and it's more portable than a laptop. Laptop, for instance, and it's more set up for gaming.


THOM LANGFORD. It's perfect for somebody who's either trying to relive their youth a little bit, you know, and can access all these old games, or somebody who's just a casual, every now and then gamer, doesn't want to invest in a, you know, a big old gaming rig or anything like that.


GRAHAM CLULEY. Are there free games on Steam as well? Because I am a cheapskate, you see, having spent £400 or whatever it is on this device, I, I probably wouldn't be happy actually spending any money on the games.


THOM LANGFORD. There's loads of free games. And also some of the older games, the ones that you and I grew up with, are like £3, £4.


GRAHAM CLULEY. So, Thom, you've had this 4 days. That's what makes me a little bit nervous, because I think you do get very excited about your toys.


CAROLE THERIAULT. You don't have to buy it right now.


THOM LANGFORD. Just wait a few months.


GRAHAM CLULEY. Well, exactly. That's what I want. I want Thom to come back in a few months and tell us if he's still playing on this or whether the— what are they? The joystick knob. Whether that's—


THOM LANGFORD. Yeah, whether the knob's fallen off.


GRAHAM CLULEY. Whether your knob's fallen off. That kind of thing. That's what I want to know.


CAROLE THERIAULT. You've heard this episode is exclusive. Excruciating.


GRAHAM CLULEY. Carole, what's your pick of the week?


CAROLE THERIAULT. So, my pick of the week is a TV show, a series called Am I Being Unreasonable? And it's flipping great. It stars Daisy Mae Cooper. She sports this massive sherling taupe coat and these '70s shades and these bedazzled bootcut leggings. And she wanders around this English village, very grumpily and friendless. That's how it opens up. Graham, I got you to watch it, and I think you hoovered up the whole series as well.


GRAHAM CLULEY. You did. Yeah. I wasn't sure after the first episode. I was a bit confused as to what was going on because it is very twisty-turny, very twisty-turny. But by about episode 3, I was hooked. And then episode 4, oh my goodness, the whole world has changed. What is going on now? This is on BBC iPlayer, Am I Being Unreasonable? And it's very good.


CAROLE THERIAULT. What's kind of cool about it is you kind of watch the first episode and you think you get the gist. You get a few little glimpses of what you think might be going on. And like for people like me, you're like, oh, I get it. I get it. I know what's happening. Da da da da da da da. I've got it. Got it. Got it. And she totally just veers. You'll get it categorically wrong.


GRAHAM CLULEY. Right? Multiple times.


CAROLE THERIAULT. Multiple times. And it's a bit like Murder, She Wrote, where it's really not obvious.


GRAHAM CLULEY. It's nothing like Murder, She Wrote.


CAROLE THERIAULT. Don't listen to it.


GRAHAM CLULEY. It's nothing like Murder, She Wrote. I'll tell you what it's a little bit like. It's a little—


THOM LANGFORD. Columbo.


GRAHAM CLULEY. It's a little— No. It's a little bit like Fleabag in terms of—


CAROLE THERIAULT. Yeah, it is.


GRAHAM CLULEY. It's funny, but there's also a lot of darkness there. And—


CAROLE THERIAULT. That's what I compare it to as well. Like, it's a comedy thriller, they call it, but it is noir. Interestingly, The Guardian only gave it 3 out of 5 stars, saying it doesn't cohere. And I don't agree. I thought it was really fresh, very funny, quirky, surprising, and very charming. I was like, again, BBC, very cute.


GRAHAM CLULEY. The one I was really impressed with, with was an actor called Lenny Rush, who plays her son. And I thought he stole just about every scene he was in. Because usually, when I see a child in a TV show, I kind of think, "Oh, God, they're going to be painful." He was so funny.


CAROLE THERIAULT. [Speaker:KARA] No, totally agree. But I think it's the relationship between the three, you know, between all of the actors that really makes it because they all are strong. But somehow, it's the— there's like a little fizz magic between all of them and how it works. 'Cause she's a tall lady, right? Daisy Mae Cooper is no wallflower. And her son is, of course, very small. So there's a kind of comedy effect of that, but then there's such a tenderness in the relationship. It's great.


THOM LANGFORD. I'm gonna look at that tonight, I think.


GRAHAM CLULEY. I'd recommend it as well. I'd recommend if you're not sure after the first episode, keep going.


THOM LANGFORD. Keep going.


CAROLE THERIAULT. Yeah, if you're exactly like Graham.


GRAHAM CLULEY. If you think you've worked it out, think, "Oh, I know what this is." Just younger and better looking. It will be rewarding, I think.


CAROLE THERIAULT. That's called Am I Being Unreasonable? on BBC iPlayer. It may be other places. Well, I don't know. And that is my pick of the week.


GRAHAM CLULEY. Great pick of the week, Roel. And that just about wraps up the show for this week. Thom, I'm sure lots of our listeners would love to follow you online and find out what you're up to. What's the best way for folks to do that?


THOM LANGFORD. Oh, I'm on Twitter at @ThomLangford. That's Thom with an H. Uh, you can also get me on my, uh, other day job as a podcast host of Host Unknown at podcast.hostunknown.tv.


GRAHAM CLULEY. Terrific. And you can follow us on Twitter @smashinsecurity. No G, Twitter wouldn't allow us to have a G. And we also have a Smashing Security subreddit. And don't forget to ensure you never miss another episode. Please follow Smashing Security in your favorite podcast app, such as Apple Podcasts, Spotify, and Google Play. Google Podcasts.


CAROLE THERIAULT. And deep, deep thank yous to our episode sponsors Bitwarden, Collide, and the Cybersecurity Inside podcast. And of course, to our wonderful Patreon community. It's thanks to them all that this show is free. For episode show notes, sponsorship info, guest list, and the entire back catalog of more than 290 episodes, check out smashingsecurity.com.


GRAHAM CLULEY. Until next time, cheerio. Bye-bye.


THOM LANGFORD. Bye.


GRAHAM CLULEY. I'm recording locally.


THOM LANGFORD. Yep, same here. I am recording locally. I can see the little red line going across.


GRAHAM CLULEY. What do you use, Audio Hijack?


THOM LANGFORD. Uh, GarageBand.


GRAHAM CLULEY. Oh, hello.


CAROLE THERIAULT. Who the fuck is that?


THOM LANGFORD. That was me.


GRAHAM CLULEY. Who's the amateur on the show?


CAROLE THERIAULT. Who's the amateur on the show?


THOM LANGFORD. The thing is, it comes up on my phone, on my—


CAROLE THERIAULT. Have you heard of this feature called Do Not Disturb?


THOM LANGFORD. Yeah, alright, alright, chill chick. It comes up on my bloody— Bloody— Oh, there it is. God, I can't remember.


CAROLE THERIAULT. This is what happens when you invite geriatrics on the show.


THOM LANGFORD. Computer! That's it.


GRAHAM CLULEY. Oh, that was the word. The word was computer. We were all wondering.


THOM LANGFORD. Bloody hell. It's what happens when I get invited on other people's podcasts. I start to lose all my wordy things.


CAROLE THERIAULT. Gorgeous listeners, and this is how we are starting this show. So wish us luck.

-- TRANSCRIPT ENDS --