A hospital gets hacked because of an ex-employee's grudge, robocalls are on the rise, and we share a scary story about the future of facial recognition.
All this and much much more is discussed in the latest edition of the "Smashing Security" podcast by computer security veterans Graham Cluley and Carole Theriault, joined this week by Michael Hucks.
Visit https://www.smashingsecurity.com/162 to check out this episode’s show notes and episode links.
Follow the show on Twitter at @SmashinSecurity, or on the Smashing Security subreddit, or visit our website for more episodes.
Remember: Subscribe on Castbox, Apple Podcasts, or your favourite podcast app, to catch all of the episodes as they go live. Thanks for listening!
Warning: This podcast may contain nuts, adult themes, and rude language.
Theme tune: "Vinyl Memories" by Mikael Manvelyan.
Assorted sound effects: AudioBlocks.
Special Guest: Michael Hucks.
Sponsored By:
- LastPass: LastPass Enterprise simplifies password management for companies of every size, with the right tools to secure your business with centralized control of employee passwords and apps.
- But, LastPass isn’t just for enterprises, it’s an equally great solution for business teams, families and single users.
- Go to lastpass.com/smashing to see why LastPass is the trusted enterprise password manager of over 33 thousand businesses.
- DomainTools: DomainTools helps security analysts turn threat data into threat intelligence. Its solutions give organizations the ability to use and create a forensic map of criminal activity, assess threats and prevent future attacks.
- Learn more about their products at domaintools.com, or visit domaintools.com/smashing to enter their Capture The Flag competition and be in with a chance to win a $100 gift card.
Links:
- YOU Season 2 Trailer — YouTube.
- Hospital administrator sacked for using NHS computer to download over 10,000 records is spared jail — Daily Mail.
- Robocalls: Americans got 58.5 billion in 2019, up 22% from last year — USA Today.
- Microsoft and Google just can't agree on proposed ban on facial recognition — ZDNet.
- Clearview - Technology to help solve the hardest crimes.
- The Secretive Company That Might End Privacy as We Know It — New York Times.
- Clearview FAQ (PDF).
- Episode review: Columbo Double Shock — Graham got it wrong. It was Martin Landau, not Leonard Nimoy, who played the twins. And they weren't surgeons (but Nimoy did play an evil surgeon in a different Columbo episode that season)
- Eunoia: Words that Don't Translate.
- Dog wagging her tail every time she sees her owner — YouTube.
- She Said: Breaking the Sexual Harassment Story That Helped Ignite a Movement — Amazon.com.
- Harvey Weinstein Paid Off Sexual Harassment Accusers for Decades — New York Times.
- ‘She Said’ Recounts How Two Times Reporters Broke the Harvey Weinstein Story — New York Times.
- Smashing Security merchandise (t-shirts, mugs, stickers and stuff)
Privacy & Opt-Out: https://redcircle.com/privacy
Transcript +
This transcript was generated automatically, and has not been manually verified. It may contain errors and omissions. In particular, speaker labels, proper nouns, and attributions may be incorrect. Treat it as a helpful guide rather than a verbatim record — for the real thing, give the episode a listen.
MICHAEL HUCKS. This is something I flippantly recorded. It is probably going to be the most famous video that I'll ever make in my life. It's a little... Settle that. How much is the dog getting?
GRAHAM CLULEY. What percentage is the dog getting? And did you ask the dog's permission before uploading it to the internet? What about facial recognition? There's a lot of similar looking dogs out there. Oh, I didn't even think to ask.
ANNOUNCER. Smashing Security. Episode 162. Robocalls. Health Hacks. and facial recognition fears with Carole Theriault and Graham Cluley.
GRAHAM. Hello, hello, and welcome to Smashing Security episode 162. My name's Graham Cluley. And I'm Carole Theriault. And we're joined this week by an old guest. Well, he's not old in years, but he was last on the show a couple of years ago. It's Michael Hucks from PCmatic. Hello, Michael. Hello, everyone. Good to be here. Thanks for coming back on the show. It's my pleasure. It's been way too long. So, Carole, a funny thing happened on the sofa last night.
CAROLE THERIAULT. Do I want to know?
GRAHAM. I'm not sure I should tell you or not because maybe you shouldn't, maybe I shouldn't share all the details with you. I mean, is it about me? This is the thing, it does touch upon you. So my wife and I, we were watching a TV show on the old Netflix, right? And so the main female character in the show, right, I say to my wife, I shouldn't really be telling you this at all. Let's just move on. Let's move on. Was she really cool, smart, sassy, fun? No, I said she was kind of irritating. And my wife laughed and she said, that's funny because she really reminds me of Carole.
CAROLE. And then you both died laughing. And you went, oh my God, you're right.
GRAHAM. No, no, no. I didn't say that, obviously. But I thought you'd be interested in hearing that.
CAROLE. You know, I'm actually hurt.
GRAHAM. And if anyone else wants to try this out. Oh, yeah. Who is it?
CAROLE. Who is it? Who is this person? Are we going to know who this mystery person is? Yeah, whoa. I can't believe that didn't occur to me. Yeah, yeah. Come on, come on.
GRAHAM. It's not anyone famous. Her name is Victoria Pedretti. And she is in the second season of You, which is all about a guy who stalks women in order that they fall in love with him.
MICHAEL. I am also watching that show right now. I'm on episode 10 of the first season, so no giveaways here but I'm in. So yeah, so this is the second series.
CAROLE. Whoa, okay, so you've watched it, so this is what, the girlfriend in You?
MICHAEL. I don't believe I've made it to this person being a part of the show yet. I'm still in season one.
GRAHAM. The main woman in series two, her character's name is Love Quinn, which is a fairly ridiculous name, especially when you find out that her brother is called Forty. I imagine her parents were fans of tennis or something, so you have Love and Forty. But anyway, so the actress, I'm sure she's a lovely actress, but I find her character extremely irritating. And like me. No, no, I didn't say it was like you. Do you agree, though? We don't have enough time to discuss this tittle tattle in detail. Tell us what's coming up on the show this week.
CAROLE. Well, yeah, no, yeah, here's a little insult and now go do your job. Yeah, what a shit sandwich is this. Play the music, play the music. Well, first, we should thank this week's sponsors, DomainTools and LastPass. Their support helps us give you this show for free. Now, Graham dives into the murky case of a hospital hacker. Mikey waxes lyrical about his absolute love for robocalls. And I'm sharing a crazy scary story about a secret facial recognition tool. All this and oh so much more coming up on this episode of Smashing Security.
GRAHAM. Now, chums, chums, I want to talk to you today about grudges. Have you ever had a grudge? I do now. You have? Yeah. Three years of my life. Taking a dislike to someone. Well, I'll tell you about someone who has a grudge. His name is Daniel Mooney. And he was an administrator at a hospital in Great Britain. And he lost his job three years ago because he'd been caught remotely accessing the internal network of the heart and lung department where he worked, of the Royal Stoke Hospital, from his home computer. And he was accessing that network, of course, without authorization.
CAROLE. Whoa. Yeah, so naughty. Well, that's just naughty. How the heck could he get in and do that?
GRAHAM. Well, you know, username and password, I guess.
CAROLE. Right.
GRAHAM. And it's the NHS network, and I imagine there weren't sufficient defences in place to prevent people from logging in from remote IP addresses. Well, the hospital came down on him hard and he lost his job and he was also cautioned by the police. And as part of his caution, he agreed that he would not access any of the hospital's IT systems in future and he would not even enter the hospital unless he was unwell or visiting a patient.
MICHAEL. Well, okay, yeah, it seems like... You mean he can't come to the cafe and grab a sandwich?
GRAHAM. Well,
GRAHAM. You know, he might get a job serving sandwiches or maybe he's a painter and decorator. It's his new job and he has a big commission at the hospital. Anyway, he's not allowed to come to the hospital unless he's got a broken leg or a splinter or something like that.
And not have any contact with hospital staff unless asked to by the HR department. So I guess HR were thinking, well, if he has any knowledge or if he knows any past.
MICHAEL. Yeah, but the HR department is not his HR department if he doesn't work there anymore.
GRAHAM. Well, this was the deal. This was he was given the caution and he agreed to these terms. Right. And so the police didn't take any further action.
MICHAEL. Okay. I have a lot of questions here, but fine.
GRAHAM. Well, you may have even more questions when you find out this, because it turned out that when he had accessed the network, he hadn't actually accessed any sensitive data. So what he got fired for, he hadn't actually done what he wanted to do. He had connected, but he hadn't accessed any sensitive data, right?
And he was really peeved about this, right? Imagine someone who's very annoyed. So he's annoyed that he's being treated like a criminal, even though he did nothing wrong in his mind.
Well, in his mind, he might have thought it was. I mean, it's very strange, this case, because a bit of me thinks, was he actually testing the systems to see if it was possible? Certainly, if I was in his shoes, maybe I would have used that kind of defense.
MICHAEL. Sure. Or was he doing work? If he wasn't accessing sensitive data, do they know why he was accessing the system?
GRAHAM. Right. Because sometimes you might log in from home to a corporate network and think, well, I can do a little bit of work. It's easier this than me driving into the office. And some organizations have got problems with that, quite understandably, and others are much more lax about it.
CAROLE. So he's feeling a bit righteous. He's thinking, okay, I did something a little bit, maybe naughty a tiny bit, but certainly not something that deserves me losing my job and all that. And not being allowed to go to the hospital unless he was unwell.
Yeah, because I really want a sandwich. I love those egg sandwiches. They have
GRAHAM. Really good sandwiches. Actually, they do. I've been to some fantastic little shops in hospitals.
And wouldn't it also support the hospital trust more if you were to go and frequent those stores rather than drive on the high street? I don't know. But anyway, he, as a result of this, of being peeved, he launched an appeal against the police caution saying, you know, this is just, this is overkill.
And that appeal was unsuccessful. And you know what that meant? He's even more peeved. Now he's got the hump, right?
Of course he does. He looks like a camel. He looks like Quasimodo. He thought, and he believed he wasn't the only person who had been remotely accessing the hospital network.
CAROLE. Oh, so he thought, I mustn't be the only person, but I'm taking the fall, I'm the fall guy for everybody else.
GRAHAM. He's the fall guy. He's taken the brunt of all of this, whereas maybe other people should have been as well.
And the mistake he then made was to allow that grievance to grow inside him and take over all of his feelings. And in December 2017, months after Mooney had been dismissed, the hospital's head of cybersecurity noticed something a little bit strange.
They discovered that there was an unauthorized user with admin rights to the server. They thought, this is a little bit suspicious. Why is this extra user with all of these rights? Who could that be?
Was it Mooney? Well, yeah. As the police searched Mooney's home, they found two disk drives containing documents related to the disciplinary process that he'd been through. Remember, he got dismissed, right?
And he had all these internal documents, documents which hadn't been shared with him about what was going to happen with Mooney and what the process was and the communications between the managers. He'd managed to get hold of that.
And furthermore, he'd also accessed 600 staff-related documents, 150 management documents, and almost 9,000 medical images of heart scans, sort of cardiac-related stuff from his department. So was he just grabbing everything he could get his hands on?
Well, I don't know. I mean, it seems a strange thing to collect, doesn't it? I mean, some people have got foot fetishes, right? I was waiting for one of you to say yes. Look at that x-ray.
CAROLE. What would you call those people?
GRAHAM. People into x-ray. Well, you know. That's x-ray-ted. I'll see myself out.
CAROLE. So was he potentially, was he trying to maybe build an app and he needed that information? Was he that kind of guy?
GRAHAM. Or was it a porn site for x-ray fetishists? Who knows? We could speculate. Yes.
MICHAEL. That's the thing. Maybe he was thinking if he just comes in and grabs as much as he can possibly grab, just to get off the system quickly, and then he can scan through it in his own private home, rather than spending hours and hours and hours on the system.
GRAHAM. I don't know why he grabbed all this data, including the medical data. One theory I would have would be maybe he's grabbing all this data to go back to them and say, "Aha, look, you've got security issues. Aren't I a hero? Maybe you should reinstate me in your IT department because I can fix these kind of problems."
I mean, foolhardy as that was, particularly as he'd already had a police caution, maybe that was incentive to do it. He's obviously knitting with one needle, right?
Well, he admitted an offence this last week under the Computer Misuse Act. And I wonder if you agree with this or not — he avoided jail. He has not been jailed. Do you think he should have been jailed, bearing in mind he's been warned before?
CAROLE. He's stolen this stuff, but he hasn't done anything with it. He hasn't demanded ransoms. He didn't post them on the web somewhere, or we don't know, I suppose.
GRAHAM. Well, as far as we know, that didn't happen, but it was being stored on his computer at home instead.
CAROLE. Yeah, I don't think you should go to jail. Maybe not jail.
MICHAEL. I feel like he was warned. He did have an agreement that he wasn't going to do this. It's not like it just came out of nowhere.
CAROLE. He should have aggression management courses. You know, how not to hold a grudge, how to forgive and forget, how to make friends.
GRAHAM. I think there's a lot of people who need to know how not to hold a grudge.
CAROLE. Yeah, true. But some of us have a reason to there, sunshine.
And I don't think 15 minutes is long enough.
GRAHAM. Instead of being given a jail sentence, he got a 12-month community order, which includes 160 hours unpaid work — not at the hospital — and he must pay £2,000 in prosecution costs as well.
You see, England is great. That would be unheard of in the States, right, Mikey?
CAROLE. I think so. Yeah, you'd go to jail. Really? Yeah, of course you would. People go to jail for crazy, tiny things.
MICHAEL. I'm actually doing this podcast from jail right now. Great Wi-Fi.
GRAHAM. Yeah, it's really not bad.
So this does raise a few questions. First of all, should the patients be notified that their personal heart scans have been breached in this way?
CAROLE. If the information is identifiable in any way, I would say yes. But if they have a heart scan — if it could be identified, anything that identifies anybody — if someone is at risk that their information had been downloaded without authorization, it would be the right thing to do to tell those people.
GRAHAM. Right. And the other thing which, of course, it raises is this whole issue of what should you do when someone leaves your employment, particularly if they leave under a cloud?
MICHAEL. Kill them. That's what they do in the States, isn't it? There's no other option. Isn't that what you do in America? Yeah, this is true. That's the only reason I'm still working here, really. I'm very afraid. If anything happens to me, you know where to look.
GRAHAM. Even if you don't leave under a cloud, passwords should be changed. But it's not easy if you're an organisation as sprawling as the National Health Service, which have got legacy systems, and they hardly have staff lolling around drinking martinis. It's not like they haven't got enough work to do already.
So make it part of your HR off-boarding process. Just like when people come into your company, you set them up with an account and give them a computer, there needs to be some sort of tick list of "this person's leaving." But it's not always easy, particularly when people are getting fired from the personnel point of view. You have to speak to an IT guy to remove someone else's passwords, and you don't want them blabbing if they haven't quite left the building yet.
CAROLE. Do you know what I find annoying, though, about this? This is the NHS, and the NHS are known — certainly in my anecdotal experience and many others — but they have a reputation for having a pretty solid checklist so that they don't leave scissors inside you or they don't cut off the wrong leg. There's a lot of procedures and papers that need to be signed and agreed with the patient at every single stage to make sure those things don't happen. Now, sure, they might happen occasionally, but it's rare.
So surely that kind of system, wouldn't that be good to do for their IT systems? I don't know why they can't port that over to make sure — how could they sit there and go, "Oh, wow, we have an admin guy here that we have no idea has access to the systems. How long has he been there?" How does that happen?
MICHAEL. It's probably — I mean, I'd imagine it has something to do with an issue of priority. How many IT guys leave on a daily basis where they have to go through this thing? It's probably not nearly as much as how many times they have to make sure they're not leaving scissors inside of someone.
GRAHAM. So another thing you can do is use technology, of course, because obviously humans make mistakes, but you should have layers of protection, such as checking whether it's an external IP address which is accessing your internal system and maybe blocking those or going an extra step.
CAROLE. Yeah, because no one has any remote workers these days.
GRAHAM. Well, what I'm saying is that certain users may not be able to access certain systems from outside. Certainly ones which have sensitive medical information on them as well — you may question whether that's really necessary.
What's crazy about this—
CAROLE. Story is he gets caught once and then, as you said, gets the hump and then goes for it again. Like he must really thought he was untouchable or that he was, no one was smart enough to spot him.
MICHAEL. I want to hear the story from this guy's perspective. Daniel, if you're listening, come on the podcast. I'm sure he's an avid listener.
CAROLE. Who isn't? Yeah, you should get him on the show Graham. I wouldn't mind having someone who also has a grudge so I could talk about my grudge with him. He's obviously an expert in grudges and he could give me some pointers and how I can channel this negative energy I feel.
MICHAEL. Michael, what's your story for us this week? My story this week is about robocalls. I have a personal grudge against this. I was telling Carole the other day, I get more robocalls on a daily basis than I get real calls. And I would say I get a fair amount.
GRAHAM. What's that? Does that mean one a day?
MICHAEL. I think I get on a bad day, I get six or seven robocalls in a day.
GRAHAM. Because I get none of these calls at all, right? No, nor do I. I never get any. Maybe it's because I'm British. I don't know, and they just have a common decency not to ring. So is it a robot which is doing the calling and then you get to speak to a human or is it a recorded voice saying, "Hello, Michael, I wonder if you"— well, what is the actual, what happens? What is a robocall?
MICHAEL. It's all of those things. And I think it really depends on who it is. I mean, I've had ones before that are just a straight up robot voice that asked me to either press one to be connected with someone or press two to be removed from the call list. Of course that is after a full minute of this thing going on and on about the new law that it wants to tell me about or whatever it is. And then I tried for a while, I would actually wait for the thing to finish its message, and then I would press two to be removed from the thing, and then sometimes not even an hour later, I get a call from the exact same number with the exact same message. And it's just relentless.
GRAHAM. Are you suggesting that opting out of spam sometimes doesn't work? Are you suggesting that unsubscribing? They actually ignore that. Well, what a shock.
CAROLE. And you know what annoys me about that is that in doing that, in kind of trying to unsubscribe, you're also validating your phone number, the fact that it's active and that someone's answered the call. So you're a live prize lion suddenly.
MICHAEL. Yes. And I've heard, I mean, I don't know how much of this is just speculation, but I've heard that even just answering the phone call at all puts you on some kind of list that says, "Oh, even if they're not responding the way we want them to, this person will answer the call." And so you could even get called more. There was a thing in the past where if you wanted to opt out of real telemarketing calls, you could be added to the do not call list, and there was a law here in the United States that you had to be added to this list. The thing is with these robocalls, I mean, most of these are scams. I don't think they really care about the law, clearly. And so that's kind of the problem now is that it's not only gotten worse, it's gotten worse a lot. So according to the 2018 report by global communications platform First Orion, spam phone calls accounted for 29.2% of all mobile phone calls in the US in 2018. And it was only 3.7% in 2017.
GRAHAM. That's astonishing. So almost a third of all calls, mobile calls in 2018 were spam or robocalls of some nature. It's one in three, right? And it's grown that fast. It must be, could be over half by now, couldn't it? I mean, does it not get to the point where you just think, well, I don't actually need a phone number because I can communicate with all of my pals via instant messaging services, whichever one you choose to use. And you can even call them that way as well. I wonder if it's possible to live without a phone number.
MICHAEL. Imagine you're a small business owner or even just you're a freelancer or a person who does contract work. You're probably getting phone calls fairly regularly from numbers that you don't recognize, and these are new people who've gotten your number somehow, and you want to be able to answer the phone. And this is the problem now is that it's saying that people are just not answering phone calls anymore that they don't recognize, which I totally understand. But what if it is the hospital calling to say that one of your family members is in— I guess they'll leave a message. I don't know. Just to add to the kind of insanity of the statistics of this, we do have some statistics from 2019. And according to YouMail, which is a tracking company, it said about 5 billion robocalls were placed in November of 2019 alone, which is more than 160 million phone calls a day, averaging 15.3 calls per American. So I suddenly, when I read this, I started feeling a little bit better about my six or seven robocalls a day. I was "Okay, maybe that's not so bad." I mean, somebody in America is getting more than 15 phone calls a day. I can't even imagine. It makes your phone unusable.
CAROLE. Well look, I have to tell you, now that you've been on the show again, reminding everybody that you exist, right, and that you pick up, you might go up.
MICHAEL. 555— actually I want to read Carole's number back to you real quick.
CAROLE. I have a question: have you ever received any robocall with political messages in it? Because you guys are up for an election this year, does that ever—
MICHAEL. Happen to you? I have not. I feel like it'd just be a bad move because everyone is so annoyed with the robocalls that if somebody, even if there was some political candidate that I liked, if they started blowing my phone up every day I'd be like, "Don't forget to vote for me." I'd be like, okay, this guy is definitely not getting my vote. Like, screw this guy. That's the problem.
GRAHAM. If there was a candidate that any of us liked. We have the same problem over here in the UK. But there is this thing, isn't there, where someone could do what's called a Joe job, where they start a campaign, a robocall campaign, promoting the opponent.
CAROLE. I was just going to say that. Right. I love, yeah. You knew I had the same idea in my idea. Not bad. It's a devious crowd. Don't trust you. Just like that character on the TV show. Don't trust him.
GRAHAM. Wasn't there a law that was brought in that they would be charged for every robocall or something like this?
MICHAEL. Yeah. This year, President Trump signed this anti-robocall bill into law, which it's supposed to allow officials to fine companies $10,000 for each illegally placed call.
CAROLE. And $5 billion a month. Ka-ching!
MICHAEL. Yeah, exactly. And who's getting that money, by the way? I think I should get paid for every single one of the calls that I've had to put up with. But maybe there's more time for that later. But the thing I'm wondering about with that is if they're spoofing numbers and they're using this voiceover IP, we don't really know where these things are coming from or who's doing it. Is this going to be that effective to try and charge the people? Are they charging the companies or are they charging the actual phone service providers? I'm not sure which one it is.
CAROLE. And it makes sense that people just do not pick up their phone unless they recognize the number. I mean, honestly, I'm guilty of that. If I don't recognize the number, I don't pick up.
GRAHAM. Sometimes, Carole, if I do recognize the number, I don't pick up.
CAROLE. You know what? You're not being very nice to me today. Wait, wait, wait. First, you know, you don't like me and now you say you're my call. You're just getting a bit sensitive. I'm not going to be calling you anymore. Don't you even worry about it. I'll get my robot to call. You can call me anytime, Mikey.
GRAHAM. Carole, what's your story for us this week?
CAROLE. So we're talking facial recognition. Now, I don't know if you guys saw in the press, but the big boys, Google and Microsoft, can't seem to agree on how to approach this issue of facial recognition. You've got Google CEO Sundar Pichai. He's expressed support for Europe's proposal to temporarily ban facial recognition. But Microsoft's top lawyer, Brad Smith, has cautioned against using a meat cleaver for what should be a surgical operation. So he wants a more soft touch approach. So while these two big dudes are duking it out in their public forum here, a little seemingly insignificant mouse entered the space and created an ethical quagmire that takes total advantage of the lack of regulation in this space.
GRAHAM. Okay, sounds interesting. All right, go on.
CAROLE. It was the New York Times that did this big exposé on this. And it's kind of stuff that makes my teeth rattle a bit. And I want to know if it makes yours rattle or if you think, oh, God, calm down. I don't even like you. I'm so irritated by your story. Okay. So the story starts with a Mr. Hoan Ton-That. That's his name. Hoan Ton-That. Yeah. And he's an Australian-born techie and one-time model, right? So a little bit of a looker.
GRAHAM. He got one modeling gig? Or one-time model means you turned up for a modeling gig and they said you're not that attractive we're never going to hire you again?
CAROLE. Yeah I should have put quotes around that that was the word these New York Times.
MICHAEL. He looks great but he just is terrible to work with.
CAROLE. Okay but anyway he, this guy Hoan Ton-That moved to San Francisco to make it big in the tech world. Now during his rise to power he created an obscure game and he also created a really useful app that lets people put Donald Trump's piss yellow wig onto their pics. That was one of his creations.
GRAHAM. I don't think it is actually a wig. I mean, it is fascinating. Wisp, then wisps. Wisps. Yes. Collection of wisps. I think it's all the more fascinating because it's not a wig. If it was a wig, you'd want your money back. Especially if it was made out of piss, like you're suggesting.
MICHAEL. You've got to pay good money to have that kind of style.
CAROLE. But then, okay, so he's created these little games. But then Mr. Hoan Ton-That got together with a Mr. Schwartz. Now, Mr. Schwartz, yep, he worked alongside Rudy Giuliani in the 90s, okay? And these two hatched a plan to create a facial recognition tool, which they called Clearview AI, okay?
GRAHAM. So Ton-That and Schwartz, not Giuliani. Ton-That and Schwartz, yep. Okay, they're producing a facial recognition thing.
CAROLE. Ton-That was going to be the developer, make the thing work, and Mr. Schwartz is going to sell it because he had a lot of contacts.
So in 2016, they recruit a couple of engineers. One helps them design a program that automatically collects images of people's faces from across the Internet, such as employment sites like LinkedIn, news sites, education sites, social networks, including Facebook, YouTube, Twitter, Instagram, and Venmo.
Effectively these guys were scraping the web and building a massive ginormous database under Clearview AI's control. Now they also hired another engineer and this guy was hired to perfect the facial recognition algorithm.
They describe this system now as quote "state of the art neural net" and basically it converts all the images into mathematical formulas and vectors based on the facial geometry. So how small a person's eyes are, Graham, or whatever.
GRAHAM. Or how big their feet are, for all.
CAROLE. But not how nice their personalities are.
GRAHAM. No, hard to tell, hard to tell, isn't it?
CAROLE. And then clearly we created this vast directory that clustered photos of similar vectors. So basically everyone with tiny eyes, Graham, would be put into a little neighborhood or everyone with big feet, Carole would be put in their own neighborhood right.
So when a user uploads a photo into the Clearview AI right of a face right, the Clearview system then converts the face and then it shows all the scraped photos that is stored in that neighborhood. So all the pictures that have similar vectors and similar algorithms matching along with the links to the sites from where these images came.
GRAHAM. It is surprising sometimes because there are people who can look very much like you. I remember working at a place once where I had a lookalike and the slightly disturbing thing was that my lookalike...
Is this the Polish guy? No, not the Polish guy. No, the lookalike I'm thinking of was actually a woman, a woman who looked like me and it was rather peculiar that she was extremely fetching.
CAROLE. I'm sure, yes. Did she have your very bushy eyebrows?
GRAHAM. If you're just going to make this a very personal podcast get off your soapbox mister. Tell us more about facial recognition.
CAROLE. So by the end of 2017, a year on, the company had what the New York Times describes as a formidable facial recognition tool, which they called Smart Checker. Now, this database is, get this, 3 billion images strong.
It's right about 75% of the time, it claims. And the one cool thing about it, apparently, is that the algorithm doesn't require photos of people looking directly at the camera. You could be looking down or covering part of your face, and still it can all work.
GRAHAM. Well, I'm not surprised at all that some enterprising technology company has gone and scooped up gajillions of facial images from all the places that they can be grabbed, because why wouldn't they? And people have given their data so willingly. So I'm not surprised about that at all, I'm afraid.
MICHAEL. I'm not surprised either. I think I have more of a question than anything. What is the ultimate plan for using this?
I mean, obviously, I can see a million ways this could be useful. But it's not quite scary until I know why this is happening. And I feel like it's going to be scary when I figure out the answer.
CAROLE. My next question to you guys is going to be, because they were wondering the same thing, right? They're like, who's our first customer going to be? Can you guess who it might have been?
GRAHAM. An obvious choice would be intelligence agencies, perhaps, if they wanted to identify people. So nation states who want to keep track of their citizens may want access to that kind of algorithm and that kind of database so that they can identify from CCTV who people are.
CAROLE. Yeah. Well, you're not far off. The first people, the first customer, according to Clearview, was the Indiana State Police.
And this is a typical example of how the software is used, right? So they solved a case within 20 minutes of using the app. So the case was two men had gotten into a fight in a park and one shot the other in the stomach.
A bystander recorded the crime on a phone. So the police had a still of the gunman's face and they ran that still through the Clearview app. They immediately got a match.
The man appeared in a video that someone had posted on social media and his name was included in a caption on the video. He did not have a driver's license and hadn't been arrested as an adult. So he wasn't in any government databases.
This is what the Indiana state police captain said at the time. And then the man was arrested and charged. So there's numerous stories like this.
Clearview is actively marketing this to police departments. And they are also spreading the word amongst themselves saying, guys, you should get this. It's incredible.
600 law enforcement agencies have apparently started using this app in the past year. The FBI, the Department of Homeland Security and the Canadian law enforcement authorities are all trying it out, according to New York Times.
GRAHAM. So where do we opt out of this? Great question.
CAROLE. Well, you can opt out by saying things like, I don't want to share my pictures with anybody on your social media apps and everywhere. But if they've scraped it, it's in the database.
GRAHAM. But do they even have the rights to scrape that image? You may have given your permission to the social network but the social network hasn't got a deal with this facial recognition company, do they?
CAROLE. Correct. That is one of the big issues here. They have scraped all these images onto their own databases and put them into a nice, I'm sure, easy to use UI that allows you to toggle all the things you want, like within this area, da-da-da-da-da. So Times went and asked people, right? And Facebook was like, well, we're going to look into this because it's a big no-no to image scrape.
And also they may get their knickers in a twist about this because they're not getting any kickback on this. They're not getting any of the traffic or any of the money. So they may not like this, particularly when they hear the word three billion images. So the other problem, Graham, you also alluded to earlier was the fact of doppelgangers. The bigger the database, the more likely you are going to find people with very similar, if not virtually identical, facial symmetry and facial
GRAHAM. characteristics. And I remember that episode of Columbo where Leonard Nimoy was playing twin surgeons. And one of them was evil and one of them wasn't. And it was all a case of which Leonard Nimoy, which Mr. Spock had committed the murder.
My wife. My wife. My wife on the other end of the sofa. She says this woman is like Carole. She loves marmalade. My wife loves marmalade. She's like Paddington Bear.
CAROLE. Now, the other thing about this is in the olden days, if you did something wrong and they were trying to search you and they had a witness to look at databases of people, all the pictures they'd be looking at were of felons or people that had been arrested for crimes. Right. And that was done for privacy. It's like if you've done something naughty, your face goes into this database.
And now everyone's face is in that database, whether you've done something or not, just because you've stepped outside or someone's taken a picture of you or you posted your own picture online. You know, when you come back to that argument between, you know, Google and Microsoft, I do think regulation is needed. It's Wild West out there. We need regulation.
GRAHAM. It is a worry because, I mean, if George Clooney, for instance, was to rob a bank, I don't want the police knocking on my door thinking that it's me. I did it because of some error. And also not just the facial recognition, but also the name similarity.
CAROLE. Let me just tell you one more thing before I bow out here. Right. Our journo. So the journo of The New York Times, he started looking into this way back in November, right, to do some digging and listen to his words here. Quote, "When I began looking into the company in November, its website was a bare page showing a non-existent Manhattan address as its place of business." And he goes on, "For a month, people affiliated with the company would not return my emails or phone calls. While the company was dodging me, it was also monitoring me. At my request, a number of police officers had run my photo through the Clearview app. They soon received phone calls from Clearview AI reps asking if they were talking to the media. A sign that Clearview has the ability, and in this case, the appetite to monitor whom law enforcement is searching for."
Holy cow. So that, okay, and then remember to use this app, how you use this app, right, how the cops are using this is by feeding the monster. They are putting in new pictures of new suspects all the time. Regulation time I say. Can anyone use this? Can I use it? Very good question. At the moment they see this becoming ubiquitous in no time. So won't that be fun?
GRAHAM. I could always set up my own country and my own police. I had a really
CAROLE. good quote on that somewhere. Yeah, the final words in the New York Times article, "Police officers and Clearview's investors predict that the app will eventually be available to the public." What could go wrong, guys?
GRAHAM. So many things. So I would see a cute girl in a bar and I'll take her photo, upload it to the app and it would tell me her name and where she lived. Well, no, nothing is going to go wrong. Okay, this is how we get
CAROLE. around this. I do have a solution. It's time to hit the 3D printers and start making a number of realistic looking rubber masks so that when you leave the house, you have a different face each time. Think cosplay, but every day.
GRAHAM. Could get a bit sweaty under that.
CAROLE. You might get a little sweaty, but don't you use that special deodorant, Graham?
GRAHAM. Yes. The face deodorant? I've been using it on my face, but my armpits are still pretty good, I have to say.
CAROLE. What is that called again?
GRAHAM. Nuud. You should try it. Pick of the week. Former pick of the week. N-U-U-D. I'm not a big sweaty person. Let other people be the judge of that. You never know. I don't think you should. I'm mouthy. Not sweaty.
This week's Smashing Security podcast is sponsored by DomainTools. DomainTools helps security analysts turn threat data into threat intelligence. Its solutions give organizations the ability to use and create a forensic map of criminal activity, assess threats, and prevent future attacks. Find out more about their cool products at DomainTools.com.
Now, they've got something very cool that I think you're going to like. A capture the flag competition, especially for Smashing Security listeners. You can win a $100 Amazon gift card. If you want to join in all the fun, visit domaintools.com/smashing to enter the competition and may the best geeky listener win. Hey,
CAROLE. Graham. Yes. There are people out there with companies a little bit bigger than ours. And one of the issues that they face is visibility and oversight. And when it comes to cybersecurity, that is super important. So listeners, listen up. If you do not have a password manager in your organization, please check out LastPass Enterprise. They offer centralized admin oversight and control shared access and automated user management. All this stuff makes your life easier. Plus, you can even use LastPass's single sign-on to protect all your cloud apps and give seamless access to employees. So check it out at lastpass.com forward slash smashing. Let me try that again, folks. Check it out at lastpass.com forward slash smashing.
GRAHAM. And welcome back. And enjoy us on our favourite part of the show, the part of the show that we like to call Pick of the Week. Pick of the Week. Mikey Oh Pick of the Week Pick of the Week Pick of the Week is the part of the show where everyone chooses something they like it could be a funny story a book that they've read a TV show a movie a record a podcast a website or an app whatever they wish it doesn't have to be security related necessarily better not be well my Pick of the Week is not security related this week good instead my Pick of the Week is a website quite a fun website with an almost unpronounceable name. The website is called eunoia.world. And eunoia is a Greek word. It's spelt E-U-N-O-I-A. Okay. Dot world. I'll put a link in the show notes. And it's a Greek word. It means a well-mind or beautiful thinking in Greek. Oh, lovely. And that website is a website of words that do not translate. 500 plus untranslatable words in over 70 languages. So if you've always thought, oh, you know, I love, you know, if you like a bit of schadenfreude and you want to drop that into your conversation or things where you thought wouldn't be wonderful if that word did exist, but it doesn't. Well, maybe it does exist in Finnish or some other language. I'm going to have a little quiz, right? I am going to tell you three words and give you options as to what those words mean. All right. We'll have a bit of fun. Woo, I love a quiz. Okay. Okay. The first word, the first word is spafololalia. Okay. Based on that pronunciation. Spafololalia. Does that mean, is it a jungle of traffic signs? Is it flirtatious talk that leads nowhere? Or is it an ungrudging and overtly expressed pride and happiness at other people's success? Spafololalia. Number three.
MICHAEL. I'm going with B. Number two.
GRAHAM. Mike, you are correct. Yay. It's flirtatious talk that leads nowhere, which. What country is that from? Oh, I should have made a note of that. Oh, no, actually, I do know. I do know. That one is actually English. That one is actually crazy language of English. And presumably it doesn't exist in other languages. So, yeah, you can chuck that. Okay, so another one. Next one, next one. Solkatt, Solkatt. Is Solkatt the glimmer that reflects the sunshine off a wristwatch? Is it the mark left on the table by a cold glass? Or is Solkatt a person of integrity and honour? I'll tell you it's a Swedish word, if that helps.
MICHAEL. I feel I should have to go first since I got it correct last time. I'm going to say that that is the glimmer off of a wristwatch because it seems there would be words for those other things. So that's my guess.
CAROLE. Okay, okay. Yeah. Okay, I was heading to try that way, but I'll go number two just to, you know.
GRAHAM. Oh, the mark left on the table by a cold glass? Yep, yep, yep. Bing, bing, bing. For Mike, who's now two nil up. So Mike was correct. It is the glimmer off a wristwatch. And the final one, the final one is kusukusu. Kusukusu. Okay. Kusukusu. Oh, that's beautiful. Kusukusu. Is that K-U-S-U, K-U-S-U? Kusukusu. Oh, cute. It's Japanese. Is it the Japanese for not bad or meh? Is it a reason for being the thing that gets you up in the morning? Or is it the suppressed giggling and tittering of a group of women?
CAROLE. I certainly would I'm going to go with Get Me Up in the Morning
MICHAEL. I'm going with Choice 3 Which was what? Suppressed giggles, yes
GRAHAM. Mike, you are incredible at this It's 3-0 to Mike Mike, it's just me I'm going
MICHAEL. to bring that one It's such a cute word I think I'm going to have to try and bring that into conversation Although I don't remember the last time I was talking about the suppressed giggles of a group of women. But if I ever do, it's kusukusu from now on.
GRAHAM. Okay. Well, my website, Eunoia World, we'll link to it in the show notes so you can find it for yourself and have as much fun as I did. And that is my pick of the week.
CAROLE. Cute. Cute. Although I got zero. That's because I'm holding a grudge, isn't it? Yeah.
MICHAEL. Sorry. it's interfering with your translation skills.
CAROLE. Are you sorry? Are you? Are you?
MICHAEL. Mike, what's your pick of the week? Move on quick. My pick of the week is, this is actually a personal one here for me. So something interesting happened to me about a little over a week ago where I uploaded a little video of my cute little doggie. Her and I were taking a nap on the couch and I had this, actually I had this video on my phone for about a month before I just decided to post it onto Reddit And I woke up the next morning and it had exploded with gajillions of upvotes. And all of a sudden.
CAROLE. What's a gajillion mean?
MICHAEL. It was within five hours. It had 80,000 upvotes. What? 80,000? 80,000. And actually a few hours later, it was the number one highest upvoted post on Reddit. Across every subreddit, it was the number one highest upvoted post. Wow. Wow. And you just uploaded this to the cute dog subreddit or something.
CAROLE. It must be the most amazing video ever. Okay, we got to see this video. We have to watch it.
MICHAEL. I will put the link there. Let's watch it now.
GRAHAM. Okay, so there's a dog. There's a dog and there's some kind of blanket. The dog's under the blanket lying on you. His tail's sticking outside. And every time you show the dog's head, the dog's tail wags.
That's very cute. Every time it sees you, it wags its tail.
MICHAEL. Yeah, and then it stops whenever the blanket goes. I mean, it's cute. I think I was a little surprised.
CAROLE. 80,000 views?
MICHAEL. Well, it ended up, I mean, right now I think it's at like 130-something thousand. But that had happened within like five hours. And so the interesting thing that happened because of this is that I had a few agencies started reaching out to me that wanted to buy the rights to license the video.
CAROLE. What? How long, okay, how long? So you post this up, it goes viral, like a week later they call you?
MICHAEL. This was within like six or seven hours of me posting it. So I guess there's people who are just looking at this all the time and, you know, these different subreddits and what's getting uploaded and what's getting upvoted, I guess more importantly. And so within a few things, I didn't really know how this worked. I'd obviously never had any experience with this and I kind of jokingly to one person, they asked if they could use it and they said they would give me credit. And I was like, well, what's your offer? Like, kind of jokingly. And then all of a sudden all these offers started rolling in. I was like, oh wait, there's actual, there could be money in this.
So I, over the next three or four days, I kind of went back and forth with a few companies and then I ended up selling the rights to this video. Dead serious. So now over time I get, I can't, I don't know if I'm allowed to share the exact thing, but I will get a percentage of whatever revenue this video makes. I don't even know how it makes revenue exactly, but I'm waiting all the time for my check to come in the mail.
CAROLE. It just must be so frustrating. I mean, you're like a musician, right? You work hard at your craft. You go out and you schlep and you market everything. And then you take a cute little video of 20 seconds of you and your dog hanging out and her being cute.
MICHAEL. I am literally a video producer. There's been videos that I've worked on for months and months and months at a time. I've grueled over putting hours and hours into editing something. This is something I flippantly recorded. It is probably going to be the most famous video that I'll ever make in my life.
GRAHAM. How much is the dog getting? What percentage is the dog getting? And did you ask the dog's permission before uploading it to the internet? What about facial recognition? There's a lot of similar looking dogs out there.
MICHAEL. Oh, I didn't even think to ask, how rude of me. I did decide though that a portion of whatever earnings would be spent, I'm going to take the dog to the store, I'm going to let her pick out some toys, maybe get her nails done, you know, give her a day at the spa. So we'll see, it's got to make money first though.
GRAHAM. Well, that's an amazing story and thank you for sharing it as your pick.
CAROLE. Yeah, and do share your affiliated link with us so that any listener that wants to throw a penny or two your way.
MICHAEL. Everyone, please go watch the video a thousand times and that would be great.
GRAHAM. Mike, we don't put this in the podcast, but the title of the YouTube video has got a spelling mistake. It says dog waging her tail.
MICHAEL. I have informed the company of this and they decided to do nothing about it. So, Carole, what's your pick of the week?
CAROLE. So, my pick of the week is a book. It is called She Said. And it's written by New York Times journalists Jodi Kantor and Megan Twohey. So, this book basically explains all the steps they went through on exposing Harvey Weinstein after decades of being basically a misogynistic, controlling pig. And I followed that whole story. So they covered it in the paper, and they covered it on their podcast, and I was listening to everything. And so when the book came out, I snapped it up and hoovered it down. And it's really interesting if you are the kind of person that likes to know more about how an investigative journalist team would chase such a story, especially when none of the victims want to talk about it or want to come forward.
GRAHAM. That was the thing because he was so powerful and people were worried that their careers would be put in jeopardy if they said anything bad.
CAROLE. But not only that, he actually set some goons on them to follow these two journalists at one point, right? Like, you know, he's got a lot of money and a lot of clout. And the thing was, is when I was reading it, I'm reading this and I'm thinking, okay, if this had happened to me, if he had been, you know, one of those saints in my life and these two journalists had called me up and said, look, we want to share your story, would I? You know? Because look what's going on. His criminal cases are basically teetering at best at the moment because he's got a pretty powerful team. Did you see him walking in with his walker?
GRAHAM. Oh yeah, yeah, yeah, yeah. My goodness, he does look pretty rough there. Are we sure he's not wearing one of those 3D printed masks to try and stuff? With lots of little hair too.
Exactly. I don't think he'll be able to sell very many Harvey Weinstein rubber masks. I don't think people want to disguise themselves as him.
MICHAEL. That's what it all comes full circle.
CAROLE. Based on reading this book and following the story, I have just become a New York Times subscriber because I've been basically gulping down loads of their content. So I'm adding it to my official news subscriptions.
So there you go. So my pick of the week this week is take a read of She Said. It's really fascinating about how they were able to nail down all the facts and got the ball rolling on the Me Too front.
GRAHAM. Very interesting. Well, on that literary note, we've just about wrapped up the show for this week.
Mike, I'm sure lots of our listeners would love to follow you online or find out more about what you're up to. What's the best way for chaps to do that?
MICHAEL. I would point them to that YouTube link. All they need to know is, that's just what my dog looks like. And here's my blanket that I sleep with sometimes.
But yeah, start there. And then maybe some kind of facial recognition with the dog, you'll find me. You'll find me. It's 2020. It's 2020. You'll figure it out.
GRAHAM. And you can follow us on Twitter at Smash Insecurity. No G. Twitter wouldn't allow us to have a G.
And you can also continue the discussion with us on Reddit. Go and find us on the Smashing Security subreddit. And don't forget to subscribe to Smashing Security in your favourite podcast app, such as CastBox. Go and find us up there and you'll never miss another episode.
CAROLE. Yes, thank you to all of you for listening to us this week, supporting us on Patreon and giving us wonderful reviews. Also, a big shout out to this week's Smashing Security sponsors, Domain Tools and LastPass.
Their support helps us give you this show for free. Check out smashingsecurity.com for past episodes, sponsorship details and info on how to get in touch with us.
GRAHAM. Until next time, cheerio. Bye bye. See you later. Adios.
CAROLE. Well, there you go, my gentle friends. Well, one friend. And the other guy.
GRAHAM. And the other guy. It wasn't me. I just said I didn't like her.
CAROLE. I'll just go drink my sorrows away. Thanks, Graham.
-- TRANSCRIPT ENDS --