WEBVTT 1 00:00:00.270 --> 00:00:02.760 Anna Delaney: Hello, welcome to the ISMG Editors' Panel. I'm 2 00:00:02.760 --> 00:00:06.030 Anna Delaney, and this is a weekly sit down with fellow ISMG 3 00:00:06.030 --> 00:00:09.660 editors on what's happening in the world of cybersecurity. This 4 00:00:09.660 --> 00:00:12.270 week I'm joined by Mathew Schwartz, executive editor of 5 00:00:12.300 --> 00:00:15.600 DataBreachToday and Europe; Suparna Goswami, associate 6 00:00:15.600 --> 00:00:19.950 editor at ISMG Asia; and Tony Morbin, executive news editor 7 00:00:20.160 --> 00:00:26.700 for the EU. Great to see you all. There's lots of beautiful 8 00:00:26.700 --> 00:00:31.290 colors in this display, this week. Suparna, why don't we 9 00:00:31.290 --> 00:00:34.020 start with you? Because that's just ... 10 00:00:34.020 --> 00:00:36.420 Suparna Goswami: Yes, festival of colors. Background as you can 11 00:00:36.420 --> 00:00:40.020 make it, is of Holi, which is being celebrated in some parts 12 00:00:40.020 --> 00:00:44.160 of India today. And a few states celebrated yesterday. So I 13 00:00:44.160 --> 00:00:46.860 celebrated yesterday, I'll share the pics with you once our call 14 00:00:46.860 --> 00:00:47.250 is over. 15 00:00:47.970 --> 00:00:49.590 Anna Delaney: Looking forward to it looking forward to it. So 16 00:00:49.590 --> 00:00:51.420 what does the day entail? 17 00:00:52.650 --> 00:00:56.220 Suparna Goswami: So in the entire day, we just put colors 18 00:00:56.220 --> 00:01:00.480 on each other with water guns. We wet each, with colorful 19 00:01:00.480 --> 00:01:03.390 water. I'll show you a pic, just a second. 20 00:01:04.800 --> 00:01:08.700 Anna Delaney: Amazing. Suparna, why don't you dress like that 21 00:01:08.700 --> 00:01:18.540 sometimes? Matt, more color and excitement in your background. 22 00:01:19.230 --> 00:01:21.960 Mathew Schwartz: Yeah, we've got a bit of street art here in my 23 00:01:21.990 --> 00:01:26.130 hometown now of Dundee in Scotland. This is a street art 24 00:01:26.130 --> 00:01:30.810 wall that has been legally decorated, I should note, by 25 00:01:30.870 --> 00:01:35.250 Scrap Antics, a local group, and I just love the riot of colors. 26 00:01:35.340 --> 00:01:38.640 It's not as great as Holi, but we make do with what we have 27 00:01:38.640 --> 00:01:39.120 here. 28 00:01:40.410 --> 00:01:43.320 Anna Delaney: Very complimentary. I love it. Tony, 29 00:01:43.500 --> 00:01:44.550 new office for you. 30 00:01:45.360 --> 00:01:49.230 Tony Morbin: Yeah, I'm virtually crossed back over the Atlantic 31 00:01:49.230 --> 00:01:53.580 to to the U.S. and I'm sitting in the White House here in the 32 00:01:53.580 --> 00:01:54.330 Oval Office. 33 00:01:54.990 --> 00:01:57.120 Anna Delaney: Makes me think we might be talking about something 34 00:01:57.180 --> 00:02:02.370 U.S. related later. Well, this is not my office, obviously. 35 00:02:02.370 --> 00:02:05.160 This is the Coliseum, the home of the English National Opera in 36 00:02:05.160 --> 00:02:08.760 London. It's an absolutely stunning building and its future 37 00:02:08.760 --> 00:02:12.990 unfortunately in London is uncertain currently. They may 38 00:02:12.990 --> 00:02:16.650 move it up north due to funding issues. But I think talks are 39 00:02:16.650 --> 00:02:19.470 underway to keep it in the capital. And I hope they do 40 00:02:19.470 --> 00:02:25.770 because it's only one of two opera houses in the city. Matt, 41 00:02:26.520 --> 00:02:29.940 onto serious business. Biden's U.S. cybersecurity strategy was 42 00:02:29.940 --> 00:02:33.510 unveiled earlier this week, which stirred some excitement in 43 00:02:33.510 --> 00:02:36.180 the industry. The general consensus being that it's a 44 00:02:36.180 --> 00:02:39.450 positive step. There's talk of disrupting and dismantling 45 00:02:39.600 --> 00:02:43.920 threat actors, and ransomware is now considered a national 46 00:02:43.950 --> 00:02:47.430 security threat. So it seems to be using the right vocabulary. 47 00:02:47.670 --> 00:02:51.270 What's your initial reaction to what was laid out? 48 00:02:52.080 --> 00:02:54.240 Mathew Schwartz: Well, there's a lot to like here. As you know, 49 00:02:54.240 --> 00:02:57.270 this is the Biden administration's cybersecurity 50 00:02:57.270 --> 00:03:00.630 strategy. It's the first one we've seen in five years. And if 51 00:03:00.630 --> 00:03:05.130 you look back over the past five years, obviously all has not 52 00:03:05.130 --> 00:03:09.360 been quiet on the cybersecurity front. And I'm sure we're going 53 00:03:09.360 --> 00:03:12.120 to get into some of the nuances of the National Cybersecurity 54 00:03:12.120 --> 00:03:15.900 Strategy. For me, one of the big things that jumps out is what 55 00:03:15.900 --> 00:03:19.530 could be a decade long effort to try to transfer more liability 56 00:03:19.530 --> 00:03:23.190 onto commercial software providers. That is part of a 57 00:03:23.190 --> 00:03:28.350 push to help us just improve the baseline. And improving the 58 00:03:28.350 --> 00:03:31.800 baseline is something we've been hearing about since Joe Biden 59 00:03:31.800 --> 00:03:35.640 became president. There has been this multi-pronged strategy of 60 00:03:35.790 --> 00:03:39.930 attempting to disrupt ransomware and other kinds of cybercrime 61 00:03:40.110 --> 00:03:43.410 via law enforcement. There were diplomatic efforts, which I 62 00:03:43.410 --> 00:03:48.690 think are kind of on pause at the moment where the officials 63 00:03:48.720 --> 00:03:51.420 of Western countries were trying to get Russia to crack down more 64 00:03:51.540 --> 00:03:54.990 on cybercrime from inside its borders. There is also a focus 65 00:03:54.990 --> 00:03:59.220 on business resilience, and getting organizations to the 66 00:03:59.220 --> 00:04:01.980 point where their defenses were so good that they could just 67 00:04:01.980 --> 00:04:06.300 better repel all sorts of attacks. And so we've seen a 68 00:04:06.300 --> 00:04:11.100 number of efforts by the administration. And it's great 69 00:04:11.100 --> 00:04:15.870 to see them now codified into this National Cybersecurity 70 00:04:15.900 --> 00:04:19.860 Strategy, which draws a line in the sand, I think, in terms of 71 00:04:20.220 --> 00:04:24.390 where we've come from, where we need to get to, and what the 72 00:04:24.390 --> 00:04:27.480 administration is saying it wants to see happen to help us 73 00:04:27.480 --> 00:04:30.150 get there. Now, some of this is contingent on working with 74 00:04:30.150 --> 00:04:34.860 Congress, for example. That's a big X factor. But already the 75 00:04:34.860 --> 00:04:37.980 hot last couple of years, we've seen some really notable 76 00:04:37.980 --> 00:04:41.760 results. And so one of the pieces I wrote recently about 77 00:04:41.760 --> 00:04:45.060 the new National Cybersecurity Strategy is what it means for 78 00:04:45.060 --> 00:04:48.150 ransomware because not everything is about ransomware 79 00:04:48.150 --> 00:04:51.570 when it comes to cybercrime, but certainly it helps illustrate 80 00:04:51.630 --> 00:04:56.400 the cutting edge of what attackers are doing. And also 81 00:04:56.430 --> 00:04:59.520 there have been remarkable gains made by law enforcement. For 82 00:04:59.520 --> 00:05:04.530 example, with the Hive takedown that we saw, Dutch, German, U.S. 83 00:05:05.400 --> 00:05:09.000 collaboration between law enforcement, they infiltrated 84 00:05:09.000 --> 00:05:14.490 Hive in the middle of 2022, identified tons of victims, 85 00:05:14.490 --> 00:05:18.810 passed them free decryptors. There was about $130 million in 86 00:05:18.810 --> 00:05:22.740 ransoms demanded by Hive that the FBI says they were able to 87 00:05:22.740 --> 00:05:27.360 prevent. And I think that is a nice small illustration of 88 00:05:27.360 --> 00:05:32.820 what's now being done to disrupt ransomware groups and other 89 00:05:32.820 --> 00:05:36.750 threat actors, and Neuberger, the deputy national security 90 00:05:36.750 --> 00:05:39.360 adviser for cyber and emerging technology at the White House, 91 00:05:40.350 --> 00:05:43.500 said Hive and some of the other things they've been doing shows 92 00:05:43.500 --> 00:05:46.380 how they're disrupting and dismantling threat actors and 93 00:05:46.380 --> 00:05:50.490 also elevating the work on ransomware - from being just a 94 00:05:50.490 --> 00:05:54.660 cybercrime threat to a national security threat. We've seen this 95 00:05:54.660 --> 00:05:57.540 happening in other countries as well. Here in Britain, for 96 00:05:57.540 --> 00:06:02.820 example, there has been a very clear pivot or escalation to 97 00:06:02.820 --> 00:06:06.840 treating ransomware as a national security threat. Now, 98 00:06:06.840 --> 00:06:09.900 that might just sound like words. But in speaking with 99 00:06:09.900 --> 00:06:13.680 experts in the U.S., they say, by reclassifying ransomware as 100 00:06:13.680 --> 00:06:17.970 national security, it gives President Biden and by 101 00:06:17.970 --> 00:06:21.330 extension, all of the executive department and the various 102 00:06:21.360 --> 00:06:27.270 agencies, more tools, levers of power, in order to disrupt 103 00:06:27.570 --> 00:06:32.130 ransomware. Everything up to military power is on the table 104 00:06:32.130 --> 00:06:35.460 now, whereas previously, it may not have been. Now, that doesn't 105 00:06:35.460 --> 00:06:37.710 mean we're going to see the U.S. Army deploying to take down 106 00:06:37.710 --> 00:06:42.510 ransomware actors. But I think it does reflect how many more 107 00:06:42.630 --> 00:06:46.590 kinds of disruption and tools - be they intelligence, possibly 108 00:06:46.590 --> 00:06:49.470 military units being used to gather intelligence or to 109 00:06:49.470 --> 00:06:54.360 disrupt ransomware - is being brought to bear. We spoke with 110 00:06:54.390 --> 00:06:57.540 retired Air Force General Gregory Touhill. He's been in 111 00:06:57.540 --> 00:07:00.180 our stages before. He's now director of the CERT division at 112 00:07:00.180 --> 00:07:03.210 Carnegie Mellon University's Software Engineering Institute. 113 00:07:03.660 --> 00:07:06.480 He says, elevating it to this national security threat, it 114 00:07:06.480 --> 00:07:08.610 doesn't make it mutually exclusive with treating it as a 115 00:07:08.610 --> 00:07:12.360 criminal activity. It just means that we're bringing additional 116 00:07:12.360 --> 00:07:15.090 political, diplomatic, economic, and military instruments of 117 00:07:15.090 --> 00:07:19.350 power to bear. So we're seeing lots of activity. Already, the 118 00:07:19.350 --> 00:07:21.360 administration has been disrupting the use of virtual 119 00:07:21.360 --> 00:07:24.690 currencies for laundering ransom payments. And not just the U.S. 120 00:07:24.690 --> 00:07:27.690 here, of course. There's a coalition of over 30 countries 121 00:07:27.780 --> 00:07:31.650 working to disrupt ransomware and by extension, other kinds of 122 00:07:31.650 --> 00:07:35.340 cybercrime that are laundering money, using cryptocurrency, for 123 00:07:35.340 --> 00:07:39.390 example, or where there are potential safe havens for 124 00:07:39.390 --> 00:07:43.440 criminals who are launching these types of attacks. So I 125 00:07:43.440 --> 00:07:47.700 think the National Cybersecurity Strategy unveiled again, by the 126 00:07:47.700 --> 00:07:52.350 Biden administration last week, is great for showing how we can 127 00:07:52.350 --> 00:07:56.010 be playing offense and the additional kinds of strategies 128 00:07:56.010 --> 00:07:59.160 they're going to be brought to bear. Will it help? I'm sure it 129 00:07:59.160 --> 00:08:02.520 will, because you've already been seeing better disruptions, 130 00:08:02.550 --> 00:08:06.660 better efforts to bring down cybercrime. Again, it's nice to 131 00:08:06.660 --> 00:08:10.350 see that codified. And it's nice to see this memo, if you will, 132 00:08:10.380 --> 00:08:13.590 from the White House, about how it wants to see public and 133 00:08:13.590 --> 00:08:17.850 private strategies, taking us to the next step of disrupting 134 00:08:17.880 --> 00:08:18.570 cybercrime. 135 00:08:19.980 --> 00:08:22.680 Anna Delaney: Great stuff! Matt, what details were missing for 136 00:08:22.680 --> 00:08:26.580 you, or at least what questions remain for you as to how this 137 00:08:26.580 --> 00:08:27.330 all plays out? 138 00:08:28.650 --> 00:08:30.540 Mathew Schwartz: Well, I think Tony's going to be getting into 139 00:08:30.540 --> 00:08:36.690 the liability piece. And that's a big, open question for me. I 140 00:08:36.720 --> 00:08:39.750 give law enforcement a lot of credit for how they have been 141 00:08:39.750 --> 00:08:44.340 combating increasingly innovative ransomware groups by 142 00:08:44.340 --> 00:08:46.920 bringing their own kinds of innovation to their law 143 00:08:46.920 --> 00:08:49.830 enforcement strategies, helping victims much more than they have 144 00:08:49.830 --> 00:08:53.190 been before. So I like the fact that we've already seen that 145 00:08:53.190 --> 00:08:57.210 working. It doesn't need to be blue sky thinking. It has been 146 00:08:57.240 --> 00:09:01.980 delivering results. Whether or not we can get Congress to do 147 00:09:01.980 --> 00:09:05.130 anything meaningful, though, in terms of transferring some of 148 00:09:05.130 --> 00:09:09.210 the responsibility on to the private sector is a huge X 149 00:09:09.210 --> 00:09:11.670 factor for me. I love the White House for saying this is what it 150 00:09:11.670 --> 00:09:15.150 wants. But I think we'll see if it happens. You can write down 151 00:09:15.150 --> 00:09:18.150 everything you want to see. But again, that doesn't mean 152 00:09:18.150 --> 00:09:20.070 everyone else is going to play ball. So that's one of the big 153 00:09:20.070 --> 00:09:20.940 questions for me. 154 00:09:21.210 --> 00:09:24.120 Anna Delaney: Yeah. And Tony will move on to you in a moment. 155 00:09:24.120 --> 00:09:27.900 But Suparna, I just wanted to ask you. What's the press in 156 00:09:27.990 --> 00:09:31.710 India, the media in India, saying about the strategy? Are 157 00:09:31.710 --> 00:09:33.300 they making much chatter about it? 158 00:09:34.290 --> 00:09:37.080 Suparna Goswami: Oh, yes, there is. So one thing is of course, 159 00:09:37.080 --> 00:09:40.980 whatever is being done in the U.S. is usually a big thing 160 00:09:40.980 --> 00:09:44.790 here. But yes, this is making news for the right reasons. And 161 00:09:45.150 --> 00:09:48.570 we are having the summit tomorrow, day after tomorrow. 162 00:09:49.170 --> 00:09:53.430 I'm sure we'll get to discuss a lot more with the CISOs on this 163 00:09:53.430 --> 00:09:57.840 particular topic on what they want to but, yes, on SBOMs; 164 00:09:58.440 --> 00:10:00.360 there has been a lot of discussion nothing has been 165 00:10:00.360 --> 00:10:03.870 finalized. People are talking about the practicality of it. 166 00:10:04.650 --> 00:10:08.310 But Matt, I would love to ask you, so how much are you 167 00:10:08.310 --> 00:10:12.330 expecting outside of the U.K. and U.S. of course, countries 168 00:10:12.330 --> 00:10:15.510 taking inspiration from this and making ransomware - like the 169 00:10:15.540 --> 00:10:17.370 U.S. has - a national crime? 170 00:10:19.460 --> 00:10:21.680 Mathew Schwartz: Interesting! Yes, that's a great 171 00:10:21.680 --> 00:10:26.210 question. I think we will see more governments outside the 172 00:10:26.210 --> 00:10:30.290 U.S., the U.K., elevating ransomware to this national 173 00:10:30.320 --> 00:10:33.950 security threat, because of all of the disruption and damage 174 00:10:33.950 --> 00:10:37.430 it's causing. We see that across so many sectors, but especially 175 00:10:37.430 --> 00:10:40.700 the healthcare sector. Ransomware groups are wanton in 176 00:10:40.700 --> 00:10:44.660 their disregard for public safety and public health. And I 177 00:10:44.660 --> 00:10:47.300 think it's only appropriate that these additional tools of power 178 00:10:47.300 --> 00:10:50.450 be brought to bear to disrupt it. So yes, very much, I think 179 00:10:50.450 --> 00:10:55.010 we'll see not just the five eyes countries comprising not just 180 00:10:55.010 --> 00:10:58.910 the U.S. and Canada, the U.K., New Zealand, Australia, but also 181 00:10:58.910 --> 00:11:02.960 many more Western governments. Hopefully, India, I don't know, 182 00:11:02.990 --> 00:11:09.860 hopefully, elevating it to this national security imperative, 183 00:11:10.040 --> 00:11:12.560 really, because of all the disruption and damage that it's 184 00:11:12.560 --> 00:11:15.140 doing. Hopefully, Suparna, that's my answer. 185 00:11:15.380 --> 00:11:17.600 Suparna Goswami: The only thing here in India is, and I just 186 00:11:17.600 --> 00:11:21.080 take a few seconds here, ransomware is big, of course, 187 00:11:21.080 --> 00:11:24.110 you talk to CISOs off the record, and they will talk about 188 00:11:24.110 --> 00:11:28.880 it, but nothing gets public. So we do not know how much actually 189 00:11:28.910 --> 00:11:32.090 ... so in U.S., we keep hearing that this organization has been 190 00:11:32.090 --> 00:11:34.220 attacked, there has been ransomware attack here and 191 00:11:34.220 --> 00:11:37.790 there. In India, we hardly get to hear anything. So that's 192 00:11:37.790 --> 00:11:42.110 probably one of the reasons that we don't get news as such on 193 00:11:42.110 --> 00:11:44.750 ransomware because people do not really talk about it. 194 00:11:45.080 --> 00:11:48.110 Mathew Schwartz: Mandatory reporting would be great. A lot 195 00:11:48.110 --> 00:11:50.690 of the security experts I talk to say that it does 196 00:11:50.690 --> 00:11:55.340 disproportionately hit the U.S. and a handful of Western 197 00:11:55.340 --> 00:11:58.430 European nations, I think because of their proclivity to 198 00:11:58.430 --> 00:12:03.440 pay, and possibly the valuations of companies in general, being 199 00:12:03.440 --> 00:12:07.340 larger. So I think ransomware attackers are more naturally 200 00:12:07.340 --> 00:12:11.210 attracted to those areas, and also, again, the propensity of 201 00:12:11.210 --> 00:12:14.390 those victims to pay. But like you say, we just don't know how 202 00:12:14.390 --> 00:12:17.000 bad the problem is, including in India. 203 00:12:18.410 --> 00:12:20.420 Anna Delaney: This has been a great opener, I mean, sticking 204 00:12:20.420 --> 00:12:23.300 with the strategy, Tony, the interesting thing about this 205 00:12:23.300 --> 00:12:26.450 strategy, which differs from other versions before it, is 206 00:12:26.450 --> 00:12:29.450 that it urges far greater mandates on private industry. 207 00:12:29.450 --> 00:12:32.330 And it comes coincidentally in the same week, when we've 208 00:12:32.330 --> 00:12:35.720 learned that the LastPass hack was down to an engineer's 209 00:12:35.720 --> 00:12:39.020 failure to update software on their personal computer. So what 210 00:12:39.020 --> 00:12:41.960 do you want to highlight with regards to this particular 211 00:12:41.960 --> 00:12:42.410 aspect? 212 00:12:42.720 --> 00:12:45.780 Tony Morbin: Well, as you say, and as Matt was saying earlier, 213 00:12:45.960 --> 00:12:48.840 you know, it is a fully integrated strategy and offense 214 00:12:48.840 --> 00:12:53.250 is a large part of it, but obviously, so is defense. And I 215 00:12:53.250 --> 00:12:56.010 think a significant aspect that was announced in the strategy is 216 00:12:56.340 --> 00:12:59.670 addressing the issue of end users currently bearing to 217 00:12:59.670 --> 00:13:03.540 greater bear a burden for mitigating cyber risks. So it 218 00:13:03.540 --> 00:13:07.290 notes how a single person's momentary lapse in judgment, use 219 00:13:07.290 --> 00:13:10.800 of an outdated password or an errant click on suspicious link 220 00:13:10.830 --> 00:13:13.590 can potentially be leveraged to have national security 221 00:13:13.590 --> 00:13:18.210 consequences. Exactly as you said, you know, LastPass, great 222 00:13:18.210 --> 00:13:22.380 example of our vulnerability. So the recent hack of security 223 00:13:22.380 --> 00:13:25.500 provider LastPass, was made possible because one of their 224 00:13:25.500 --> 00:13:29.610 company's DevOps engineers failed to update Plex on their 225 00:13:29.610 --> 00:13:33.240 home computer. Attackers targeting their home computer 226 00:13:33.240 --> 00:13:36.480 with a keylogger malware exploited a three-year old now 227 00:13:36.480 --> 00:13:40.680 patched flaw in Plex, and were able to achieve code execution. 228 00:13:40.980 --> 00:13:43.770 Then they obtained credentials and breached the cloud storage 229 00:13:43.770 --> 00:13:47.220 environment to steal partially encrypted password vault data 230 00:13:47.520 --> 00:13:52.650 and customer information. So obviously hugely embarrassing 231 00:13:52.650 --> 00:13:58.140 and consequential for LastPass, but down to an unpatched home 232 00:13:58.140 --> 00:14:03.930 laptop. Now, this policy is putting more responsibility on 233 00:14:03.930 --> 00:14:06.360 the vendors, but it's not about beating up the vendors or 234 00:14:06.360 --> 00:14:11.220 condemning lack security procedures. It is true, you 235 00:14:11.220 --> 00:14:14.430 know, we often hear analogies along the lines of "if aircraft 236 00:14:14.430 --> 00:14:17.760 manufacturers built planes that kept crashing, no one would buy 237 00:14:17.760 --> 00:14:20.550 them." So why are information technology providers allowed to 238 00:14:20.550 --> 00:14:23.250 produce products and services that continue to be vulnerable? 239 00:14:23.790 --> 00:14:27.330 And I'll put my hands up, I've actually said that myself. I'm 240 00:14:27.330 --> 00:14:29.760 still of the opinion that manufacturers should be required 241 00:14:29.760 --> 00:14:32.370 to take more responsibility for the security of their products 242 00:14:32.370 --> 00:14:36.420 in building security from the outset. But to be fair to the 243 00:14:36.420 --> 00:14:39.810 security vendors, unlike civilian aircraft manufacturers, 244 00:14:39.960 --> 00:14:42.930 they are under constant attack from both state and criminal 245 00:14:42.930 --> 00:14:46.410 actors. Plus, the offensive cyber capability is widely 246 00:14:46.410 --> 00:14:49.650 available. So perhaps a better analogy might be that you 247 00:14:49.650 --> 00:14:52.170 wouldn't build a military aircraft that was easy to shoot 248 00:14:52.170 --> 00:14:56.700 down. So we need to reframe our thinking to understand that 249 00:14:56.700 --> 00:14:59.820 we're operating in a high risk environment. And while we can't 250 00:14:59.850 --> 00:15:01.920 live mitigate the risk as the benefits of digital 251 00:15:01.920 --> 00:15:05.070 communication are just so great and increasingly integral to 252 00:15:05.070 --> 00:15:09.450 modern life. But we can and must mitigate those risks and take 253 00:15:09.450 --> 00:15:13.170 the burden off the individual. And that's where the strategy is 254 00:15:13.350 --> 00:15:16.920 coming in. It's saying, we need to ask more of the most capable 255 00:15:16.950 --> 00:15:20.520 and best positioned actors to make more, basically, to make 256 00:15:20.520 --> 00:15:25.110 our digital ecosystem inherently defensible, resilient, and it's 257 00:15:25.140 --> 00:15:28.380 coming from the U.S. so aligned with U.S. values, although 258 00:15:28.380 --> 00:15:31.170 further on, they talk about obviously, you know, and sharing 259 00:15:31.170 --> 00:15:35.220 with partners. Those that own and operate the systems that 260 00:15:35.220 --> 00:15:37.830 hold our data responsible for their security, along with the 261 00:15:37.830 --> 00:15:41.850 technology providers that build the systems. And it's saying 262 00:15:41.850 --> 00:15:44.550 that where there are market failures, industry and 263 00:15:44.550 --> 00:15:47.370 government need to work together to protect the most vulnerable 264 00:15:47.400 --> 00:15:50.730 and defend the shared digital ecosystem. And it does further 265 00:15:50.730 --> 00:15:52.920 on say that, you know, it feels that the market hasn't 266 00:15:52.920 --> 00:15:56.970 delivered. So to facilitate this, the strategy is calling 267 00:15:56.970 --> 00:16:00.840 for incentivizing creation of a more resilient and defensible 268 00:16:00.840 --> 00:16:05.700 system, using both market forces and public programs to reward 269 00:16:05.700 --> 00:16:11.100 security and resilience, build robust drivers, and a diverse 270 00:16:11.310 --> 00:16:14.700 cyber workforce, embrace security and resilience by 271 00:16:14.700 --> 00:16:18.060 design, and strategically, coordinate research and 272 00:16:18.060 --> 00:16:21.780 development investments in cybersecurity. So as a result, 273 00:16:21.780 --> 00:16:25.770 we can expect what the report calls generational investments 274 00:16:25.950 --> 00:16:28.560 in renewing infrastructure, including modernizing 275 00:16:28.560 --> 00:16:32.490 cryptographic techniques. As Matt also said, it's going to be 276 00:16:32.490 --> 00:16:35.610 realigning foreign and domestic policy priorities as well. So 277 00:16:35.610 --> 00:16:40.710 there'll be the whole offensive side as well. To one lens 278 00:16:40.710 --> 00:16:44.820 cybersecurity is very much now become the new health and 279 00:16:44.820 --> 00:16:48.390 safety, we can and we should expect everything to be done to 280 00:16:48.390 --> 00:16:51.960 keep us safe when we're online. And just as there are critics 281 00:16:51.960 --> 00:16:56.190 who decry regulatory burdens, here in the U.K., they call it 282 00:16:56.190 --> 00:16:59.130 health and safety gone mad. And there will be those that argue 283 00:16:59.130 --> 00:17:02.490 too much burden is being transferred to the vendors. But 284 00:17:02.490 --> 00:17:05.580 the fact is that cybersecurity is now an essential and an 285 00:17:05.580 --> 00:17:08.760 integral part of a functioning modern society. And we've all 286 00:17:08.760 --> 00:17:11.220 got to play our part in achieving and maintaining it. 287 00:17:11.700 --> 00:17:16.350 And the new strategy includes both carrot and stick with, you 288 00:17:16.350 --> 00:17:22.650 know, lots of funding programs, and also likelihood of more 289 00:17:22.650 --> 00:17:28.950 regulation as well. As it said, carrot and stick. You do it 290 00:17:28.950 --> 00:17:31.890 right, there's all this money going to be available. I don't 291 00:17:31.890 --> 00:17:35.940 know how much yet. But there's also the regulations. Again, we 292 00:17:35.940 --> 00:17:39.600 don't know how much they're exactly. And some of the things 293 00:17:39.600 --> 00:17:42.570 are still waiting to be found out, how they're actually going 294 00:17:42.570 --> 00:17:44.790 to address cyber insurance, which they are now saying 295 00:17:44.790 --> 00:17:45.810 they're going to address. 296 00:17:47.460 --> 00:17:49.740 Anna Delaney: Some are saying that maybe tech companies will 297 00:17:49.740 --> 00:17:52.830 be less transparent about vulnerabilities. And what have 298 00:17:52.830 --> 00:17:56.490 you heard from the vendor community? We often talk to 299 00:17:56.490 --> 00:17:59.460 CISOs who say they're trying to get answers from them, and they 300 00:17:59.460 --> 00:18:02.340 don't receive, you know, the information that they'd like. 301 00:18:02.880 --> 00:18:05.490 How do you think this might play out? 302 00:18:06.080 --> 00:18:08.450 Tony Morbin: I think the strategy has got the two 303 00:18:08.450 --> 00:18:13.400 drivers, you know, fear and greed. You know, if you provide 304 00:18:13.400 --> 00:18:18.050 the money, they'll provide it. If you actually, through 305 00:18:18.050 --> 00:18:21.320 regulation say you must, then I'm afraid they must. 306 00:18:23.210 --> 00:18:26.390 Anna Delaney: Good language! Let's hope so. This is great. 307 00:18:26.810 --> 00:18:29.870 Well, Suparna, moving on to a different area, of one of your 308 00:18:29.870 --> 00:18:33.230 favorites, fraud. You've been spending time focusing on check 309 00:18:33.230 --> 00:18:35.870 fraud in the past couple of months. What is the state of 310 00:18:35.870 --> 00:18:38.690 check fraud today? And can you share any highlights from your 311 00:18:38.690 --> 00:18:39.170 interviews? 312 00:18:40.350 --> 00:18:42.360 Suparna Goswami: Yes. And as you mentioned, I have been speaking 313 00:18:42.360 --> 00:18:45.630 with people on check fraud. So the big headline in the fraud 314 00:18:45.630 --> 00:18:50.010 world these days is that check fraud is back. Well, to be 315 00:18:50.010 --> 00:18:53.250 honest, technically it never went away, but it has never been 316 00:18:53.250 --> 00:18:56.970 dominated or has ever dominated the news headlines in the past 317 00:18:56.970 --> 00:19:00.390 two decades. Now, according to the U.S. Treasury Department, 318 00:19:00.420 --> 00:19:07.260 check fraud increased 84% in 2022, in comparison to 2021, and 319 00:19:07.260 --> 00:19:11.640 even the Financial Crimes Enforcement Network in the last 320 00:19:11.640 --> 00:19:14.580 week of February, has issued an alert and said it is 321 00:19:14.580 --> 00:19:18.180 collaborating with the United States Postal Inspection Service 322 00:19:18.480 --> 00:19:21.570 and has identified red flags that will help financial 323 00:19:21.570 --> 00:19:25.110 institutions to detect and prevent suspicious activity 324 00:19:25.110 --> 00:19:28.350 around check fraud. But definitely this is not going to 325 00:19:28.350 --> 00:19:31.230 be enough. Now the question is why has check fraud become such 326 00:19:31.230 --> 00:19:34.560 a huge problem all of a sudden? Now first, let us understand 327 00:19:34.560 --> 00:19:38.430 that it is no longer a manual theft where thieves are just 328 00:19:38.430 --> 00:19:42.150 stealing checks from the mailboxes. And like most crimes, 329 00:19:42.540 --> 00:19:45.390 it too has evolved and has gotten a little more 330 00:19:45.390 --> 00:19:49.590 sophisticated. Now criminals are using platforms like Telegram, 331 00:19:50.580 --> 00:19:54.000 which has become a one stop shop for criminals from buying your 332 00:19:54.240 --> 00:19:58.170 stolen checks to hiring people - known as walkers - to deposit 333 00:19:58.170 --> 00:20:01.260 them. So there's literally ads on Telegram which says, I'll pay 334 00:20:01.260 --> 00:20:04.860 you this much, this check needs to be deposited. And there are 335 00:20:04.860 --> 00:20:08.580 walkers who actually get interviewed. And we are seeing 336 00:20:08.580 --> 00:20:12.780 criminals selling the checks on Telegram along with sensitive 337 00:20:12.780 --> 00:20:16.920 information on the victims, like your social security numbers, 338 00:20:16.920 --> 00:20:21.060 balances in accounts, etc. Now, why are banks not able to deal 339 00:20:21.060 --> 00:20:24.660 with this problem, which I thought is not as sophisticated 340 00:20:24.660 --> 00:20:28.230 as other problems that banks deal with on a regular basis. 341 00:20:28.590 --> 00:20:32.400 Now, for the better part of the past 20 years, check fraud has 342 00:20:32.400 --> 00:20:36.660 been extremely stable and very, very predictable. But now it has 343 00:20:36.660 --> 00:20:41.130 changed. I spoke with Karen Boyer, who is from M&T Bank, and 344 00:20:41.130 --> 00:20:45.360 she said that check fraud is getting into a space, which is 345 00:20:45.360 --> 00:20:50.760 kind of new for bank and banks have not seen that before. I'll 346 00:20:50.760 --> 00:20:52.920 give you an example. So historically, banks have 347 00:20:52.920 --> 00:20:56.520 established a check form by calling a person to verify if he 348 00:20:56.520 --> 00:20:59.760 has written a check of this amount to this person. The 349 00:20:59.760 --> 00:21:03.210 problem now is that the name stays the same, but the account 350 00:21:03.210 --> 00:21:06.030 number is changed. So if I'm a banker, and I call you and 351 00:21:06.030 --> 00:21:12.000 verify you, and I have you deposit this check to XYZ for 352 00:21:12.000 --> 00:21:14.910 this amount, it does not show red flag, because you will say 353 00:21:14.910 --> 00:21:19.950 yes, I have deposited this; this all looks fine. So as a banker, 354 00:21:20.280 --> 00:21:25.170 it is fine, it is a legitimate check. But the account number is 355 00:21:25.170 --> 00:21:28.350 changed. So this is a typical case of identity theft, and 356 00:21:28.350 --> 00:21:31.440 synthetics that are being fabricated and made to match 357 00:21:31.440 --> 00:21:35.400 these lucrative checks that are coming through to bypass the 358 00:21:35.400 --> 00:21:38.400 authentication. Now, though, banks are trying to do their 359 00:21:38.400 --> 00:21:42.030 best, the technology is still not out there. Because up until 360 00:21:42.030 --> 00:21:44.700 now, there has not been much demand from the banks. As I 361 00:21:44.700 --> 00:21:48.150 said, it has been really low and predictable. But hopefully 362 00:21:48.150 --> 00:21:51.900 that's changing now. I've been speaking to vendors. There are 363 00:21:51.900 --> 00:21:54.930 many vendors who are now specifically looking at check 364 00:21:54.930 --> 00:21:58.680 fraud. And we are beginning to see a pickup in activity around 365 00:21:58.680 --> 00:22:01.410 innovation in the space. So hopefully something will be out 366 00:22:01.410 --> 00:22:03.510 soon. But there's still time. 367 00:22:04.590 --> 00:22:06.780 Anna Delaney: What are the particular gaps those vendors 368 00:22:06.780 --> 00:22:10.380 are hoping to address, in terms of how can we solve this and at 369 00:22:10.380 --> 00:22:11.670 what point of the process? 370 00:22:11.000 --> 00:22:14.180 Suparna Goswami: So many companies, or at least the 371 00:22:14.180 --> 00:22:17.840 vendors, they are trying to get some intelligence about stolen 372 00:22:17.840 --> 00:22:21.170 checks, and then have the tools consulting with the data 373 00:22:21.890 --> 00:22:24.800 repository. Now getting a data repository is a challenge in 374 00:22:24.800 --> 00:22:28.490 banks, because banks are not allowed to share that data with 375 00:22:28.490 --> 00:22:32.450 each other. So that's a huge issue and banks are working 376 00:22:32.450 --> 00:22:36.020 towards it. But data repository will govern whether a check is 377 00:22:36.020 --> 00:22:39.320 stolen or not. Then image analysis, they're working on 378 00:22:39.320 --> 00:22:42.980 that as well, better image analysis tools. But again, if a 379 00:22:42.980 --> 00:22:47.660 check is legitimate, image analysis will not really work. 380 00:22:47.960 --> 00:22:53.300 So they're trying to find better ways of detection. But as I 381 00:22:53.300 --> 00:22:55.490 said, the industry is still looking for solution which can 382 00:22:55.490 --> 00:22:58.700 detect fraud in check in an efficient manner. But it's a 383 00:22:58.760 --> 00:23:01.820 fabulous space. I did not follow that space, to be very honest, 384 00:23:01.820 --> 00:23:04.580 last year. I was hearing constantly about check fraud. 385 00:23:04.580 --> 00:23:08.090 And my thinking was again it's just a mail theft from 386 00:23:08.090 --> 00:23:13.280 mailboxes, what's the technical aspect to cover in that? But you 387 00:23:13.280 --> 00:23:17.450 speak to any banker, suddenly check fraud is just dominating. 388 00:23:17.690 --> 00:23:21.230 And they are losing a lot of money on that. So I see a lot of 389 00:23:21.260 --> 00:23:23.300 vendor activity in this space going forward. 390 00:23:23.930 --> 00:23:26.210 Anna Delaney: As you say, it's one of the oldest crimes in 391 00:23:26.210 --> 00:23:28.790 finances and just seeing how it's been brought up to date. 392 00:23:28.820 --> 00:23:31.940 But let's see what happens and track the movement. Thank you, 393 00:23:31.940 --> 00:23:36.530 Suparna. Okay, finally, as it's International Women's Day, or 394 00:23:36.530 --> 00:23:40.580 even week, who is the most inspirational woman making ways 395 00:23:40.610 --> 00:23:42.290 in the industry right now for you? 396 00:23:44.840 --> 00:23:46.550 Suparna Goswami: It could be the practitioners we speak with? 397 00:23:47.210 --> 00:23:47.720 Anna Delaney: Anyone. 398 00:23:47.210 --> 00:23:53.240 Suparna Goswami: So I have two of them. So one is Mel MigriƱo. 399 00:23:53.240 --> 00:23:56.840 She is from Philippines. So up until the end of last year, she 400 00:23:56.840 --> 00:23:59.990 was the CISO of Meralco, which is one of the largest power 401 00:23:59.990 --> 00:24:02.960 companies in Philippines. And she is now the chairman and 402 00:24:02.960 --> 00:24:06.620 president of Women in Security Alliance Philippines. Now I turn 403 00:24:06.620 --> 00:24:11.630 to her for OT security, zero trust, supply chain. She's one 404 00:24:11.630 --> 00:24:17.420 person who I've seen, you know, grow a lot in her space. And the 405 00:24:17.420 --> 00:24:21.710 other woman would be Shivangi Nadkarni. She's from India, and 406 00:24:21.710 --> 00:24:25.700 she's my go to person for privacy. A great person to speak 407 00:24:25.700 --> 00:24:31.520 with. She has been a longtime friend of ISMG. And we can go to 408 00:24:31.520 --> 00:24:34.940 her on anything to do with data protection and privacy. So I 409 00:24:34.940 --> 00:24:39.050 think these two women for me are doing great in their own fields. 410 00:24:39.680 --> 00:24:42.770 Anna Delaney: Excellent choices. Matt? 411 00:24:44.070 --> 00:24:46.560 Mathew Schwartz: And I have a few so I don't want to step on 412 00:24:46.560 --> 00:24:50.220 anybody else's toes in terms of who they might be lauding. But 413 00:24:50.220 --> 00:24:54.570 I've seen Lindy Cameron, the CEO or head of the National 414 00:24:54.570 --> 00:24:57.690 Cybersecurity Center here in the U.K., speak on multiple 415 00:24:57.690 --> 00:25:04.770 occasions, and it's always good very reassuring to hear and see 416 00:25:04.830 --> 00:25:09.480 the level of insight that she has into what's happening and 417 00:25:09.600 --> 00:25:13.200 the advice that she's promulgating to others. Jen 418 00:25:13.200 --> 00:25:19.800 Easterly at CISA, also the same. It's good to know that there are 419 00:25:19.800 --> 00:25:23.430 good people bringing their expertise to bear to help make 420 00:25:23.430 --> 00:25:26.730 us safer. And finally, Anne Neuberger. I mentioned her 421 00:25:26.730 --> 00:25:30.510 before she was part of the development of the new National 422 00:25:30.510 --> 00:25:35.970 Cybersecurity Strategy. She advises Biden on cyber and 423 00:25:35.970 --> 00:25:39.090 emerging technology. She used to be at the National Security 424 00:25:39.090 --> 00:25:42.600 Agency, also looking at emerging technology, such as 425 00:25:42.630 --> 00:25:47.100 quantum-resistant cryptography. She led the NSA's election 426 00:25:47.100 --> 00:25:51.840 security effort. So again, someone where you look at the 427 00:25:51.840 --> 00:25:55.440 skills and expertise that she's bringing to bear and you think, 428 00:25:55.800 --> 00:26:00.060 thank goodness, we need this. And we have it in the form of 429 00:26:00.060 --> 00:26:00.600 these women. 430 00:26:02.130 --> 00:26:05.640 Anna Delaney: We are in good hands, it seems. Tony? 431 00:26:05.000 --> 00:26:09.020 Tony Morbin: Clearly, Matt and I are in tune today because I was 432 00:26:09.020 --> 00:26:12.500 also going to say Lindy Cameron, but I was going to also mention 433 00:26:12.500 --> 00:26:17.900 somebody else. Betty Webb, a woman I interviewed who was one 434 00:26:17.900 --> 00:26:21.410 of the code breakers at Bletchley Park in World War II, 435 00:26:21.560 --> 00:26:25.010 not because she's currently a mover and shaker at 99. You 436 00:26:25.010 --> 00:26:28.460 know, you can't really expect that. But her role was so 437 00:26:28.460 --> 00:26:33.020 constrained among the code breakers, and to contrast that 438 00:26:33.260 --> 00:26:36.170 with Lindy Cameron now heading up our National Cybersecurity 439 00:26:36.170 --> 00:26:40.010 Center actually leading from the front as the most prominent 440 00:26:40.010 --> 00:26:43.970 person in cybersecurity in the U.K. I guess if I was 441 00:26:43.970 --> 00:26:47.030 specifically looking at promoting the role of women in 442 00:26:47.030 --> 00:26:53.120 cybersecurity, I'd probably say Jane Frankland, but I'd love to 443 00:26:53.120 --> 00:26:55.400 actually then call out other practitioners because there are 444 00:26:55.400 --> 00:26:59.510 the ones that I'd be missing. So I have named three. So that will 445 00:26:59.510 --> 00:26:59.930 be fine. 446 00:26:59.960 --> 00:27:02.150 Anna Delaney: There are so many, and yes, absolutely not 447 00:27:02.150 --> 00:27:05.360 forgetting the women who have paved the way for all of these 448 00:27:05.630 --> 00:27:08.720 women who are at the top positions at the moment. Well, 449 00:27:08.720 --> 00:27:12.020 I'm going to just nominate all my female colleagues at ISMG. 450 00:27:12.020 --> 00:27:14.750 Suparna, of course, you are included. I mean, we have a 451 00:27:14.750 --> 00:27:18.380 tremendous team of very hard working women. And they work 452 00:27:18.380 --> 00:27:22.070 hard at what they do and they are inspiring and very 453 00:27:22.070 --> 00:27:25.850 supportive. So thank you to you all. And thanks so much for 454 00:27:25.850 --> 00:27:27.380 watching. Until next time.