Skip to main content

Webinar

The Perils Of WhatsApp To Companies

We discuss WhatsApp and the potential risks it presents to business in leaking corporate data and breaching GDPR with YUDU CEO, Richard Stephenson

Hosted by: Jim Preen - Crisis Management Director at YUDU Sentinel
Expert guest(s): Richard Stephenson - CEO of YUDU Sentinel

Date: 20 November, 2019

Host profiles

jim-preen-circle

Jim Preen

YUDU Sentinel

Crisis Management Director

Jim Preen is Crisis Management Director at YUDU Sentinel. He designs and delivers crisis simulations for clients using the Sentinel platform. Along with providing expert guidance on all aspects of crisis communications.

Formerly, he was a journalist working at ABC News (US), covering stories including the Gulf War, the Bosnian conflict and the Concorde crash. He won two Emmys for his work.

richard-stephenson-circle

Richard Stephenson

YUDU Sentinel

CEO

Richard is the CEO of crisis management software provider YUDU Sentinel. Richard has run public listed companies, mid-market private equity investments and tech start-ups.

His professional skills include digital strategy, crisis management, risk and digital document publishing.

Video Transcript

Jim Preen: I'm now going to move on and invite Richard to join us, because he has a cautionary tale for us, I think about a major retailer who used WhatsApp, and I think the story hopefully will lead us straight to a lot of what we're going to be talking about today. Richard, over to you.

Richard Stephenson: Thanks Jim, great pleasure to do this online, and thanks everybody for joining us. Stories are good, we love stories. We tell stories, they engage us. I think it's a lot better for us to explain how things work in the real world. This story is about a major retailer, a British retailer that has a very large store on one of the most prestigious streets in London.

They had come across a number of WhatsApp groups and they decided to do a survey on their company, and they found that there were 36 live WhatsApp groups running. That's the first time they knew that, but they then dug down and investigated who was setting up these groups, who was actually managing them however they were being run, and the results of the study were that these groups were using-- having discussions which were quite confidential to the company.

A lot of this was about what stock policies they'd have, what maybe buying lines they would have and a lot of things that you'd say should be information that was kept purely internal. What was the big light bulb moment about having to change things, was when they discovered that a lot of the people that were in these WhatsApp groups had actually left the company and they realized that they were in fact leaking information out to people who'd left, and some of those people who had left were now working for other stores on the same high street up the road.

So they hadn't actually removed themselves from the WhatsApp group, and they were still in there. That meant that the directors, then they started a project to say, "Well, we can't just say don't use it because people have got used to it", so they hunted around for a solution that would in fact replace what they were doing before they actually made sure that it was completely banned. That's just a story about a particular retailer, but the same thing happens pretty well across all businesses.

That little poll that we did beforehand shows that a lot of people aren't using it, but amongst the millennials, there is a belief that things like WhatsApp should be used in businesses. A number of surveys recently done, particularly by the Chartered Institute of Professional Development, have shown that in some companies, as much as 80% of the companies they've surveyed are using WhatsApp in some form inside their business. We are talking today about crisis communications, but some of the points I'm going to raise here resonates about the overall use of it as a business tool.

Obviously the title is The Perils of WhatsApp. The corporate oversight issue, which was the one I was talking about by this large retailer, is that they actually did not have any idea about what was actually running in these discussions, or in fact how many discussions were running. The point about any communication systems that you sanction within an organization, is that you need to have control over what is actually being discussed, and which groups are set up. The reason we do that, because for example we have corporate email, which is the-- still if you like, the heavy lifting of corporate communications.

However, that means that there is a right that every employer has to go to the back and look at the discussions on that email so there is an audit trail of what's actually happened, and you commit the company to those things. If you have a WhatsApp group for example, these are set up as private accounts, and therefore there is no control over who's members of those accounts, and pretty well anybody who set that up, either the person who set up or who they've appointed as an administrator on that account, can invite other people to join that account.

That could be somebody inside the company, or it could be outside the company, but more importantly, the issue is it's not connected to your HR database. Point number three is that there is no control over joiners and leavers. Now, anybody who's undergone the processes of ISO 27001, which is in fact increasingly being adopted as a measure of good practice, and particularly on, if you like, control of data and cyber, knows that the processes that you have to go through for joiners and leavers, is that you have-- particularly leavers, the protocols have, you have to remove them all from the access that they have to various parts of the IT systems that you want to have.

That is very, very important process and protocols you have to go through. In the case where you have sanctioned the use of a WhatsApp group, there is no visibility, or no ability for the company to remove you from that group. That's a very important point when it comes to the conforming, if you like, to company policies. Because they're private accounts, there is no usable audit trail, and because it's [unintelligible 00:06:21] is that there is no record that it in fact company what's actually been said.

What you got is a situation where WhatsApp actually only keeps the data or the conversation on their servers for 30 days, in an encrypted form, and then it's removed, and anything that you have on your phone is in fact in the cache of your phone, all right, for that [unintelligible 00:06:48] but it's not actually stored. If you want to store the conversations, then you have to store it on other types of backup like the iCloud or Google Drive, but that's your decision and it's nothing to do with the company.

Jim Preen: So the data is deleted after 30 days unless it's stored elsewhere, you remain part of the group that-

Richard Stephenson: Yes, you remain part of the group. What of course is not covered is there's no audit of users that have deleted messages. If you have the ability to delete message, that actually has an impact on emergencies and crisis because if you have a retrospective on a crisis, you're looking at saying what actually happened, you need to have a complete audit trail, to say what were the instructions that I sent out to this person to do?

Now if that error ended up in an investigation by either a regulator, of course, you need to be able to prove what actually actions you took and what instructions you sent out. By using WhatsApp to send instructions, you actually don't have any record. So for example, the Financial Conduct Authority will require a report of a major incident from anybody that is in fact covered by their regulations. Constructing that report will be very difficult when in fact you have potentially large chunks missing, of information.

Even worse, if that somebody did actually die in a crisis, and you have to present yourself to a coroner's court, you have not the right level of data or documentation. So using WhatsApp as something that you sanction is a real major problem. The unencrypted backups point, I talked a little bit a while ago about that you do have the option on WhatsApp to do backup to iCloud or backup to Google Docs, they allow you to do that. But remember that those are unencrypted backups.

When everybody believes that you have end-to-end encryptions, the backups themselves that we're doing are harvested from the conversations and they are not encrypted. The security implications then for the company, from your ability to comply with your GDPR requirements, is in fact potentially at a breach there.

Jim Preen: I think this is actually where we want to go next, is on privacy and GDPR.

Richard Stephenson: Yes. Well, obviously we are now in the GDPR world and some people believe that GDPR doesn't really have the teeth or have the impact that everybody thought it was, but it certainly isn't the millennium bug that never happened. This in fact, is something where there are companies being fined.

Jim Preen: British Airways is a good example.

Richard Stephenson: British Airways is one [unintelligible 00:09:55], but actually there are other big headlines, but Jim, if you look at some of the elephants, there are companies being fined 20,000, 30,000, such, on a regular basis, small companies that are not making the big news. But basically, we're now seeing the teeth of the ICO.

Jim Preen: These smaller companies, they will have lost client data or--

Richard Stephenson: Yes, what we're now seeing and British Airways one is an interesting one, it's not so much that they cross the data, it's the question about the competence in British Airways case, of the IT staff. That's a new thing, because we generally have talked about the GDPR being about the data protection officer and the processes. But in the case of British Airways, we had a situation where they ran a promotion for on their site, which the website and the [inaudible 00:10:52] pay was actually hacked.

So they were fined for the fact that they didn't-- hadn't done their proper testing and due diligence, and the [unintelligible 00:11:00] was increased on the type of code they put on the website. Now that of course, send a chill through all the IT directors and everything else to think that these things they do on the website--

Jim Preen: Because it was a third party that--

Richard Stephenson: Yes, but they actually use code that was vulnerable on the website and didn't manage to check it. Therefore, they would be fined for that error. But if we look at, we've got up on the screen there, and we talk about WhatsApp for business and the standards for WhatsApp, there is WhatsApp for business, and therefore you'd say, "Surely that's fine." But if you want a business type of product, you need to have one where you've got administration control from the center.

Whatsapp for business deals a little differently with how data is. If you look at the terms and conditions of both of those products, the standard one and WhatsApp for business, in the standard WhatsApp, you have WhatsApp acting as both what's called the data controller and the data processor. The way that the GDPR works is that the controller is the person who says, "I am deciding what data I'm going to put on there and I am going to be controlling that and talking to the client", whereas the processor is just the repository, "And I'm going to make sure it keeps safe and runs it on my servers correctly."

But in the case of standard WhatsApp, they are both the controller and the processor. By being controller, they are completely at liberty to use the data that you've actually then put up on their servers for whatever purposes they wish to use it for. They're in complete control. Now, when you join any social media site, the first thing will pop up is to say, I want access to your address book and I move your contacts directory. Now as soon as you do that, then those that data, many of which will be company contacts, many of which will be client contacts, those are all covered under GDPR. So as soon as you click that, it moves up to WhatsApp, you've actually breached GDPR.

Jim Preen: Richard, can I just interrupt you, Brandon is calling from Australia. He's a bit concerned about some of our acronyms, the first one being GDPR. Can you in very short order, just explain what GDPR is?

Richard Stephenson: Yes, sure. I don't know whether it's heading your way to Australia, across the whole of the European Union, we have something called the General Data Protection Regulation. That has been adopted across all the countries and this is actually a toughening up of the way in which companies can use data. It's a sort of general trend around the world where there's been so much about sort of almost an unwritten contract particularly between social media companies that say, "I'm going to give you this fabulous free service and in return, you're going to give me your data for me to sell you stuff and do stuff."

The loseness of that arrangement has largely been tightened up by GDPR. We have been having it since back in May 2018 and this has meant that we've had to tighten up a lot of the way we can operate. There are things more or less centered on the individual, where the individual has things like, the right to be forgotten. In other words, if you're holding data on an individual, then that individual can demand that you remove it. There's lots of capabilities in IT that companies have to adopt in order to comply with GDPR.

Jim Preen: Just one other thing very quickly, Richard. The other thing is that if you do suffer a cyber attack or a data breach, you have to report it. It's pretty hard to cover this stuff up these days, but it is-- you're obliged to report it.

Richard Stephenson: That's the teeth of things, you have 72 hours to report it to the ICO and you can be fined up to 4% of your global turnover. That's pretty frightening number for them to-- and generally speaking though, if you follow the processes and you do all the type of normal housekeeping to protect the data, your fine is likely to be very small. But if you don't, and you don't report it, then in fact it escalates very fast.

Jim Preen: Just one thing, Richard, Brandon says they don't have-- I don't know what they have in Australia, but the GDPR is relatively similar to what they have in the States, is that right?

Richard Stephenson: Yes and it's likely to get a bit tougher. I think some parts of the regulations in the States are very tough indeed like lots of things. I think the States are certainly passing laws which are going to probably equal or exceed GDPR in the future. I would expect Australia to be-- take a similar view. Remember, Australia has taken an extremely strong view about WhatsApp and some of the other social media sites demanding, for example, that they do have access to WhatsApp. That's one of the countries that really wants to get control of that conversation.

I think we can only see Australia just following suit at time. Going back to the business one just very quickly, not to get to nerdish but the business one merely means that you are the controller of the business of who you brought in your contacts database and they are just the processor. But if you read the terms and conditions, it says very clearly that you have to have full legal authority from everybody in your contacts book that you are allowing them to share it with WhatsApp.

The reality is, nobody does have that. The get out for WhatsApp is that they have that written information, but the practical world is that nobody does that. Now, the rules are that the data protection officer each company, must be able to recover that data and because you've got a private account that DPO has no view on that. Therefore that's a big leakage to WhatsApp. The other thing is about if you decide to sanction it within the company, is that the user data is actually being shared with Facebook. Not the encrypted message but the fact is, you are going to be doing that. We're going to be shortly seeing adverts appearing in the profile information in WhatsApp.

You have to remember that Facebook paid a ton of money for WhatsApp, which was not loss making business. The only way in which Facebook actually justifies this thing is data, it's data on you and they have loads of users. Facebook will be using it, now, they may not see the message initially, but they will actually talk about you Jim, using the system and they will connect that with other elements.

Jim Preen: Then they will start selling me stuff won't they, I know they will.

Richard Stephenson: Yes, there's a very interesting story where a person that I know has actually written a rather good article on technique where he met a colleague, a work colleague, who they shared the details with each other on WhatsApp and she made two or three messages on WhatsApp. About a week later, he received on his Facebook account a suggestion to connect to the same lady. If each one of those WhatsApp and Facebook are actually supposed to be ring fence, that isn't the case.

There is clearly-- Facebook has to monetize these things properly and that's the issue. So the data about you is going to be shared with Facebook information. We know that. Let's look at another issue.

Jim Preen: I think what we're going to look at is sort of slightly hinted at it with the Australian experience, but these are kind of other concerns about where WhatsApp may be banned in certain countries, which, if suddenly-- you were using it in a crisis situation and suddenly it was banned, that could cause you some problems. Richard, go ahead.

Richard Stephenson: This is just the problem of being such a popular platform that's being used, is that you face the fact that it could be turned off at any moment. If you think about the success of WhatsApp, Whatsapp now in India, for example, is up to 400 million users of WhatsApp, which is just fantastic.

Jim Preen: The whole thing is over more than a billion users, it's huge.

Richard Stephenson: Yes it is, but it's 400 million in India. Therefore, when for example, in Indonesia, they had a lot of riots there, they immediately disabled the ability for people to share photos and images et cetera because they realized that was fueling the

[unintelligible 00:20:00]. Now if you happen to be in Indonesia or have a company there, then, in fact, your ability to use it in the emergency exactly for what you want to do, was to try and report the situation and it immediately fails. In some countries like UAE, for example, and for Russia or China or [unintelligible 00:20:18] public places, it is just not feasible.

Now, we're used to, we're sitting in nice UK, which is nice and friendly but we've got to remember that if we're going to use it for a crisis, we may have executives sitting in other countries and they actually-- you can't actually--

Jim Preen: And they just flick a switch and it goes.

Richard Stephenson: Flick a switch and it goes off. That's one of the other concerns. I think probably the other big thing that happened, is that the marketing of WhatsApp has been that it's end to end encryption, which sounds fabulous.

Jim Preen: Yes it does, particularly for crisis.

Richard Stephenson: Absolutely, it's fantastic. I think that encryption is such an important part of the security we have to do with our data now. Everything we do, for example, is always what we call encrypted at rest, that means everything on servers is encrypted, so if someone did hack the server, they're only dealing with encrypted data and you always encrypt over the wire, that means when things are being sent to devices, it's encrypted over, just like what WhatsApp is doing, fine.

But when you want to read it, Jim, you can't read encrypted data, so the phone or the operating system then converts the encrypted data into readable text. At that stage, the message is vulnerable. In there, is the stories about Trump et cetera, and some of the people that were engaged around his circle, they felt that WhatsApp, of course, was a secure way of communicating, but they were forgetting that the end recipient was actually copying all the messages and then produced those in court, so people forget that point.

But more importantly is that the ability to hack, not necessarily the WhatsApp itself, but the operating system to read those messages is [crosstalk].

Jim Preen: It's the Israeli company?

Richard Stephenson: Yes, so NBC it was, which is a company that prides themselves on selling a military-grade version and they sell it to what they claim to be only people who are--

Jim Preen: Military-grade version of what, so as to be clear?

Richard Stephenson: Military-grade of hacking software. In fact, the Israelis government classifies that bit of software as a weapon. They sold it to a number of actors out there and they then managed to use it to access what I think now is about 1400 accounts of very important people and particularly going back to India again, there is a strong suspicion in India and the government has been accused, Modi has been of having this software and spying on the main opposition and human rights people in India. Even if they haven't, now the suspicion is that WhatsApp is vulnerable, that who else has actually been using the software.

Jim Preen: Just to be clear about this as well, so it's not hacking the encrypted material, it's just looking at what is on somebody's phone.

Richard Stephenson: Yes, it's in the case of this software, this was delivered-- you see, Whatsapp is a messaging system, but it also is a voice system and it uses voice over IP as its method of talking, that's why it costs nothing to talk on WhatsApp, but that's just computer software and that's able to deliver packages of malware. They packaged that so if anybody made a call or if they call the phone, it then put this malware onto your phone into your operating system, which would actually then view what you're talking about and then send that back to their servers.

That meant that suddenly, we were all thinking that, Whatsapp is nice and secure and suddenly we find that this software has a very clever way of doing it. Now people are nervous about those types of conversations. Even in a crisis, we have to say that the vulnerability of the people we're talking to could be that there is some spyware on their phones that has actually shown them what you're talking about and what you're discussing on that area.

That only probably impacts major crisis where major decisions are being made, but we have to be careful when we've got a ubiquitous piece of software. The problem with that ubiquity is that it encourages really smart hackers to invest the time to develop very smart software. There is no payback for a small little program that you want to try and hack. With a big platform like WhatsApp, gets a big payback for hackers and that's one of the problems with being a big platform.

Jim Preen: That it's worth people--

Richard Stephenson: Yes and that was the problem with Microsoft for so many years. Microsoft was the leading player and therefore had so many bits of malware attacking it. Apple got away fairly well, but not necessarily because they were that much better-

Jim Preen: Because they were smaller.

Richard Stephenson: -because they were much smaller and that's the nature of it, so we have to remember that.

Jim Preen: Good. I think you've got another story for us and this is not so much from the technical perspective, but from the HR, Human Resources perspective. Tell us about this one Richard.

Richard Stephenson: The one thing that's happened is that we've seen these chat tools starting to pervade businesses and as we've said, this is barely in an uncontrolled way. There have been now the blurring of the line between what's actually in the business and what's actually your private area is a problem. It became particularly clear when we had a trader in the city of London and this actually worked through the court cases, but an investment banker was fined £37,000 by the Financial Conduct Authority, who shared information to a friend over WhatsApp.

That is because it's so easy. The barrier to say, I'm using WhatsApp for talking on my business and then, I in fact go to my private life and shared that. He was so excited about potentially having a deal, he was saying that it was going to pay off his mortgage. In fact he shared the details of the deal with friends outside WhatsApp and because there's no physical barrier, it's about almost having potentially a fence, that you've now taken down, but it's marked with a nice little cones, and you're saying, don't cross that particular line, there's no fence there anymore.

The point is that between the use in business and private, it's very easy for people to stray across the line. That's the problem of having a tool that is actually built for the family and friends world and in business. In that case, and this is not the first time, and there has been many cases where the issues have been taken to court, in some cases, there's been discrimination and in many cases that what's happened on Whatsapp is exactly the same in a business as has happened in the rest of social media, which is the issue of bullying and sexual harassment, et cetera, for that.

There is a concern, particularly even within a crisis, is that when people are under a lot of pressure and a lot of tension, et cetera, that what's actually been communicated down the line to people, is inappropriate and as people would say, often first thought, whatever's been communicated can be then removed. Actually, from a company, we do need to deal with that and not deal with hearsay and therefore, we do need an audit trail.

When these stories come out, about people sharing inappropriate data, the MET police has had a number of court cases against them where groups of the MET police were sharing images and things which were totally inappropriate, there is just too much-- too easy because they don't have any corporate oversight and because they know nobody is looking at it, that encourages that type of behavior. You don't use your corporate email for that because you know that somebody could actually troll through that and bring that information up, so you were careful what you say.

Jim Preen: Whereas with WhatsApp, if it can't be seen it, there's no corporate oversight.

Richard Stephenson: Absolutely, it encourages the wrong type of behavior.

Jim Preen: Because it encourages, I guess we all learned years ago with email when it's so easy to fire off an email, most people now think twice before doing that. It's even easier with some messaging service, whether it be Messenger or WhatsApp or whatever it is, just to bang something out like that and then live to regret it later, as your guy did with the fine that he was-- he didn't pay off his mortgage, I guess.

Richard Stephenson: Absolutely.

Jim Preen: [laughs] All right, we're going to move on now. Here's the thing is, you've been a bit combative with them, with Whatsapp, but clearly, there's a role for chat in a crisis, isn't there?

Richard Stephenson: No, I think what is-- if you look back, get up on a helicopter, look at the trends of communication and we've been looking at, obviously, as I said, email is the backbone of Corporate Communications at the moment, but then we've also got SMS and SMS is sort of sending a message, which people find pretty useful, but the chat is different and chat has actually become part of our lives and that is the immediacy of having the response and being able to share information, that's the upside, it's the upside.

No matter what we say is that it's actually a very, very useful tool and I tell you in a crisis, that immediacy could be vital and could actually save lives. So whereas you can use SMS for example, as we do in mass notification to tell people that there is a crisis but if you're trying to resolve the crisis amongst the team, a collaborative solution where people are suggesting ideas and sharing data, it's a very good way to do that.

So, we've seen a fabulous tool arriving, but the issue is that you cannot compromise the legal requirements within the company. We see it as a very good tool and we see that obviously it's mobile, we have now, certainly in the UK, I think amongst people in business, and we have about 96% of people using smart phones since only a few people holding out for the old ones.

Jim Preen: Nokia bricks.

Richard Stephenson: Nokia bricks and stuff, and they're generally people who are flatter physical people who don't want to get hacked. Those people have got a very powerful computer in their pocket and they're using it all the time and because they're all the time, that enables us to use chat-tools in a very collaborative way. The immediacy of being able to get an answer, I was talking to a crisis person and he was talking about his own personal crisis which wasn't exactly really a crisis, but he was a complete had a Lancia Stratos.

Jim Preen: What's that?

Richard Stephenson: Very fast rally car.

Jim Preen: Okay. I don't think everybody-- I certainly wasn't aware, car.

Richard Stephenson: Brilliant rally car but I would say you have to take a bit of a risk if you did what he did which was to take it out on a tour through the sort of Norwegian fjords and the mountains et cetera, and of course, the thing broke down and the only person who knew how to fix this Lancia Stratos was sitting somewhere in the UK and therefore they could use WhatsApp to take the photograph, take the filter, tell him what to do, everything else and there was no way that a Norwegian person would be able to do it because it's a car you would never normally come across.

That was where the immediacy of how to get the car back again was an example of how you could fix something by a collaborative work with someone.

Jim Preen: Which is where chat came in.

Richard Stephenson: Chat, and you could share the image and the person could say, "All right, tighten this bolt, do this." But that an example of how you can solve things by collaboration. The other thing is, it's become pretty familiar to us now, and that's really important for adopting. We're all now used to it, the adoption is over a billion years, et cetera of WhatsApp but if you add Facebook messenger in there, all the other such facilities, everything else. I think from a crisis point of view is that getting the images and the videos on the dots, if you're in a crisis commanding role, you will be wanting to say, "I want to have that image, what's going on, on the--" and the actual problem itself.

You may have spotters on-site, we can then send you back images or share images, share videos, and you may have documents you want to share immediately about how to fix things or things where you can share with people or installation manuals that you wish to do so in order to fix them, how to turn off the gas main, et cetera. These things make it a very, very useful tool in the crisis. That's all the plus side of what a product like WhatsApp can actually bring to a crisis. We think it's really good in a crisis.

Jim Preen: Let's move on a little bit, what about the future the tech? You're always looking at where all this is going, so where do you see it going? How do you see crisis communication changing in the next few years, where is that going?

Richard Stephenson: Well I think the phrase here is-- I'm using is multi-pipe but it could be multichannel communications which we want to do, I think that where we are communicating, and this particularly affects where we are communicating around the world, SMS for example, which for crisis communications is the main way that companies tend to sort of push out notifications like a text to people, they run through the telecom networks and most of those telephone networks around the world are run by or managed by government agencies.

They are very precious and they set the rules and they have various ways of working. It's a bit of a train wreck out there in the world of getting a message through all these different organizations and making sure that you're sending it according to the various rules that are set up in each particular country. So you can't rely totally on SMS communications or even most communications around the world so you need to use the internet as well.

We're seeing that rolling out so not just that, but also you're sending out by messaging where they're getting just the same push notifications you would do on WhatsApp to tell you information, but also email. I think that in crisis communications, we are now dealing with the biggest threat that we have to everybody really is cyber-attacks. In cyber-attacks, generally speaking, you can say that they have been in your system for a hundred days or so and the triggering of a cyber-attack is generally done to cause maximum mayhem at the point where it's just before public holiday or just before the weekend.

You can assume that your communication system will be compromised and you, therefore, have to be very careful about using your own systems for saying, "How are we going to fix it?" We've had many cases of cyber attacks and ransomware attacks where people are fixing it by using their corporate email to communicate how to fix it and the hackers are merely just reading the emails and then responding accordingly.

So you have to have multi-pipe communications, but independent communication. Better tools at the edge. We’ve somewhat talked about that, but we see crisis communications obviously starts off along the command and control model, the permit, the top-down, the gold, silver, bronze model but as I think, Andy who's on the core actually on this thing said, he used to be the PWC, one of his models is to look at this pyramid and say, actually very soon that pyramid inverts itself so that in fact the people at the center are supporting the bronze teams who are fixing the problem. That's what we described as the edge, that's the edge.

For you to fix the problem, the center has told you that the problem exists by the communication tools that we have, but at the edge, you need these collaboration tools very much like we've talked about, that ability to chat, so a group of people who are the fixers now have the ability to communicate amongst themselves about the best way to solve the problem and most of the knowledge of how to solve the problem isn't at the center, it's actually at the edge.

So better tools at the edge is the way we see it, collaboration, definitely improving, we know that most of the crisis studies that have said what went wrong was a failure of communication and the failure of collaboration and what we do see is that, you've got to present not just the information to say there is a missile heading this way from North Korea to Hawaii as you sit there, but you need to actually not only do that, you have to actually know what to do about it, not just decide you're going to break out the drinks cabinet and things, but you need to know where's your shelter, where's your information, and you have to collaborate with a group of people to actually survive those situations so collaboration is really part of it.

The ad hoc side, the agile nature, there is a tremendous improvement around the world in terms of business continuity work and things. I think that that's been driven by financial services and other areas and other people at risk but there is now a real understanding that we do need to think very carefully about how we can survive and driven by a lot of terrorism in Europe, driven by extreme weather and driven by cyber-attacks in particular, but whatever you have as your playbook so you're expecting is actually going to be different.

So you need to be able to say, "This is what we're going to be using, these are the checklists you will need, but actually we're going to do number one, number five, number six, or we've just put a new one in there, so agility is very important to get the right response.

Jim Preen: Document handling.

Richard Stephenson: Document handling, checklists and reporting back systems, et cetera, all of that exchange is really important as you go through the crisis.

Jim Preen: It's starting to make just the SMS element to this seem quite old fashioned?

Richard Stephenson: It's very thin, it's just a very small amount of information you can put across. It tells you that-- but actually sometimes when people like me who run businesses, we don't particularly like somebody telling us that there's a major problem at 10 o'clock at night just before we're going to sleep, when there's absolutely nothing we can do about it.

But if they tell us something and say what the fix is and say what the actions are and more information, the richer information we sleep better because we know things have been handled. What we need is richer information, particularly in a crisis. One of the other things is that it's not sufficient to have disjointed processes in terms of audit trails. We're a great believer, passionate and almost compulsive follower of making sure that everything is in fact audited. That's not just so that we can learn from events [unintelligible 00:40:14] going forward but also so that in fact as we say, if we need to produce evidence, we can do.

Those people familiar with the police will know that any incident will have a commander and next to the commander will be a scribe who is writing down what actually we knew at the time and what actions that person took. Therefore you are always being able to produce the evidence of the decisions you made. In business and most crisis, we don't have the time for that. Whatever systems that are now being adopted in the future have to properly have a full audit trail of everything that's been [unintelligible 00:40:54].

Jim Preen: Certainly from my background in crisis management and so forth, logging your actions and decisions and reason why it was a very good reason why that's part of crisis management protocol because if you do end up in the corridors of an enquiry, you need to be able to explain why you took certain actions. Also it'll help you through a crisis, it's similar to logging just information as it comes in as well. It's all good crisis management practices.

Richard Stephenson: Privacy has been spoken about beforehand. Privacy is just going to ramp up, cyberattacks are going to be stealing more and more information. People are going to get more concerned. If people would stop using tools because they're worried about privacy then in fact, we are going to get less resilient. We want to make sure that everything that happens in crisis communications is in fact entirely protected and private and secure.

Jim Preen: Time is getting on a bit. We got a few minutes left. Don't forget to put your questions in if you have any questions for us, we'd love to see those. Let's just move on now. I think it will come as no surprise to you to learn and while we're very, very careful not to turn this into advertising, we do offer crisis communication software. We've learned quite a lot about in designing our own software is we've learned quite a lot about what people want, what they're looking for. Do you want to talk about that?

Richard Stephenson: I think sharing the journey, basically any operation like ours is about looking for a problem to solve. A lot of technology people come up with a technology before thinking about actually, "Am I really solving something important." You always start from the purpose of what's problem is being solved. So eyes onward, very much about just how chat was taking off for example, and how people were using it and how useful it was. We then spoke to a lot of people about it and I would say that so many people were reluctant people putting their hands up saying, "We're using it."

I interviewed a head of compliance and risk at a major bank, a very famous back, and he admitted that he was using it. I use the phrase admitted because he was making that sort of almost confession about it because he knew it wasn't correct, but he just didn't have a better solution.

What we found was that there is actually a real demand for a good compliance solution. That's what we learned on the journey, that people were using it because it was so useful but it wasn’t compliant. For those of you familiar with the UK is that we have a National Health Service here and about a year and a bit ago, they approved it for use within the National Health Service, which was heavily criticized by certain people, not including me. The issue was is they just said, "Look, I know we shouldn't but it works and it could save some lives." This is only there for a temporary situation. They know they have to do it. Our journey was to say, "People know that there's got to be a solution that comes on." We set about to actually look at that and we responded to the feedback on the basis of saying how can we build something which will have all the encryption that you needed, the end-to-end encryption, but the difference is that it's actually able to put on to servers that are actually have got eyes on by the corporation of those conversations.

Could we build something like that that would work well? We also have this GDPR thing and one of the stories we have, we do things for connecting areas of London, for example, Westminster and Victoria and Trafalgar Square, these areas. These are areas where we connect 150 businesses where they can all talk to each other using Sentinel, but some of them are retailers and some of them are things where they like to take a picture of a potential shop lifter or somebody like that.

The problem that we faced with GDPR is that if somebody decides to share an image that in fact is not correct or anything else, then that person has a right to be forgotten. We then had to say can we build in corporate controls where you can look at things that have been inappropriately shared, could be a document that maybe has got intellectual property and that shouldn't be shared. We had to put in those protocols. We had a whole series of checklists on our big whiteboard when we decide to design this system to say we had to deal with all these problems.

Then it also is got to be just not just working on a mobile device whether it be Android or an iOS device but also in the office environment on the desktop. That's really important because a lot of people will be accessing from whichever device they want to access it from. It's early days for us. We have it operating very successfully with clients and they actually love using it. What we had to also make do without actually reaching any of these design protocols of WhatsApp etc. is make it as familiar as possible for the end-user.

It was actually just the same sort of experience, and they could use it with the same ease. That's where you can replace a product. If you've got something that's the same ease-of-use as that. From a crisis management point of view, which is where Sentinel is focused on, we now think that as a tool to compliment instant conferencing where you put people together without a PIN where you compliment SMS and Maps notification, we complement documents, adding chat into the mix gives you a collaborative tool which is the best way of working.

The feedback was saying should we do it, when we looked at it, can we do it. The answer was yes. The last six, nine months our developers have been doing that and we launched it in beta in October. It's relatively new but very promising and that's the thing.

Jim Preen: We're almost at the end up here. There's a quick question from Francis, and I certainly don't know the answer to this, but Richard maybe you do. A point about GDPR, I believe the right to be forgotten is not an absolute right. I don't know. Do you know about this?

Richard Stephenson: Definitely as an individual, there are certain situations where you have definitely the right to have that information. For example, if you're providing a service, and in order to deliver that service, you need to have the details on that person, then you have every right to have that information. However, if someone ends that service, they do have a right to ask for their information to be completely removed. It is a right that it's enshrined in the articles of GDPR that it can be requested.

The problem with companies and businesses, it's a considerable overhead if everybody starts to be asked to be forgotten, and you have to have protocols in place. That's a lot of cost to the organization. A lot of software, particularly legacy software, was never written with that in mind. How to pick out an individual person and remove all traces them on the system. Very often it's leaked out to other places, so the practical side of right to be forgotten, it's not practical to be able to do it, it's been passed on to third parties.

Jim Preen: Great. Richard, thank you very much indeed. I hope that's been an informative and interesting discussion for you. Just before we go, I'm being reminded here to tell you about an event that we're running on the 20th of November in central London. We're running a breakfast briefing, there'll be some yummy food I think on offer as well, but we'll be looking at trends in business continuity including communication technology. I know I'm running a crisis simulation during the breakfast seminar. Richard, I think you're going to be running technology.

Richard Stephenson: Yes. No, it's very exciting. We're going to be unveiling some very interesting tech at this meeting.

Jim Preen: Okay, good. There will be a panel discussion as well. It's in central London. It's in St. James's. It's on the 20th of November. It's in the morning, fairly early in the morning. I think now if you are interested in signing up for this then please just drop us an email and we'll show you where to sign up. That I think is where we are in fact going to sign off.

Thank you very much indeed for listening to us, and we'll be running another webinar next month. If you have some ideas about webinars that you would like us to run, then please let us know that. Thank you very much for listening. Goodbye from Richard. Goodbye from me, Jim Preen as well. Bye-bye now.