Keri Pearlson

Christopher Reichert, MOT ’04, welcomes Keri Pearlson, Executive Director of Cybersecurity at MIT Sloan. They discuss the research MIT Sloan is doing in cybersecurity, including an effort to drive cybersecurity behaviors. Learn more at

Christopher Reichert: Welcome to Sloanies Talking with Sloanies, a candid conversation with alumni and faculty about the MIT Sloan experience, and how it influences what they're doing today. So, what does it mean to be a Sloanie? Over the course of this podcast you'll hear from guests who are making a difference in their community, including our own very important one here at MIT Sloan.

Christopher Reichert: I'm your host, Christopher Reichert. I'm with Keri Pearlson, Executive Director of the Cybersecurity Program at MIT Sloan. Welcome Keri.

Keri Pearlson: Thank you.

Christopher Reichert: Today we're going to do it a little differently than we've done before. You're our first non-alum on the podcast, so welcome.

Keri Pearlson: Thank you.

Christopher Reichert: Before I get to your work here at Sloan, I just want to give a quick bio of Keri for our listeners. Keri holds a doctorate in business administration, in MIS from Harvard Business School, which we won't hold against her. And an MS in industrial engineering, and a BS in mathematics from Stanford, so she's quite the slouch. She's held positions in academia in industry including Babson College, UT Austin, where I almost went for my undergrad, Gartner's Research Board, CSC, and AT&T. And, she began her career at Hughes Aircraft Company as an Assistant Analyst. Finally, she was the Founding President of the Austin Society for Information Management (SIM), and was named the 2014 National SIM Leader of the Year, so welcome.

Keri Pearlson: Thank you, good to be here.

Christopher Reichert: Well, so where to begin? The topic of cybersecurity is so broad, and cuts across so many different areas of peoples’ lives. I guess we want to focus on the research that you guys are doing at Sloan, but not so much the technical side, although I'm sure we'll dip into that a bit. But really on the security, cybersecurity culture that you'd like to see more companies partake in. Also focus on how that affects managers and organizations, and how they operate. It's ironic, I think, given that back in the early days of the internet we all thought that the self regulating trust, and openness, and transparency would really herald a new utopian environment where we would not have the issues that we're facing today, and obviously that was rather naïve, right? I guess the question is, how secure are we today?

Keri Pearlson: It's funny you ask it that way. That is the holy grail question today in cybersecurity. Let me just take a step back and tell you a little bit about who we are, because that might structure some of the questions and the way we have our discussion. We're a research consortium at Sloan called Cybersecurity at MIT Sloan. We were founded by Professor Stuart Madnick. Many folks who are alum will know him.

Christopher Reichert: Absolutely.

Keri Pearlson: He's been at MIT for … he likes to say longer than recorded history, although I'm sure it's not that long. He is one of the founders, or one of the early researchers in information security. Our research at what we call CAMS, Cybersecurity at MIT Sloan, looks at the organizational, managerial, governance, the business side of cybersecurity. MIT of course, we're a technical school. There are a lot of places, you can look at the technologies, and those are really important questions to answer. But, we think equally important are the people questions. All the things around, “How do you know how secure you are? How do you build a culture of cybersecurity? How do you manage the dark web? How do you know your supply chain is secure?”

Those aren't technical questions. There might be a technical solution, but research shows that anywhere from 50 to 90% of the breaches that a company experiences is people oriented. It's somebody clicking on a phishing email, or somebody leaving the keys under the mat. No matter how secure your house is, if you leave the keys under the mat, you've opened up a vulnerability. There are a lot of opportunities for the “bad guys” to get in, if you will. Just because people do either, overt things they know they shouldn't do, or more likely, inadvertent things that they didn't know they shouldn't do.

Christopher Reichert: Right.

Keri Pearlson: That's sort of the foundation of our research. What we've noticed over the past couple years is of course an increasing interest not only in, “How do we keep ourselves more secure?” But, Boards of Directors want to know the exact question you just asked me, “How do we know how secure we are? How secure are we?” In many cases we can tell them numbers. We can tell them how many people failed a phishing email, and we can tell them how many endpoints we have, and how secure they are. But just like in the old days of IT, those are technical answers, they don't necessarily tell you how secure they are. Those are answers that we can calculate, and we do. We use those as measures of an answer to, “How secure are we?” But the real question is, “How do we know how secure we are?” And that's one of the questions we're researching right now.

Christopher Reichert: Yes, I can tell you from firsthand experience as a former CIO that the board was definitely asking that, as a fairly high-profile organization. I would give some of those statistics. "Here's how many denial of services we rejected, and here's how many times we update our security protocol on the exterior." But, what can we do to increase security as individuals? It's surely not crazier passwords?

Keri Pearlson: Right, right.

Christopher Reichert: Here's my password, is that crazy enough for you?

Keri Pearlson: The wisdom on passwords is, the longer the better. Of course, you can't remember long passwords, but you could remember a sentence. If you have a sentence, I'm sure as a CIO you coached your employees, your team, to do that. But, the longer the better, and if it's a sentence you remember, "I work at MIT." Or something like that, you get all those extra words, extra letters, extra characters…

Christopher Reichert: Zeroes and ones, and all that, right?

Keri Pearlson: And you can remember it. But, the answer to your question is really at the heart of some of our research. One of the projects I'd love to talk to you about is our culture project. It happens to be one I'm leading, so I know it best of all the things on our agenda. But, we have a research project, if you will, where we believe that we're trying to drive cyber secure behaviors. Those are behaviors that you do because it's part of your job, and the things you do just because you're a good member of our community. They’re things that aren't necessarily written in your job description, but they're the right thing to do.

Christopher Reichert: Just commonsense, but also kind of a, almost a peer pressure commonsense.

Keri Pearlson: Well, and a, “We feel some responsibility for this organization because this is the organization we're a part of.”

Christopher Reichert: Right.

Keri Pearlson: We want to drive—we call it “in role” and “out of role” behaviors—but we want to drive cyber secure behaviors. We think that most organizations, most managers, think the way to do that is we train you. We send you through a training class. Everybody has that. You go to training, you learn how to set your passwords, and what you're supposed to do and not supposed to do. You do it once a year because it's compliance training, and then you quickly forget most of it. You remember maybe something that you do during the year, but next year you go, click that button, take that compliance training, and the university can report back that 95% of the people in the university have taken the compliance training.

Christopher Reichert: Right.

Keri Pearlson: Again, not how secure are we, and not necessarily driving behavior. Our model takes a different approach. We think that what drives your behavior is what you value. If you personally think cybersecurity is important and building security into our organization is important, you're probably going to have more secure behaviors. What we look at, and what our model postulates, or hypothesizes, is that the behaviors of the managers drive a culture, which we define as the values, attitudes, and beliefs that drive cyber secure behavior. Our research is looking at different companies, and how do they build in, or how do they change values, attitudes, and beliefs. And we believe that happens at three levels. Of course, at the senior management level. If you as the CIO, or even the president of the organization, [you] do things that are cyber secure, you're going to be sending a message that, that's important to you. You make it a priority, you talk about it in your daily work in the company, you as the president, or the COO, any senior level, particularly a non-technology leader, general manager, send a message through whatever your means of communications are. You're going to be sending a message it's important to you, and that's going to show the value that you have, which research has shown drives the values, and the attitudes, and beliefs of others in the organization.

You can do things like start a meeting with a security moment. You can say, "Let's just take one minute, we know where the bathrooms are, we know how to turn off our cellphones. But, let's also talk about how to be secure. What's one thing we can do to be more secure?" That kind of behavior drives values, attitudes, and beliefs.

Christopher Reichert: I'm trying to imagine some of those things in the previous organization. I imagine people would have looked at me like I had antlers or something. Like, "Why are you wasting your time, why are you making us all paranoid?"

Keri Pearlson: Well, I guess you could look at it that way, but cybersecurity is in the press so much these days and it gets down to peoples’ individual concerns. I know when for example, the Equifax breach happened, my girlfriends called me up and said, "What do I need to do? You work in cybersecurity, what do I need to do?" And they're not cybersecurity researchers, they're my friends, they're my social friends. They are as far from IT leadership as you could possibly be, but it's on their mind.

Christopher Reichert: Yes.

Keri Pearlson: I think today it's a little different world than maybe it was even five years ago, or certainly 10 years ago. Where everybody's wondering about that, number one. Number two, we're all carrying these IT computers in our pocket. We have iPhones, we have Samsung phones, we have Google phones. We have computers in our pockets, so we're all a little bit more technologically savvy than we were a while ago, so we're all thinking about technology differently.

Christopher Reichert: It's up to the technology leaders in organizations, as well as the senior leadership, to take that on as, it may not be a profit center, but it certainly will mitigate being a loss center, right?

Keri Pearlson: Well, I think that's true. I think people think about technology differently, and the other piece of our research, which we're just starting now, is to look at the maturity of your culture. So if you think about it, everybody thinks their technology will keep them safe. We can put in firewalls, we can build technology barriers, and you and I as technology people know that that's good, only to the extent that it's good.

Christopher Reichert: Right.

Keri Pearlson: We don't know where the next attack is coming from, and the bad guys are ... as Stuart [Madnick] likes to say, "The bad guys are getting badder faster than the good guys are getting better." We don't know where the next attack's coming from. The technology can only go so far.

The next level of maturity is the CIO, the technology leadership. We think they're going to tell us what to do, and they're going to keep us secure. That's true again to a point, but the next attack could be something that they didn't even know was on the horizon. Then it's managers. People think their managers will tell them what to do, and help us be secure, and that's the example I gave a minute ago of executives building it into their meetings, or their security moments.

Christopher Reichert: So it's the culture side?

Keri Pearlson: The culture side. But, to me the holy grail is where each of us take it personally as our responsibility to be more secure. In an industrial environment, everybody takes safety very seriously.

Christopher Reichert: Physical safety, seems obvious, right? Construction sites, oil factories…

Keri Pearlson: Right. It's not necessarily your job to be safe, but it's part of the culture, it's part of what you have to do. That's what you do as a good citizen, and to keep yourself secure. Why don't we have that same kind of thinking in cybersecurity?

Christopher Reichert: That's true.

Keri Pearlson: That's really where we're trying to get with our research. We think it's everybody's individual responsibility.

Christopher Reichert: Yes, certainly. So construction for example, certainly ahead of that. They have their signs up for the public to see, "X number of days without an accident."

Keri Pearlson: Right.

Christopher Reichert: It's really built into every time you come on and off the site, you're aware that safety is an important aspect.

Keri Pearlson: Well, and you see those posters, "My child is expecting me home tonight."

Christopher Reichert: Yes.

Keri Pearlson: I'm not going to do anything that gets in the way of being ... and, that really gets at the personal side of it too. Personally, “I want to be safe and get home. I've got a family that I care about.” But in cybersecurity we don't quite see the same mental leap, because first of all it's hidden.

We actually have seen this happen, we've seen phishing emails sent out, we've seen executives who know it's a phishing email click on it, and we've asked them “Why?” You know what their answer is? “I wanted to see what would happen." Well you know, in an industrial controlled environment you wouldn't see a sign that said, "Don't touch this boiler or chiller," and people go up and touch it to see what happens.

Christopher Reichert: Right.

Keri Pearlson: You don't do it. You know the physical response, but when it comes to technology it's kind of hidden, so people aren't sure really what's happening.

Christopher Reichert: And that might be a bad consequence of having too much faith in technology.

Keri Pearlson: Exactly.

Christopher Reichert: Thinking that, "Well, I've got an antivirus." Or, "I've got something in my browser that might protect me."

Keri Pearlson: Right, so if you're of the mindset that the technology is keeping me safe, you wouldn't have any problem pushing on a phishing email that might do something bad. But if you feel it's your responsibility not to, you're more likely to report that phishing email to the security professionals.

Christopher Reichert: Aren't you challenging the basic business model of bungee jumping? That big chord behind you?

Keri Pearlson: That's right. How much do you trust that big chord behind you?

Christopher Reichert: One of the things that I think has come to the fore is that it's not just a 400-pound guy sitting on his bed somewhere, as a famous tweeter once said, right?

Keri Pearlson: Right.

Christopher Reichert: There are state sponsored security breaches or attacks. I thought it was very interesting that one of the things that you're looking into is using the knowledge of the attacker’s business model. I was particularly intrigued by the Porter Value Chain Model. I found that fascinating, just to think of the dark web attackers not just as that 400-pound basement person, but actually as very sophisticated, very organized, highly-funded organizations out there. Tell me how that fits in, and how does Porter’s Value Chain fall into that?

Keri Pearlson: Yes, sure, thanks for asking. This is another research stream we have going in our consortium. We're looking at the business of the dark web, if you will. Or, the ecosystem of the dark web. What we've noticed is exactly what you said, that the hacker of today doesn't necessarily imply a hooded, tattooed, pierced, fringe individual sitting in a dark room trying to break into your company. It might be state sponsored, or it might just be some business person with bad ethics. Somebody who just wants to steal money, or somebody who wants to do mischief, and they don't have to be a hacker anymore. There's a whole marketplace out there on the dark web of services, and you, or me, or any other business person could go out on the dark web with the right credentials and the right knowledge, and pull together all the components that are necessary to build an attack vector. If you don't even know exactly what you're doing, there's a service that'll help you build what you need to do, and tell you what pieces you need to have.

They're not necessarily well-funded state agencies, or governments outside of our government, or any government. This is the ecosystem. Once you start to look at the dark web as a well-organized, well-structured ecosystem, and we think Michael Porter's Value Chain Model is a nice way to look at it. Then, all of a sudden you start to see a different way to protect yourself. One way might be that we start seeing activity around a particular service, in which case we might be able to make some assumptions of the kind of attack vector that could be coming. Another might be that we disrupt some of these services if we see activity around them, and that would give law enforcement, or the other companies that help try to keep us safe, some indication of where to put their resources.

Christopher Reichert: When you say “service” you're not talking about banking services, you're talking about literally at the port level services on…?

Keri Pearlson: No, we're talking business as a service.

Christopher Reichert: Okay.

Keri Pearlson: Anything as a service.

Christopher Reichert: Got it.

Keri Pearlson: Not all the services that are designed were designed for bad guys. For example, there are services that take Bitcoin. That's not necessarily designed for a hacker, but in the hands of a mal-intended person, it could be used as part of an attack vector. A ransomware attack for example. Sorting through or sending out phishing emails. Companies send those out all the time to try to figure out if their company is safe. But in the hands of somebody trying to launch an attack vector, it could be knocking on the door to see where the keys are under the mat, where the person is that’s not vigilant. These services are well-formed components of the Value Chain, and that's why we like the Value Chain Model.

Christopher Reichert: That's something that's always confused me, is that the internet and the whole technology under it, is a human construct. It's defined by protocols, and you know very much about it. In some ways we know, I guess this is the confusing part for me. Phone numbers for example, these are things that are assigned by an authority of some sort, telephone company, or whatever. Google, whatever it may be. Same thing for IP addresses, somebody knows who assigned that IP address to somebody else. Now, I know there's all sorts of ways of hiding yourself, but is one of the ways to approach actually finding the source of the problem, to unpack some of the way that the whole internet is structured?

Keri Pearlson: That's a really good question. I think that would give one avenue of perhaps defense, or building defenses and protection. I'm not sure that it's easy to unpack the internet the way you're talking about. It seems to me that part of the value, if you will, or the positive side is that it's so distributed.

Christopher Reichert: By its very nature, sure.

Keri Pearlson: By its very nature. And actually, my brain is going to another place, which is one of our projects on Blockchain, which is sort of the same concept of distributed ledger in that case.

Christopher Reichert: Trust, right?

Keri Pearlson: But, distributed information. To actually change something when it's that distributed is really hard, perhaps impossible. Perhaps technology in the future will be able to do that, but it seems at this point in our evolution of technology, it's something that could be done, I just personally don't know of any solutions for it.

Christopher Reichert: I thought maybe as this IPV6 came out that, that would be like, "let's add the security layer in there, from the get-go,” like a fingerprint of some sort.  

Keri Pearlson: But, I think we do ourselves a disservice when we think that technology is going to keep us safe.

Christopher Reichert: Well there you go, I fell back into the trap.

Keri Pearlson: Yes, see, there you go. I think it helps, and we're seeing a lot of advances in AI and ML also, trying to unpack the information that we're seeing, and try to understand where the next vulnerability might be. But, I think no matter how good we get, we have to keep all the doors safe. The bad guys only have to find one open door.

Christopher Reichert: That's right.

Keri Pearlson: It just seems like an almost impossible task, to keep all the doors safe.

Christopher Reichert: I've gone down the technical path, my mistake.

Keri Pearlson: No, not at all.

Christopher Reichert: No, no. But I want to bring it back to the culture side of things. What's the kind of work that you're doing with organizations, and how do you recruit them, how do you choose good possible candidates for it? In other words, are there ones with “bad culture” perhaps? I mean I'm going to have to say that with air quotes. And how do you measure success in the research, or even in the findings?

Keri Pearlson: The Culture Project started about three years ago, two and a half years ago. CAMS has an annual conference, and in July two years ago, we had a workshop where we brought together all of the companies that support us, and we said, "How do you keep your people secure?" That was just the question we asked in the workshop. We broke into small groups, and everybody talked amongst themselves, then we reported back. We realized that there were some patterns, what we were hearing from the hundred or so people in the room. And so, that launched the project. Then, we reached back out to our companies who are a part of our membership base, and we've done some case studies on them, we've published some papers on them. Many of these are available on our CAMS website. Some are publicly available, so if any of your listeners are interested they can go to our public website, and take a look at some of these papers.

Christopher Reichert: Give us the URL on that.

Keri Pearlson: Sure. It's CAMS is Cybersecurity At MIT Sloan, C-A-M-S.

Christopher Reichert: Great.

Keri Pearlson: We started out with, I would say leading practices. We don't really know what's the best, but we see some leading practices. I can give you some examples from that if you'd like, but we've documented several companies that have some interesting leading practices. We then created a survey, and the survey has been given out to 150 to 200 different companies. We're in the process of analyzing that data right now to see if we can validate some of our hypothesis, and we are still collecting data. Right now we're really interested in global issues, so how does the local culture—US culture being different from Brazilian culture, being different from Italian culture. We think local culture plays into how you're going to view cybersecurity. We think things like local, not just local, but any regulation. If you're a bank, you're going to be subject to different kinds of financial services laws, than say a consumer products company. If you're operating under GDPR, you're going to be much more aware of privacy issues than, well pretty much everyone is today. But, it's going to be different for different companies based on that.

So those are going to drive the values, attitudes, and beliefs also. But, really interesting is your question about, “How do you know you're successful?” We've asked that of every company we've talked to, and nobody has a good answer. In fact, our case studies which we use for teaching, end with that. “What should we use to measure? How do we know? What is the right way to know if we've been successful?” We can't tell you we're successful just because we haven't had a breach.

Again, just because nobody fails the phishing exercise doesn't mean you're more secure. We don't know the answer to that question yet; that's another research stream we have. We would love to have people fund us in that by the way. If somebody who's listening is interested in answering that question, they should get in touch, because we really want to figure out that answer. How do we answer the question, “How secure are we?”

Christopher Reichert: How do people get more involved in learning? Would they approach you to have their company be part of the research? Or are there things that professionals can do to become better at it themselves? Whether it's the culture, or the technology side?

Keri Pearlson: Yes, that's a really good question, and thank you for giving me an opportunity for the commercial. CAMS is like every other research center at MIT, we're self-funded. People join us, companies join us, so we have a number of companies that are sponsors of our research. Any of your listeners who are interested can go to our website at, get information, or reach out to us. We also do sponsored research like any other part of MIT, so if somebody has a particular question they want answered, we are totally open to trying to see if any of our research answers that, or if they're interested in sponsoring research that would answer that. Of course, we love gifts too, so anybody wanting to donate money, we'll take that.

But individually, how can you raise your awareness? We want to solve that problem too. We're partnered with Sloan Exec Ed Programs, and there are two that we are teaching from our research. One is an in person program we offer three times a year, and one is a online program offered, I think more than three times a year. You can do the online program from wherever, and we have people all over the world that do that. We also have Sloan graduates, and Sloan students, who regularly just hangout with us in one way or another. If somebody's really interested in getting involved, they can reach out and I can talk to them about that also.

Christopher Reichert: Excellent, wonderful. There's so much to cover, I was thinking about IOT security and endpoint issues. With so many devices being smart and removed from a premise or put into circumstances, which it's impossible to predict, you know? A phone could be lost, a computer could be lost, it could be breached in many different ways, at an airport, or wherever the case may be. But in thinking on an industrial level, that things are designed in one way and they're put into environments that are remote from the provider, and so again, the control is an issue. What's the research that you're doing on the endpoint cybersecurity and IOT component? Since, I think that's really going to be a huge growth area for industry.

Keri Pearlson: We have another research stream, we have about 35 projects in various forms. If your listeners are wondering why we're covering such a broad map, I should say that we have so many different projects that we're doing. We have a few priorities, but we mostly look at the strategic, governance, managerial, and organizational issues around cyber. And, endpoint security is another one because there are technical solutions, but the question really becomes a managerial question, “How do we keep our endpoints secure?” Most organizations put a defense and depth strategy in place, where they have multiple levels of security, if you will. One level, or maybe multiple levels is around policy and people, some levels are the firewalls and the technologies that we normally think about for security.

Securing the perimeter is another piece of that defense and depth strategy, and endpoints of course are important. Stuart likes to talk about his toothbrush. I think his wife has a toothbrush that's connected to the internet. Now, you may wonder why would you want to connect your toothbrush to the internet. But, there could be benefits to your dentist being able to monitor your habits and help you have better oral health because your toothbrush is connected. We're pretty much seeing everything connected at some level, so then the question becomes how do we make sure they're secure. It's really nice that they're all connected, we all see the benefits. But, the minute it's connected, it then opens up a vulnerability. The particular research you're referring to is looking at white lists versus black lists. The space on these IOT devices is much smaller. The storage area, the amount of code you could put on there is limited.

Christopher Reichert: Right, the wrong chip, or whatever it is, right.

Keri Pearlson: Exactly, so that project is looking at “How do we white list?” Instead of saying, “What shouldn't be accessing us,” let's just make a small list of what should access us.

Christopher Reichert: Right.

Keri Pearlson: That's a management question, you know? How do you want to manage that? Of course there's a technology component and our researcher is building prototypes to show how to use white lists versus black lists. But, the other piece of this is Blockchain. People seem to think the Blockchain is the answer to a lot of this, that you can use the Blockchain as a means for securing endpoints. We have mixed thoughts on that right now. We do think the Blockchain technology itself, like other distributed technologies, has some play in the security world. But, it's the system around it that gets in the way, and I think one of our researchers wrote a paper about the 70 ways that Blockchain's been violated. I think that's in one of the public popular press magazines these days. But, it's not secure itself because it’s in a system.

Christopher Reichert: Right.

Keri Pearlson: Then, all of a sudden, you've got to be making sure that all the aspects of the system are secure. Those are two of the avenues we're exploring with our endpoint security. The other piece that plays into this is what I mentioned earlier, the IT versus OT. So, Operational Technology versus Information Technology. We started out as a critical infrastructure focus, so most of the organizations we worked with were part of energy, or oil and gas, or telecommunications, critical infrastructure companies. In those organizations the Operational Technology people have a very different approach to information security than the IT people. The stuff like what you and I used to do in offices. They already think about security in a different way, and there are lessons, we've talked about a few from physical safety that apply to the cyber world, that play in that world. But, we also have another research stream looking at cyber safety in OT worlds, and we have quite a few projects looking at energy delivery systems, and keeping operators up to speed, and looking at the whole thing as a system using Systems Dynamics, of course, a popular approach from MIT, to try to keep the whole operational system and industrial control system safe.

Christopher Reichert: I was thinking about the way Tesla pushes out updates to vehicles, automated, right? In some ways the end user doesn't have control. I suppose they may have given control to Tesla to do that when they first got the car, but I can see how that would be a huge risk if you're relying on ... But, then that also relies on all sorts of other systems, GPS to keep you in your lane, distance and radar, and there's all sorts of ways that could be a shenanigans parade.

Keri Pearlson: Yes. Well, the autonomous vehicles is another avenue, we have a researcher looking at that. We actually haven't dived into Tesla per se, we've been looking at unmanned vehicles used in mining and military, and other applications. But it's the same technology, the same idea.

Christopher Reichert: Right, agriculture.

Keri Pearlson: Agriculture, yes. Once these devices are unmanned or computer controlled, they can be controlled to do bad stuff just as they can be controlled to do good stuff. To avoid pedestrians or hit pedestrians, it's a matter of code. It's a little scary.

Christopher Reichert: Yes.

Keri Pearlson: We want to make sure that we bring the MIT brainpower, to answering those kinds of questions. But yes, it is an issue when you start to look at distributed updates, but you see that even in the IT world. I'm sure you get regular updates for your laptop. I do on mine. We just click and say “yes” because we assume they're the right ones.

Christopher Reichert: Right.

Keri Pearlson: It would be really insidious if somebody sent me an email that said, "By the way here's an update to your laptop," and I clicked on it, and it was actually some sort of malware.

Christopher Reichert: Right. That's the sort of phishing thing that we talked about earlier. Are there things that people can do to—I know there's the tests that you do, right? That people pass or fail. But, is there some obvious tips that we don't know about that you can pass on?

Keri Pearlson: Well I'm not sure you don't know about them, but let me tell you a few tips that we regularly tell people. First of all, phishing is usually when you receive an email from somebody like, "I have a great uncle in Nigeria who just died and left me 10 million dollars, and I want to give you a piece of that, send me your bank accounts." I think most of us know that's fake.

Christopher Reichert: I love the stories by the way, of the people who taunt them for a long time. That's a dangerous game!

Keri Pearlson: Yes. But, there are people who unfortunately do fall for that, and they're usually elderly, or uneducated, or-

Christopher Reichert: Or if you're doing it quickly, you're just kind of clearing it.

Keri Pearlson: They're doing it quickly. Obviously there are things at one end of the spectrum that we all recognize. There are things at the other end of the spectrum that we would have a really hard time recognizing, and there are stories, you can do Google searches on some of these, where you get an email that looks like it's from somebody you know. Usually we call these spear phishing, because they know something about you, so they can write the email in a way that makes it sound like it's plausible. We've heard stories and documented examples of executives that were out of touch sending an email, and I'm air quotes, sending an email to somebody. "Please wire 10 million dollars and don't tell anybody because I'm working on this secret deal in China, and I need this money tomorrow," and it turns out that, that was spear phishing.

Both of those, those are two ends of the spectrum, and of course everything you could imagine in between, and things you can't imagine in between. We can't even think of them all.

So, how do you stay safe? First of all, anything that looks suspicious, investigate. Don't just quickly look at your email and answer things. Don't quickly click on links in your email that might be suspicious. Err on the side of suspicion in your emails, only because right now the technology can't even keep up with all of the bad emails that do get through to your system.

Christopher Reichert: Very clever, right?

Keri Pearlson: That's number one. Number two, if there's a link in your email, hover over it. Most email systems will tell you where that link's going before you click on it.

Christopher Reichert: It might look like Amazon, but it's something else.

Keri Pearlson:  And look carefully at it. It might say, " …” Something else.

Christopher Reichert: Yes, right.

Keri Pearlson: It looks like it's Microsoft, it looks like it's Google, it looks like it's Amazon, but if you look carefully, it isn't. We regularly send emails amongst ourselves saying, "I got this one, do you think this one's real or not?" We jokingly try to catch each other. I mean, none of us are clicking on them, we use it as discussion amongst ourselves.

Christopher Reichert: Right.

Keri Pearlson: But, hovering over it. And then as a last resort, send it to your IT people. Almost every company has an IT mailbox that you can send spam, or investigate it before you click on it. Or, call the person that it's from and see if they really sent you that email. I get this one email regularly from a former colleague from my SIM chapter. His email for some reason is consistently hacked, and I keep getting these emails. The last one said, "I need some help, I need a debit card from Walmart. I need $100 worth of stuff, could you help me? Could you go to Walmart? I'll pay you back when you bring it to me."

 Or something along those lines. I called him and I'm like, "Is this-"... And he didn't know. He didn't even know to go after and stop that kind of thing.

Christopher Reichert: That's right. Yes, it's not obviously because it's silent behind that.

Keri Pearlson: Yes, exactly that. Silence is what we all worry about. Silence and fear. Fear, uncertainty, and doubt.

Christopher Reichert: Right.

Keri Pearlson: Don't let that get in the way. If you see something that looks suspicious, it's that old, "See something, say something." If you see something, ask the person, bring it to the surface. Don't just assume that it's good or bad.

Christopher Reichert: Right.

Keri Pearlson: The more we talk about it, the more aware we become.

Christopher Reichert: Absolutely. Well before we go I have two questions for you.

Keri Pearlson: Sure.

Christopher Reichert: What's your personal definition of success? And not so much work ... it could be work, it could be anything else.

Keri Pearlson: My personal definition, you mean success for me or for research?

Christopher Reichert: For you, yes. How do you feel successful?

Keri Pearlson:  How do I feel successful?

Christopher Reichert: Yes.

Keri Pearlson: I feel successful when I see that “aha” moment in somebody's eyes, that they get this new idea that we're talking about, and they want to engage in discussion about how to make it even better, or how to implement it in what they're doing. I think I'm a teacher at heart.

So, bringing these new ideas to the surface and have it be communicated in a way that people understand, is something they hadn't thought of, perhaps outside the box. But, that makes a difference in the way that they view the world, or go about their daily business. For me personally, that's success.

Christopher Reichert: Excellent. And, what's the last thing you really geeked out about? Or maybe given your work, you-

Keri Pearlson: Besides this discussion?

Christopher Reichert: ... anti-geek out.

Keri Pearlson: I geek out regularly.

Christopher Reichert: Yes, that's right, so maybe the opposite of that.

Keri Pearlson: The last thing I geeked out about? Well personally, I have a textbook that I've written, it's in its 20th year, 20th anniversary is this year.

Christopher Reichert: Congratulations!

Keri Pearlson: Thank you! We're updating the seventh edition, and I geek out regularly about that. An ironic story is, the first one I wrote which was back in 2000, I was not doing this cybersecurity research. I was a researcher just in general in IT leadership, and the book is around the managing and using of IT, or information systems. It's written for MBA's who need to be knowledgeable participants in IT discussions, because of course back in 2000, you might remember this. Business people just thought that we were taking care of all the technology for them, and we'd make decisions that impacted their business opportunities. 

This textbook was originally written based on a lesson I did at the University of Texas at the time. But over the years its evolved, and this last one, the security chapter was super important to me, because now I'm in cybersecurity. We were geeking out with one of my co-authors about, “what do we put in here, how do we really talk about cybersecurity?” It's more than just awareness and training. I really believe it has to be part of the ether, and part of the environment, and what we just talked about earlier, about the culture. I geeked out about that.

Christopher Reichert: The good news is I think that people are getting more sophisticated in the conversations they're having. It's far beyond just the passwords and firewalls, it really is as you say, it really needs to be put into the DNA of a culture of an organization.

Keri Pearlson: Sounds good.

Christopher Reichert: Well I'd like to thank Keri Pearlson, the Executive Director of the Cybersecurity Program at MIT Sloan for joining us on Ideas That Matter.

Keri Pearlson: Thank you.

Christopher Reichert: Thank you. I'm your host, Christopher Reichert, and until next time.

Sloanies Talking with Sloanies is produced by the Office of External Relations at MIT Sloan School of Management. You can subscribe to this podcast by visiting our website,, or wherever you find your favorite podcasts. Support for this podcast comes in part from the Sloan Annual Fund, which provides essential, flexible funding to ensure that our community can pursue excellence. Make your gift today by visiting, To support this show, or if you have an idea for a topic or a guest you think we should feature, drop us a note at