Red teaming is a technique that will help you avoid strategic blunders and make better decisions. Author Bryce Hoffman combines military diligence and playing devil’s advocate to help businesses overcome cognitive biases and avoid huge missteps. His time at the U.S. Army’s Red Team Leader Program at the University of Foreign Military and Cultural Studies became the core of his new book, Red Teaming: How Your Business Can Conquer the Competition by Challenging Everything.
Bryce is an author, speaker, and consultant. His 2012 book American Icon: Alan Mulally and the Fight to Save Ford Motor Company became manual for CEOs looking to transform their corporate cultures. Before he launched his consulting practice in 2014, he was an award-winning financial journalist.
In this episode, Bryce introduces the concept of red teaming and how militaries and companies around the world use the strategy to stress-test their decisions. We discuss how red teaming can help you uncover blindspots in your thinking and overcome cognitive biases that might otherwise lead you into a business quagmire. Bryce also talks about the application of red teaming in other cultures, and whether or not it can be effectively done as an individual.
Red teaming is an incisive problem solving tool that will undoubtedly make you reexamine how your organization currently makes decisions.
If you enjoy the show, please drop by iTunes and leave a review while you are still feeling the love! Reviews help others discover this podcast and I greatly appreciate them!
Listen in:
Podcast: Play in new window | Download
On Today’s Episode We’ll Learn:
- How one specific scene in a zombie movie piqued Bryce’s curiosity about red teaming.
- Why red teaming applies to business just as easily as it applies to military decision making.
- The impact cognitive biases have on our decisions and why it’s easier for other people to point them out.
- What essential qualities a leader must have to make the most of red teaming.
- Whether or not there is a cultural barrier to red teaming.
- How companies can justify the cost of red teaming with the huge strategic missteps they avoid.
Key Resources for Bryce Hoffman and the Psychology of Red Teaming:
- Connect with Bryce: Website
- Amazon: Red Teaming: How Your Business Can Conquer the Competition by Challenging Everything
- Kindle: Red Teaming: How Your Business Can Conquer the Competition by Challenging Everything
- Audible: Red Teaming: How Your Business Can Conquer the Competition by Challenging Everything
- Amazon: American Icon: Alan Mulally and the Fight to Save Ford Motor Company
- The Art of War by Sun Tzu
- Red Team Leader Program at the University of Foreign Military and Cultural Studies
- World War Z
- Thinking, Fast and Slow by Daniel Kahneman
- Dr. Gary Klein
- “Success is a lousy teacher. It seduces smart people into thinking they can’t lose.” – Bill Gates
- Polaroid
- Maslow’s hierarchy of needs
- Col. Gregory Fontenot, U.S. Army Retired
- Dale Carnegie & Associates
Share the Love:
If you like The Brainfluence Podcast…
- Never miss an episode by subscribing via iTunes, Stitcher or by RSS
- Help improve the show by leaving a Rating & Review in iTunes (Here’s How)
- Join the discussion for this episode in the comments section below
Full Episode Transcript:
Welcome to The Brainfluence podcast with Roger Dooley, author, speaker, and educator on neuromarketing and the psychology of persuasion. Every week, we talk with thought leaders that will help you improve your influence with factual evidence and concrete research. Introducing your host, Roger Dooley.
Roger Dooley: Welcome to the Brainfluence Podcast, I’m Roger Dooley. We’ve heard countless times that businesses war. Years ago, Sun Tzu’s The Art of War was required reading for aspiring business leaders. Today, any number of special forces veterans are willing to teach you how to apply their training and techniques to businesses and life. I wasn’t sure that today’s guest would be a fit with the Brainfluence Podcast audience, but then I realized that his insights are as much about psychology and how people make decisions as military strategy. Bryce Hoffman is an author, speaker, and consultant who wrote the bestseller, American Icon Alan Mulally and The Fight to Save Ford Motor Company. This book wasn’t just about business history. It also taught readers how to apply those strategies that saved the auto giant in their own businesses.
More recently, Bryce was the first civilian to graduate from the US Army’s Red Team Leader Program at the University of Foreign Military and Cultural Studies. The knowledge he gained there, and in subsequent research led to his new book, Red Teaming: How Your Business Can Conquer the Competition by Challenging Everything. Welcome to the show, Bryce.
Bryce Hoffman: Hi, Roger. Thank you for having me.
Roger Dooley: Well, Bryce, I think we’ve got a good place to begin here. Very early in your book, you talked about how this whole project was inspired by a zombie movie, which I have to admit is one of the more unique inspirations I’ve run across. Why don’t you explain how that happened?
Bryce Hoffman: Sure, Roger, as long as you allow me to preface that I’m not actually a zombie movie fan, which was part of the point. In 2013, several of my friends, whose opinions that I value, suggested that I watch the movie World War Z. They called it, “The thinking man’s zombie movie.” I didn’t value their opinions enough to actually go and watch it. However, a few months later I found myself home sick one day, flipping through the new releases on Amazon On-Demand, it popped up and I started watching it. Roger, I can’t tell you much about the movie except for one scene, which really just struck me like a zombie sneaking up behind you and whacking you on the back of the head.
There’s a scene early on in the movie where the hero, Brad Pitt, is sent to Israel to find out why Israel is the only country in the world that has not succumbed to this zombie plague that’s swept across the rest of the Earth. When he gets there, what he finds is a Mossad officer who explains to him that Israeli Intelligence has a concept called the Tenth Man Doctrine. He explains that this concept was developed after the 1973 Yom Kippur War. It was based on the intelligence failures that happened in the lead up to that war. Those intelligence failures led the Israelis to come up with a system that required … If all of the members of their security council agreed on something, then one person, the tenth man, had to argue that the opposite was true.
While they got the same fantastical information that the rest of the world got about this fast-moving zombie plague from India, they choose not to ignore it, because the tenth man argued, “We should at least seal our borders.” It really doesn’t matter, Roger. In the movie, a few minutes later, Israel falls to the zombie plague too, but it mattered a lot to me. As you mentioned, I had written this book about Ford Motor Company’s turnaround. I had ended up quitting my 20-plus year career as a financial journalist to start helping companies implement Alan Mulally’s management system. I was looking for a way to make it even stronger, to make it more self-critical, to help companies that were successful like Ford had become with Alan’s help figure out how to stay successful.
I started looking to find out if this Tenth Man Doctrine is really true. Roger, what I found out is the answer kind of, “Well maybe.” The Israelis, it turned out, didn’t know anything about a Tenth Man Doctrine, but they did have a group that functioned exactly like the movie described. They called ipcha mistabra, which is, they explained to me, is Aramaic for, “On the contrary, it appears the opposite is true.”
Roger Dooley: Probably Tenth Man Doctrine was a little bit more palatable to audiences.
Bryce Hoffman: Yeah. It does roll off the tongue a little easier. The group, which is an office within Israel’s Director of Military Intelligence does exactly that. Their job is to take whatever the prevailing view of the organization is, on strategy, on tactics, on what Israel’s adversaries are doing and argue that the opposite is true, or that the prevailing wisdom is wrong. They’re not judged by how right they are, because their job is not to come up with a better plan. Their job is to make the existing plan better. When I learned that this is really true, I thought, “Gosh. Businesses need this. Every bit as much as the Israeli Defense Force does.”
The Israeli’s told me an interesting thing. They kept referring to this as, “Our version of Red Team.” I used to cover the high-tech industry, and I heard Red Team referred to in cyber security circles as kind of the white-hat hackers who try to break into networks to find hidden vulnerabilities and patch them before the black-hat hackers do. That was clearly not what the Israeli’s were talking about. They said, “Well, if you want to know more about Red Teaming, your country actually has the best Red Teaming program in the world.” The Army’s program at Fort Leavenworth, Kansas. That’s where I went to learn about Red Teaming.
Roger Dooley: Mm-hmm (affirmative). Well, it seems like this sort of Red Team concept has been around for a long time. War games have been played for years and years and years where somebody simulates being an enemy to try and probe weaknesses in Army. Air Force and Navy pilots have trained against pilots who are simulating opponents and trying to come up with strategies using something similar to the opponent’s equipment and techniques and so on. How did the formal concept of Red Team Training arise?
Bryce Hoffman: Roger, you bring up an excellent point. Just as I had heard about Red Teaming in the context of cyber security, there is, as you say, this existing concept of Red Teams as threat emulation red teams that kind of take on the role of the adversary in war games. What I’m talking about, what my book is about, and what I’m referring to when I speak about red teams is something very different. It’s what is formally known in military and intelligence circles as Decision Support Red Teaming. It’s, essentially, a system that draws together a whole array of tools from different disciplines. Ranging from cognitive psychology, to intelligence analysis, to just kind of critical thinking 101, and brings them to bare on strategic questions about planning, about decision making, and really is designed to kind of help organizations stress-test their strategies and make better decisions. That’s the type of red teaming I’m talking about.
Roger Dooley: Mm-hmm (affirmative). You know, one of the things I didn’t expect to see in your book, Bryce, was a discussion of Kahneman’s System One and System Two thinking. I’m sure most of our listeners are familiar with them, since a key element of neuro-marketing and behavioral marketing is that people think they make rational, logical decisions, that would be Kahneman’s System Two, but in fact, these decisions are often made mostly with System One thinking, emotional, intuitive, unconscious. Explain how Kahneman’s work influenced your own.
Bryce Hoffman: Kahneman’s work is really foundational to, particularly, the US Army’s view of red teaming, and the system that they developed. As is Dr. Gary Klein, who many of your listeners are probably familiar with his work too, on decision making. The reason I say that is because what red teaming, at the end of the day, is really designed to do, is overcome the sort of biases and heuristics that Dr. Kahneman has identified in his work Amos Tversky over the years and others who have followed them in the field. One point that Kahneman makes is that it’s much easier to see a minefield when someone else is about to step in it than when you are. That’s kind of the point of red team. The idea that you don’t know what you don’t know.
If you really want to find out, the only way to do it is to have somebody else look at it and help you see what you’re missing. That’s really what red teaming is about, is kind of being that other set of eyes that looks at a strategy, that looks at a plan, and says, “Oh, you know, is this really true, or is this based on your personal bias, or blind spot?” Or, “Do you just notice this omission here?” It was born out of the fact that agencies like the CIA, and groups like the US Army really fell victim to a lot of the kind of biases and blind spots that Kahneman talks about before 9/11 and before, and in those of Iraq and Afghanistan.
We all know, I think, at this point, the story about how there were all of these indicators that terrorists were preparing to hijack planes and fly them into the World Trade Center and Pentagon that were kind of lying right in front of US intelligence agencies, and yet they failed to connect the dots, because they weren’t looking for people to fly planes into buildings. They were looking for them to blow planes up in the air, because that’s what terrorists had always done. That’s a classic example of kind of confirmation bias, right? Or some of these other things that Kahneman talks about. Similarly, the US Military, in planning the invasion of Iraq, made all sorts of assumptions based on their success in the Balkan War and in the first Persian Gulf War that led to catastrophic problems in Iraq, many which we’re still dealing with today.
If you talk to people at the Pentagon, and in the intelligence community, they’ll tell you that because we had these easy victories in the United States, you know, these one-sided wars in the Persian Gulf and in the Balkans in the 1990s, we had really come to believe, I mean, the prevailing view, Roger, was that America’s technological superiority and mastery of information would guarantee victory abroad and security at home for the foreseeable future. On 9/11, we realized that just wasn’t true. There was a concerted effort made to learn from these mistakes and to try to come up with systems for countering these things.
It’s interesting though, because Dr. Kahneman has not been very optimistic about our ability to overcome our biases and heuristics. Again, that’s what I said, that’s where red team comes in, is to recognize that, yeah, it may be difficult for us to see our own biases and heuristics, but if we work with other people, they can figure them out much easier than. That’s why I say that red teaming is not a different planning system, it’s a system for making your existing plans better.
Roger Dooley: Right. Well, I think one of the staples of behavioral science is that people, quite able to recognize the biased behavior of others while they feel they, themselves, are not subject to that, because obviously they understand all this bias stuff, and there’s no way that they would be influenced by these, but they would readily agree that most other people are influenced by them.
Bryce Hoffman: Yeah. I mean if you look at just the past few decades of business history are just filled with cautionary tales about companies that fell victim to these various sorts of biases and heuristics that Kahneman warns us about. For instance, if you look at Blockbuster Video, you know, Netflix offered to sell itself to Blockbuster for $50 million. Not once, but several times, and Blockbuster, if you look at interviews that were given out by the CEO and CFO of Netflix, they talk about how they were laughed out of Blockbuster’s office when they made these offers, because Blockbuster thought they were worthless. It turned out that it was Blockbuster that was worthless, because they couldn’t see that the business model it had was going to be trounced by this new model, because it was working. Again, it goes to this other point that I think is really saline and is very much related to these biases, which is a quote I really like from Bill Gates, that success is a poor teacher.
The more successful companies are, the harder a time they have seeing the ways in which they’re doing things may not be the best way to go forward. That’s why get Uber’s emerging that they’re kind of able to just kind of trounce companies and beat them at their own game.
Roger Dooley: Right. I think the other problem with success is that changing the way you do things dramatically may actually reduce your success in the short-term. You sort of have to obsolete your own products. If you’ve got a wildly profitable product, there’s a real tendency not to want to obsolete that, for obvious reasons. Particularly if you have shareholders to answer to and so on.
Bryce Hoffman: Absolutely. Another great example in that regard, Roger, is Polaroid. Most people forget that in the early 1990s, Polaroid was actually the market leader in consumer digital cameras. They had the best consumer version of digital cameras at the time and some of the best technology in the pipeline. You know, one of the biases that Kahneman warns us about is loss-aversion. People are more likely, Kahneman has demonstrated, to act to prevent the loss of something than they are to gain something that could be much better. Polaroid’s senior executives had a meeting in which they discussed their strategy, film versus digital. They looked at the margins on their film business. Yeah, the business was shrinking, but the margins were like almost 70% on film. They were only like 30% on digital cameras. They had this discussion amongst themselves about like, “Golly, why would we want to do something that’s going to further erode the part of our business that has these phenomenally great margins? We should double down on that.” They actually made the decision to effectively kill their own digital camera business in order to support their film business.
And yet, these were not stupid people. Objectively, if someone had walked in the room and said, “Hey guys, did you think about the fact that other people might still pursue this whole digital photography thing, and it may not matter whether we decide to pull the plug on our program?” They probably would have said, “Oh gosh, you’re right. Why did we even have this discussion?” But it’s very hard to do when everyone in the room is suffering from the same biases.
Roger Dooley: Yeah. Of course, I think that others in the real world of corporate decision making there are often factors, like if you’ve got a CEO who’s making an eight-figure salary and maybe looking at an eight-figure bonus if he makes it through the next 12 months, there may be a greater tendency to focus on that, as opposed to what’s really right for the company in the long run. While we’re talking about CEOs, how often in poor decision making processes is the problem a leader, say a general in the Army, or a CEO, or even some other executive, who just doesn’t tolerate dissent? I mean, Steve Jobs was a classic example of a leader who ignored market research and even a lot of the input from his own team, but who often managed to make the right choices. In my experience at least, there are a lot of executives who act that way, but simply aren’t close to being Steve Jobs. I’m assuming that’s quite often a problem, just a leader who does not encourage dissent.
Bryce Hoffman: Absolutely. One of the things, one of the first things that they taught us in the US Army’s Red Team Leader Program is that you can’t red team in the fear bunker. That really gets to the heart of what you’re talking about, Roger. Red Teaming only works in an organization where the leadership is willing to accept the possibility that there may be a better way to do something. That there may be something that they’re missing. If that exists, then red teaming can make a good leader into a great leader, and can make a good company into a great company. Red teaming can’t take a bad leader and make them into a good leader.
You know, I make this very clear in my book that red teaming is not for every organization. It’s for organizations that want to do better, that believe in continuous improvement, that believe that no matter how successful their plan is today, it could be even more successful tomorrow. For the rest of the companies I say, “Don’t red team.” You know? You can just sit back and wait for your competitors to do it to you, because one way or another you will be red teamed.
Roger Dooley: Right. Are there cultural differences that inhibit red teaming? I mean, some Asian and European cultures seem to be very hierarchical where questioning or disagreeing with a higher ranked individual would be impolite at best and maybe suicidal at worst. I’m guessing in North Korea Kim Jong Un would have difficulty getting volunteers for a red team. Even in less extreme environments, that’s often a real deference to authority. Is that a problem?
Bryce Hoffman: You know, it’s both a problem and an opportunity, Roger. One of the most successful red team engagements I had took place in a Japanese corporation. The reason is because a lot of the tools that red teaming relies on are what the Army calls, ‘liberating structures’. They’re basically group thinking mitigation techniques that are designed to use things like weighted anonymous feedback and other techniques that kind of allow people to seek the truth in an organization without repercussion. I was working with a major Japanese company last year that was looking at a potential acquisition deal. One of the things that we did as part of that was to have everybody, you know, to basically break the acquisition strategy down to the assumptions it was based on. That’s really, that’s kind of what red teaming’s core is all about, Roger, is recognizing that any plan we have, any strategy we have, any decision we make is based on a set of assumptions, some of which are stated, some of which are unstated.
I worked with them to break the plan down into the assumptions it was based on, and then we had all of the senior executives and kind of one-level-down executives that were involved in this deal look at those assumptions and gave them all three-by-five cards and said, “Write in just block letters,” and I give them all the exact same pen, “write down the assumption on this list that you think is most likely to prove untrue in execution.” We gathered up the index cards and we tallied the answers. What we found was out of about two-dozen assumptions, there were three assumptions that the vast majority of people kind of cast their vote for.
We were able to say, “Now folks, let’s look at these three things, because a lot of you sitting in this room say you’re very concerned about these three things? What is it about these things that cause concern?” Then, we were able to have a very robust and an open discussion about these three elements of the strategy. It ended up in the deal being kind of modified to address these concerns. After that session, Roger, I had two different executives come up to me, one was very senior, and one more of a middle-level executive. The middle-level executive said to me, “I really appreciate this, because I’ve been feeling what I wrote on my card since the day this deal was presented to me, and it’s been bothering me. I knew that I could never say anything about it, because this deal originated above me in the hierarchy, so I couldn’t criticize it. This technique that you just showed us has allowed me to give voice to a concern that I would never have been able to give voice to before.” And he said, “I plan to use this with my teams going forward.”
Interestingly enough, though, one of the senior members of the team came up and told me, “This is so valuable, because I know that my people think these things, and I tell them all the time, ‘I want to hear the honest truth. I want to hear your honest opinions about these deals’. They don’t give them to me, because in our culture, that’s very difficult.” He also said that he found this valuable, because he was able to kind of get people to speak the truth to power that is often lacking in these hierarchical organizations.
Roger Dooley: Mm-hmm (affirmative). Well, that makes a huge amount of sense. By anonymizing it, you can get around some of those barriers in the hierarchy. I had the same experience years ago in Mexico where I was managing a project and I found that my American staff members were more than willing to give me feedback, sometimes more than I wanted, but the employees who were Mexican, culturally, just found that very difficult to not be quite deferential. Doing it anonymously probably helps a lot. Bryce, you mentioned that the Army has an engineering mindset. Define a problem, identify a solution as quickly as possible, and then figure out how to implement it. I’m sure many tech and manufacturing businesses would agree with that approach. Often, they’re run by engineers. I’m an engineer by education. I always felt that the problem solving approach that I learned in undergrad school really served me pretty well in business too. You make the point that this engineering mindset isn’t always a good thing. Explain why that is.
Bryce Hoffman: For the very reason you said, because there’s a tendency when you approach problems that way to coalesce around a solution as quickly as possible. To do some real, rigorous analysis, which is good, but as soon as a solution appears to kind of coalesce around that solution as rapidly as possible. When people start coalescing around a solution, they stop asking questions about it. Again, that’s Kahneman 101, right? Red teaming recognizes the value, the analytical value of that approach, but it complements it by adding a critical thinking element to it to kind of say, “Okay, is this really true and is it likely to be true under all circumstances?”
Let me give you an example of what I mean by that. My instructor at the University of Foreign Military and Culture Studies at Fort Leavenworth was literally the Colonel who planned the invasion of Iraq. He was literally the person in charge of planning the invasion of Iraq. He told our class a story about a bomb. The US has a bomb, and I don’t know what the technical explanation of it is, but it’s a bomb that explodes in the air, and it releases a cloud of carbon filaments, carbon-fiber tape that drift down gently on the breeze and land beautifully on high-tension lines and short them out. It’s a very effective way to rapidly take out a country’s electrical grid.
We used this bomb very effectively in the Balkan Wars on Serbia. In a matter of a couple of days, plunged the country into darkness and deprived their command and control centers of power or forced them to rely on generators, which created a whole new logistics plan, et cetera, et cetera. The war planning team thought it would be great to use this wonderful bomb in Iraq too, because it doesn’t even kill anybody. It just goes off in the air and plunges the country into darkness. Now, the thing he told our class is that once that tactic was put on the table, people rapidly started saying what was great about it. “This is wonderful.” “We could … We’ll have all these desirable effects that we want to accomplish.” The other thing he said is that there was not a single officer in that room who wasn’t familiar with Maslow’s Hierarchy of Needs, because the Army teaches offices Maslow’s hierarchy. Maslow’s Hierarchy teaches us that people don’t care really about whether they live in a democracy or dictatorship if they don’t have running water and refrigeration and things like that.
While depriving them of those things in Central Europe in a relatively temperate climate may not be the easiest thing on them. Depriving them of those things in the desert in the middle of Summer can be really problematic. Despite the fact that everyone in that room was aware that Maslow’s Hierarchy would state this was going to have a very negative impact on how our actions were viewed by the populace in Iraq, we decided to go ahead and do this, because people had rapidly coalesced around this idea that this was a great way to debilitate the command and control systems of the Iraqi military. We are still, today, dealing with the consequences of that decision in Iraq.
Roger Dooley: Shifting gears a little, Bryce, you defined different ways that businesses can implement red teaming, like having people dedicated full-time, or hiring outside consultants, or forming ad-hock teams. Any use of red teaming, I guess by definition, is going to add cost. Of course, there could be much bigger cost-savings or revenue increases that result from it, but most small companies, or even a lot of large companies struggle with resources constantly. When everybody is bailing water out of the boat to keep it from sinking, having someone else stand back and evaluate whether water is the real problem or bailing is the right solution, it just seems like an impossible luxury. I’m sure you get that objection from time to time. How do you counter it?
Bryce Hoffman: Well, Roger, let me break that comment down into a couple of points, because I think there’s a couple of different answers there. One is about the bailing water component of this. The first rule of red teaming that we were taught in the military and I talk about in the book is, as they taught us in the military, you don’t red team when the enemy’s in the water. Red teaming can never be allowed to get in the way of action when action is required. If you’re sinking, you don’t want to have someone standing back and asking, “Well, should we really be bailing water?” That’s not the point of red teaming. You don’t red team every decision, and you don’t red team every plan or strategy. It’s something that you use strategically when it’s merited and when time and resources allow. That’s the first point I would make.
The second point is to the question of how do you justify the cost. That is the most difficult challenge for red teaming worldwide. You know, as I mentioned in my book, red teaming has been so successful in the US military and intelligence agencies that it was rapidly adapted by allied militaries around the world. I’ll tell you a kind of funny story. When I was working on my book, I was talking to the brigadier who is in charge of the red teaming program for the Ministry of Defense in the United Kingdom. I told him, “You know, I’m having breakfast with Danny Kahneman next week. Is there any question you want me to put to him?”
He said, “Oh, yes. Absolutely.” I thought, “He’s got some deep question about like, how do you overcome availability bias.” Or something like that. His question was, “Could you please ask Dr. Kahneman how you could economically account for the value of the critical analysis that red teaming provides.” I said, “Well, with all due respect, brigadier, that’s not really Dr. Kahneman’s area of expertise. I mean, that’s a finance question.” His response was, “Oh, I know, but he’s probably one of the most brilliant people alive today. I’ve never been able to answer that bloody question, so perhaps he could.” That is really a challenge.
As the first director of the US Army’s Red Team Program put it to me, Colonel Greg Fontenot, “What is the sound of no dogs barking? How do you prove … How do you calculate the value of a system that prevents you from making a colossally bad decision if you never make that bad decision? How do you calculate the value of bringing in a red team facilitator that helps you avoid making a strategic blunder that could bankrupt your company if that blunder is never made, so you never see it.” I think back to a lot of the red teaming engagements that I’ve done, you know, out of necessity have had to be confidential, but one that I can talk about, I was working with Dale Carnegie Associates, the global company that runs Dale Carnegie around the world. I was brought in by their CEO, who had recently been hired himself to lead a turnaround for the organization. He asked me to lead them through a red teaming exercise on the trim plan.
We ended up, as a result of that, working with him and his senior leadership team identifying some significant points in the plan that really did not address some of the fundamental things they were trying to fix. As a result, they made major changes to this turnaround plan, as a result of that red teaming engagement. We had the discussion afterwards, which is, you know, that plan was presented to the board of directors after the red teaming exercise. They never saw the original plan. If you never saw the original plan, how did you know the value that red teaming brought to that plan, because it never happened? You’re right, that is a tough thing. It’s a decision every company, every organization has to make for themselves about what they can afford to spend on red teaming.
Fortunately, there is an array of options there. There are probably very few organizations in this world that could afford to establish a permanent standing red team unit of trained red teamers, but they exist, and they can be very powerful. They also have problems too. Each model of red teaming has kind of unique benefits and challenges. A lot of organizations might opt to have one person trained as a red teamer who can then put together an ad-hock red team just of other people from the organization, and kind of lead them through, or they can bring in an outside red team facilitator to lead them through. The same type of thing, that’s a lot of work that I do is around that.
Even if an organization can’t afford to do that much, there are things that you can do as a leader, as a manger, to make better decisions, make your strategies and plans better using some of the tools and techniques I talk about in my book. I’ll give you one example. I talk about an investment banking firm that has figured out how to use red teaming on a much more informal basis. It doesn’t add any cost at all. I describe the process that they use that they call Hall Brawl. Put simply, it’s basically an informal devil’s advocacy analysis. The way it works is this. The senior partner who started using this told me that he realized that the way that most investment firms make their decisions is they have a partner develop an investment strategy, and then present that strategy to the rest of the team, and then they talk it through, and they vote on it, whether they decide it’s worth pursuing or not.
He said, “You know. I had this realization that all the guys on my team know how to make money. That’s why I hired them. Rather than having them come in and having a discussion about how we can make money off of the investment pieces that they’ve put together, why not use a red teaming technique, like devil’s advocacy, and have them send out the proposal to everyone ahead of time so we can all look at it and see how we can make money off of it, we don’t need to discuss that. We use the time we’d normally use to have that discussion to discuss how we could lose money off of the deal. If everyone then comes and suggests, ‘well, here’s how I think we could lose money off of this deal’. If the deal still sounds like a good deal after we’ve had that discussion,” he said, “then we give it the green light, write the check, and we make the investment.” Their track record has been significantly better than their peers since they’ve started using this system.
That’s just one example. There’s a lot of others I talk about in the book of how companies can use these tools right now, without hiring anyone else, without bringing in any outside consultants, without doing anything else. Just to think a little bit more critically about their strategies and plans. The other thing, Roger, that I just tell, I tell everyone I talk to about red teaming, I say, “If nothing else, if you’re a manager, you don’t have to be even running an organization. If it’s just you want to make better decisions yourself, take the time to find someone you trust who can be your devil’s advocate and explain to them your strategy or your plan. Don’t explain the rationale behind it. Just explain it to them, or let them look at it, even better. Ask them to look at it with a critical eye and tell you the ways they think it could go wrong, or the things you might be missing, or the things it looks to them like you might have forgotten about.”
If you just do that, and you do it sincerely with someone who will tell you the truth, that can save you from a world of trouble right there.
Roger Dooley: Mm-hmm (affirmative). Is it possible, Bryce, for a person to red team himself or herself? It may be there are too many cognitive biases to overcome, but say I’m an entrepreneur, and I’m the only person in the company who really has a strategic overview, can I clear my desk, clear my mind, and red team myself?
Bryce Hoffman: You can use some of these tools to help you make better decisions. I just think it’s important to recognize that looking at things you’ve done yourself with a critical eye is never going to be as powerful as having somebody else look at them. That said, let me give you an example of how you can do what you were just talking about. As I mentioned earlier, if there’s kind of a core concept in red teaming, it’s this idea that all strategies, all decisions we make are based on a set of assumptions, both stated and unstated. Red teaming is about breaking those strategies and plans and decisions down into the assumptions they’re based on, and then challenging those assumptions to see if they’re really true and likely to remain true under all circumstances. It’s this element of deliberate challenge that makes red teaming so different from other management systems and decision making tools that people are familiar with.
My advice to individuals who want to use these techniques to just make better decisions by themselves is, if you are going to make a big decision in your business, and I don’t mean a decision to buy another three boxes of printer paper, but a big decision, sit down and make, kind of, you remember how we used to diagram sentences in school, in elementary school? Do that with your strategy, with your plan, with your decision. Kind of look at it. Write it out on a piece of paper, and then kind of look at all of the assumptions. Try to identify all of the stated and unstated assumptions that plan or strategy or decision is based on. Once you’ve identified them, make a list.
Then, go down that list and ask yourself a series of questions. There’s a whole series of them that I give in the book. Just to give you an example, “Is this true? Is this objectively true, or is it true based on my personal perspective? Is it likely to remain true under all circumstances?” If it isn’t true, if it proves untrue, what are going to be the consequences for this strategy, plan, or decision? There’s a whole bunch of other questions that you can ask. Just go down and ask those questions. If you do that honestly, and that’s the key, you’ve got to be honest with yourself about it. It’s not easy, but if you do that honestly, it will show you where the weak spots in your strategy or plan are, and it may also show you ways that can address them.
Roger Dooley: You know, I’ve found the power of writing to be really significant. In other words, I can think about a problem for hours and beat my head against it and not meet with any success, but when I go to, say, write an email to somebody and describe my problem, and as I start thinking through it, and what I’ve tried so far to solve it, and what things I might try, it really tends to clarify it. Often, I don’t even finish the email because I realize that, “Okay, now I have enough information to make a decision or solve a problem.” We’re just about out of time here, Bryce. Let me remind our audience that we’re speaking with Bryce Hoffman, author of the new book, Red Teaming: How Your Business Can Conquer the Competition by Challenging Everything. Bryce, how can our listeners find you and your book online?
Bryce Hoffman: Well, you can buy my book starting May 16th wherever books are sold, or you can go to my website, which is www.brycehoffman.com, all one word, or you can go to my practice website, which is www.redteamthinking, all one word, dot-com.
Roger Dooley: Great, well we will link to those places and any other resources we talked about on the show notes page at rogerdooley.com/podcast. You’ll find a handy PDF text version of our conversation there too. Bryce, thanks for being on the show.
Bryce Hoffman: Thanks for having me, Roger.
Thank you for joining me for this episode of The Brainfluence podcast. To continue the discussion and to find your own path to brainy success, please visit us at RogerDooley.com.