Like most Americans, I remember exactly where I was twenty years ago. It was just after 7 a.m. in California when a friend woke me up with a phone call, urgently insisting that I turn on my television.
“Dude, they just destroyed the World Trade Center!” he exclaimed.
“Who?” I said, suspicious because my friend was a notorious prankster.
“Right,” I said, thinking about the failed 1993 attempt to blow up the iconic buildings. “How?”
“They flew two planes into them – commercial jets – and one of the towers just collapsed,” he said. “Oh! And they attacked the Pentagon, too. We’re at war!”
It took him another five minutes to convince me to turn on the TV. It wasn’t just that I didn’t trust my friend; it was also because I could not believe that what he was saying was actually true.
My own failure of imagination reflected a larger failure of imagination on the part of America’s intelligence community.
MORE FOR YOU
There were many consequences of that failure of imagination. As the 9/11 Commission would later reveal, it helped open the door for al-Qaeda and allowed its heinous plot to come to fruition. It led America and her allies to invade and occupy first Afghanistan, then Iraq. And it sparked the creation of a revolutionary new methodology for making critical and contrarian thinking part of the planning and decision-making process at both the CIA and in the U.S. Army: decision-support red teaming.
The birth of red teaming
As I described in my 2017 book, Red Teaming, that process began shortly after midnight on September 12, 2001, when then-CIA Director George Tenet activated a group called the “Red Cell” and charged it with using critical and contrarian thinking tools to directly challenge the Agencies own thinking.
While the work of the CIA’s Red Cell remains cloaked in secrecy, the Agency has publicly credited it with preventing a number of major terrorist attacks against Americans at home and abroad over the last two decades. And it has inspired similar initiatives in other U.S. and allied intelligence agencies.
The Army took a couple more years to launch its red teaming initiative – one sparked not by the intelligence failures that allowed the terrorist to attack New York and Washington, but rather by the planning failures that helped turn America’s easy victories in Afghanistan and Iraq into unwinnable counterinsurgency operations. As the wheels began to come of America’s nation-building experiments in late 2003, newly appointed Army Chief of Staff Gen. Peter Schoomaker set up a lessons-learned team at the Pentagon to figure out what went wrong and how to avoid similar mistakes in the future.
That team put a large share of the blame on the Army’s failure to challenge its own assumptions, to listen to alternative perspectives, and to consider the second- and third-order effects of its decisions. To counter these shortcomings and combat groupthink, Gen. Schoomaker ordered the establishment of a new school at the Command and General Staff College at Fort Leavenworth. Its mission was to train “red teams” to stress-test Army strategies, identify unseen threats and missed opportunities, challenge the conventional wisdom of the organization, surface alternative perspectives, and help senior leaders make better decisions.
Porting it to business
When I learned about this amazing program, I immediately recognized its potential value to business as well. So, I called up the Pentagon and asked if I could attend the school. In June 2015, I became the first – and only – civilian from outside government to graduate from the Army’s Red Team Leader Course. I then spent another year figuring out how to port what I learned there to business before writing my book.
Since its publication, my team and I have continued to evolve these applied critical thinking tools and groupthink mitigation techniques. It did not take long for us to discover some of the shortcomings of the Army’s formal red teaming process – challenges that limited its applicability and made it difficult to spread throughout organizations.
Some of those issues explain why red teaming has not done more to prevent the disastrous denouements in Iraq and Afghanistan. So does the Trump Administration’s resistance to red teaming. (It turns out that people who are convinced they are always right see no need to listen to alternative perspectives or disconfirming evidence.)
But the core principles of red teaming have proven more valuable than ever.
We have moved away from the formal red teaming methodology, which required a dedicated team of highly trained analysts, to a lighter, more ad-hoc approach. We call it “Red Team Thinking” because it utilizes many of the original red teaming tools (in modified forms) as well as new ones based on the same cognitive science and applied critical thinking approach.
Major corporations such as Verizon and Kimberly-Clark have successfully employed these techniques to help meet the challenges of the present pandemic, as well as those presented by the increasingly complex and competitive nature of business today. Government agencies such as the Centers for Disease Control and Prevention, the National Park Service, and U.S. Forest Service have also used Red Team Thinking to help respond to global health threats and the challenges presented by climate change. We have even taught this new approach to the U.S. military and intelligence agencies.
It is one of the few positives to have come out of the terrible events of September 11, 2001 – one that I hope will continue to protect this nation and others from seeing another day like that ever again.