In 1960, shortly after being elected, John F Kennedy was approached by the CIA to approve a plan to train, equip and support an invasion of communist Cuba by Cuban exiles opposed to the government of Fidel Castro. The invasion that ensued at the Bay of Pigs became one of the major US foreign policy fiascos of the 20th century. It entrenched Fidel Castro as leader of Cuba with communism persisting in Cuba to this very day.
A decade later, a social psychologist from Yale, Irving Janis coined the term Groupthink to describe the failure in decision-making that resulted in this disastrous outcome for the US. Why did some of the smartest people in US government set the country on a course that would result in such a spectacular failure? Irving found that while a number of Kennedy’s advisors harboured doubts about the course of actions being proposed, they kept them to themselves. They all knew that Kennedy wanted to overthrow Fidel Castro, and this knowledge conditioned their behaviour. As commander-in-chief and president, Kennedy had a distorting influence on the debate, with his advisors consciously or unconsciously aligning with positions that were likely to curry favour. Moreover, a ‘consensus’ culture meant that not only did his advisors feel pressure to conform but they mutually reinforced their biases, a process known as confirmation bias and ignored information that did not fit the intended outcome.
The irony is that Kennedy did try to inject an opposing view, and invited Democratic Senator William Fulbright to present an opposing perspective on the plans. However, despite clearly articulating weaknesses in the proposed plan, Fulbright was an outsider. His interventions simply galvanised Kennedy’s inner circle, who closed rank, and effectively shut down the debate.
Industry-Think and Fads
Groupthink does not occur solely in political decision-making. Whenever there is a consensus opinion in a company or an industry that is flawed but remains unchallenged, that is often an indication of groupthink at work. Consider the 2008 financial crisis. The common wisdom was that combining low-grade debt instruments associated with sub-prime housing in the US could effectively lower the risk sufficiently for them to be sold as investment-grade products. There was little consideration of the systemic risk that was being generated, not only within the banks dealing in these instruments, but also in the regulators, whose very function was to protect the economy. The Bank of England said that there is “some latent risk in relation to individual banks getting into difficulty.” That must go down as one of the greatest understatement in the over 300 years of the Bank of England.
So how do you go about avoiding groupthink?
1. Assemble a diverse group
If the cause of groupthink lies within the shared perspectives, backgrounds and expectations of the group of decision-makers, then the most obvious antidote is to ensure diversity of the group. This is not only about ethnic, gender or social diversity, but also about the diversity of background, outlook and experience. A group of accountants will take a monetary and financial outlook to a problem, so educational and professional diversity is important. When all the participants share different perspectives, then the range of options that will be considered will be broader, and the quality of the decision better. [4]
However, group diversity on its own will not prevent groupthink. As seen in the Bay of Pigs example, there is a strong human urge to conform, so members of the group who have misgivings may find it difficult to push for diverging or dissenting options. This brings us to the next antidote, to deliberately explore unpopular options.
2. Deliberately explore unpopular options
In 1973, Arab states led by Syria and Egypt launched a surprise attack on Israel on the feast of Yom Kippur, resulting in a war that brought Israel closer to defeat than it had ever been. Although overwhelming evidence was available indicating that an attack could be imminent [5], this was ignored. It did not conform to existing pre-conceptions within Israeli intelligence that given the heavy defeat suffered by the Arab states in the 1967 war, they would be unwilling to risk another war.
Following this near-catastrophe, the Israeli Intelligence Agency (AMAN) established the Devil’s Advocate office. This was tasked with providing an alternative explanation of available intelligence and challenge the established assessments. Crucially, the staff in the Devil’s Advocate Office was made up of very highly-regarded officers and analysts, and therefore couldn’t be easily be dismissed or ignored. It was intended to act as safeguard against institutional groupthink and prevent the blind spot that nearly proved so fatal to Israel in 1973.
The lessons from Israel’s Devil’s Advocate office is not only should dissenting opinions be proferred in decision-making, but they should be done from a position of credibility, authenticity and authority. There is no point in having a Chief Sustainability Officer or a Chief Risk Offer on a company’s board if that person’s view is always in the minority and ignored. A contrarian opinion is only really useful if it is offered in a culture that both allows and encourages dissent. If the dissenting view is constantly raised from the same individuals, then it is easy to ignore and will have next to no impact on the quality of the decision.
3. Apply Divergent Thinking
Having a culture that encourages diversity of opinion is not solely valuable at a board level, but within every team in the business. One technique that can be valuable is to engage in divergent thinking. Unlike convergent thinking, the step-by-step logical process that takes you to a solution, divergent thinking is designed to generate creative ideas by exploring many possible solutions and options. Brainstorming is an approach often associated with divergent thinking, and the aim is to identify as many starting points as possible, to minimise the chance of overlooking what may be an effective way of solving the problem. Once these options are identified, the task then shifts to whittling down the options to the most appropriate for the problem at hand, using traditional analytical techniques.
4. Beware the pitfalls of DataThink
You would be hard-pressed to to find any modern management handbook, consultant or tech company to want anything other than a data-driven approach. This is because, properly used data on how a company’s customers behave, their preferences, where revenue and profits are generated can bring a scientific rigour to the practice of management. However, although use of metrics can indeed be a powerful means for bringing objectivity in decision-making, it is easy to be blinded by numbers.
After all, the banks in 2008 did not lack for numbers and data. Indeed the financial instruments that proved to be the downfall of many venerable institutions were created specifically to mean certain numeric criteria. The problem was that they were all looking at the same data and drawing the same conclusions. Overly-relying on data is subject to numerous forms of bias, which if not challenged robustly results in a form of groupthink. Most data sets can be selected, analysed and presented to support radically different interpretations. As Kennedy found out, Confirmation Bias occurs when the person carrying out the analysis looks to prove a given outcome, (e.g. that their actions have resulted in improved customer engagement) rather than dispassionately testing different hypotheses. Selection Bias is when the dataset is selected subjectively, again to support a given interpretation.
Further Reading
- Psychology Today, “Preventing Groupthink”
- Wilful Blindness: Bay of Pigs
- Hill, A. (2018). Why groupthink never went away. FT.Com
- Skapinker, M. (2009, May 26). Diversity fails to end boardroom ‘groupthink’.
- Doron Geller, Israel Military Intelligence: Intelligence During Yom Kippur War (1973)
- Bank of England ‘groupthink’ during credit crisis, CNBC, 7 Jan 2015
- Types of cognitive biases you need to be aware of as a researcher