“The person who agrees with you all the time is not helping you think better. They’re helping you think less.” — Annie Duke
It seems obvious that we want people around us who will challenge our thinking. Yet in practice, most groups — companies, families, friend groups, investment committees — actively suppress disagreement. Dissent is seen as disloyalty or obstruction. Harmony is prized over accuracy.
This is a catastrophic mistake.
Disagreement, when it is honest and well-structured, is the most powerful diagnostic tool available for testing the quality of decisions. A bad idea exposed to genuine challenge looks very different from a bad idea that everyone nods along with. The challenge is learning to cultivate and receive disagreement in a way that actually improves outcomes rather than just creating conflict.
Groupthink happens when the desire for harmony overrides the motivation for accurate assessment. It’s not a failure of intelligence — it’s a failure of social dynamics. Smart, capable people in cohesive groups can make catastrophically bad decisions because no one felt empowered or willing to raise doubts.
The symptoms of groupthink are easy to recognize in retrospect:
The causes are harder to see in real time. They include social pressure to conform, fear of damaging relationships, deference to high-status group members, and the simple human desire to be liked rather than to be right.
History is full of groupthink disasters: the Bay of Pigs invasion, the Challenger launch decision, countless financial bubbles and corporate collapses. In each case, there were people who had doubts and didn’t raise them, or raised them and were overruled by social pressure.
The common factor was not bad information — it was a group culture that made dissent too costly.
The antidote to groupthink is not to simply “encourage people to speak up.” That rarely works, because the social pressures that suppress dissent are stronger than general encouragement. What works is structural — building disagreement directly into the decision-making process.
Formally appointing a devil’s advocate — someone whose explicit job is to argue against the group’s preferred position — accomplishes several things:
The devil’s advocate works best when the role rotates — so no single person becomes the permanent dissenter, which would allow the group to discount their objections reflexively.
A pre-mortem (also discussed in Chapter 6) is another powerful structural tool. Before committing to a decision, the group imagines that it is one year in the future and the project has failed catastrophically. Each person then explains what went wrong.
This reframes the exercise from “are there any problems with this plan?” (social pressure toward yes) to “we know it failed — what were the causes?” This small reframe dramatically increases the number and quality of risks identified, because the premise of failure removes the social pressure to be optimistic.
A red team is a dedicated group whose job is to attack the primary group’s plan, looking for flaws, vulnerabilities, and unconsidered risks. Red teaming is used extensively in military and intelligence contexts, and increasingly in business.
The key is that the red team must have real power — their objections must be taken seriously, not dismissed. A red team that is always overruled by the primary team provides theater, not protection.
Even with structural supports for disagreement, individual courage matters. Someone has to be willing to be the first person to say “I’m not sure this is right” in a room full of enthusiastic people.
This is genuinely difficult. It requires:
How you express disagreement matters as much as whether you express it. Compare:
The goal is not to win an argument — it’s to improve the group’s thinking. Framing dissent as collaborative inquiry rather than opposition makes it far more likely to be heard and integrated.
Beyond managing disagreement when it arises, Duke argues for actively seeking it out. This is the most powerful and least common form of the practice.
When you’ve formed a view on something important, deliberately look for the strongest available version of the opposing argument. This is called steelmanning — the opposite of strawmanning. Rather than looking for the weakest version of the counterargument (which you can easily dismiss), look for the strongest version.
If you can identify the best argument against your position and your position still holds, you have much higher confidence that it’s correct. If steelmanning reveals a fatal flaw in your view, you’ve saved yourself from a potentially costly mistake.
Think about a decision you’re currently leaning toward. What is the strongest argument against it? Have you genuinely considered that argument, or only the weak versions you can easily dismiss? Who would make the strongest case against your position, and have you actually sought them out?