What a Mentalist Sees That You Don’t in Your Meetings
Let me tell you something I notice in every engagement.
When I walk into a room—an offsite, a strategy session, an executive committee—I see things most people in the room don’t. Not because I have a “gift.” Because thirty years of mentalism trained me to watch what everyone else ignores: the invisible micro-decisions that quietly steer everything that follows.
And what I see on stage today, I’ve seen in business too—during twelve years in leadership. Same mechanisms. Different setting.
One clarification: I don’t “read” people… at least not in the way you’re imagining. Although—yes—I am a mentalist. Let’s say practice has trained me to spot, fast, where attention shifts, where the frame gets set, where the group converges… and where an idea quietly dies.
Here are four invisible dynamics that show up in your meetings—whether you notice them or not.
Scene 1 — The first person to speak wins (without realizing it)
Monday morning. A framing meeting. The leader opens with:
“I think we should probably lean toward option A.”
They’re not making the decision. Just sharing a view.
But in many cases, it’s already over.
The anchor is set. For the next forty minutes, the team will debate—around option A. Other options still exist on paper. They’re on the slides. Someone may even think about them. But psychologically, they’ve lost altitude.
This is the anchoring effect. Tversky and Kahneman showed decades ago that a first piece of information—even an arbitrary one—can pull judgments in its direction simply because it arrives first.
On stage, I can reproduce this in seconds. The room makes a choice it believes is free. Then I show how the choice was quietly shaped by an initial frame no one noticed.
The silence in the room at that moment is worth more than an hour of explanation.
Scene 2 — When a team confuses consensus with alignment
Thursday. A strategic offsite. Roundtable discussion. Everyone nods. The leader concludes:
“Looks like we’re aligned.”
Aligned on what, exactly?
Three people heard the same plan in three different ways. Two others disagreed but stayed quiet because the room seemed confident. You leave with a sense of unity… and a few weeks later you realize everyone executed a slightly different version of “the plan.”
This is where teams confuse no objections with real alignment.
Classic work on social conformity (including Solomon Asch’s experiments) shows how strongly group unanimity can pull individual responses—sometimes even against obvious evidence—simply because the group creates the feeling of “everyone agrees.”
And there’s a version of this you’ve probably lived in real life: the elevator effect.
You step into an elevator where everyone is facing the side wall instead of the door. You don’t know why. You feel it’s odd. And yet your body hesitates… then often ends up aligning with the group.
Not because you’re weak. Because social alignment is a deeply human reflex.
In companies, the danger is that apparent unanimity creates artificial confidence: the organization moves fast—but in a direction that was never truly validated the same way by everyone.
THE MENTALIST’S EYE
My job isn’t to judge these dynamics. It’s to make them visible. As a former CTO/GM, I learned decisions are rarely “purely factual”—they’re group dynamics you can learn to steer.
Scene 3 — The exec team that decides using yesterday morning’s data
Tuesday afternoon. Executive committee. A market analysis is presented. The numbers look solid. The recommendation is clear. Everyone approves.
And no one asks the uncomfortable question:
“What if these numbers are mostly confirming what we already believed?”
Confirmation bias is one of the most dangerous biases because it’s one of the most invisible. We don’t consciously try to be wrong. We try to feel safe. We gather “rigorous” elements—often without noticing they’re tilted toward a conclusion we already consider reasonable.
As a CTO and GM, I’ve seen multi-million-dollar investments approved on this basis—analyses that looked impeccable, yet were unconsciously built to confirm a decision already made.
The problem wasn’t data quality. It was the quality of the lens applied to the data.
Scene 4 — The project that dies because “it’s not the right time”
Friday. Innovation meeting. A strong idea is on the table. Technically feasible. Strategically relevant. Then someone says:
“The idea is good—but it’s not the right time.”
Everyone nods. The idea disappears. What just happened often has little to do with timing. It’s usually risk aversion dressed up as prudence.
Kahneman and Tversky showed that losses typically feel more painful than equivalent gains feel rewarding. The result: killing a new idea is almost always more comfortable than defending it.
This mechanism is especially sneaky because it sounds reasonable: “not now,” “not a priority,” “not mature,” “not the moment”… when the real decision is sometimes: “I’d rather not expose myself.”
What you don’t see costs more than what you do
These scenes aren’t extreme cases. They’re ordinary Tuesdays. They happen in every organization, at every level, with very smart people.
Intelligence doesn’t protect you from bias.
It just gives bias more sophisticated arguments.
The real leadership skill in 2026 isn’t having the right answers. It’s monitoring the quality of your own reasoning—in real time. Asking yourself:
Am I actually thinking… or am I just following an automatic pattern?
Three simple questions before any important decision
Before: What assumption am I trying to confirm without testing?
During: Are we converging because we’re aligned—or because no one wants to disagree?
After: What shaped this decision besides the facts?
These are the mechanisms I make visible on stage. Not to impress.
So that next time anchoring appears, consensus gets manufactured, or risk aversion hides behind “prudence,” someone in the room sees it—and says it.
YOUR LEVER (Conclusion Card)
Monitoring the quality of your reasoning is a leadership reflex.
A meeting isn’t just information exchange—it’s invisible dynamics steering decisions
If your next offsite is a chance to make these dynamics visible, I’ll show you how I turn events into moments of strategic clarity in my Signature Keynotes.
References
To learn more, here are the scientific studies cited in this article.
Tversky, A., & Kahneman, D. (1974). Judgment under Uncertainty: Heuristics and Biases. Science, 185(4157), 1124–1131. [The foundational article on cognitive biases: we make judgments using mental shortcuts (representativeness, availability, anchoring) that are efficient but systematically biased.]
Asch, S. E. (1955). Opinions and Social Pressure. Scientific American, 193(5), 31–35. [The classic conformity experiment: when facing a unanimous group giving an obviously wrong answer, roughly one third of participants go along with the group.]
Cialdini, R. B. (2021). Influence, New and Expanded: The Psychology of Persuasion. Harper Business. [Original edition: 1984.] [The definitive work on the mechanisms of social influence, including social proof: we follow others' behavior to guide our own, especially under uncertainty.]
Kahneman, D., & Tversky, A. (1979). Prospect Theory: An Analysis of Decision under Risk. Econometrica, 47(2), 263–291. [Prospect theory: we don't process gains and losses symmetrically — losing €100 affects us roughly twice as much as gaining €100.]