The Personal Cost of Sycophancy
Dismantling the Echo Chamber: Part 2
This is Part 2 of the Breaking the Echo Chamber series:
Part 1: The Seduction of Yes-Men Culture
Part 2: The Personal Cost of Sycophancy [you are here]
Part 3: The Organizational Cancer
Part 4: Breaking Free
Happy New Year, everyone! Wishing all of you a healthy, happy, and prosperous 2025.
By month nine, the VP of Engineering I’d been watching had stopped contributing in technical architecture reviews. Not because she’d been excluded; she still had her seat at the table and commanded nominal respect. But her primary function had shifted from evaluating technical decisions to managing the room’s reaction to them. When a junior engineer raised concerns about the scalability of a proposed system, the VP didn’t engage with the technical merits whatsoever. She reframed the concern as “premature optimization” and redirected the conversation toward the CEO’s (completely arbitrary and ephemeral) timeline. Deftly done, like the practiced flick of a conductor’s baton.
I knew the VP had real expertise. She’d built systems at scale, understood the tradeoffs, could hold her own in any technical discussion. But she’d stopped using that expertise for technical judgment. Instead, she deployed it for political calculation: how to make the CEO’s preferred decision look technically sound, how to neutralize objections without appearing to suppress them, how to maintain the illusion of rigorous technical process while ensuring the predetermined outcome.
This is what yes-men culture costs at the individual level. Not just compromised integrity or political gamesmanship, but the systematic atrophy of the very capabilities that made someone valuable in the first place.
The Erosion of Expertise
Professional capability erodes through disuse like muscle memory fading. When you stop exercising judgment, you lose the ability to exercise it well. The VP’s technical skills hadn’t disappeared. She could still write code, still understand architecture diagrams, still follow technical arguments. But her technical *judgment* had weakened, grown soft. The muscle memory of evaluating technical tradeoffs, weighing risks, making decisions based on engineering principles rather than political considerations had atrophied month by month.
By month nine, she’d lost the ability to distinguish between good technical arguments and politically convenient ones. When someone raised a concern, her first thought wasn’t “is this valid?” but “how does this affect the CEO’s timeline?” That wasn’t a conscious choice anymore. It was reflexive, automatic, the result of months of conditioning. Like Pavlov’s dogs, but for organizational dysfunction.
What makes this particularly pernicious is that the person experiencing it often can’t see it happening. The VP didn’t think her judgment had deteriorated. She thought she’d gotten better at “seeing the bigger picture” and had an epiphanic “understanding how the business really works.” The fact that “seeing the bigger picture” actually meant ignoring technical reality, and “understanding the business” meant anticipating this specific CEO’s preferences, she couldn’t see that anymore. The transformation was complete.
The Psychology of Compromise
The cognitive dissonance of maintaining two versions of reality extracts ongoing cost. You know the technical decision is wrong. You also know you’re going to support it publicly. You develop elaborate internal narratives to reconcile these positions, but the reconciliation is never complete. There’s always a gap between what you know and what you say, and that gap requires constant energy to maintain. Exhausting, corrosive, secretly shameful energy.
I watched the VP develop increasingly complex rationalizations. “We need to fix that in the next iteration. Along with delivering everything else that was already planned.” “The business pressure justifies the technical debt.” “I’m choosing my battles. This isn’t the hill I’m going to die on.” Each rationalization was technically true but profoundly dishonest. Every iteration, they pushed the technical debt forward. Every battle, she chose not to fight. After months of this pattern, the truth became clear to everyone: she never fought any battles; not for budget, not for quality, not for people. The “strategic” silence had become a permanent posture.
This takes a psychological toll that’s hard to articulate to someone who hasn’t experienced it. You start monitoring yourself constantly—what can you say in meetings, what needs to stay internal, how to phrase disagreements so they don’t register as challenges. That level of self-surveillance is exhausting. It creates a baseline anxiety that affects everything, a constant low-grade dread that something you say might cross an invisible line.
But it’s more than just stress or burnout. What happens when you repeatedly act against your own judgment and values is something psychologists call “moral injury”—the damage to your sense of self that comes from sustained ethical compromise. You know the right answer. You say something different. You do this again and again, and each time you do it, you erode your own sense of integrity. It’s not just that you’re compromising your principles in the moment, you’re reshaping who you are. The person who could have stood up for what’s right becomes the person who learned not to.
Private conversations became the only place she could be honest. The VP would pull me aside after meetings to explain what she actually thought about the decisions that had just been made. These conversations were revealing not because they showed her private disagreement (that was obvious to anyone paying attention) but because they showed how much energy she was expending to maintain the gap between her public and private positions. That energy could have been used for actual technical leadership, for mentoring engineers, for solving hard problems. Instead, it was consumed entirely by the performance of alignment.
The Illusion of Security
The cruel part is how much the VP had convinced herself she was being strategic. She had a clear narrative, repeated often enough that she believed she was maintaining influence so she could guide decisions on issues that really mattered. The problem was that “issues that really mattered” kept getting defined downward, shrinking like the frog in slowly boiling water. First it was major architectural decisions. Then it was just infrastructure choices. Eventually it became “keeping the team from getting fired.”
This is the trap of incremental compromise. Each decision to stay silent feels justified because you’re preserving your position to fight another day. But another day never comes, and your position becomes increasingly dependent on not fighting. Ever.
By month twelve, the VP’s professional identity had fully shifted. She no longer saw herself as a technical leader who happened to work in a political environment. She saw herself as a political operator who happened to have technical background, a subtle but fundamental reframing. Her value wasn’t her engineering judgment, it was her ability to navigate the CEO’s moods, manage the team’s expectations, and translate objectively bad decisions into language that sounded reasonable.
This felt like success to her, though. She had access, perceived influence, a seat at the table where important decisions happened. What she couldn’t see: her influence was entirely contingent on not exercising it. The moment she pushed back on something the CEO actually cared about, that access would evaporate like morning fog. She wasn’t powerful. She was useful because she was silent. There’s a difference, and it’s not subtle.
The Actual Costs
Skills become non-transferable. The VP’s primary competency had become knowing how this specific CEO made decisions and how to work around them. Her value was tightly coupled to the CEO. That’s not a skill that travels. Outside this organization, she’d need to rebuild her technical credibility from scratch. Except her technical judgment had deteriorated enough that “from scratch” might not be an exaggeration. She’d need to relearn how to make decisions based on engineering principles rather than political calculation, and that’s harder than it sounds after a year of conditioning.
Professional network calcifies. Her relationships increasingly consisted of other people in similar positions: other insiders, other enablers, other people who operated the same way she did. This created a self-reinforcing bubble where her behavior seemed normal because everyone around her was doing the same thing. The broader professional community, people who valued technical judgment over political alignment, she’d lost connection to them. They’d stopped reaching out. She’d stopped maintaining those relationships because they weren’t useful in her current role.
Reputation becomes fixed. Word travels in tech, especially in specific domains and geographies. The VP was starting to be known not for her technical work but for her ability to manage up. That’s career poison. It limits your options to other organizations looking for that specific skill set, which is to say, other dysfunctional organizations where enablement is valued over expertise. She was becoming unemployable anywhere that actually valued engineering judgment.
Financial dependency locks in. The compensation was good. Stock was vesting. Retention bonuses were scheduled. Each month that passed made leaving more expensive in real terms. By year two, she’d need to take a significant pay cut to go anywhere else, and her lifestyle had adjusted to the current compensation. The golden handcuffs had worked exactly as designed. The mortgage, the kids’ private school, the lifestyle expectations, all of it created dependency that made the cage feel increasingly inescapable, however comfortable.
The Trapped Insider
By month eighteen, the VP wasn’t strategically positioning herself anymore. She was trapped, plain and simple. Her skills had deteriorated to the point where recovering them would take years. Her professional network consisted primarily of other enablers. Her reputation was that of a political operator rather than a technical leader. And her financial obligations made leaving prohibitively expensive unless she was willing to dramatically downgrade her lifestyle, which, having adjusted to the compensation, felt impossible, possibly catastrophic.
More importantly, she’d lost the ability to see the trap she was in. When I pointed out that she hadn’t challenged a technical decision in six months, her response was to explain why each individual decision hadn’t been worth fighting over. She couldn’t see the pattern anymore. Or maybe she could see it but couldn’t afford to admit it, even to herself, because admitting it would require confronting the choices she’d made and the position she’d put herself in. Most people aren’t that honest with themselves.
The question “how did I get here?” never has a satisfying answer because the journey happens in such small increments that no single step feels like the wrong one. It’s only when you look back over the full distance that the path becomes visible. And by then, you’re too far from the starting point to easily find your way back.
This is what yes-men culture costs at the individual level. Not just compromised integrity but the systematic destruction of professional capability, judgment, and agency. The people it captures often don’t recognize they’ve been captured until it’s too late to escape without significant cost.
But these individual costs aren’t isolated tragedies. They’re the mechanism through which organizational dysfunction perpetuates itself. When competent people become enablers, they don’t just fail to stop bad decisions, they become the machinery that makes those decisions possible and sustainable. That’s what we’ll examine in Part 3.
Next in the series: Part 3: The Organizational Cancer




