Every growing engineering organization eventually reaches the same inflection point: people start asking what it takes to get promoted, and no one has a clear answer. That is when leadership decides to build a career framework. The intention is good. The execution is where most teams stumble.
After working with dozens of engineering teams on their competency frameworks, we have seen a pattern. The frameworks that engineers trust share a few specific qualities, and the ones they dismiss all make the same mistakes.
Why do vague criteria undermine the entire framework?
The most common failure mode is criteria that sound meaningful but cannot be observed or measured. Phrases like "demonstrates technical excellence" or "shows leadership" feel authoritative on paper, but they fall apart the moment two managers try to agree on whether a specific engineer meets them.
Vague criteria create three problems simultaneously:
- Inconsistent application. Manager A interprets "shows leadership" as mentoring juniors. Manager B interprets it as driving technical decisions. Both are reasonable, but the engineer gets different signals depending on who they report to.
- Eroded trust. When engineers see peers promoted under criteria they cannot replicate or understand, they stop believing the framework reflects reality.
- Calibration gridlock. Without shared definitions, calibration sessions devolve into debates about whose interpretation is correct rather than discussions about performance.
How do you write behavioral anchors that hold up?
The fix is behavioral anchors: specific, observable descriptions of what a skill looks like at each level. The difference is the difference between "communicates effectively" and "proactively shares context with cross-functional stakeholders before decisions are finalized, reducing downstream rework."
Good behavioral anchors follow three rules:
- They describe actions, not traits. Instead of "is a strong communicator," write "documents technical decisions in RFCs and solicits feedback from affected teams before implementation."
- They specify scope and impact. A senior engineer's scope is different from a staff engineer's. The anchor should make that boundary visible: "within their team" versus "across the engineering organization."
- They are falsifiable. If two reasonable managers cannot look at the same evidence and agree on whether the behavior occurred, the anchor needs to be rewritten.
Why should engineers be involved in writing the framework?
Frameworks written entirely by HR or leadership often miss the nuance of what great engineering actually looks like day to day. Engineers notice when the criteria were clearly written by someone who has never debugged a production incident at 2 AM.
The most effective approach is a small working group of three to five engineers spanning different levels and specialties. Their job is not to write the framework from scratch but to pressure-test every anchor against reality. Ask them: "Can you name a specific person who clearly meets this? Can you name someone who clearly does not?" If the group cannot agree, the anchor is not ready.
What does the review process look like?
Share the draft framework widely before finalizing. Run it through a mock calibration using real (anonymized) cases. This surfaces ambiguities that seemed fine in isolation but collapse under the weight of actual decisions. Tools like Harmny's competency framework builder make it straightforward to iterate on levels and skills without losing version history.
How do you calibrate effectively once the framework exists?
A framework is only as good as its application. Calibration is where theory meets practice, and it is the moment that determines whether engineers trust the system. Three practices make the biggest difference:
- Require evidence, not opinions. Every rating should be supported by specific examples tied to behavioral anchors. "I feel like they are ready" is not evidence.
- Normalize distribution conversations. If every engineer on a team is rated "exceeds expectations," either the bar is wrong or the ratings are inflated. Neither is acceptable.
- Document decisions and reasoning. When engineers can see why a decision was made, even one they disagree with, they are far more likely to respect the process. Transparent documentation through structured reviews builds that record over time.
The long game
A career framework is not a document you publish and forget. It is a living system that needs regular recalibration as your organization evolves. Schedule a review every six months. Look at where calibration conversations stalled, which anchors generated the most debate, and where engineers felt the criteria did not match their actual work.
The goal is not perfection. It is a shared language that makes growth conversations productive rather than political. When engineers trust the framework, they stop worrying about optics and start focusing on the work that actually matters.