Adverse Selection in Technical Debt: Why Bad Code Drives Out Good


Gresham’s Law: “Bad money drives out good.”

When two currencies circulate and one is debased (lower quality), people hoard the good currency and spend the bad. Eventually, only bad money circulates.

The same thing happens with code. And it explains why codebases rot.

Here’s how bad code drives out good code:

Product manager: "We need this feature by Friday."
Time available:  3 days
Time for good code: 5 days
Time for hacky code: 2 days

Developer ships hacky code.
Feature ships on time: ✓
Tests pass: ✓ (if there are tests)
Users are happy: ✓ (for now)
Manager is happy: ✓

No visible downside to cutting corners.
Code review:
  - Reviewer doesn't have context on "right" design
  - Deadline pressure means "ship it"
  - Hacky code looks similar to good code
  - Approves the PR

Management sees:
  - Features shipped
  - Deadlines met
  - "Productivity" metrics look great
Developer A: Writes clean, well-tested code. Takes 5 days.
Developer B: Ships fast, hacky code. Takes 2 days.

Who gets promoted?
  - B looks "more productive"
  - B ships more features
  - B gets the raise

Developer A notices.
Developer A has options:
  1. Keep writing good code, look slow, get passed over
  2. Adapt to the system, ship fast, get promoted
  3. Leave for a company that values quality

Options 2 and 3 both remove good code from the system.
Who stays?
  - Developers who write fast, hacky code (rewarded)
  - Developers who don't know the difference (unaware)
  - Developers who can't leave (stuck)

Who leaves?
  - Developers who care about quality (frustrated)
  - Senior developers who see the trajectory (experienced)
  
The talent pool degrades.
Hacky code makes good code harder to write:
  - No consistent patterns to follow
  - Good code looks "out of place"
  - Integration with hacky code requires compromises
  - "When in Rome..."

More hacky code → harder to write good code → more hacky code

This is Gresham’s Law for software. Bad code drives out good code through adverse selection.

Adverse selection occurs when information asymmetry causes “bad” participants to dominate a market.

Buyers can't tell good cars from lemons.
Buyers assume average quality and pay average price.
Sellers of good cars can't get fair price → leave market.
Only lemons remain.
Market collapses.
Managers can't tell good code from bad code.
Managers assume average quality and reward speed.
Writers of good code can't get recognition → adapt or leave.
Only hacky code remains.
Codebase collapses (slowly, then suddenly).

The information asymmetry is code quality. The market is the internal labor market for developers. The collapse is technical bankruptcy.

Early stage:     High velocity (shipping fast)
Middle stage:    Velocity slows (accumulated debt)
Late stage:      Velocity crashes (everything is hard)

But at each stage, the fastest coders are still the ones shipping hacky code.
Track who leaves:
  - Senior engineers: "This codebase is a mess"
  - Quality-focused devs: "Nobody cares about doing it right"
  - New hires: "This isn't what I signed up for"

Track who stays:
  - Developers who created the mess
  - Developers who don't notice
  - Developers with no options
Every 2-3 years:
  "We should rewrite this from scratch"
  "The codebase is unmaintainable"
  "It'll be different this time"

The rewrite happens. Same incentives. Same outcome.
Who gets celebrated?
  - Developers who ship features fast
  - Developers who fix urgent production issues
  - "10x developers" who crank out code

Who doesn't?
  - Developers who prevent issues
  - Developers who refactor quietly
  - Developers who say "this needs more time"

Hero culture rewards the behaviors that create the messes heroes fix.

Information asymmetry drives adverse selection. Reduce asymmetry.

Quality metrics:
  - Test coverage
  - Cyclomatic complexity
  - Dependency health
  - Code review thoroughness
  - Bug escape rate per author
  - Time to understand (for new devs)

Make these visible:
  - Dashboards
  - PR annotations
  - Performance reviews

When quality is visible, it can be rewarded.

Old incentives:
  Ship features fast → Promotion
  Write good code → Invisible

New incentives:
  Ship features fast AND sustainably → Promotion
  Reduce technical debt → Recognized
  Prevent bugs → Celebrated
  Mentor others on quality → Valued

What you measure is what you get.

In economics, signaling is how informed parties communicate quality.

Without signaling:
  Manager can't tell good from bad code
  Manager rewards speed
  Bad code wins

With signaling (rigorous code review):
  Code review reveals quality
  Good code is distinguishable
  Good code can be rewarded

But code review only works if:

  • Reviewers have time and context
  • Reviews block bad code (not rubber-stamp)
  • Quality concerns are valued, not dismissed
Bad code review culture:
  "LGTM" in 5 minutes
  "Ship it, we're behind"
  Reviewers don't understand the code

Good code review culture:
  Thorough review expected
  Quality concerns block merge
  Authors want good reviews (signaling quality)

Adverse selection accelerates under pressure. Reduce pressure.

No slack:
  Every sprint at 100% capacity
  No time for "doing it right"
  Technical debt accumulates

With slack:
  80% capacity planned
  20% for quality, refactoring, learning
  Good code becomes possible

Google’s “20% time” wasn’t just about innovation—it also reduced adverse selection pressure.

Traditional career ladder:
  Junior → Senior → Lead → Manager
  
Measured by:
  Features shipped
  People managed
  Scope of ownership

Quality-aware ladder:
  Junior → Senior → Staff → Principal
  
Measured by:
  Technical impact
  Code quality
  System health
  Mentorship of quality practices

If senior individual contributors are rewarded for quality, quality becomes aspirational.

Interview for:
  - Code quality awareness
  - Refactoring experience
  - Testing philosophy
  - "Tell me about a time you pushed back on shipping"

Avoid:
  - "How fast can you solve this?"
  - Pure algorithm grinding
  - No discussion of maintainability

The bar you set in hiring determines the equilibrium quality level.

Breaking adverse selection requires investment. Here’s how to justify it:

Technical debt compounds:
  Year 1: 10% of codebase is problematic
  Year 2: 20% (new debt + spread)
  Year 3: 35%
  Year 4: 55%
  Year 5: 75%

Velocity impact:
  Year 1: 100% velocity
  Year 2: 90% velocity (10% fighting debt)
  Year 3: 75% velocity
  Year 4: 55% velocity
  Year 5: 30% velocity
Intervention at Year 2:
  Investment: 20% of capacity for 6 months
  Result: Debt reduced from 20% to 10%
  Velocity preserved: 90% instead of declining

Without intervention:
  Years 3-5 velocity: 75% → 55% → 30%
  Average: 53%

With intervention:
  Years 3-5 velocity: 85% → 80% → 75%
  Average: 80%

Velocity preserved: 27% higher average

For a 20-person team at $200K/person:
  20% more velocity = $800K/year in effective capacity
Adverse selection causes talent loss:
  Quality developers leave → replacement cost $50K each
  Remaining developers less productive → 20% velocity loss
  New hires ramp slowly in bad codebase → 50% first-year productivity

If quality intervention retains 2 senior devs:
  Retention value: 2 × $50K = $100K
  Productivity preserved: 2 × $200K × 20% = $80K
  
  Annual value: $180K

Catch adverse selection early:

Signal Meaning
“We don’t have time for tests” Quality is being sacrificed
Senior devs leaving for “better engineering culture” Adverse selection in progress
Features ship fast but bugs increase Quality is invisible but declining
“We should rewrite” conversations Debt has compounded
New hires take longer to onboard Codebase complexity is high
“Hero” deployments and fixes Celebrating symptoms, not prevention
PRs approved in < 10 minutes Code review isn’t working

Bad code drives out good code through adverse selection:

Information asymmetry: Managers can't see code quality
Result: Speed is rewarded, quality is invisible
Consequence: Quality-focused developers adapt or leave
Equilibrium: Only hacky code remains

The cycle:

Deadline pressure → Hacky code ships → Looks like success
→ Quality developers unrewarded → Adapt or leave
→ Codebase quality drops → More hacky code normalized
→ Harder to write good code → Repeat

Breaking the cycle:

Strategy How It Helps
Make quality visible Reduces information asymmetry
Fix incentives Rewards quality alongside speed
Rigorous code review Enables signaling of quality
Slack time Makes good code possible
Career paths for quality Makes quality aspirational
Hire for quality Sets the equilibrium bar

Gresham’s Law isn’t inevitable. But breaking it requires deliberate intervention.

If you don’t actively select for quality, you’ll passively select against it.

And eventually, only bad code will circulate.