• Home
  • BVSSH
  • C4E
  • Playbooks
  • Frameworks
  • Good Reads
Search

What are you looking for?

Standard : Outcomes are reviewed and used to guide future investments

Purpose and Strategic Importance

This standard ensures that outcomes from completed work are regularly reviewed and used to guide future investments—bridging the gap between delivery and strategy. It closes feedback loops, fosters continuous improvement, and sharpens focus on real impact.

Aligned to our "Outcome-Driven Development" policy, this standard ensures data informs decisions, preventing wasted resources on work that doesn’t deliver tangible value. Without it, prioritisation drifts, and opportunities for meaningful impact are missed.

Strategic Impact

  • Stronger alignment between delivery and business strategy
  • Faster course correction based on measurable results
  • Increased return on investment across the portfolio
  • Clearer focus on outcomes rather than output
  • Higher confidence in planning, investment, and forecasting

Risks of Not Having This Standard

  • Repeated investment in low-impact features or services
  • Strategic decisions made without evidence of value
  • Inability to course-correct based on real-world impact
  • Erosion of trust in engineering’s ability to deliver value
  • Fragmented or duplicated effort across teams

CMMI Maturity Model

Level 1 – Initial

Category Description
People & Culture Teams focus on delivery and close the book after release.
There is little interest or capacity for measuring actual outcomes.
Process & Governance No process exists to review or learn from what’s been delivered.
Post-release activities are rarely prioritised.
Technology & Tools Outcome data is not collected or is fragmented across systems.
There is no structured way to interpret results.
Measurement & Metrics Success is assumed based on completion, not impact.
No tracking of value delivered.

Level 2 – Managed

Category Description
People & Culture Some teams review delivery success informally.
Lessons may be shared but are not routinely applied.
Process & Governance Retrospectives or reviews occasionally consider
value delivered, but feedback is rarely used to guide future work.
Technology & Tools Limited metrics on outcomes are available.
Reporting is manual or patchy.
Measurement & Metrics Some attempts to measure value, but not integrated
into governance or investment planning.

Level 3 – Defined

Category Description
People & Culture Teams actively engage in outcome reviews.
Delivery and product decisions are informed by results.
Process & Governance Post-delivery reviews are built into workflow.
Learnings influence prioritisation and roadmap decisions.
Technology & Tools Tooling supports consistent outcome measurement,
including business and user metrics.
Measurement & Metrics Value metrics are defined up front and measured
against expectations after release.

Level 4 – Quantitatively Managed

Category Description
People & Culture Teams discuss outcome trends and seek to improve
portfolio effectiveness over time.
Process & Governance Strategic reviews include delivery performance and
value trends to influence portfolio shaping.
Technology & Tools Dashboards highlight outcome metrics,
supporting learning loops and investment refinement.
Measurement & Metrics Trends and benchmarks are used to guide funding,
planning, and capability investment.

Level 5 – Optimising

Category Description
People & Culture Teams are encouraged to challenge priorities
based on outcome learning and data.
Process & Governance Investment cycles continuously adapt based
on real-world feedback and system behaviour.
Technology & Tools Outcome data integrates with planning systems
to enable real-time steering of the roadmap.
Measurement & Metrics A continuous loop exists from delivery to impact
to investment, maximising strategic alignment.

Key Measures

  • % of completed work with outcome review documented
  • Number of investment decisions influenced by outcome data
  • Impact of outcome-driven learning on roadmap changes
  • Measurable value delivered vs. forecast value
  • Trends in feature success or customer adoption post-release
Associated Policies
  • Outcome-Driven Development
Associated Practices
  • Static Application Security Testing (SAST)
  • Integration Testing
  • Feature Toggles (Flags)

Technical debt is like junk food - easy now, painful later.

Awesome Blogs
  • LinkedIn Engineering
  • Github Engineering
  • Uber Engineering
  • Code as Craft
  • Medium.engineering