Architecture Definition Framework Rating™
- Sunil Dutt Jha

- Feb 25
- 3 min read

A 3–5 Day Framework Integrity Assessment Across P1–P6
Powered by ICMG Enterprise Anatomy™
Most enterprises say they have an architecture framework. They reference standards. They run architecture boards. They publish templates. They certify architects.
Yet change cost rises. Impact analysis slows. Regulatory updates require reconstruction. Platform consolidation inflates budgets.
The real question is not whether a framework exists.
The real question is: Does your framework define enterprise anatomy — or does it define documentation?
The Trigger
If your architecture framework governs 10–20 active programs and:
Cost per change is rising
Regulatory updates trigger interpretation debates
Consolidation exposes hidden rule conflicts
Architecture reviews feel procedural
The problem may not be delivery. The problem may be the framework itself.
What Actually Breaks
A framework defines how architecture is interpreted across projects.
If it does not explicitly bind:
P1 Strategy intent
P2 Sequencing invariants
P3 System Rule ownership
P4 Component constraints
then every governed program inherits ambiguity.
Ambiguity does not create one failure. It multiplies across the portfolio.
Portfolio Exposure Snapshot
Assume:
15 active programs
Average program size: $12M
Total governed capital: $180M
If P1–P4 are weak inside the framework, amplification is automatic.
Rule Ownership Fragmentation (P3 Risk)
When rule authority is not structurally defined:
Each program:
Re-implements logic
Duplicates thresholds
Reinterprets overrides
Conservative duplication band: 10–15%.
On $12M per program:
$1.2M structural duplication.
Across 15 programs:
$18M exposure.
Not visible in one budget line. Distributed across all of them.
Sequencing Drift (P2 Risk)
When sequencing is not structurally bound:
Projects discover:
Timing conflicts
Trigger misalignment
Dependency gaps
Impact analysis doubles. Release cycles stretch 20–30%.
If each program carries 200 lifecycle changes:
Even $8,000 incremental effort per change:
$1.6M per program. $24M across portfolio.
This is not delivery failure. It is framework-level ambiguity.
Consolidation Reconstruction
At consolidation stage: Platform merge. Cloud migration. Regulatory redesign.
If P1–P4 were never structurally embedded: Reconstruction begins.
Observed escalation: 15–25%.
On $180M:
$27M–$45M.
This is not modernization cost. It is framework debt.
Exit Risk Multiplier
If P1–P4 are not institutionalized:
Leadership rotation destabilizes all governed programs.
Observed inflation: 8–12%.
On $180M:
$14M+ exposure.
A framework that collapses when architects move was never structural.
It was personality-driven.
The Real Cost of a Weak Framework
A framework that defines deliverables instead of anatomy does not fail during project delivery. It fails during operations. When the system is live and change begins.
Consider a $10M CRM platform.
Year 1: Delivery is successful. Dashboards green. Capabilities mapped. Microservices deployed.
Years 2–5: 350 lifecycle changes occur:
Rule modifications
Regulatory updates
Pricing adjustments
Channel behavior changes
Integration shifts
If the framework never visualized: P2 sequencing invariants, P3 rule authority, P4 component constraints, then every change becomes reconstruction.
If cost per change increases from $25,000 to $40,000 due to rule rediscovery, duplicated logic, sequencing confusion.
Additional cost per change = $15,000
Across 350 changes = $5.25M
That is not scope expansion. That is framework-induced amplification.
Now add:
Upgrade cycle inflation (15–20%)
Rework from rule fragmentation (20–30%)
Impact analysis delay (2× time multiplier)
Over 5–7 years, a $10M platform easily becomes a $20–30M lifecycle commitment.
Not because technology failed. Because the definition framework never produced one visible anatomy.
Why Anatomy Matters in Operations
Architecture is not created for go-live. It exists to manage:
Real-time decision scenarios
Regulatory interpretation
Pricing adjustments
Channel inconsistencies
Risk overrides
Upgrade sequencing
Platform evolution
A document-centric framework produces artifacts for approval. An anatomy-based framework produces a living decision model.
What We Measure (3–5 Days)
P1 – Decision authority clarity
P2 – Sequencing invariants
P3 – Rule ownership integrity
P4 – Boundary enforcement
P5 – Traceability discipline
P6 – Operational inheritance
We test structural binding. Not documentation volume.
What You Receive
Framework Integrity Score (0–100)
P1–P6 Binding Map
Bias Exposure (P5-heavy vs P1–P4 grounded)
Portfolio Amplification Estimate
10 Structural Weakness Indicators
Executive Financial Exposure Summary
This is not a maturity model. It is a capital exposure instrument.
What This Is NOT
Not a TOGAF audit. Not a documentation review. Not a governance checklist. Not a template validation. It tests whether your framework reduces capital risk — or multiplies it.
Pricing
Typical governed portfolios: $100M–$300M.
The 3–5 Day Rating is positioned at less than 0.5–1% of governed capital exposure.
It costs a fraction of the inflation it prevents.
Why It Matters
Does your framework reduce complexity? Or does it formalize ambiguity across programs?
A document-centric framework produces artifacts for approval. An anatomy-based framework produces a living decision model. That model reduces:
Interpretation
Change surface expansion
Cross-team escalation
Exit risk multiplier
Architecture Definition Framework Rating™ tests whether your framework produces:
Artifacts or An operational anatomy.
Weak framework → 10–30% lifecycle amplification. Strong framework → controlled change economics.
Across a 10-project portfolio, that difference is not theoretical. It is eight figures.


