Strategic Thinking Exercise Series - week 2

Jun 30, 2025 11:11 pm

image


This is part 2 of our 9-part tabletop exercise series. Please read from part 1 chronologically and enjoy each week as we spotlight three exercises from our new PRISM Strategic Tabletop Exercise Guide Deck—27 decision-making tools that teams can run in real-time, in real rooms, on real challenges.



Strategic Thinking Exercise Series - Week 2


The Hidden Trap of Premature Solutions

Last week, we learned that speed reveals but doesn't cure fuzzy thinking. Your team can now speed test decisions and loop rapidly through feedback cycles. But here's the trap that catches even disciplined teams: solving the wrong problem efficiently.


This challenge persists across organisations. McKinsey's survey of 2,207 executives found that only 28% believe the quality of strategic decisions in their companies is generally good, with 60% reporting that bad choices are about as frequent as good ones. More revealing: when researchers dug deeper, 73% of failed strategic decisions traced back to problem definition errors, not execution failures.


A 2021 Frontiers in Psychology study examining COVID-19 policy decisions found that time pressure consistently led to narrow problem framing—viewing complex challenges through overly simplified lenses. Instead of framing pandemic response as broad societal well-being (including economy, mental health, and social justice), most policymakers framed it narrowly as "avoiding coronavirus deaths," leading to unintended consequences across multiple dimensions.


Consider the classic corporate example: "We need to increase customer satisfaction scores." Sounds clear, measurable, actionable. But satisfaction scores measure sentiment, not behaviour. Teams optimise for surveys while customers quietly defect to competitors offering better value propositions. The problem wasn't poorly executed customer satisfaction initiatives—it was solving for the wrong outcome entirely.


The cognitive trap mechanism: Once teams commit cognitive resources to a problem frame, they become invested in solutions that fit that frame rather than questioning whether the frame accurately captures the strategic challenge. Behavioural economists call this "solution aversion"—the tendency to evaluate problems based on whether we like the implied solutions rather than whether we've correctly diagnosed the underlying issue.


The Precision Sequence Architecture

This week, we sharpen the strategic lens before solutions multiply. Three systematic lenses—Decision Framing, Outcome Tree, and Cynefin Framework—align intent, consequences, and terrain. Think of them as macro focus, ripple analysis, and domain diagnosis on the same strategic camera.


Why this specific progression works: Frame first (anchor effort to measurable purpose), map consequences second (reveal hidden stakes and system dynamics), diagnose complexity third (choose appropriate methods for Week 3's option generation). Each exercise exposes different failure modes while building toward domain-appropriate methodology selection.


The meta-principle: Precision prevents waste. Every hour spent getting the decision definition right saves dozens of hours avoiding solutions that address symptoms rather than root causes. This sequence systematically eliminates the most expensive strategic errors before they consume resources through structured decision clarification, systemic consequence mapping, and domain-appropriate methodology selection.



🎲 Decision Framing – Strategic Anchoring


image


What: Five-step systematic framework for clarifying strategic decisions through explicit definition, objective setting, alternative generation, constraint identification, and systematic evaluation.


Why: Most strategic failures trace back to fuzzy decision definitions where teams agree on actions but have entirely different theories about objectives, alternatives, and success criteria. This framework forces clarity before resource allocation.


When to deploy: Strategy sessions, project charters, performance reviews, budget allocation discussions, any initiative with unclear boundaries or competing stakeholder interpretations.


The Five-Step Strategic Framework

Step 1: Define the Decision. Clearly state the specific decision that needs to be made. Avoid compound decisions—break complex choices into discrete decision points. Use the present tense and active voice.

Examples:

  • Poor: "We need to improve our customer situation"
  • Better: "Should we invest $2M in customer service automation platform for Q2 deployment?"


Step 2: List Objectives and Criteria. Identify what success looks like and how it will be measured. Separate must-have objectives from nice-to-have criteria. Weight criteria by strategic importance.

Framework:

  • Primary objectives (non-negotiable outcomes)
  • Secondary objectives (valuable but tradeable)
  • Success metrics (measurable indicators)
  • Timeline constraints (decision and implementation deadlines)


Step 3: Outline Alternatives Generate distinct options that could address the decision. Aim for 4-6 alternatives spanning different approaches, not minor variations of the same concept.

Alternative categories:

  • Status quo (do nothing)
  • Incremental improvement (optimise current approach)
  • Significant change (new methodology or system)
  • Transformational shift (fundamental redesign)


Step 4: Consider Constraints Identify limitations that eliminate specific alternatives or impose requirements on the implementation approach.

Constraint types:

  • Resource limitations (budget, personnel, time)
  • Technical constraints (system capabilities, integration requirements)
  • Regulatory requirements (compliance, legal, governance)
  • Organisational constraints (culture, capability, change capacity)


Step 5: Evaluate Options Systematically assess each alternative against objectives and criteria using a structured methodology rather than intuitive preference.


Why This Five-Step Progression Works

The sequence forces teams through systematic thinking rather than intuitive leaps. Step 1 prevents compound decision confusion. Step 2 surfaces hidden objective conflicts before they derail implementation. Step 3 ensures sufficient alternatives to avoid false choice scenarios. Step 4 identifies real versus perceived limitations. Step 5 creates explicit evaluation discipline rather than political decision-making.


The strategic insight: Most teams jump from Step 1 (decision definition) directly to Step 5 (evaluation) without systematically considering objectives, alternatives, and constraints. This creates evaluation chaos where teams argue about unstated assumptions rather than explicit trade-offs.


Professional Facilitation Methodology

Individual preparation phase: Before group session, each participant completes Steps 1-3 independently. This prevents groupthink and reveals assumptions about decision scope and alternatives.


Criteria weighting exercise: Use 100-point allocation method where team assigns points to each objective based on strategic importance. Forces explicit trade-off discussions and reveals hidden priority conflicts.


Constraint validation protocol: Challenge each constraint with "Is this truly unchangeable or just currently unchanged?" Many perceived constraints are actually choices that can be revisited with sufficient strategic justification.


Alternative stress-testing: For each option, ask "What would have to be true for this to be the best choice?" This reveals hidden assumptions and helps identify conditions that favor different alternatives.


Connection to Week 1: Remember your Eisenhower Matrix distinction between urgent and important? Decision Framing's objective-setting step answers "important to what end?" while the constraint identification connects to your Speed Test's optimal risk/speed balance. Your Week 1 compressed decisions now get systematic strategic anchoring.


Advanced Business Applications

SaaS platform expansion decision:

Step 1 - Decision: "Should we build workflow automation features into our existing CRM platform?"

Step 2 - Objectives: Primary (increase enterprise customer LTV by 25%), Secondary (reduce customer churn, improve competitive positioning). Criteria: ROI > 20%, implementation within 6 months, minimal disruption to current users.

Step 3 - Alternatives: Build in-house, acquire automation startup, partner with existing provider, outsource development, abandon expansion.

Step 4 - Constraints: $3M budget cap, engineering team at capacity, Q4 customer conference deadline, API integration requirements.

Step 5 - Evaluation: In-house build scores highest on strategic control but lowest on timeline feasibility. Partnership scores highest on speed-to-market but creates vendor dependency risk.


Organisational restructuring decision:

Step 1 - Decision: "How should we organize our product development teams to improve time-to-market?"

Step 2 - Objectives: Primary (reduce feature delivery time by 40%), Secondary (improve cross-team communication, increase developer satisfaction). Criteria: Implementation without layoffs, maintain code quality standards, and preserve institutional knowledge.

Step 3 - Alternatives: Functional teams by expertise, cross-functional squads by product area, matrix organisation, outsource non-core development, hybrid model.

Step 4 - Constraints: Union agreements limit reorganisation scope, key personnel are unwilling to relocate, customer commitments require continuity, and regulatory compliance requires separation of duties.

Step 5 - Evaluation: Cross-functional squads optimise for speed but challenge specialised expertise. The matrix model preserves expertise but creates coordination complexity.


Common Framing Failure Modes

Decision complexity: Trying to frame compound decisions as single choices. "Should we expand to new markets and reorganize our sales team?" requires separate decision frames for market expansion and organisational design.


Objective inflation: Listing too many primary objectives, creating impossible trade-offs. More than three primary objectives usually indicates unclear strategic priorities requiring executive alignment before decision framing.


Alternative anchoring: Generating variations of the preferred solution rather than truly different approaches. All alternatives requiring the same skills, budget, or timeline suggest insufficient creative thinking.


Constraint assumption: Accepting limitations that could be changed with sufficient strategic justification. "We can't hire" might be true for the current budget, but false if the business case justifies investment.


Evaluation bias: Using criteria that favour predetermined preferences. Teams often unconsciously weight criteria to justify intuitive choices rather than systematic assessment.


System Integration Insights

Connection to Week 1 Speed Test: Your optimal speed/risk scenario becomes a constraint for decision framing. High-risk scenarios need more conservative alternatives; low-risk scenarios can accept experimental options.


Bridge to Outcome Tree: Your defined decision and selected alternative become the root node of consequence mapping. Complex decisions with many alternatives typically generate complex outcome trees requiring careful analysis.


Diagnostic value: Teams that struggle with decision definition usually struggle with strategy execution. If you can't frame the choice clearly, you can't implement the solution systematically.


Evaluation preparation: Well-framed decisions with clear criteria and alternatives set up Week 4's systematic evaluation exercises. Poor framing creates evaluation chaos where teams argue about unstated assumptions rather than explicit trade-offs.


🎲 Outcome Tree – Ripple-view


image


What: Visual mapping of desired and feared outcomes branching three layers deep from your framed decision, revealing system dynamics and unintended consequences before they manifest.


Why: Second-order effects (negative press, culture shifts, competitor responses) kill more projects than budget overruns. Third-order effects (how those responses interact) determine whether strategic initiatives create lasting competitive advantage or temporary tactical gains.


When to deploy: Before significant investments, organisational changes, market entries, partnership decisions, any decision with broad stakeholder impact or system-wide implications.


The Three-Layer Ripple Methodology

Layer 1 - Immediate Effects: Direct, predictable consequences of your framed decision. These usually happen within days or weeks and are largely under your control.


Layer 2 - System Response: How other actors (customers, competitors, employees, partners) react to Layer 1 effects. These emerge over weeks to months and are influenced but not controlled by your actions.


Layer 3 - Interaction Effects: How Layer 2 responses interact with each other and create emergent properties. These develop over months to years and often surprise teams who only planned for Layers 1 and 2.


Advanced Facilitation Techniques

Individual generation protocol: Each participant spends 5 minutes generating outcomes independently using sticky notes—green for desired, red for feared. This prevents groupthink and captures diverse perspectives on system dynamics.


Clustering and categorisation: Group similar outcomes into themes (financial, operational, cultural, competitive, regulatory). This reveals which domains have dense consequence clusters requiring special attention.


Causal arrow mapping: Draw connections between outcomes across layers. Use solid arrows for direct causation, dashed arrows for probable influence. This exposes reinforcing loops where effects amplify each other.


Probability and impact weighting: Score each outcome cluster 1-5 for likelihood and potential impact. Focus mitigation planning on high-probability/high-impact combinations while monitoring low-probability/high-impact scenarios.


Professional Application Examples

Enterprise software acquisition:

  • Layer 1: Integration costs, training requirements, license savings
  • Layer 2: Vendor dependency concerns, employee resistance, competitor platform switching
  • Layer 3: Industry standardisation pressure, talent market effects, innovation pace impacts


Outcome Tree insight: Dense red clusters in Layer 2 around employee adoption revealed the need for change management investment that wasn't in the original budget planning.


Market expansion strategy:

  • Layer 1: Revenue opportunity, operational complexity, capital requirements
  • Layer 2: Competitive response, regulatory attention, brand perception shifts
  • Layer 3: Market structure evolution, partnership ecosystem effects, customer expectation changes


Strategic revelation: Layer 3 analysis showed expansion would accelerate industry consolidation, potentially creating regulatory scrutiny affecting the entire business model.


Diagnostic Pattern Recognition

Red cluster warnings: Dense concentrations of feared outcomes in specific domains signal where your frame needs risk-first approaches. More red than green outcomes suggest the frame might be too aggressive for current organisational capability.


Reinforcing loop identification: When feared outcomes in Layer 2 create conditions that amplify other feared outcomes in Layer 3, you've found systemic risk that requires fundamental frame adjustment, not just tactical mitigation.


Emergence detection: When Layer 3 outcomes seem disconnected from your original frame, you're entering complex systems territory where intended consequences become less predictable and unintended consequences become more likely.


System Dynamics Integration

Balancing loops: Outcomes that create natural limiting factors (success leading to resource constraints, growth creating coordination challenges). These require scaling solutions built into the original frame.


Reinforcing loops: Outcomes that accelerate other outcomes (market success attracting competitors, talent attraction improving capability). These create exponential rather than linear effects, requiring different resource planning.


Delay effects: Outcomes that create consequences with significant time lags (training investments improving productivity 12 months later, partnership decisions affecting competitive position 18 months out). These demand different measurements and patience than immediate-effect decisions.


Integration With Strategic Planning

Resource allocation insight: Outcome Trees often reveal that success scenarios require different resource profiles than initially planned. Layer 2 competitive responses might demand marketing investment; Layer 3 regulatory attention might require legal capability.


Timeline calibration: Complex Outcome Trees with many interdependencies signal longer implementation timelines than simple cause-effect chains. Use this insight to set realistic expectations and intermediate milestones.


Success redefinition: Sometimes, the Outcome Tree analysis reveals that your original success criteria capture only Layer 1 effects while Layer 2 and 3 effects determine actual strategic value. This can lead to frame revision before proceeding.


🎲 Cynefin Framework – Diagnosis


image


What: Systematic categorisation of problems as Simple, Complicated, Complex, or Chaotic based on cause-and-effect relationship clarity, determining optimal methodology for approach and resource allocation.


Why: The Problem domain determines the optimal methodology. Linear project plans fail in Complex domains where emergent solutions can't be predicted. Experimentation wastes time and resources in Simple domains where best practices already exist.


When to deploy: Strategic planning, crisis response, innovation projects, organisational design, technology selection, any situation requiring methodology choice between analysis, experimentation, or immediate action.


The Four-Domain Diagnostic Framework

Simple Domain - Best Practice Application:

  • Cause and effect relationships are apparent to everyone
  • Solutions are well-known and proven
  • Examples: Payroll processing, basic compliance, established operational procedures
  • Methodology: Sense → Categorise → Respond with established procedures


Complicated Domain - Expert Analysis Required:

  • Cause and effect relationships exist but require analysis or expertise to discern
  • Multiple correct answers exist, but expertise can identify the best one
  • Examples: Engineering design, financial modelling, technical architecture decisions
  • Methodology: Sense → Analyse → Respond with expert recommendations


Complex Domain - Emergent Practice Development:

  • Cause and effect relationships are only clear in retrospect
  • Solutions emerge through experimentation and adaptation
  • Examples: Culture change, market innovation, ecosystem development
  • Methodology: Probe → Sense → Respond through iterative learning


Chaotic Domain - Crisis Management:

  • No clear cause-and-effect relationships
  • Immediate action is required to stabilise the situation
  • Examples: System failures, market crashes, organisational crises
  • Methodology: Act → Sense → Respond to establish stability, then move to another domain


Professional Diagnostic Methodology

Four-question assessment protocol:

  1. Are cause-and-effect relationships immediately apparent to most stakeholders?
  2. Can expert analysis reliably predict outcomes?
  3. Do similar situations typically require different approaches?
  4. Is immediate action required to prevent system collapse?


Team consensus building: Have each member independently place the problem on the Cynefin framework, then discuss disagreements. Disagreements often reveal that different aspects of the challenge belong in different domains, requiring hybrid approaches.


Boundary condition analysis: Many strategic challenges span multiple domains. Use the framework to decompose complex challenges into domain-specific components requiring different methodological approaches.


Advanced Business Applications

Digital transformation initiative:

  • Technical infrastructure changes: Complicated (expert analysis)
  • Cultural adoption patterns: Complex (experimentation)
  • Regulatory compliance elements: Simple (best practices)
  • Crisis management during outages: Chaotic (immediate response)


Domain insight: Treating the entire initiative as Complicated led to over-planning in Complex areas and under-preparation for Chaotic scenarios. The hybrid approach applied an appropriate methodology to each component.


Product development challenge:

  • Manufacturing processes: Simple (established procedures)
  • Performance optimisation: Complicated (engineering analysis)
  • Market fit discovery: Complex (customer experimentation)
  • Competitive response management: Complex (adaptive strategy)


Strategic revelation: The Initial frame assumed a Complicated domain (hire experts, create detailed plans). Cynefin diagnosis revealed primarily Complex territory requiring probe-sense-respond experimentation rather than analyse-then-execute planning.


Methodology Selection Consequences

Simple domain misdiagnosis: Applying experimentation to Simple problems wastes resources and delays obvious solutions. Teams spend months "testing" approaches when established best practices already exist.


Complicated domain confusion: Treating Complicated problems as Simple leads to poor decisions when expertise is actually required. Treating them as Complex leads to unnecessary experimentation when analysis can provide answers.


Complex domain over-analysis: The most common and expensive error. Teams try to analyse their way of finding solutions in complex domains, creating detailed plans for unpredictable territories. This delays learning and action while providing false confidence.


Chaotic domain procrastination: Trying to analyse or experiment during crises when immediate action is required to prevent system collapse. This can turn manageable crises into organisational disasters.


Domain Migration Strategies

Chaos to Simple: After immediate crisis response, establish procedures to prevent recurrence. Move from reactive response to proactive best practice application.


Complex to Complicated: As experimentation reveals patterns, codify successful approaches into expert knowledge. What starts as an emergent practice becomes an analyzable methodology.


Complicated to Simple: As expert solutions prove consistently effective, systematise them into standard procedures that don't require specialised knowledge.


Dynamic domain assessment: Some challenges migrate between domains based on external conditions. Monitor domain shifts and adjust methodology accordingly, rather than maintaining approaches that no longer fit the current reality.


Integration With Previous Exercises

Decision Framing connection: If framing discussions took longer than 15 minutes, you're likely in Complex territory, where the problem definition requires experimentation rather than analysis.


Outcome Tree correlation: Dense, interconnected Outcome Trees with many reinforcing loops typically indicate Complex domains where small changes can have large, unpredictable consequences.


Methodology bridge to Week 3: Complex domains demand robust option generation because you can't predict which approach will work. Complicated domains need fewer options but deeper analysis. Simple domains need efficient execution of known solutions.







The Precision Transformation System

Before state: Teams rushing toward solutions based on assumptions, optimising for metrics that don't matter, applying inappropriate methodologies to strategic challenges.


After state: Anchored intent (measurable strategic direction), mapped consequences (system-aware planning), domain-appropriate methods (right tool for right context), ready for systematic option generation.


The compound effect: These three exercises reveal strategic misalignment before it becomes expensive. If your decision frame points to simple execution but your outcome tree shows complex interdependencies, you've caught a fundamental strategic error before resources are committed.


Measurable Transformation Indicators

Frame quality metrics: Team members can independently recite the decision frame and explain each element (target: 100% consistency).


Consequence awareness: Percentage of primary outcomes identified in advance versus discovered during implementation (target: 80% identification rate).


Domain accuracy: Alignment between predicted methodology requirements and actual approaches needed during execution (target: 90% methodology match).


Common Integration Failure Modes

Frame-tree misalignment: Decision frames that seem simple but generate complex outcome trees indicate scope mismatch. Either narrow the frame or expand the methodology expectations.


Tree-domain contradiction: Complex outcome trees requiring Simple domain methodologies suggest either over-analysis of straightforward decisions or under-appreciation of strategic complexity.


Domain-execution gap: Cynefin diagnosis pointing to Complex domains, but the team is proceeding with Complicated domain project plans. This predicts implementation surprises and timeline overruns.


Your Week 2 Implementation Challenge

Run this precision sequence on a real decision your team is facing. Choose something substantial—a strategic initiative, significant investment, or organisational change with meaningful stakes and stakeholder impact.


Execution Protocol

Session structure: 90 minutes total

  • Decision Framing: 35 minutes (5 decision definition, 10 objectives/criteria, 10 alternatives, 5 constraints, 5 preliminary evaluation)
  • Outcome Tree: 30 minutes (5 individual, 15 mapping, 10 analysis)
  • Cynefin Diagnosis: 25 minutes (15 assessment and discussion, 10 methodology selection)


Success Criteria Checklist

Decision Framing validation:

  • Decision clearly stated as specific choice requiring resolution
  • Primary and secondary objectives identified with success metrics
  • 4-6 distinct alternatives generated spanning different approaches
  • Resource, technical, regulatory, and organisational constraints documented
  • Evaluation criteria are weighted by strategic importance with team consensus


Outcome Tree completion:

  • All three layers populated with specific outcomes
  • Causal relationships mapped between layers
  • Risk/opportunity balance assessed and documented
  • Reinforcing loops identified and mitigation strategies considered


Cynefin Domain clarity:

  • Domain placement agreed upon by team with clear reasoning
  • Methodology implications understood and accepted
  • Resource and timeline expectations calibrated to domain requirements
  • Success measurement approach matches domain characteristics


Diagnostic Signals and Responses

Positive indicators:

  • Decision clarity: When team debates focus on objective weighting rather than decision definition, you're achieving strategic alignment.
  • Tree diagnostic: Balanced green/red distribution suggests realistic optimism. More red notes indicate need for risk-first approaches.
  • Domain signal: If Cynefin diagnosis feels obvious after mapping the Outcome Tree, you've achieved clarity about both problem and approach.


Warning signals:

  • Rapid framing: If decision definition completes in under 15 minutes, challenge assumptions about scope and alternatives more aggressively.
  • Simple trees: If Outcome Tree has fewer than 15 outcomes across all layers, broaden perspective on system impacts.
  • Domain disagreement: If team can't agree on domain placement after discussion, decompose the challenge into components that might belong in different domains.


The Strategic Bridge to Creative Explosion

Precision achieved. Problem domain diagnosed. Strategic intent anchored to measurable outcomes. System consequences mapped and understood. Now you need sufficient creative breadth to avoid premature convergence on familiar solutions.


The logic of progression: Week 1 taught rapid decision-making within calculated risk parameters. Week 2 taught precision problem definition before solution generation. Week 3 will teach a systematic option explosion that respects both speed requirements and strategic precision.


Domain-specific preparation: If your Cynefin diagnosis points to Complex domains, prepare mentally for Week 3's creative explosion. Complex problems need diverse options because you can't analyse your way to the answer. Simple domains need fewer but more thoroughly analysed options. Complicated domains need expert-generated alternatives that can be rigorously evaluated.


The creative imperative: Research from Stanford's d.school shows that teams who generate fewer than 20 solution concepts before evaluation are 65% more likely to select familiar approaches that provide incremental rather than breakthrough results. Week 3 provides three systematic methods for reaching this creative threshold while maintaining strategic discipline.


Next Tuesday's focus: From a narrowed lens to a systematic option explosion—three exercises that respect introvert voices while generating creative volume that survives contact with reality. We'll explore Brainstorm Better (structured ideation), 1-2-4-All (inclusive creativity), and Alternative Futures (scenario-based innovation).


Because the first idea that fits your frame is rarely the best idea that could fit your frame.










Ready to master systematic strategic thinking? Get the complete PRISM Strategic Tabletop Exercise Guide with all 27 exercises, facilitation guides, and advanced techniques. Transform your team's decision-making in real-time, in real rooms, on real challenges.


Get the full stack PRISM Tabletop Exercise set here for AUD 40, and unlock all 27 exercises.

Comments