Skip to main content

A Named Methodology

Discover Before You Build

DX Discovery is a four-phase research methodology that helps platform teams understand developer needs, uncover genuine friction, and build with evidence rather than assumption.

Explore the Methodology

Why This Matters

Developer Experience shapes whether engineers adopt the platforms you build. But most platform teams design for capability, not experience. The result: underutilised platforms and frustrated engineers.

The Problem Space

The Challenges of Getting DX Right

Platform teams face a specific set of research and decision-making challenges. These are not generic product management problems — they are distinct to the context of building for developers.

Building on Assumptions

Platform teams are experts in their own tools and forget what it felt like to be a newcomer. Their experience is the worst proxy for their users'.

Loudest Voice Wins

Without structured research, roadmaps are shaped by the most vocal stakeholders — not the most representative developer needs.

Low Adoption Mystery

Teams build platforms, launch them, and watch adoption stagnate — without knowing whether the problem is awareness, onboarding, or fundamental misfit.

Solving Symptoms

Without research, teams fix the bugs developers complain about, missing the deeper workflow issues that are the actual source of friction.

Too Late to Change

Discovery happens after launch, when design decisions are baked in and changing direction is expensive. Research is retrospective when it should be generative.

Fragmented Evidence

Support tickets, Slack messages, and one-off conversations exist but are never synthesised into a coherent picture of developer needs and pain points.

The Solution

How DX Discovery Helps

DX Discovery is not a silver bullet. It is a structured approach to the messiness of understanding developer needs — one that gives platform teams the confidence to make decisions and the evidence to defend them.

Build Confidence

Discovery gives platform teams the evidence base to defend prioritisation decisions. Instead of 'we think developers need this', you have 'here's what 18 developer interviews told us'.

Reduce Product Risk

The biggest risk in platform engineering is building the right thing in the wrong way — or the wrong thing entirely. Discovery de-risks both before significant investment is made.

Enable Strategic Alignment

Research findings give platform teams a shared language with stakeholders. Evidence-based insight is far more persuasive than intuition when negotiating for roadmap space.

The Framework

The DX Discovery Methodology

DX Discovery is a four-phase research methodology. Each phase builds on the previous one — moving from context and landscape through research, synthesis, and into experimentation. The result is an evidence-backed foundation for platform decisions.

01

Phase 1

Landscape

Before you research, you need to understand the terrain. Phase 1 is about mapping the developer ecosystem: who are the users, what platforms and tools do they interact with, where do the team's assumptions live, and what is in scope for this discovery?

The four flight levels model is a particularly useful lens in this phase. It helps the research team identify which level of the system the discovery question lives at — and whether they are looking at the right level at all.

The Four Flight Levels

Four flight levels model: operational (Level 1), coordination (Level 2), portfolio (Level 3), and strategy (Level 4)
1

Operational

Day-to-day developer tasks and tools

2

Coordination

How teams work together and hand off

3

Portfolio

Prioritisation & sequencing across teams

4

Strategy

Business & technical direction

02

Phase 2

Research

Phase 2 is where you gather evidence. DX Discovery uses a mixed-method approach — combining qualitative depth with quantitative breadth. The balance depends on what you know and what you need to know.

Watch Out: The False Consensus Trap

Platform teams are experts in their own tools. That expertise is also a blind spot. Because they understand the system deeply, they've forgotten what it feels like to encounter it as a newcomer. This is why platform teams are the worst proxy for their own users — and why structured research with actual developers is non-negotiable.

Qualitative Methods

  • Developer interviews
  • Contextual inquiry (observe developers working)
  • Usability testing
  • Diary studies
  • Expert interviews

Quantitative Methods

  • Usage analytics and telemetry
  • Support ticket analysis
  • Developer surveys (NPS, CSAT)
  • Funnel and drop-off analysis
  • A/B testing of developer flows

Early Exploration Research

When you don't know what you don't know

Use when: the problem space is unclear, the team has strong assumptions to challenge, or you're scoping a new discovery from scratch. Favour qualitative depth and open-ended methods.

Later Exploitation Research

When you need to validate and prioritise

Use when: you have strong hypotheses from qualitative research and need to test them at scale. Favour quantitative methods to validate frequency and impact.

03

Phase 3

Synthesis

Synthesis is where raw data becomes insight. It is the most cognitively demanding phase — and the most underinvested. Teams rush from research to solutions, skipping the step where the real learning happens.

The DX Advantage

In DX discoveries, developers are often more articulate about their workflow than typical product users. They can describe the exact moment friction occurs, what they expected, and what they did instead. This specificity makes synthesis richer and insights more actionable.

04

Phase 4

Experiment

Discovery doesn't end with a report — it ends with a decision about what to test. Phase 4 frames experiments that validate the most important hypotheses before full product investment is made.

Minimum Viable Evidence

The goal of an experiment is to validate or invalidate a hypothesis at minimum cost. Design experiments that answer the key question without building the full solution.

Define Success Before You Start

Every experiment needs a predetermined success criterion. Decide what 'validated' looks like before you begin — not after you see the results.

Timebox Ruthlessly

Experiments without deadlines drift into ongoing projects. Set a fixed window — one or two weeks — after which you commit to a decision: continue, pivot, or stop.

Inputs

  • Prioritised insight list from synthesis
  • Defined hypotheses with measurable success criteria
  • Stakeholder alignment on what to test
  • Developer cohort willing to participate
  • Baseline measurements for comparison

Outputs

  • Validated or invalidated hypotheses
  • Evidence-backed product recommendations
  • Prioritised backlog items with research provenance
  • Documented experiment results for future reference
  • Clear go/no-go decisions on proposed solutions
+

After Discovery

Post-Discovery Considerations

Discovery doesn't end when the synthesis deck is shared. Sustaining the value of research requires deliberate effort after the project closes. These considerations are open-ended by design — the right answer depends on your organisation.

A Different Perspective

AI-Powered Discovery

AI is changing both what platform teams discover and how they discover it. Developers are adopting AI coding tools rapidly, and this creates a new class of DX challenge: the experience of AI-assisted development is itself worth studying. Simultaneously, AI tools are becoming useful research accelerators within the discovery process.

The risk with AI-assisted discovery is over-automating the human parts. Transcription and thematic coding can be accelerated with AI. But the interpretation of context, the recognition of what's unsaid, and the judgment about what matters — these still require human insight. Use AI to speed up the mechanics. Preserve the human for the meaning.

A Note on AI Coding Tools

If your developer audience is using AI coding assistants, your platform's DX in that context is a distinct research question. How does your API, SDK, or tool behave in an AI-assisted workflow? Is your documentation designed for both human and AI consumption? These questions are worth adding to your discovery scope.

A Personal Note

DX Discovery is the methodology I wish I'd had earlier. Too many platform decisions I've been part of were made on intuition — confident intuition, backed by smart people, but intuition nonetheless. What surprised me when I first ran a structured discovery was not how much we learned, but how wrong some of our most confident assumptions turned out to be. The developers who struggled most were not the ones we expected. The friction points that mattered most were not the ones loudest on Slack. Research doesn't slow you down. It redirects your speed toward the right things.

Jacob Lueg Tiedemann

Jacob Lueg Tiedemann

Product Builder

Where to Go Next

FAQ

Frequently Asked Questions

About DX Discovery

DX Discovery is a four-phase research methodology for understanding developer experience. It combines qualitative and quantitative research methods to uncover genuine developer needs, identify friction points in existing platforms or tools, and build the evidence base needed to make confident product decisions. It is designed specifically for the context of platform teams and internal developer products.

A full DX Discovery typically takes 6–10 weeks, depending on scope, team size, and organisational access. Phase 1 (landscape) takes 1–2 weeks, Phase 2 (research) takes 2–4 weeks, Phase 3 (synthesis) takes 1–2 weeks, and Phase 4 (experimentation framing) takes 1–2 weeks. Focused discoveries — scoped to a specific tool or workflow — can be completed in 3–4 weeks.

DX Discoveries are best run by a small cross-functional team: a researcher or DX practitioner to design and facilitate the research, a product manager or platform lead to connect findings to strategy, and an engineer or tech lead to interpret technical context. The team should have enough separation from day-to-day platform work to approach findings with curiosity rather than defensiveness.

Methodology

DX Discovery uses a mixed-method approach. Qualitative methods include developer interviews, contextual inquiry (observing developers while they work), usability testing, and diary studies. Quantitative methods include usage analytics, support ticket analysis, survey data, and telemetry from developer tools. The balance between qual and quant depends on the phase: early exploration favours qualitative depth; later validation benefits from quantitative breadth.

The false consensus trap occurs when a team assumes their own experience of a tool or platform is representative of all developer experience. Because platform teams are experts in their own tools, they are the worst proxy for their users. They've developed workarounds, have insider knowledge, and have long forgotten what it felt like to be a newcomer. DX Discovery explicitly guards against this by centring developer voices through structured research rather than assumptions.

The four flight levels model is a framework used in Phase 1 of DX Discovery to map the developer landscape at different levels of abstraction. Level 1 (operational) covers day-to-day developer tasks and tools. Level 2 (coordination) covers how teams work together and hand off work. Level 3 (portfolio) covers how the organisation prioritises and sequences work. Level 4 (strategy) covers the business and technical direction. Understanding which level a problem lives at helps prioritise where discovery research should focus.

Applying the Methodology

DX Discovery is valuable at several points: before building a new platform or major feature (to build on evidence rather than assumptions), when adoption of an existing platform is lower than expected, when support ticket volume is high and the root cause is unclear, when a platform team has grown rapidly and lost touch with developer needs, or when a significant architectural change is planned and you want to understand the human impact. In general: run discovery before you build, not after you've committed.

AI is both a subject of DX Discovery (understanding how developers use AI tools and where friction exists) and a tool within the discovery process itself. AI tools can accelerate interview analysis, help identify patterns across large transcript sets, and support synthesis. However, AI doesn't replace the need for genuine human contact with developers — the insight comes from understanding context and motivation, which requires qualitative depth that AI currently cannot fully replicate.