Home
cd ../playbooks
Product ManagementIntermediate

Discovery Interview Prep

Design a customer discovery interview plan with the right methodology, questions, and bias guardrails — calibrated to your research goal, segment, and constraints via 4 adaptive questions.

15 minutes
By communitySource
#discovery#customer-interviews#mom-test#jtbd#qualitative-research#bias-avoidance

You have a two-week deadline, 5 interviews scheduled, and you're about to ask "Would you use this if we built it?" — which produces exactly zero usable insight. Good interviews aren't about scripts. They're about matching methodology to goal, then asking only about past behavior.

Who it's for: PMs running customer discovery, UX researchers designing interview studies, founders validating early problem hypotheses, PMMs investigating competitive switching, Heads of CS running churn investigations, anyone about to bias 10 interviews

Example

"I need to interview 5 enterprise customers about why they churned in the last 90 days" → Q1 retention/churn + Q2 recent churners + Q3 limited access + Q4 switch interviews → Full interview guide with 5 questions, 5 biases to avoid, 5 success criteria, and a recruiting plan

CLAUDE.md Template

New here? 3-minute setup guide → | Already set up? Copy the template below.

# Discovery Interview Prep

Guide product managers through preparing for customer discovery interviews by asking adaptive questions about **research goals**, **customer segments**, **constraints**, and **methodologies**. Output: a tailored interview plan with methodology, question framework, and success criteria.

This is not a script generator—it's a strategic prep process that ensures interviews yield actionable insights rather than confirmation bias or surface-level feedback.

## The Discovery Interview Prep Flow

1. Gather product/problem context
2. Define research goals (what you're trying to learn)
3. Identify target customer segment and access constraints
4. Recommend interview methodology (JTBD, problem validation, switch interviews, etc.)
5. Generate interview framework with questions, biases to avoid, and success metrics

## What This Is NOT

- **Not a user testing script** — Discovery = learning problems; testing = validating solutions
- **Not a sales demo** — Don't pitch, listen and learn
- **Not surveys at scale** — Deep qualitative interviews (5–10 people)

## Application (4 Adaptive Questions)

### Step 0: Gather Context

**For an existing or planned product:**
- Problem hypothesis or product concept
- Target customer segment (if known)
- Existing research (support tickets, churn data, user feedback)
- Product website or positioning materials
- Key assumptions you're trying to validate

**For investigating an existing problem:**
- Customer complaints, support tickets, churn reasons
- Hypotheses about why customers leave or struggle
- Competitive alternatives customers switch to

### Question 1: Research Goal

1. **Problem validation** — Confirm problem exists and is painful enough (new product ideas)
2. **Jobs-to-be-Done discovery** — Understand what customers are trying to accomplish and why current solutions fail
3. **Retention/churn investigation** — Figure out why customers leave or don't activate
4. **Feature prioritization** — Validate which problems/features matter most

### Question 2: Target Customer Segment

1. **People who experience the problem regularly** — E.g., freelancers who invoice 5+ clients/month
2. **People who've tried to solve it** — E.g., users who tried 2+ competitors and churned
3. **People in target segment (regardless of awareness)** — Uncover latent needs
4. **People who've recently experienced the problem** — Fresh memory, last 30 days

### Question 3: Constraints

1. **Limited access** — 5–10 customers, 2-week timeline
2. **Existing customer base** — 100+ active customers, easy recruitment
3. **Cold outreach required** — No existing customers; recruit via LinkedIn, ads, communities
4. **Internal stakeholders only** — Sales/support as proxy (pragmatic but less ideal)

### Question 4: Interview Methodology

Recommended options, adapted to Q1–Q3:

1. **Problem validation interviews (Mom Test style)** — Past behavior, not hypotheticals. "Tell me about the last time you [experienced problem]." (Best for: validating if problem is real and painful)
2. **Jobs-to-be-Done (JTBD) interviews** — Focus on accomplishment, not wants. "What were you trying to get done? What alternatives did you consider?" (Best for: understanding motivations, switching behavior)
3. **Switch interviews** — Customers who recently switched. "What prompted you to look? Push away from old, pull to new?" (Best for: competitive positioning, unmet needs)
4. **Timeline/journey mapping interviews** — Walk through experience chronologically (Best for: uncovering full context)

## Generated Interview Plan Template

```markdown
# Discovery Interview Plan

**Research Goal:** [From Q1]
**Target Segment:** [From Q2]
**Constraints:** [From Q3]
**Methodology:** [From Q4]

## Interview Framework

### Opening (5 minutes)
- Build rapport: "I'm researching [topic]. This isn't a sales call."
- Set expectations: "No right answers. Critical feedback helps most."
- Get consent: "Okay if I take notes / record?"

### Core Questions (30–40 minutes)

Example for Mom Test-style problem validation:

1. **"Tell me about the last time you [experienced problem]."** — Specific, recent behavior
   - Follow-up: "What were you trying to accomplish? What made it hard? What did you try?"
   - Avoid: "Would you use a tool that solves this?" (leading, hypothetical)

2. **"How do you currently handle [problem]?"** — Reveals workarounds, alternatives, pain intensity
   - Follow-up: "How much time/money does that take? What's frustrating?"
   - Avoid: "Don't you think that's inefficient?" (leading)

3. **"Walk me through step-by-step."** — Uncovers details, edge cases
   - Follow-up: "What happened next? Where did you get stuck?"
   - Avoid: "Was it hard?" (yes/no)

4. **"Have you tried other solutions?"** — Competitive landscape, unmet needs
   - Follow-up: "What did you like/dislike? Why did you stop?"
   - Avoid: "Would you pay for a better solution?" (hypothetical)

5. **"If you had a magic wand, what would change?"** — Ideal outcomes (treat with skepticism)
   - Follow-up: "Why does that matter? What would that enable?"
   - Avoid: Taking feature requests literally

### Closing (5 minutes)
- Summarize: "Just to recap, I heard [key insights]. Did I get that right?"
- Ask for referrals: "Know anyone else who experiences this?"
- Thank them.

## Biases to Avoid

1. **Confirmation bias** — Don't ask "Don't you think X is a problem?" → "Tell me about your experience with X."
2. **Leading questions** — Don't ask "Would you use this?" → "What have you tried? Why did it work/fail?"
3. **Hypothetical questions** — Don't ask "If we built Y, would you pay?" → "What do you currently pay for? Why?"
4. **Pitching disguised as research** — Don't say "We're building Z to solve X" → "I'm researching X."
5. **Yes/no questions** — Don't ask "Is invoicing hard?" → "Walk me through your invoicing process."

## Success Criteria

✅ You hear specific stories, not generic complaints
✅ You uncover past behavior, not hypothetical wishes
✅ You identify patterns across 3+ interviews
✅ You're surprised by something (if everything confirms, you're leading)
✅ You can quote customers verbatim

## Interview Logistics

**Recruiting:**
- Limited access: reach out to 20–30 to get 5–10 interviews (33% response rate typical)
- Existing customers: email 50 with $50 gift card incentive

**Scheduling:**
- 45–60 minutes (30–40 min conversation + buffer)
- Record with consent or take detailed notes
- Max 2–3 per day (synthesis time needed)

**Synthesis:**
- Write key insights immediately after each interview
- After 5 interviews, look for patterns
```

## Common Pitfalls

1. **Asking what customers want** — "What features do you want?" gives feature requests, not problems. Fix: ask about past behavior.
2. **Pitching instead of listening** — 20 min explaining your idea → polite obligation. Fix: don't mention solution until last 5 min, if at all.
3. **Interviewing wrong people** — Friends, family, non-sufferers → polite feedback, not real insights. Fix: interview regular, recent experiencers.
4. **Stopping at 1–2 interviews** — Confirmation bias. Fix: interview 5–10 minimum; look for patterns.
5. **Not recording insights** — Memory fades, patterns missed. Fix: record with consent or detailed notes; synthesize immediately.

## References

- Rob Fitzpatrick, *The Mom Test* (2013) — How to ask good questions without biasing answers
- Clayton Christensen, *Jobs to Be Done* — Methodology for understanding motivations
- Teresa Torres, *Continuous Discovery Habits* (2021) — Opportunity solution tree interviews
README.md

What This Does

Walks through 4 adaptive questions — research goal, target segment, constraints, methodology — and produces a tailored interview plan: opening/core/closing questions, biases to avoid, success criteria, and recruiting logistics. Matches methodology (Mom Test, JTBD, switch interviews, journey mapping) to your specific situation.

Not a script generator. A strategic prep process.


Quick Start

mkdir -p ~/Documents/DiscoveryPrep
mv ~/Downloads/CLAUDE.md ~/Documents/DiscoveryPrep/
cd ~/Documents/DiscoveryPrep
claude

Provide your problem hypothesis, target segment (if known), existing research (support tickets, churn data), and constraints. Claude runs the 4 questions and delivers the plan.


Four Methodology Options

  • Problem validation (Mom Test style) — Past behavior for new product ideas
  • Jobs-to-be-Done (JTBD) — Motivations and switching behavior for strategy
  • Switch interviews — Push/pull analysis for competitive positioning
  • Timeline/journey mapping — Chronological walkthrough for full context

The Five Biases to Avoid

  1. Confirmation bias — "Don't you think X is a problem?" → "Tell me about your experience with X."
  2. Leading questions — "Would you use this?" → "What have you tried? Why did it work/fail?"
  3. Hypothetical questions — "If we built Y, would you pay?" → "What do you currently pay for? Why?"
  4. Pitching disguised as research — "We're building Z to solve X" → "I'm researching X."
  5. Yes/no questions — "Is invoicing hard?" → "Walk me through your invoicing process."

Success Criteria

  • Specific stories, not generic complaints ("Last Tuesday, I spent 3 hours..." vs. "Invoicing is annoying")
  • Past behavior, not hypothetical wishes
  • Patterns across 3+ interviews (same pain points emerge independently)
  • Surprises (if everything confirms assumptions, you're leading)
  • Verbatim customer quotes

Tips & Best Practices

  • Record with consent. Memory fades; verbatim quotes are gold.
  • Max 2–3 interviews per day. Synthesis time matters as much as interview time.
  • Reach out to 3× your target. 33% response rate is typical for cold outreach.
  • Don't pitch. Don't mention your solution until the last 5 minutes, if at all.
  • Interview 5–10 minimum. Stop at saturation (same patterns across 3+), not at convenience.

Common Pitfalls

  • Asking what customers want (gets feature requests, not problems)
  • Pitching instead of listening (polite obligation, not honest feedback)
  • Interviewing friends/family/non-sufferers (polite feedback, no insight)
  • Stopping at 1–2 interviews (confirmation bias, tiny sample)
  • Not recording or synthesizing immediately (details lost, patterns missed)

$Related Playbooks