Product-Led Growth

How to interview customers: the complete guide for startup founders

Interviewing customers is a little bit like taking vitamins or brushing your teeth. We all know it's good for us, but we don't like doing it. Here's how to change that.

How to interview customers: the complete guide for startup founders

Last updated: April 2026 | Originally published: November 2022 

Interviewing customers is a bit like taking vitamins. Everyone knows it's good for them. Almost nobody actually does it.

Customer interviews feel uncomfortable. Confronting. You're not sure what to ask, you're not sure what you'll hear, and without a clear sense of when and how to run these conversations, you end up with opinions instead of insights. But opinions won't help you build anything real.

CB Insights analysed 431 VC-backed companies that shut down since 2023. Their top root cause of failure, cited in 43% of cases: poor product-market fit. Building something customers didn't actually need.

Customer interviews are the cheapest insurance policy in your startup toolkit. One honest conversation with the right person can tell you something that six months of building cannot.

This guide covers exactly when to use them, how to run them well, what the research actually says about getting reliable insight – plus the mistakes that cause most founders to walk away more confused than when they started. ⬇️

The founders who move fastest talk to customers first

Customer interviewing is the practice of talking directly to potential customers before and during product development to validate whether a real problem exists and whether people will pay to solve it. Done well, a single conversation can surface insights that months of building cannot, saving founders time, money, and runway.

One good conversation with the right person can literally save your business.

Get clear on who your customer is, what problems keep them up at night, and how motivated they actually are to solve them – and you'll quickly know which of your key assumptions are wrong before you've built anything. 

You get a free reality check. You save months of runway. You know what to build and what to leave out.

The problem isn't lack of motivation for customer insight. It's that no one teaches you how to run a useful customer conversation. The natural instinct (share your idea, gauge the reaction, look for enthusiasm) produces the wrong kind of data.

Rob Fitzpatrick called this out in The Mom Test: people don't give you bad feedback because they're dishonest. They give it because they want to be kind. Your warmest contacts are your least reliable sources. They'll tell you what you want to hear, and that's far more dangerous than hearing nothing at all.

Customer interviewing has a methodology: here’s what it is

Customer development is a methodology created by Steve Blank, formalised in his 2003 book The Four Steps to the Epiphany. It treats every startup's early assumptions as hypotheses to be tested through direct customer conversations before scaling. Rob Fitzpatrick later made the practice accessible in The Mom Test, which teaches founders how to ask questions that produce honest, actionable insight rather than polite validation.

Know the theory, run better interviews. Here's the two-decade intellectual lineage behind the method.

The Customer Development model

Steve Blank formalised the practice in The Four Steps to the Epiphany (2003), introducing what he called the Customer Development model. 

His core insight was deceptively simple: startups are not small versions of large companies. Large companies execute known business models. Startups search for one. 

That search requires getting out of the building and systematically testing hypotheses against reality before committing resources to building. As Blank put it: "No business plan survives first contact with customers."

His four-step model – Customer Discovery, Customer Validation, Customer Creation, Company Building – established the sequence most successful startup teams follow today.

The Mom Test

Rob Fitzpatrick then made the methodology accessible with The Mom Test, translating Blank's academic framework into practical conversational technique. 

The book's core insight: the problem isn't that founders talk to customers, it's how. Founders ask the wrong questions in the wrong way and end up with data that's misleading.

Fitzpatrick identified three types of bad data that routinely derail early-stage companies: 

  • Compliments ("that's a great idea!")
  • Hypothetical fluff ("I would definitely use that")
  • And wishlists ("it would be cool if it also did X"). 

None of these are a business signal. All of them feel like one.

The methodology that came out of this body of work is now used by every major accelerator (Y Combinator, Techstars, Seedcamp) as foundational curriculum. 

Customer discovery sits at an interesting intersection. It's not quite market research – you're not surveying for preferences. It's not quite UX research – you don't have a product yet. Understanding the difference is key Market research vs UX research →

When to interview customers

Startups that skip customer interviews before building risk spending months developing a product nobody needs. CB Insights found that 43% of startup failures trace back to poor product-market fit – the direct result of building before validating. Customer interviews replace assumptions with evidence before a single line of code is written.

There is three types of risk every startup faces: financial, product, and market. Since product-market fit is the thing most likely to kill you, let's focus on the last two – and crucially, how they shape what kind of interviewing you should be doing.

High market risk: you need discovery

Market risk shows up in three questions about your customers:

  • Do they actually want this?
  • Will they pay for it?
  • Are there enough of them?

Ventures with high market risk are either entering a new category or solving an existing problem in a new way. 

If you can't answer those three questions with confidence, customer interviews are your primary tool – specifically discovery interviews, focused on understanding the problem space before you've defined the solution.

Discovery interviews happen before you have a product. Their goal is to understand how people experience a problem today, not to gather feedback on your solution. You're testing assumptions, not pitching features.

One thing founders tend to get wrong here: they start doing discovery interviews after they've already built something. At that point, you're not doing discovery, you're doing user testing. It's a valuable activity, but it answers a different question. ⚠️

One thing that complicates customer interviews in practice: it's rarely clear who owns them. Is this a product management job? A design job? A founder job? The answer has real consequences for what you learn and what gets built: Product design vs product management →

High product risk: you need validation

Product risk is different. It's the risk that you know what to build but you're not sure you can actually deliver it with the resources available. 

If your main selling point is a significantly better or cheaper solution than what already exists, your risk is execution – not demand.

Here, customer interviews shift from problem discovery to commitment testing. You're not asking ❌ "does this problem exist?" You're asking ✅"will you pay to solve it, right now, at this price?" The goal is securing specific commitments (pilots, letters of intent, pre-orders) not gathering more opinions.

Knowing which type of risk dominates shapes how you spend your time. Most early-stage founders have both, but one tends to dominate, and mis-diagnosing it is expensive.

The right time to interview customers depends on the type of risk your startup faces. If you're unsure whether a market exists for your idea, interview before you build – this is customer discovery. If you have a product but need to test whether people will pay, interview to validate commitment. Most early-stage founders need both, in that order.

How to run a customer interview

An effective customer interview follows four principles: talk about the customer's life rather than your idea; ask about specific past behaviours rather than future hypotheticals; keep the interview separate from any sales pitch; and actively seek out bad news rather than confirmation. 

Once you understand the method, customer interviewing becomes one of the most rewarding things you'll do as a founder. Most of the discomfort comes from not knowing what you're doing. Here's what a genuinely useful interview looks like. ⬇️

Rule 1: Talk about their life, not your idea

Your goal in a customer interview is not to validate your idea. It's to understand how people behave and what genuinely motivates them.

What matters is understanding your customer's reality – what a problem actually costs them, how they're currently solving it (or not), and whether it registers as a problem worth solving at all.

The test Rob Fitzpatrick uses: could your mom answer these questions honestly without lying to protect your feelings? Questions about the customer's life (not your product) pass the test. Questions about whether your idea is good do not.

Fitzpatrick's recommended opening: "What's the hardest part of doing [the thing you help with]?" Let the answer take you somewhere. You'll learn more from one well-placed follow-up than from ten leading questions.

Rule 2: Ask about specifics in the past, not generics about the future

People are optimistic about what they'll do in the future. They also tell you what they think you want to hear. Neither of those things is useful.

Speculative questions – ❌ "Would you use something like this?" or ❌ "How much would you pay for that?" – produce unreliable answers almost every time. 

Instead, ask about specific situations that have already happened:

✅ "Walk me through the last time you encountered this problem."

✅ "What did you do? Did you search for a solution?"

✅ "Have you ever paid for something to fix this?"

The specifics tell you far more than the generics. If someone can't remember a specific recent instance of the problem, that's data. If they paid for a solution that didn't work, that's very different data. The details reveal what the generics obscure.

Airbnb learned this the hard way. Growth didn't change when Brian Chesky and Joe Gebbia asked hosts if they liked Airbnb as an idea. It changed when they went to New York in 2009, met hosts in person, and dug into a specific, concrete problem: listing quality. After personally photographing around 40 apartments, revenue in the New York market roughly doubled within a month. The insight didn't come from a survey. It came from observing a specific friction up close.

Drew Houston didn't discover Dropbox's core insight by surveying people about file storage. He discovered it by living the problem himself – forgetting his USB drive on a bus and being unable to work for hours. He started building during that journey. 

Later, when activation was broken, the team recruited people from Craigslist for $40 each and watched them attempt a simple task: go from a Dropbox invite email to sharing a file. Zero of five succeeded. That usability testing session produced a list of 80+ friction points they worked on fixing. The insight didn't come from asking hypothetical questions. It came from watching what people did, and couldn't do.

Rule 3: An interview is not a pitch, but when you pitch, demand a clear signal

A good interview is about listening, not talking. Keep the two modes completely separate. Interviews are for learning. Pitches are for selling.

This is hard when you're excited about what you're building. The temptation to share your vision, to see the person's face light up, is powerful and dangerous. The moment you start pitching, you've stopped learning.

When you do pitch, don't settle for "that's interesting" or "keep me updated." Those are not signals. Ask directly: "If this solves the problem you just described, I can take your order right now – would you like to proceed?" Or at minimum: "Would you be willing to introduce me to the decision-maker in your company?"

A clear "no" is worth more than a warm "maybe." Warm maybes feel like progress. They are not – a business cannot be built on compliments.

Rule 4: Actively pursue bad news

The most valuable thing a customer interview can do is tell you something you're wrong about, while it's still cheap to change course.

Founders who treat interviews as validation exercises consistently over-invest in the wrong direction. Founders who treat them as hypothesis-testing engines iterate faster, waste less money, and arrive at genuine product-market fit sooner. 👏

Before each round of interviews, write down the three most important things you want to learn. What assumptions, if wrong, would require you to fundamentally rethink what you're building? Those are the questions worth prioritising. Let the answers evolve as you learn.

The structure of a useful 30-minute interview

A customer interview can be completed in 30 minutes using a simple structure: spend the first three minutes setting expectations and clarifying you are not selling; the next 15–20 minutes going deep on one specific past incident where the problem occurred; five to seven minutes quantifying the impact; and the final few minutes asking for a concrete next step or commitment.

Interviews don't fail because of what's asked. They fail because of structure. Here's a framework that consistently yields useful signal:

Minutes 0–3: Set the frame

Make it explicit that you're not selling anything and that the most useful thing the person can do is tell you what's hard, not what's nice. 

💬 "I'm not here to pitch you, I'm trying to understand a problem and I'd love your honest reaction. There's no right answer."

Minutes 3–20: Go deep on one specific incident

Ask for one real situation where they encountered the problem you're exploring. 

💬 "Can you walk me through the last time this came up?" 

Then dig: “What happened next? What did you try? What did you have to work around? Who else was involved?”

Minutes 20–27: Quantify the impact

  • How much time did this cost? 
  • Was there a financial impact? 
  • What would have happened if it hadn't got resolved? 

Numbers make vague pain specific and shapeable.

Minutes 27–30: Ask for the next step

Whether that's a follow-up call, an introduction, or an expression of commitment – finish with a forward action, not just thanks. The close of every interview should produce something concrete.

One additional practice from The Mom Test: wherever possible, have two people attend. One leads the conversation, the other takes notes. The lead stays in listening mode. The notetaker catches things the lead misses and keeps notes grounded in actual quotes, not interpreted summaries. This matters because a single person interpreting all customer feedback creates an information bottleneck that can distort what the whole team believes.

How to find the right people to interview

The most valuable people to interview are early adopters – people who have the problem you're solving, know they have it, and are actively seeking a solution. They can be found in online communities, industry forums, and LinkedIn groups organised around the problem. The quickest sourcing method: ask every interviewee who else you should speak to.

Interview quality is exponentially more important than interview quantity. But finding the right interviewees is often an issue.

The people worth talking to are early adopters, not average users. Early adopters are people who: have the problem you're solving, know they have it, and are actively looking for a solution. If they're not actively seeking a solution, they're polite at best and misleading at worst.

The signal you're looking for: if three of the last five people you talk to are actively seeking a solution to the problem you're solving, you've found something worth pursuing.

How to find users to interview

  • Start with people you can reach through your existing network, then use those conversations to get warm introductions. One useful interview often yields several more. End every conversation with: 💬 "Is there anyone else you think I should talk to?"
  • Find communities where people congregate around the problem – forums, LinkedIn groups, industry events, Slack communities, subreddits. People who are posting about a problem are self-selected early adopters.
  • For B2B, customer support queues of adjacent products are gold. People who've already paid for an imperfect solution are among the most valuable people you can speak to.
  • Industry conferences, even virtually, can yield 10–20 valuable conversations in a few hours when you're not presenting, just having conversations.

Segment deliberately

❌ "Startups" is not a customer segment. 

❌ "B2B SaaS founders" is barely one. 

A strong segment definition includes a clear role, a context or environment, and a constraint that shapes behaviour. 

✅ "UK-based SaaS founders at seed stage managing a product team of under five people" is a segment. The narrower you go, the more comparable your interview data becomes, and the cleaner the patterns that emerge.

How many interviews do you actually need?

For early-stage customer discovery, five to ten interviews per customer segment is typically enough to surface reliable patterns. Customer discovery is qualitative research, not statistical sampling. The signal that you have done enough: when consecutive interviews stop producing genuinely new problems and the same core frustrations keep emerging without new dimensions.

"How many interviews should I do?" is one of the most common questions founders ask. The honest answer is: fewer than you think, but more qualitatively rigorous than most founders manage.

Customer discovery is qualitative research, not statistical sampling. You're looking for patterns, not percentages. The number is usually at around 5–10 well-chosen interviews per segment before patterns begin to solidify.

The qualifier is critical: per segment. Mixing interviews across very different customer types produces confusion. And if your third interview contradicts your second completely, you probably haven't narrowed your segment enough.

A useful practical signal: you've done enough interviews for a given phase when you stop hearing genuinely new problems. When the fifth consecutive person raises the same core frustration with no new dimensions, you have the signal you need. 👌

Interviews are one input, not the whole picture. If you want a faster, more systematic way to validate before committing to a build, we've put together the approach we use with founders at every stage: The systematic approach to product validation →

What comes after: how to use interviews to measure product-market fit

Once you have active users, product-market fit can be measured using the Sean Ellis test: ask users "how would you feel if you could no longer use this product?" with options of very disappointed, somewhat disappointed, or not disappointed. If 40% or more answer "very disappointed," you likely have product-market fit. Below that threshold, continued iteration is needed before scaling growth.

Customer interviews don't stop once you have a product. They evolve.

The Sean Ellis test

Once you have active users, the industry standard methodology for measuring product-market fit is the Sean Ellis test – a single survey question: "How would you feel if you could no longer use this product?" with three options: very disappointed, somewhat disappointed, not disappointed.

After benchmarking nearly a hundred startups, Ellis found a consistent pattern: companies that achieved sustainable growth almost always had 40% or more of users answer "very disappointed." Companies below that threshold consistently struggled to grow.

The Sean Ellis test tells you how people feel about your product. Analytics tell you what they actually do. Before you hit product-market fit, most teams are tracking the wrong numbers entirely: The metrics that matter pre-PMF →

Rahul Vohra applied this at Superhuman with striking results. When they first ran the survey, their score was 22%, well below the threshold. By segmenting users to identify who found the product genuinely indispensable, then focusing product development exclusively on increasing that group, they got to 58% within months. That clarity (derived from systematically asking users a single honest question) transformed their roadmap and their trajectory.

The connection back to interviews: when you run the survey and want to understand why users answer as they do, the interview is still your best tool. Surveys tell you what. Conversations tell you why.

💡 Interviews tell you who to build for. Onboarding is your first chance to prove you listened. If your activation rate is stuck, the problem is usually upstream of the product – it's in how you bring people to value. How to nail your PLG onboarding →

The do and don't list

✅ Do:

  • Ask "why" repeatedly about specific past behaviours – not opinions, but actions
  • Dig into emotional signals: "Why does this bother you so much?" gets to motivation that "what's the problem?" never reaches
  • Ask whether they've ever searched for a solution, or paid for one. This is the best proxy for real pain
  • When someone requests a feature, ask what problem that feature would solve, not just what the feature would do
  • Take detailed notes with actual quotes; emotion-laden quotes in particular are hard to lie to yourself about later
  • Send a two-sentence follow-up after each interview summarising what you heard – it confirms you listened and surfaces corrections

❌ Don't:

  • Ask "would you use this?" or "would you pay for this?". Hypothetical future behaviour is almost always wrong
  • Pitch without asking for a specific commitment
  • Interview people who already support you, or who want to be encouraging
  • Treat compliments as demand signals – they are not
  • Do all the interviews yourself without the rest of the team ever talking to customers. This creates a single point of interpretation failure
  • Over-prepare with a rigid script; the best interviews follow threads, not questionnaires
Good interviews don't give you one clear answer. They give you ten competing signals, all of which feel urgent. Deciding what to act on first is a separate skill, and one worth having a framework for: How to prioritise when everything feels urgent →

Final thoughts

Discovery interviews tell you about the problem space. Validation interviews test commitment. User interviews gather product feedback. Mixing these up – running what feels like a discovery interview but interpreting it as validation – is where the most dangerous false positives come from.

AskTina, a live video chat widget for websites, failed to validate its business idea through real customer interviews before building. They launched with over 10,000 page loads and couldn't generate a single paid consultation. Had they talked directly to the people they were trying to serve, they might have discovered that their target audience didn't want a live video call option at all. 🤦

That's the thing about customer interviews. They feel optional until they're the thing you wish you'd done six months ago.

You've done the interviews. You have the insight. Now the question is how to translate what you heard into product decisions that hold up. That's where user-centred design comes in The founder's guide to user-centred design →

Milosz Falinski is Co-Founder and UX & Strategy Lead at Lumi Studio – a product design studio working with startups from pre-seed to Series B. Since 2015, Lumi has partnered with 52+ startups helping them design products that find and keep customers. Their clients have gone on to serve 40M+ users, achieve 4 exits, and reach a combined valuation of $1.4B+.

Frequently asked questions

When is the right time to start interviewing customers? 

Before you write a line of code. Customer interviews are most valuable at the hypothesis stage, when you still have the option to change course without burning runway. Steve Blank's core principle: the facts reside outside your building.

How many customer interviews do I actually need? 

For early-stage discovery, 5–10 well-chosen interviews per customer segment is typically enough to surface reliable patterns. You're doing qualitative research, not building a statistically significant dataset. The signal you're looking for: when you stop hearing genuinely new problems, you have enough to act on.

What's the difference between a discovery interview and a validation interview?

Discovery interviews happen before you have a product: their goal is to understand the problem space. Validation interviews happen once you have a product or prototype – their goal is to test whether your specific solution fits the problem you've identified. Running them out of sequence (validation before discovery) is one of the most expensive mistakes in product development.

What if people say they'd use my product but never actually sign up?

That gap between stated intent and actual behaviour is one of the most reliable signals in early-stage validation. It usually means the problem isn't painful enough to motivate action, the friction in your signup is too high, or the people you're speaking to aren't your actual early adopters. Rob Fitzpatrick's fix: stop asking what people would do, and start asking what they've actually done.

What's the biggest mistake founders make in customer interviews?

Pitching instead of listening. The second biggest: interviewing people who are predisposed to be supportive rather than honest. The third: treating a conversation as validation when it was really just polite engagement.

How do I know when I have product-market fit?

The Sean Ellis test is the most reliable leading indicator: ask active users "How would you feel if you could no longer use this product?" If 40% or more say "very disappointed," you're in the zone. Below that, you're building on hope. Superhuman went from 22% to 58% by using this metric to drive every product decision for several months, and it became their most important number.

👋
Thanks for reading!
Got any questions? Chat with our founder today to get them answered & learn how we can help your startup!
Milosz Falinski

Milosz Falinski

Founder of Lumi Design, design strategy expert and startup veteran. Businesses Milosz has worked on tend to be acquired.

👋
30m Intro
🤝
Deep dive
🚧
First project

Building something great? Let's talk

Meet for 30 minutes with one of our founders. Dive right in and see if we're a good fit.