SaaS Retention
Churn Survey
Your churned customers will tell you the truth. Are you asking?
A churn survey is the fastest way to find out why customers cancel. Three questions, shown at the right moment, give you the data to fix what is actually driving churn.
No credit card required
What is a churn survey?
A churn survey is a short questionnaire sent to customers who have cancelled, downgraded, or stopped using your product. The goal is simple: find out why they left and what would have kept them.
Churned customers are uniquely honest. They have already made their decision, so there is no social pressure to be polite. That makes churn surveys one of the most direct feedback sources available to a SaaS team.
35-45%
Response rate when shown in-flow during cancellation
3
Questions is the optimal churn survey length
90 days
Typical time to see measurable churn reduction after acting on data
Churn survey vs exit survey
The terms are often used interchangeably but they describe different scopes.
Churn Survey
- +Broad concept covering all ways of understanding why customers leave
- +Includes proactive surveys (before cancel) and reactive surveys (at or after cancel)
- +Covers voluntary churn (user decides to leave) and involuntary churn (payment failure)
- +A full churn survey strategy spans multiple touchpoints across the user lifecycle
Exit Survey
- +Specific tool - the survey shown at the moment of cancellation
- +Reactive only - triggered when a user clicks cancel
- +Highest response rate of any churn survey type (35-45% in-flow)
- +One implementation within a broader churn survey strategy
Most SaaS teams start with an exit survey tool because it is the easiest to implement and has the highest response rate. A full churn survey strategy adds proactive inactivity surveys and post-churn win-back questionnaires on top of that.
Voluntary vs involuntary churn
Your churn survey strategy depends on which type of churn you are measuring. They need different survey approaches.
Voluntary churn
The user chose to leave
- -Cancelled subscription directly
- -Downgraded to free or lower tier
- -Stopped using the product (passive churn)
Survey approach: Exit survey in the cancel flow, inactivity survey at 14-30 days of no login. This is where churn surveys have the most impact.
Involuntary churn
The user did not intend to leave
- -Payment failure (expired card, insufficient funds)
- -Subscription lapsed due to no action
- -Account closed due to non-payment
Survey approach: A short post-lapse survey asking if they want to reactivate. Not a churn analysis survey - this is a recovery touchpoint. Fix the payment, then ask how you can help.
Churn survey questions
Grouped by purpose. A complete churn questionnaire picks one from each category - never more than 3-4 questions total.
Cancellation reason - always include, pick one
What's the main reason you're cancelling?
Multiple choice
When to use: The standard churn survey question. Works for most SaaS products. Always include "Other" with a text field to capture edge cases.
Which best describes why you decided to leave?
Multiple choice
When to use: Value-framed alternative. Use when you want to understand ROI perception rather than just feature gaps.
How would you describe your experience with us overall?
Multiple choice
When to use: Softer framing. Use for customers who have been with you a long time - feels less like an interrogation.
Open-ended follow-up - always include, pick one
What would have kept you as a customer?
Open text
When to use: Best general-purpose open question. Invites solution-oriented feedback rather than venting.
Is there one thing we could have done differently?
Open text
When to use: Forces prioritization. Better when you already have quantitative data and need the specific fix.
What's the most important thing we should improve?
Open text
When to use: Product-team friendly. Good for routing responses directly to a roadmap discussion.
Win-back intent - include when you want to recover churned users
If we fixed this, would you consider coming back?
Single choice
When to use: Identifies your win-back pool. Users who say yes or maybe are worth a personal email when you ship the fix.
Would you recommend us to a colleague, even though you're leaving?
Single choice
When to use: A passive churn signal. Yes here means they left for situational reasons - budget, timing - not product quality.
Competitive intel - optional, only show if they selected "switching to competitor"
What tool are you switching to?
Open text
When to use: The most valuable competitive data you can collect. Segment these responses separately from all other churn reasons.
What does that tool do better than us?
Open text
When to use: Follow-up to the above. Do not show this to everyone - only as a conditional follow-up when competitor is selected.
Churn survey templates
Three sample churn questionnaires ready to use. Copy directly or adapt the questions for your product.
Template 1
Standard customer churn survey (3 questions)
What's the main reason you're cancelling?
Multiple choice
What would have kept you as a customer?
Open text
If we fixed this, would you consider coming back?
Single choice
Template 2
Proactive churn survey for inactive users
We noticed you haven't logged in for a while. What got in the way?
Multiple choice
What would make it worth logging back in?
Open text
Would a quick 15-minute setup call help? We'll get it on the calendar.
Single choice
Template 3
Post-churn win-back questionnaire (sent 30 days after cancel)
Now that some time has passed, what do you wish had been different about your experience?
Open text
Are you currently using another tool to solve the same problem?
Single choice
We've shipped some updates since you left. Would any of these bring you back?
Multiple choice
Can we reach out to share what's new?
Single choice
When to run a churn survey
Timing determines response rate and the quality of insight you get.
At cancellation
Highest priorityShow inline during the cancel flow - the moment they click "cancel account". This is the highest-response window (35-45%). Do not wait to email them after.
After inactivity (14-30 days)
ProactiveUsers who have not logged in for 2-4 weeks are churn-risk. Send a short survey asking what got in the way. You can still save them.
After a downgrade
Often missedA downgrade is a soft churn signal. Ask what drove the decision - pricing, feature access, or usage drop. Different answers than a full cancellation.
Post-churn follow-up (30 days after)
Win-back signalOne month after cancellation, send a brief email survey. Response rates are lower (8-12%) but the perspective is more considered - they know what they switched to.
Churn survey best practices
The difference between a 10% and 40% response rate is mostly timing and format.
Show inline, not via email
Trigger the survey inside the cancellation flow, not as a follow-up email. In-flow response rates are 35-45%. Post-cancel emails get 8-12%.
Keep it to 3 questions
Every additional question drops completion. Three questions is the proven maximum. The survey should feel fast, not like a punishment for leaving.
Always include an "Other" option
Your preset options will never cover everything. "Other" with a text field captures the edge cases - and those are often the most useful responses.
Confirm cancellation first
Show the churn survey after cancellation is confirmed, not as a blocker before it. Gating cancellation creates resentment, not honesty.
Wait 30+ responses before acting
Churn survey data is noisy at small samples. The first five cancellations tell you nothing reliable. Collect at least 30 responses per reason category before changing anything.
Revisit quarterly
Churn reasons shift as your product and customer mix changes. A reason that was #1 six months ago may have been fixed. Run the analysis fresh every quarter.
The 5 churn reasons - and what to do about each
Most SaaS churn concentrates in 3-5 categories. Here is what each reason actually means and how to act on it.
38%
avg
Too expensive
Price objection - but often a value communication problem. Users who churn for price rarely compared you to alternatives; they just stopped seeing the ROI.
Fix: Add a pause or downgrade option in the cancel flow. Reach out with ROI data for similar customers.
27%
avg
Missing features
A specific capability gap. This is the most actionable churn reason - it tells you exactly what to build next.
Fix: Log every feature request from churn surveys. When you ship it, email the churned users who asked for it.
19%
avg
Not using it enough
An onboarding or habit formation failure. The product did not become part of their workflow.
Fix: Improve activation. Add in-app prompts that bring users back to core value moments.
11%
avg
Switched to a competitor
A positioning gap. Ask the follow-up: which competitor? This is your most valuable competitive intelligence.
Fix: Add a follow-up question asking which tool they switched to. Use this to sharpen your differentiation.
5%
avg
Other / personal
Budget cuts, company shutdown, role change. Unavoidable - do not optimize for this segment.
Fix: Flag for win-back in 3-6 months. Circumstances change.
How to analyze churn survey results
Collecting the data is 10% of the work. Here is how to turn responses into lower churn.
Categorize and rank by frequency
Group all responses into reason buckets. Rank by frequency. The top reason gets fixed first - not the most interesting one, the most common one. Most SaaS teams scatter effort across five problems instead of fully solving one.
Segment by customer cohort
Break the data down: do startups churn for different reasons than mid-market? Do month-1 churners give different answers than month-6 churners? Early churn is usually an onboarding failure. Late churn is usually a value or feature gap. Different problems require different fixes.
Separate fixable from unavoidable churn
"No longer need it" and "company shut down" are not fixable. "Too hard to use" and "missing feature X" are. Focus all product effort on fixable churn - it is the only category that responds to what you build. Track the ratio of fixable to unavoidable over time.
Build a win-back list and use it
Every user who answered "yes" or "maybe" to coming back is a win-back candidate. When you ship the fix they named, send a personal email - not a campaign blast, a personal note. "We built the thing you asked for. Thought you should know first." Win-back conversion from this approach runs 15-25%.
Share results with product, CS, and sales weekly
Exit data siloed in one person's dashboard helps nobody. Product needs it to prioritize. CS needs it to spot at-risk users earlier. Sales needs it to tighten messaging and handle the objections they will hear on the next call. A weekly Slack summary of top churn reasons is enough.
Ready to run your first churn survey?
Mapster has a pre-built exit and churn survey template - live in minutes.
Frequently asked questions
What is a churn survey?
A churn survey is a short questionnaire sent to customers who have cancelled, downgraded, or gone inactive. It asks why they left and what would have kept them. Churned customers have nothing to lose and will tell you the truth - making churn surveys one of the most honest feedback sources in SaaS.
What should I ask in a churn survey?
Ask three things: (1) the main reason for cancelling (multiple choice), (2) what would have kept them (open text), and (3) whether they would return if the issue was fixed. Keep it to 3 questions - response rates drop sharply beyond that.
When should I send a churn survey?
Show it inline during the cancel flow - the moment they click cancel. In-flow response rates are 35-45% vs 8-12% for post-cancellation email. Also consider proactive churn surveys for users inactive for 14-30 days, before they cancel.
What is a good churn survey response rate?
For in-product surveys shown during cancellation, 35-45% is healthy. For email surveys sent after cancellation, 8-15% is typical. In-flow surveys significantly outperform email because motivation to share feedback peaks at the moment of leaving.
How is a churn survey different from an exit survey?
The terms are often used interchangeably. Exit surveys are shown at cancellation (reactive). Churn surveys can also be proactive - sent to at-risk users before they cancel based on inactivity signals. Most SaaS teams run both.
How do I use churn survey data?
Group responses by churn reason, find the top 3, then segment by customer type. Do startups churn for different reasons than mid-market? Do month-1 churners differ from month-6 churners? Fix the top reason, track whether churn drops, then move to the next. Treat it as a product experiment.
Find out why customers churn. Then stop it.
Mapster shows your churn survey to the right user at the right moment - and links every response to their plan, role, and usage data so you can segment instantly.
Run a Churn Survey FreeNo credit card required