Demo Case Study: This is a fictional example created to demonstrate Mapster's PMF analysis methodology. The company, data, and insights are illustrative examples showing how segmented PMF measurement works in practice.
A demo case study on systematic Product Market Fit measurement across customer segments, geographies, and acquisition channels
B2B Project Management SaaS
Mappbook had solid topline growth but struggled with 8% monthly churn. They didn't know which customer segments actually loved their product, why expansion into Europe was underperforming, or what was driving the gap between power users and churned customers.
"We're growing, but are we actually building something people love? Or just something people kind of use?"
β Andrew Walker, Co-Founder & CEO
Well below the 40% threshold for strong Product Market Fit
The segmented analysis revealed the real story behind the numbers. Mappbook's overall PMF masked dramatic differences across segments, geographies, and channels.
| Segment | PMF Score | Sample Size | Key Insight |
|---|---|---|---|
| Marketing Agencies (11-50) | 58% β | 847 | Loved time tracking + client reporting |
| Software Dev Teams (11-50) | 42% ~ | 623 | Used for sprint planning |
| Consultants (Solo/Small) | 19% β | 412 | Found it "too complex" |
| Enterprise (200+) | 15% β | 521 | Lacked enterprise features |
Marketing agencies didn't just have higher PMF - they also had the highest power user conversion rate at 34%. Users in this segment were 3.2x more likely to adopt 3+ integrations compared to other segments.
See the exact survey Mappbook used to measure PMF by segment
Mappbook segmented PMF scores by user type to understand how product value perception changed from free trial through paid tiers to power users. The results challenged conventional wisdom.
| User Type | PMF Score | Sample | 30-Day Retention | Key Characteristic |
|---|---|---|---|---|
| Power Users (3+ integrations) | 71% β | 412 | 96% | Deep integration in workflow |
| Paid (Pro Plan) | 54% β | 687 | 88% | Uses advanced features |
| Paid (Basic Plan) | 42% ~ | 834 | 72% | Core features only |
| Free Trial (Active) | 31% ~ | 614 | 47% | Evaluating fit |
| Free Trial (Inactive) | 12% β | 300 | 8% | Never activated |
Power users who adopted 3+ integrations showed 71% PMF - dramatically higher than even paid Pro users (54%). The path to "very disappointed" wasn't just about paying, it was about depth of integration into daily workflow.
Meanwhile, 31% of active free trial users already showed moderate PMF, revealing that activation quality mattered more than paywall friction for conversion.
| Location | PMF Score | MRR | Churn Rate |
|---|---|---|---|
| πΊπΈ United States | 47% β | $4.2M | 5.2% |
| π¬π§ United Kingdom | 38% ~ | $1.8M | 7.8% |
| π©πͺ Germany | 18% β | $1.2M | 12.4% |
| π¦πΊ Australia | 41% β | $0.8M | 6.1% |
"We want to use it but compliance team won't approve - no data residency options"
"The workflow doesn't match how German teams operate"
Segmenting German users by segment showed that regulated industries (fintech, healthcare) had 14% PMF due to compliance blockers, while creative agencies had 24% PMF citing workflow mismatches.
Decision: Prioritized EU data residency and GDPR features first (unblocking regulated industries), then localized workflows. This dual approach addressed both segments, resulting in 18% β 43% PMF improvement.
| Channel | PMF Score | CAC | LTV | LTV/CAC |
|---|---|---|---|---|
| Referrals | 67% β | $45 | $2,840 | 63x |
| Paid (Google) | 51% β | $12 | $1,620 | 135x |
| Organic (SEO) | 44% ~ | $89 | $2,140 | 24x |
| Paid (LinkedIn) | 22% β | $340 | $720 | 2.1x |
| Product Hunt | 19% β | $280 | $580 | 2.0x |
Channel quality wasn't just about PMF scores - it was about user type quality. Referrals brought 2.8x more power users (users who adopted 3+ integrations) than paid channels, explaining their dramatically higher LTV.
Referrals and Google Ads brought users with 3x higher PMF scores than paid channels. Paid ads were attracting the wrong customer segments with poor retention.
See how to track which channels bring your best customers
Mappbook correlated PMF survey responses with actual churn behavior over 90 days:
| PMF Response | 90-Day Churn Rate | Prediction |
|---|---|---|
| "Very Disappointed" | 4.2% | High retention |
| "Somewhat Disappointed" | 18.7% | At-risk (41% of users) |
| "Not Disappointed" | 64.3% | Will churn |
"Somewhat disappointed" users were 4.5x more likely to churn than "very disappointed" users - and they represented 41% of the user base.
"What's the main thing preventing you from being 'very disappointed' if Mappbook disappeared?"
By segmenting "somewhat disappointed" responses by user attributes, Mappbook discovered that marketing agencies (their ICP) were asking for Slack/Asana integrations, while solo consultants (not ICP) wanted pricing changes.
Decision: Built Slack & Asana integrations first (serving 38% of at-risk ICP users), deferred pricing changes (requested by non-ICP). This targeted intervention moved 34% of at-risk ICP users from "somewhat" β "very disappointed."
Mappbook sent Customer Effort Score (CES) surveys at key milestones in the user journey:
| Journey Stage | CES Score | % "Very Difficult" | Main Friction |
|---|---|---|---|
| Initial Signup | 6.2/7 β | 8% | Smooth |
| First Project Setup | 4.1/7 β | 41% | Too many fields, unclear workflow |
| Inviting Team Members | 5.8/7 ~ | 15% | Permission settings confusing |
| Generating Reports | 3.2/7 β | 58% | Can't find button, export broken |
"I wanted to create a simple task list but was forced to fill out 12 fields including 'project code' and 'budget allocation' - I'm a 3-person agency, I don't have a budget allocation!"
CES responses segmented by team size revealed that teams of 11-50 (their ICP) found project creation "very difficult," while teams of 200+ (enterprise, not ICP) complained about missing advanced fields.
Decision: Simplified for ICP segment (11-50 teams) by reducing required fields from 12 β 3, added "advanced mode" toggle for enterprise. This improved activation rate for ICP by 28% while maintaining enterprise usability.
See how to measure friction at key user journey milestones
Mapster allowed Mappbook to layer PMF scores with usage data to discover their champion profile:
Beyond new customer acquisition, Mappbook analyzed how PMF scores correlated with expansion revenue. The discovery: users with >50% PMF had 4.2x higher expansion rates - and specific behavioral signals predicted who would upgrade.
| PMF Segment | Avg Expansion ARR | Upgrade Rate | Time to Upgrade |
|---|---|---|---|
| Very Disappointed (>50% PMF) | $1,840 | 67% | 28 days |
| Somewhat Disappointed (30-50% PMF) | $620 | 31% | 89 days |
| Not Disappointed (<30% PMF) | $140 | 8% | Never* |
*92% of users with <30% PMF never upgraded before churning
Users who invited 3+ team members within first 14 days had 78% upgrade rate vs 12% for solo users
"Stuck" users had 23% PMF vs 58% for expansion champions - PMF predicted expansion better than usage metrics
Key Learning: NPS strongly correlated with PMF scores. Use PMF for product decisions, NPS for customer health tracking.
Key Finding: 72% of churn came from users with <25% PMF
"Missing features" churn was preventable - users at 34% PMF who churned for 1-2 specific gaps
See how to measure customer loyalty and satisfaction
Month 0
Month 6
Mappbook had 28% overall but 58% in their ICP segment. Segment-level analysis is critical.
Germany looked like revenue but had 12% churn until localized for GDPR and workflow preferences.
Referrals had 3x higher PMF than paid ads. Focus on channels that bring champions.
They represent 40% of users and 18% churn rate. This segment needs targeted intervention.
Low-PMF users churned even with okay usage metrics. PMF score is a leading indicator.
CES surveys at Day 3/7 revealed onboarding issues causing silent churn later.
Mappbook stopped targeting enterprise and solo consultants (no PMF), doubled down on agencies (58% PMF). ICP clarity drove all growth metrics.
Similar Discovery: Mapera's creator platform faced the same decision - stop targeting "all creators" and focus on newsletter/course creators in B2B niches (62% PMF vs 18% for video creators).
See how creator platforms segment for PMF βNote: Fictional founder perspective created for illustrative purposes
"Before Mapster, we were flying blind. We knew churn was high but didn't know why. We thought we needed more features, but the real problem was we were building for the wrong customers.
The PMF surveys gave us permission to stop trying to be everything to everyone. We narrowed our ICP, fixed the onboarding friction, and focused on the segments where we already had strong PMF.
Six months later, we're at 52% PMF, churn is down 31%, and we have a clear roadmap based on what our champions actually need. This is the data I wish I had on Day 1."
Mappbook didn't just send surveys randomly - they triggered them at strategic moments in the user journey where feedback would be most actionable and authentic.
Sent monthly to rapidly iterate and understand segment-level patterns. High frequency enabled quick hypothesis testing.
Shifted to quarterly once ICP was identified and core issues addressed. Focus moved from discovery to tracking improvements.
Trigger: Sent to users with 30+ days usage and 5+ active sessions
Captured activation friction - revealed 12-field form was blocking ICP segment
Measured collaboration ease - found permission settings confusing
Assessed value delivery - discovered export UX was broken
Philosophy: Survey right after critical actions, not days later when memory fades
Users reach team member limits, storage caps, or feature restrictions - perfect upgrade moment
Users exporting reports 2+ times/week signal need for advanced features
Goal: Understand expansion intent when users are experiencing growth needs
Triggered within 1 hour of cancellation to capture authentic reasons while top-of-mind
Analyzed churn reasons differently for high-PMF users (preventable) vs low-PMF (inevitable)
Discovery: 72% of churn came from users with <25% PMF - focus retention on saveable users
Mappbook surveyed users at moments of truth - right after activation steps, during growth needs, and at churn. This timing ensured feedback was specific, actionable, and directly tied to product improvements.
Sample Size: 2,847 initial PMF survey responses from a base of 8,000 active users (30+ days usage). Segmentation analysis included customer_id, segment, user_type, attribution_channel, usage_frequency, and features_used. All data was anonymized and aggregated for analysis.
See how the same segmented PMF methodology works across different industries
This is a demo case study created to showcase Mapster's PMF analysis methodology. All companies, names, and data are fictional and for illustrative purposes only.Start measuring your own PMF β
β’ No Credit card needed β’ Install in minutes