Inbound Marketing

Cohort Analysis

Cohort analysis is a method of grouping users by a shared starting event — usually the week or month they signed up — and tracking their behavior over time as a group. Instead of one rolling average, you see many parallel lines, each revealing how a specific generation of users actually behaved.

Cohort analysis is a method of grouping users by a shared starting event — usually the week or month they signed up — and tracking their behavior over time as a group. Instead of one rolling average, you see many parallel lines, each revealing how a specific generation of users actually behaved.

Why It Matters

Aggregate metrics lie. A product can show rising MAU while every individual cohort is churning faster than the last — the growth comes entirely from acquisition outpacing decay. Cohort analysis reveals this pattern immediately. Every serious growth team (Facebook, Airbnb, Shopify) runs retention conversations off cohort charts, not averages. Cohort analysis is also the only reliable way to know whether a product change actually worked — before-and-after averages mix old and new behavior; cohorts separate them.

How It Works

1. Pick a starting event: Sign-up, first purchase, first use of a feature. This defines "Week 0" for each user.

2. Group users by start period: All users who signed up in Week 1 of April are one cohort, Week 2 another, and so on.

3. Pick a retention event: What counts as "retained"? Logged in, completed a core action, paid — be specific.

4. Track each cohort's retention over time: For each cohort, compute the % still performing the retention event at Week 1, Week 2, Week 3, …

5. Plot them side by side: Each cohort becomes a row or line. Compare shapes, not just numbers.

Shapes to Look For

Flattening curve: Retention drops sharply at first, then levels off at a stable percentage. This is the shape of product-market fit — a core group sticks.

Smile curve: Retention drops, then rises as dormant users return. Rare but powerful; seen when a product becomes habit.

Slide to zero: Retention decays steadily to 0%. The product doesn't stick. Acquisition won't save you.

Cohort improvement over time: Newer cohorts retain better than older ones. This is the signal that a product change actually worked.

Cohort degradation: Newer cohorts retain worse. Something broke — either the product or the acquisition channel is pulling in users who don't fit.

Common Uses

Retention diagnosis: Is our product actually sticky?

Feature impact: Did launching X improve retention for cohorts that saw it?

Channel quality: Do users from Google Ads retain as well as users from organic?

Pricing experiments: Does a new plan's cohort retain better than the old plan's?

Churn forecasting: Apply cohort curves to new signups to predict future MRR.

Common Mistakes

Comparing to averages: Averages combine all cohorts, hiding the trend that matters.

Cohort size too small: Weekly cohorts of 20 users are mostly noise. Aggregate to monthly if volume is low.

Wrong start event: "Signed up" is not "activated." Pick the event that defines real usage.

Wrong retention event: Login counts nothing. Pick the action that creates value.

Looking only at one cohort: Single-cohort snapshots hide whether things are improving or worsening across time.

Cohort by acquisition month only: Also cohort by feature exposure, channel, plan, and other dimensions to find real drivers.

Sources: