In a company that prides itself on being data-driven, it’s tempting to track everything. But too often, this flood of information turns against us: we get lost in a maze of metrics, unsure which ones truly reflect performance, drive decisions, or simply dress up dashboards. To bring meaning back to measurement in Growth, we need to cut through the noise, focus on what matters, and turn the right KPIs into action levers.
We track a lot of things — often without knowing why. The result? Irrelevant metrics, cluttered dashboards, and decision-making that’s harder, not easier. For a data-informed approach to work, you need to start with a deceptively simple question: which metrics actually matter? The ones that steer priorities, support trade-offs, and lead to real action. The challenge: identify the right indicators, make them visible… and use them as strategic drivers.
More Data isn’t always better
Being data-informed doesn’t mean measuring everything — it means measuring what’s useful. Before investing in tools, take the time to identify the right KPIs. Too often, the data we collect is neither used nor usable. Choosing the right metrics early on helps guide tool selection and prevents dashboard bloat.
There are two main types of indicators:
- Reporting metrics: to track performance over time, benchmark against targets, and share clear updates across the organization.
- Experimental metrics: more operational and granular, they’re used to frame internal experiments. Often invisible externally — but critical for learning and iteration.
Stop chasing vanity metrics: clean up your reporting
Reporting metrics should be clear, stable, and actionable. Yet many teams clutter their dashboards with vague or flattering indicators that don’t support any decision: page views, new visitor rates, average time on site… metrics that are rarely actionable — and often misleading.
Examples of vanity metrics or unreliable indicators:
- Page views: Unless your business model relies on advertising (and even then, only by CPM), raw visit numbers are usually irrelevant.
- Number of visits: Without knowing how many unique users are behind them, this number is meaningless. Is it one user visiting 50 times, or 50 users once?
- Average time on product: This only becomes useful when broken down by segment, page, or feature. Most tools calculate time spent per page poorly, making this metric unreliable.
- New visitor rate: Tracked by tools like Google Analytics, this is hard to interpret without context. A drop in returning users might reflect increased acquisition, weaker retention, or even a bug — you won’t know without digging into other KPIs.
- Account creations or app downloads: These can indicate acquisition, but they need to be paired with activation or engagement metrics to assess real usage and value.
These figures create the illusion of performance, but distract from what really matters. The priority? Cut the noise. Keep only what drives decisions.
Share your metrics
Measurement without communication is pointless. Once your KPIs are defined and your tools are up and running, build a reporting structure that fits your needs. A few best practices:
- Tailored dashboards: A tech lead and a marketer don’t need the same data. Keep it relevant for each audience.
- Less is more: A few sharp metrics beat a dashboard jungle.
- Smart alerts: Get notified at the right time, for the right reasons. Not too often, not too late.
- Automate the format, focus on the insight: Save time on setup so you can spend it on interpretation.
And above all: revisit your metrics regularly. What’s relevant today may not be tomorrow. Relevance is not a one-time task — it’s a mindset.
The right experimental metrics are the ones that hurt
Reporting isn’t enough. A truly data-informed team experiments constantly. Every test should aim to move the needle on one or two meaningful KPIs. Sometimes these overlap with your reporting metrics — but not always. They’re more targeted, more contextual, and more actionable.
The five stages of the AARRR framework (Acquisition, Activation, Retention, Referral, Revenue) are a great starting point.
Examples:
- Improve activation rates for a specific segment
- Reduce churn in a targeted cohort
- Increase LTV by working on a key behavior.
The best metrics? The ones that make you uncomfortable — because they expose what’s not working. Those are the ones that push you to improve.
Example: Monitoring an E-commerce Conversion Funnel
Give meaning to your metrics
The most powerful insights often come from the least obvious metrics — the ones you uncover by digging deep, cross-referencing data, and asking the right questions. These are the real breakthrough moments.
Always combine quantitative and qualitative inputs: metrics point to the problem, but user feedback tells you why. Surveys, heatmaps, usability tests — mix your sources to get the full picture.
And look ahead. Don’t just measure what happened — anticipate what’s next. Spot the early signs of churn, identify weak signals. It’s the only way to act before it’s too late.