Metadata
- Author: Gibson Biddle
- Full Title:: #4 Proxy Metrics
- Category:: 🗞️Articles
- Document Tags:: Metrics layer,
- URL:: https://gibsonbiddle.medium.com/4-proxy-metrics-a82dd30ca810
- Finished date:: 2024-04-28
Highlights
Using retention as a metric for all projects isn’t feasible, however. It’s a hard metric to move, and proving a retention improvement requires large-scale A/B tests. Lower-level metrics — proxy metrics — are easier and faster to move than high-level engagement metrics. Ideally, moving a proxy will improve the high-level metric (e.g., retention for Netflix), demonstrating a correlation between the two. Later, you can prove causation via an A/B test. (View Highlight)
Over the same period, we drove month-one retention from 88% to 90% — both retention and our “simple” metric moved together. We chose not to take the time, however, to execute a large-scale A/B test because we were confident that the more straightforward experience improved retention (View Highlight)
First, you seek a correlation between your high-level metric and the proxy metric. Later you work to prove causation. (View Highlight)
Is not an average. The danger of averages is you may move the metric by inspiring a small subset of customers to do a lot more of something. But this may not affect enough members to improve the overall product experience. (View Highlight)
isolating the right proxy metric sometimes took six months. It took time to capture the data, to discover if we could move the metric, and to see if there was causation between the proxy and retention. Given a trade-off of speed, and finding the right metric, we focused on the latter. It’s costly to have a team focused on the wrong metric. (View Highlight)