![rw-book-cover](https://readwise-assets.s3.amazonaws.com/static/images/article4.6bc1851654a0.png) ## Metadata - Author: [[olga|Olga]] - Full Title:: Introduction to Proxy Metrics - Issue 204 - Category:: #🗞️Articles - Document Tags:: [[proxy-metrics|Proxy Metrics]], [[proxy-metrics|Proxy Metrics]], - Finished date:: [[2024-05-27]] ## Highlights > KPIs (and North Star) are business ecosystem metrics (which are the opposite of being sensitive), so you should **not use them** to measure the impact. ([View Highlight](https://read.readwise.io/read/01hywzyrgdws14a92hbxbdk5re)) > How to determine if your proxy metric is effective > 1. The proxy metric should be sensitive so that you can affect it in the short term - e.g., screen views, button clicks, session time, number of transactions, etc. > 2. Good proxy metrics are simple and don’t involve many complex filters or calculations. > 3. If possible, try to avoid using averages. [The danger of averages is real](https://substack.com/redirect/48534e41-ed9a-4863-affa-6e6fe78ae01c?j=eyJ1IjoiM2kxMHhzIn0.1Kpky8Knu0EyceTU0RV_phz4GkjLwPr5RprErnU4tsk). Remember, you can change the experience of a small subset of users (and significantly increase the average) but not improve the overall experience for the vast majority of customers. > 4. It should be independent of other product features, marketing initiatives, or similar factors. ([View Highlight](https://read.readwise.io/read/01hyx016wnc3a308gs29fq040e)) > referencing [Gibson Biddle](https://open.substack.com/users/1925031-gibson-biddle?utm_source=mentions) one last time, “you’re generally better off structuring the metric as a threshold”: > • % of (new/existing customers) who do at least (X) of (new feature) > • % of (new/existing customers) who do at least (minimum threshold of value) by (X period in time). ([View Highlight](https://read.readwise.io/read/01hyx0609nw291gnp0e8d50scb)) > • **DAU —>** Unique screen views, logins or app opens, median time spent per user per day, total views or clicks. > • **Activation —>** Onboarding CVR, % users doing X activity in their first day > • **Retention —>** Total screen views, logins or app opens, total number of days using the product/feature. > • **Install-to-Paid —>** Signup to trial, Total trials, number of initial transactions > • **New subscriptions —>** Total number of transactions. > • **Subscription retention —>** Total number of transactions, unique paid customers, % paid customers from DAU. > • **Churn rate —>** % of users who canceled, total cancelations, the median time spent per user per day. > • **LTV —>** DAU-to-Paid, in some cases, average transactions or orders per user, % paid customers from DAU. > • **Revenue —>** Total transactions, unique paid customers, upsell-to-trial CVR, average revenue per user ([View Highlight](https://read.readwise.io/read/01hyx06kkevym1j39ajayvw29f))