Compare different metrics
This is an overview of the metrics available in Adapty Analytics. Use it to understand what each metric measures and how it differs from related metrics.
For a deeper explanation of how Adapty processes analytics data, see How Adapty analytics works.
This article does not cover Adapty User Acquisition metrics. Read UA analytics to learn more about ad campaign metrics (Spend, CPI, ROAS, CTR, among others).
Global metrics
Global metrics track the performance of your entire app, across all placements and paywalls.
Revenue
These metrics measure how much money the app generates and from which sources.
| Metric | Description | Key difference |
|---|---|---|
| Revenue | Total revenue from subscriptions and one-time purchases, minus refunds | Actual revenue generated. Can show gross revenue, revenue after commission, or revenue after tax and commission depending on chart controls |
| MRR | Monthly recurring revenue from active subscriptions | Predictable monthly revenue of your app. Excludes one-time purchases and non-recurring subscriptions. |
| ARR | Annual recurring revenue from active subscriptions | Calculated like MRR but on a yearly scale. Useful for projecting annual revenue |
| ARPU | Average revenue per user | Divides revenue by the total number of users — paying and non-paying. Shows how much revenue each user generates on average |
| ARPPU | Average revenue per paying user | Only counts users who made a purchase during the selected period, including refunded transactions. Always higher than ARPU |
| LTV (lifetime value) | Revenue from paying customers divided by the number of paying customers in a cohort | Realized value per paying customer over time. Unlike ARPPU (single period), LTV demonstrates total revenue over the course of the customer relationship. Can be viewed by renewals or by calendar days |
| Predicted LTV | Estimated lifetime value per user in a cohort | Forward-looking estimate. Unlike realized LTV, projects future value using gradient boosting on transaction patterns. Available for 3, 6, 9, or 12 months |
| Predicted revenue | Estimated total revenue a cohort will generate | Forward-looking estimate. Unlike realized Revenue, predicts the total a cohort will generate over the selected timeframe. Updated daily |
| Non-subscriptions | Count of in-app purchases: consumables, non-consumables, and non-renewing subscriptions | Excludes auto-renewable subscriptions. |
| Refund events | Count of refunded purchases or subscriptions | Attributed to the refund date, not the original purchase date. |
| Refund money | Total amount refunded during the selected period | Financial impact of refunds. Calculated before store fees. Unlike Refund events (a count), this shows the monetary amount |
Subscribers and conversion
These metrics track how users enter the app and move through the funnel.
| Metric | Description | Key difference |
|---|---|---|
| Installs | Number of app installations during the period | Counts one of the following depending on the install definition: • Device installations (a user who reinstalls the app is counted again) • Unique users (only counts users who set customer_user_id. Anonymous users are excluded entirely — if no users are identified, the count is 0) |
| New trials | Trials activated during the period | Counts every trial start, even if the trial has already expired or converted to paid by the time you view the chart |
| Active trials | Number of trials that have not yet expired | Only counts trials active at the end of the period |
| New subscriptions | Subscriptions activated for the first time during the period | Excludes renewals and reactivations |
| Active subscriptions | Number of paid subscriptions that have not yet expired | Excludes trials and subscriptions with cancelled renewal |
| Install to trial | Percentage of installers who started a trial | The denominator includes all installers, not just paywall viewers, so the rate might be lower than Paywall view to trial. The two metrics may also diverge if the app does not log paywall views. This can happen with a custom paywall that does not call logShowPaywall, or when a user starts their trial from a promoted in-app purchase. |
| Paywall view to trial | Percentage of paywall viewers who started a trial | Only counts users who saw a paywall, so the rate may be higher than Install to trial |
| Trial to paid | Percentage of trial users who purchased a subscription | Measures trial quality and conversion efficiency. Unlike Install to paid, focuses only on users who completed a trial |
| Install to paid | Percentage of installers who purchased a first subscription | Counts all installers, not just paywall viewers. The rate might be lower than Paywall view to paid. Includes both direct purchases and trial-to-paid conversions |
| Paywall view to paid | Percentage of paywall viewers who eventually purchased a subscription | Only counts users who saw a paywall, so the rate might be higher than Install to paid. Includes users who completed a trial first |
Retention and subscription renewal
These metrics track how well the app retains paying subscribers over time.
| Metric | Description | Key difference |
|---|---|---|
| Retention | Share of original subscribers that remain after each billing period — 1st renewal, 2nd renewal, etc. | Tracks subscribers from the first payment onward. Unlike period-to-period metrics below, always compares against the original group, so you see the full picture at a glance |
| Paid to 2nd period | Percentage of first-time subscribers who renewed for the second period | Measures the transition between two specific adjacent periods. Unlike Retention, focuses on the single most critical renewal — the first one |
| 2nd to 3rd period | Percentage renewing from the 2nd to the 3rd period | Indicates early retention stability after the initial renewal |
| 3rd to 4th period | Percentage renewing from the 3rd to the 4th period | Mid-term retention indicator |
| 4th to 5th period | Percentage renewing from the 4th to the 5th period | Long-term loyalty indicator |
| 6 Months+ | Percentage of first-time subscribers remaining subscribed over 6 months | Measures calendar time, not renewal count. An annual subscriber counts as retained at 6 months even without a renewal |
| 1 Year+ | Percentage of first-time subscribers remaining subscribed over 12 months | Annual retention milestone |
| 2 Years+ | Percentage of first-time subscribers remaining subscribed over 24 months | Long-term retention milestone |
Churn
These metrics measure how many subscribers and trial users the app loses.
| Metric | Description | Key difference |
|---|---|---|
| Trials renewal cancelled | Trials where the user disabled automatic renewal | User keeps trial access until it ends but will not convert to paid automatically. Unlike Subscriptions renewal cancelled, applies to trial users who haven’t paid yet |
| Expired (churned) trials | Trials that expired — the user lost access to premium features | The user has already lost access. Attributed to the expiration date, even if the user cancelled renewal in a previous period. Can be grouped by reason (voluntary vs. billing) |
| Subscriptions renewal cancelled | Subscriptions where the user disabled auto-renew | User still has access until the period ends. Signals churn risk, not actual churn — the user may re-enable auto-renew before the period expires |
| Churned (expired) subscriptions | Subscriptions that expired — the user lost access to premium features | Actual churn. The user has already lost access. Attributed to the expiration date, even if the user cancelled renewal in a previous period. Can be grouped by reason (voluntary vs. billing) |
Billing issues and revenue recovery
These metrics track how effectively the app recovers revenue lost to billing issues.
| Metric | Description | Key difference |
|---|---|---|
| Grace period | Subscriptions that entered grace period due to a billing failure | Includes users that exceeded the grace period and lost access |
| Grace period to paid | Percentage of grace period users who renewed before the grace period ended | A rate (%). Answers “what share of grace period users recovered?” |
| Grace period converted | Absolute number of grace period subscriptions that successfully renewed | Same events as Grace period to paid, but shown as a count instead of a percentage |
| Grace period converted revenue | Revenue from grace period recoveries | Financial impact of the grace period feature |
| Billing issue | Subscriptions that entered billing issue state | Starts after grace period expires. Unlike Grace period, only counts users that have already lost premium access |
| Billing issue to paid | Percentage of billing issue users who renewed before the billing cycle ended | A rate (%). Answers “what share of billing issue users recovered?” |
| Billing issue converted | Absolute number of billing issue subscriptions that successfully renewed | A count of billing issue subscriptions that successfully renewed. Same events as Billing issue to paid, but shown as a count instead of a percentage |
| Billing issue converted revenue | Revenue from billing issue recoveries | Financial impact of billing issue recovery |
Paywall, placement, and onboarding metrics
These metrics are calculated for individual paywalls, placements, and onboardings. They measure the performance of a specific paywall or placement rather than the app as a whole. The Associated global metric column shows the corresponding metric from the global analytics section.
| Metric | Description | Key difference | Associated global metric |
|---|---|---|---|
| Proceeds | Revenue after tax and commission for an individual placement | Equivalent to Revenue after tax and commission | Revenue |
| ARPPU | Average revenue per paying user for this paywall or placement | Same calculation as global ARPPU but scoped to a single paywall or placement | ARPPU |
| ARPAS | Revenue divided by the number of active subscribers (trial and paid) | Counts trial users. Unlike ARPPU, reflects revenue potential of the entire subscriber base | — |
| Views | Total number of times a paywall or placement was displayed | Counts every display. A single user viewing the same paywall twice counts as 2 views | — |
| Unique views | Number of unique users who saw a paywall or placement | Each user counted once regardless of how many times they viewed it. Unlike Views, measures reach rather than engagement frequency | — |
| CR to purchases | Purchases divided by total views | Uses total views (including repeat views by the same user) as denominator | Paywall view to paid |
| Unique CR to purchases | Purchases divided by unique views | Uses unique views as denominator. Higher rate than non-unique CR because repeat viewers are counted once | Paywall view to paid |
| CR to trials | Trials started divided by total views | Measures how effectively a paywall converts views into trials | Paywall view to trial |
| Unique CR to trials | Trials started divided by unique views | Calculated like CR to trials but with unique viewers as the denominator | Paywall view to trial |
| Purchases | Total number of transactions for this paywall: new purchases, trial conversions, upgrades, downgrades, and returning subscriptions | Excludes renewals. | Revenue |
| Trials | Total activated trials through this paywall | Scoped to this paywall only | New trials |
| Trials canceled | Number of trials where the user disabled automatic renewal | Scoped to this paywall’s trials only | Trials renewal cancelled |
| Refund rate | Refunds divided by first-time purchases (renewals excluded) | A rate (%), not a count. Normalizes refunds against the number of purchases | Refund events (count, not rate) |
| Completions | Number of times users completed an onboarding flow from first to last screen | Placement and onboarding only. Counts every completion, including repeat completions by the same user | — |
| Unique completions | Number of unique users who completed an onboarding flow | Placement and onboarding only. Each user counted once. Unlike Completions, measures how many individuals finished the flow | — |
| Unique completions rate | Unique completions divided by unique views | Placement and onboarding only. Measures onboarding effectiveness: what share of users who started it actually finished | — |