Most marketers have adopted the view that nothing is worth doing if it can’t be measured. That’s good. It forces us to articulate what success looks like and invest in those programs that are most likely to achieve that success.
It also forces us to think carefully about what we should measure. Sometimes, that’s easy. Those focused on advertising, for example, want consumers to remember, like, and engage their ads.
As customers spend more of their time glued to their smartphones and tablets, marketers are gaining access to new data. At first glance, the data feels familiar. Think carefully, though, because metrics that make sense in digital marketing don’t necessarily offer the same value when it comes to measuring mobile app performance.
This revelation first came to my mind while working with a top media company on marketing programs to support its mobile app for Windows Phone. We kicked off the discussion by proposing performance metrics. Downloads, time spent per usage, number of pages viewed, and active users each emerged as candidates. Yet for every single one, we could think of a reason why maximizing the metric might lead to undesirable outcomes.
- Downloads are relatively easy to deliver, but most downloaded apps are rarely used more than once.
- Time spent per session and number of pages viewed each presume that more is better. They do not consider, for example, that smartphone users often face external time limits that force them to quit an app, or that spending more time consuming more pages could, in fact, reveal a sign of frustration that the app isn’t delivering the information a user wants or needs.
If you’re picking which metrics to optimize, think first of how customers actually will use the app.
In the end, we agreed to measure active users because we concluded that returning to an app signaled a user’s stamp of approval. Even that metric evoked debate, though. That’s because the company had added a featured called “Live Tiles,” a feature unique to Windows that enables the app to deliver some information to the user’s Start screen. Here’s an example of one of my favorite apps’ Live Tiles:
As you can see, Accuweather delivers temperature and weather conditions to its Live Tile. That means any customer who has the app and has activated its Live Tile function doesn’t need to open the app in order to access some (but not all) weather information.
The lesson learned: there is no single “magic metric” that can track every app’s performance. If you’re picking which metrics to optimize, think first of how customers actually will use the app. Those with the resources to conduct user testing, even informally, can determine what these behaviors are most likely to be.
In the case of Accuweather, the number of times a user pins the app to the Start screen could be a terrific metric for assessing engagement. So, too, could be how many users activate the Live Tile function.
I also recommend conducting satisfaction surveys within an app. One of the most challenging aspects developers face is deciding what information to serve through an app. Delivering too much content or content not optimized for small screen consumption and you risk cluttering the experience. Serve too little and you risk user abandonment. You need to ask customers what they prefer.
The impulse to invest only in things a marketer can measure is just as sound in the mobile world as it is digitally (and in social media). Just be prepared to rethink long-standing assumptions about performance metrics before setting expectations.