The dos and don’ts of product analytics

14 min read
Share on

Analytics help product people to realise their product goals through being evidence and outcome driven — today they are a vital part of any product person’s toolkit. This deep dive into analytics aims to give you some background information on analytics, some handy dos and don’ts and also shares some key insights from a new report from Mixpanel that investigates how product teams are using metrics to analyse user behaviour.

Perhaps we should start with the straightforward purpose of analytics — in a nutshell they measure the state of the product and what users are doing. They tell you what’s going on, what users are doing although not necessarily why they are doing it. As Mixpanel says in an e-book, The guide to product analytics, you can apply quantitative measurements to any product, no matter how intangible its value may appear to be. It’s not a stretch then to see that measuring the right things is fundamental to any successful analytics strategy.

Some eight years ago ProdPad CPO and Mind the Product co-founder Simon Cast wrote a blog post for Mind the Product, Everything a product manager needs to know about analytics. Simon says: “If you’re output driven you’re focused on pushing out features and not on whether they make a difference. If you want to be outcome driven rather than output driven, then you need analytics. The use of analytics means you can see whether you’re making improvement in the areas where you want to make improvement. What you don’t measure, you can’t improve.”

The concepts of analytics

Simon’s intention was that his article should be evergreen, and so it has proved to be. It takes us through the concepts of product analytics — data points, segmentation, funnels and cohorts — concepts which still apply today.

Data points

Data points are the individual pieces of collected data that measure particular items, so — as Simon’s article explains — collecting the right data points is the foundation of analytics. Mind the Product has lots of content and case studies on data collection and developing a data-driven culture — for example, Helen Hewitt’s ProductTank talk, Developing a data-driven culture at Dow Jones, looks at how Dow Jones collects and leverages its vast amounts of data, and this case study, Helping a machine to distinguish toilet flush from kitchen tap shows how data collection and data driven design can solicit and engage users to help improve a product.

Segmentation

Segmentation allows you to group similar data together based on your chosen parameters so that you can use it more efficiently — again Mind the Product has lots of insights and ideas on this. Have a look at Rajat Harlalka’s post Are you segmenting your A/B test results? or this ProductTank San Francisco talk, Uncovering your most pivotal users, by Marieke McCloskey and Doug Puett for further information.

Funnels

A funnel is made up of the measurement of the key event at each step of the flow or user journey, and funnel analysis helps to determine the point at which users drop off and stop the task they set out to complete. This case study, How Uniregistry used smoke testing in product validation, shows how funnel analysis can be used in product discovery and validation.

Cohort analysis

While similar to segmentation, cohort analysis is different, says Simon, because the grouping is done using a point in time and a characteristic of the users, such as traffic source: “The primary purpose of cohort analysis is for comparative analysis to answer the question of how users’ behaviour changes over time. Cohort analysis is also important for measuring the long-term value of the user. For example: how does the behaviour of users who registered a week ago differ from that of users who registered a month ago?.” This post, 7 ways cohort analysis can optimize company performance and results, contains some useful advice on cohort analysis.

Implementing product analytics

The experts we spoke to agree that people continue to implement analytics badly. “People are still measuring the wrong thing, and still doing vanity metrics,” says Simon. He points out that the biggest change since he wrote his foundational article has been the emergence of objectives and key results (OKRs). “But are people really tying in what they’re measuring to their key results?,” he asks, “they might be measuring existing vanity metrics and not really measuring whether the changes they’re making are, in fact, affecting the key results — this is core to what you try to do with OKRs.”

Alex Kulbei, co-founder of data collection and analytics company Probe, says that, in the early stages of a business, founders may feel that they know what’s going and and they’re so close to the action that there’s no need to invest in metrics. But then the business grows: “Then people try to capture everything and this is not a good approach.” And Cassidy Fein, director of product management at Echo360 and an instructor on Mind the Product’s Metrics for product managers training workshop, says: “We see a couple of things. First and foremost, people think that they don’t really need metrics because they think that they know better. Secondly, people have so much data that they don’t know what to do with it, and they don’t have a proactive process around what kind of metrics they need to track to validate success.” Mind the Product’s managing director Emily Tate also comments she’s seen product managers get overwhelmed by the amount of things they track, such that all they’re getting is “a lot of noise”. “It’s hard to find the signals and the little things that matter when you’re just tracking everything.”

It’s hard to find the signals and what matters when you track everything (Image: Shutterstock)

A product team needs to track different types of metrics, says Emily. Health metrics, like active users and retention rates, serve as a check that the business and product are not going in the wrong direction, and need to be tracked even if no one is actively making decisions about them — “these are your KPIs” says Emily. “Then there are metrics you track because you’re trying to understand something or change behaviour, or to see if something you’ve built is having the impact you were hoping for.” But the key is to interrogate your data rather than just track everything, she says: “The data is rarely going just magically to surface some insight that you never would have thought to look for.”

Make no mistake, metrics are daunting. As Cassidy says, “it’s really hard to feel confident that the metrics you’re choosing to track and validate against will actually translate into improved outcomes or growth”.

How do you make sure you track the right metrics?

It means starting with a plan, and not just implementing whatever tool you may be using. Says Cassidy: “A lot of times people will jump right to the tool, because they think the tool will make them successful and they don’t necessarily implement an holistic process.” This article, Why do you need to rethink your analytics strategy?, looks at what happens when you record and report everything and digs into how to plan an analytics implementation. Similarly Alex Kulbei comments that product teams need to define their North Star metric and the key numbers that define their sector and work out how they can capture them. Alex wrote a post, published earlier this year, How to get started with data collection and analytics at your business, with some helpful advice on how to get started.

Mind the Product’s training workshop Metrics for Product Managers also focuses on how you define a North Star metric and then craft an appropriate metrics plan. Emily points out that a product team may be unable directly to influence their North Star metric, and that it’s important also to look into leading indicators that might influence it, and this post, Why your North Star metric might be sending you south, looks at other reasons why it’s important also to choose other leading metrics. Says Cassidy: “You need to think about ultimately, what problems you’re trying to solve for the user at all — it always goes back to the user.”

Create a metrics plan

A metrics plan should be developed in conjunction with others. Says Emily: “We talk a lot about balanced teams of the engineers and designers and the people directly associated with the product. Ideally, they should be part of the metrics you use. If you can’t get them involved directly in crafting metrics, you at least want to get alignment from your key stakeholders. Your metrics plan is what your team is focused on, your purpose. So you want to align those people who may want you to make different decisions.” Encouragingly, Cassidy points out that it can be straightforward to get buy-in for a metrics plan because execs typically want to see more numbers rather than fewer.

A few essentials to remember when implementing a metrics plan:-

Measure the right data

Our experts say that the biggest mistake people make is not knowing what they want to measure. So work back from your OKRs or KPIs so that you make sure you measure the data that shows whether your key results are changing. If you can’t measure directly, use proxy measures that indicate whether the key results are improving or not.

Focus on the users

Your metrics must be ultimately serving or reflective of your users and their behaviour.

Know there’s no right or wrong answer

As Cassidy points out, people tend to believe there’s always a right and wrong answer solely because numbers are involved, and this just isn’t the case — it won’t always be black and white.

Be dedicated

This is a practice and not a one-time exercise. You need buy-in from stakeholders, and the dedication to review and renew metrics with a regular cadence.

Measuring and reporting product metrics

Once the North Star and other key metrics are defined there comes the question of how to stay on top of and communicate results.

If a company is to be data-driven, then results from metrics need to be communicated efficiently and with relevant context. Alex says it’s the product manager’s job to define what success means for a product and provide this context: “Adoption of my product dropped by 10% this month tells you nothing without some context. Did you release a new feature? Were there any outages? We often find people just look at a spreadsheet or dashboard — it’s just numbers with no context around them.”

Metrics need to be communicated with relevant context (Image: Shutterstock)

Metrics also need to be reviewed and renewed as appropriate. Emily says that although — as ever in product the answer is “it depends”— a quarterly review can be appropriate for many product teams. It’s long enough to start to see a change and short enough to do something about changing something that’s not working before it becomes a problem. “Revisiting  your metrics doesn’t always mean you’ll change them,” she says, “and twice a year feels too infrequent.” She adds: “Obviously you would be looking at your metrics much more frequently. So if your metrics plan is not driving the results or driving the behaviour you want, you’ll know within a couple of weeks.”

What, not why

As others have stated, metrics will tell you what’s happening but not why it’s happening. And you shouldn’t assume that you know why. Your quantitative work should be paired with qualitative analysis, as this post, Why you need quantitative AND qualitative data, explains. Says Emily: “It can be really tempting to just test and watch the data and experiment your way into the right things. But even if you have a hypothesis for why numbers are moving, you won’t actually know why until you talk to people. Don’t use data to avoid talking to your customers.”

Extracting value from data

The data in Mixpanel’s report provides a guide to what businesses with a metrics practice are doing to manage and extract value from their data — the results show us time frames and common practice among other insights.

The report splits Mixpanel users into casual and power users — power users are the top 10% who use data as the compass for nearly all their product decisions, while casual users use data to inform their product decisions but it doesn’t propel them forward in a systematic way.  The report finds that power users are intentional about incorporating data into their work — the median casual user runs 7 queries a day, while power users run twice as many. Casual users don’t spend as much time identifying why numbers have changed and are satisfied with knowing the status of a metric. Powers users do daily analysis, spending 5 days a week with their data, whereas for casual users it’s 3 days a week.

Mixpanel also segments the data by size of company, labelling them tech giants, digital transformers, scaleups and startups. There are power users in every size of company but those in startups run queries the most often. Gina Gotthilf, Co-Founder at consultancy for early-stage entrepreneurs Latitud comments on the reasons for this: “Startups need to use data to guide their product and business decisions because they can’t afford to go down too many wrong paths or make mistakes. Larger companies can deal with things like bureaucracy, R&D, and so on, because they’re not trying to survive.”

Tech giants spend the longest with their data — taking 9+ minutes per session — and, while startups do the most with their data, their sessions are under 4 minutes, perhaps because there is less data or because their questions don’t need much investigation.

Hannah Maslar, Mixpanel’s Senior Product Marketing Manager says she had expected to see product teams use more filters and do bigger breakdowns of their data: “I think this simplicity is a good thing. It shows you don’t need to be a technical expert, and you can come in and learn something with just a quick high-level analysis of your data.” She was also pleasantly surprised by how little time product managers need to spend with their data: “It shows you can really iterate on your products by just spending a little bit of time with your data,” she says.

In an environment where everyone tracks so much data, there is added value to the business in doing it well. The results in Mixpanel’s report certainly show there is value to be gained with relatively little effort. Emily sums up: “You won’t necessarily fail if you’re not good at managing metrics. But you reduce your risk when you can do this really well, because you’re not just throwing things at the wall. The added value is being intentional about your business and where your product is going.”

Read Mixpanel’s report: Frequently Asked Queries: Global benchmarks for how product teams analyze product data and user behavior

Discover more content on Product Management Metrics and KPIs.