Analytics and Decision-Making – Recap of August’s ProductTank London

8 min read
Share on

If you’d have told me that August’s ProductTank London on analytics and decision-making would be one of the most risqué to-date, I’d have struggled to believe you. The talks had everything – sex, drugs and racy pictures of hot plumbing action. Read on for an 18-rated recap.

Making Data Actionable

Matty CurryMatty Curry (@mattycurry) opened with stark warnings (or promises?) of rude words and ruder pictures. As the director of e-commerce for Lovehoney, one of the largest online purveyors of sex toys, we were obliged to believe him. But hey, it was a warm Friday night in August and everyone had a drink in hand, so the audience was in the mood.

From the slightly incongruous and isolated setting of Bath, Lovehoney’s small and agile team tends to do things for themselves. (Team size isn’t everything. It’s what you do with it that counts. – Ed.) When there’s a problem with the plumbing, they fix it themselves. Their philosophy of self-reliance means they make the occasional mistake – Matty once accidentally set every product’s price to £0.01 prompting what can only be described as ‘a bit of a rush’. But when they make mistakes, they learn from them.

A drink or two was needed to prepare for Matty's talk
We needed a drink or two to prepare for Matty’s talk

The first thing they’d learned was that attempting to report on absolutely every possible metric was a lesson in futility. They had the vain hope that they might discover the magic bullet for their business, but in their search they were paralysing themselves.

The second thing they learned was that real-time analytics was essentially statistical pornography – and not even good quality porn at that. It goes on forever and doesn’t really lead anywhere.

After each metrics review meeting, they would return to their desks and would continue to do exactly what they’d been doing before. They were not learning anything. Instead, they needed to find the data that would cause them to take action and try something different.

The next thing they learned was that there was no point in reporting on high-level KPIs such as conversion rate. If they wanted to up the conversion rate, they could have a sale. The rate would rise, but it wouldn’t necessarily make any more money – see Matty’s accidental £0.01 fire sale for details.

So instead they needed to drill down into the detail, to find the human behaviour and emotions driving each metric.

When examining view-to-buy rate, they needed to strip out the misleading information that was clouding the real story. Out-of-stock products or fractured products (where 80 percent of the product varieties are out of stock) will have a zero or low view-to-buy rate because people can’t buy them. They also cleared out from the report in-stock but unviewed items. The merchandising team could easily generate some interest in these by sending out samples for review.

With all this dross data out of the way, they could concentrate on products with a low view-to-buy rate. Was it a problem with the content? Do people look at the pictures or the videos showing how to use the sex toy in the first place?

Gosh
Gosh

They were also able to identify a segment they euphemistically call ‘tyre-kickers’, namely people who are only there for the provocative pictures and videos, and will never actually buy anything. They segmented these out based on their search terms, such as those specifically looking for “sexy lingerie pictures”.

Examining their on-site search also helped to reveal intention. If a customer searches for something, then refines the keywords, chances are they’re not sure what they’re looking for. Could the search algorithm be improved to help them?

Similarly searches that return no results can reveal valuable trends. One such insight led Lovehoney to become the first UK shop other than Amazon to sell E.L. James’s 50 Shades of Grey, and from that won the deal to be the first store to sell 50 Shades-branded products.

People come to Lovehoney’s site for different reasons, some to learn as well as to buy. As content has different purposes, content KPIs should reflect that also. Has the content prompted the customer to fulfil the desired action on that page?

  • Don’t bother trying to report on all of the things
  • Avoid metrics porn – focus on what will prompt you to do things differently as a result
  • Learn the human behaviours behind your metrics
  • Clear out the dross results hiding the real story
  • Search behaviour can reveal customer’s intent
  • Don’t treat all your content the same

Analytics for Non-Retail Environments

Alec CochraneDigital analytics consultant Alec Cochrane (@whencanistop) had the unenviable task of following Matty’s talk. Through some case studies of clients he’s worked with, he spoke about how to measure the outcome of transactions when they occur somewhere other than your own site.

He started by recapping the analytics framework he uses with clients.

  • Strategic objective
    • Business objectives
      • Critical success factors (CSFs)
        • Key performance indicators (KPIs)
          • KPI targets
            • Segments

One pharmaceutical company had hired an agency to create content to achieve two objectives:

  1. to tell everyone about how to diagnose and prescribe for a particular condition;
  2. to distinguish this condition from a similar one.

However, they’d only see tangible success if prescriptions for their drug to treat that condition rose. Using this as a basis, they were able to define KPIs that would show whether the agency was writing effective content.

Alec mentioned other case studies:

  • a company moving away from online advertising to lead generation, for which it hosts branded content on its site as a means of driving leads to the companies providing the branded content;
  • a pharmaceutical company that answers questions from medical professional such as whether it’s safe to prescribe the company’s drug with another; and
  • another pharmaceutical company that needed to provide support for patients taking its drug because initial unpleasant side-effects were discouraging patients from continuing their course of medication before they saw the benefits.

In order to try and measure outcomes when they happen elsewhere, Alec suggested a few approaches:

  1. Can you connect different data sources? For example sale conversions from one system and financial transactions from another system.
  2. Intent surveys. If the desired outcome for some website content was to encourage the patient to see a doctor, then ask half the visitors how likely they would be to go to the doctor at the beginning of the journey, and the other half at the end. You could then measure whether there was an increase in intent from those who had seen the content over those who hadn’t, although this still wouldn’t tell you whether they had in fact gone to see the doctor in the end.
  3. Econometric studies. These are expensive but look at minute changes over time, and can highlight where to spend money and what effect it has.

Thanks

Peter O'NeillAugust’s talks were curated by Peter O’Neill (@peter_oneill) who also gave a brief off-the-wrist off-the-cuff (That’s enough now – Ed.) summary of analytics essentials. Drinks were kindly sponsored by Tesco.

Do join us for our next ProductTank London. Tickets will be available shortly (join the waitlist if none are currently available).

You can also get in touch with us if you’re interested in speaking at or curating a ProductTank, writing for our blog on Mind The Product, or sponsoring our events. See you soon!