3 Tools to Validate Product Assumptions with Metrics

The lean accounting compass

compass.png

Part of our series of beating the competition with an agile and iterative mobile product strategy, we look at one of the guiding principles of Lean Methodology; measuring what’s going on with your product.

If the assumption that we’re not always right 100 percent of the time stands, then we know that we’re going to need to make a decision at some point to change something that’s not working. If that is the case we need to track what’s working and what is not.

Tempting as it is to keep going back to our customers and presenting them with intercept surveys, to see if they prefer version A or B of the product (known as split testing), can we really keep on surveying time  crunched users? What happens when we want feedback on a weekly basis? Perhaps we can we look at metrics in our business which infer what we’re doing is better.

With digital engagement, users kick up a whole host of eminently trackable information and the reality is that we can use this data to attach success criteria to our assumptions.

We all know about analytics, but what to measure?

blueprint-964629_1920.jpg

Lean Methodology avoids vanity metrics such as visitors, users or page views (these are often function of marketing rather than the product) but draws on actionable measurement such as metrics around retention and referrals - KissMetrics has a great 9 point list.

In the case of a financial services business, key success factors for the marketing team are likely to be conversion or in the case of advertorial revenues, then how long someone looks at the content (known as dwell time) is probably more important.

So what do actionable metrics look like? Let’s see with a couple of assumptions...

Example Assumption

Way we could measure this..

Push notifications will drive repeat visits to the app

Percentage difference in average number of return visits over 30 days for users with push notifications enabled

Providing an option to select the time for a call will reduce the number of calls

Number of people who click “contact” and opt to entering a preferred time to call rather than calling from the app.

In our workshop we will look at some detailed assumptions with visuals to show how we can hook in tests into the app build process.

Not getting your metrics in a twist…

twistedmetrics.jpg  

Looking at the examples above it’s pretty clear that these metrics are complex measurements rather than simple volume based charts.

We’re going to have to get the learnings from these metrics reflected in our product, through new or amended features, via technology changes and in associated development cycles.

Agile philosophy and methodologies help us where metrics should drive the product requirements. Choosing actionable metrics as key project success factors, which sit alongside the requirements, are defined in a project initiation type document (PID) generate dependent tasks to implement tracking in the project by asking the following questions:

  1. Can success be measured by an existing metric and we split test it with users?
  2. Does this feature need a new metric to be measured? If so what is it?
  3. If it can’t be measured then is it actually of value?

Using the principle of “One Metric That Matters” for a project or start-up, for any metric we want to test, it should be compatible with our cohorts (different user groups), split tests, app versions and campaigns and be tracked in our monitoring tool, to see if any user activity has significance (in terms of success or not).

What is more we’ve got different version releases going on, different split tests, different cohorts and a whole bunch of different channels feeding the apps. To make sense of all the metrics we’re tracking, we’re going to need to tool up with analytics software!

3 tools fit for the job...

So, what tools do we use to gain insight from our metrics? This motley blend of tools forms our usual stack...

  1. Google Analytics (or other visitor tracking tools) - start with the user tracking tool but also with powerful tools and integrations to measure cohorts and also hook into your existing stack.
  2. Apptimize or Optimizely - for multi variant or split testing within the app. Some hybrid frameworks are also rolling out their own A/B testing tools (hello there Ionic)
  3. KissMetrics or Mixpanel - to bring it together with other metrics as a reporting tool but with specific actionable metrics and with a focus on who is doing what.

Our vote goes with KissMetrics on their alignment with bringing their product into reality. As any software engineering house our focus is on creating software which users love and any platform that helps product owners get that insight is worth it’s weight - err bytes - in gold.

Written By: Ollie Maitland