Getting your users to engage
There was a bit of conversation recently around the measurement of engagement within the web analytics industry and I think that this has been brewing around this company for a while too (although under a different name). Whilst Avinash and Steve have been doing their impressions of Frankee and Eamon, this is actually quite an important issue.
Whilst a couple of weeks ago I wrote about how you can use HBX to find out which links people are clicking on a page and before that how you can use active viewer to have a little ‘heat’ map of where people are going, this doesn’t give an overall figure that you can put infront of your manager and say this is how ‘engaged’ our users are.
So how do we do it? Do we get Research to do a survey so that we can find out how much our users love our site? Can we measure it through HBX? Should we go and ask some of them ourselves?
Well, to cut Avinash and Steve short, yes is the answer to the last three of those questions. But in order, here is why:
Do we get Research to do a survey so that we can find out how much our users love our site? There is no way you can find out what your users think of your website without asking them. What is the best way of doing this? The best way is to set up a survey on your website and ask them. “Are you engaged with our website?” Answer yes or no. I know this sounds simplistic, but when you ask them the same question in 4, 6, 8 or 12 months time, you can find out if the efforts you have been putting in to the site have made a difference. Hell, you can even say “Are you engaged with section A of our website?”. Ok that is a bit simplistic too, maybe we should look back at Avinash’s blog and question what we want them to do on each part of the website and then ask if they felt like they’d achieved this: Applying for a job (finding one first); reading lots of relevant articles, etc.
Can we measure it through HBX? Again this is something we should, and can do (here I am agreeing with Steve). How do we do it? Well, sitewide we can look at average pages per visit (page views divided by visits). Well this is ok, but we know that some of our users consume loads of pages and some hardly any, so lets do it differently. Let’s look at the number of visits consuming more than one page (ie they have at least in part engaged with the site).
Let’s not pretend though – this metric is pages per visit, not engagement.
Another method that we can use is to measure the length of time that our users stay on the site for. The theory behind this one is the longer they stay, the more they have consumed (bearing in mind different readers have different reading speeds).
This metric is now showing not only an average, but also how long for each of the 30 second time periods. Now my feelings on averages should be known by now!!! What we want to do is look at the percentage of visits that last longer than one minute. These users have interacted with our sites long enough for us to say they’ve spent a while on a our site. Unfortunately our average will be skewed by those people spending hours (and there will be some) on the site. Remember though – this metric is not engagement, lets tell people what it is.
Should we go and ask some of them ourselves? Well now that we’ve got a survey result telling us what people think on average (going up or down over time) and we’ve got some HBX stats on how much our users consume. What do we do with these results? We’ve made a couple of changes. They’ve worked ok. They’ve made a difference to our metrics above, but what else? What you need to do now is go and ask some specific people. Get them in a room, write to them in an email or ring them up and ask them what they think. They’ll be candid and they’ll let you know what actions you should take. These usability studies will tell you where, when and how your users are engaging.
Hi Alec,
Good post and I largely agree with you (FURB aside, I respect Avinash’s work, I just think he was misleading the world and his forum needed to be told ;o). I have to comment though that the engagement index I used was an index of visitors that we classed as engaged.
In other words we were using an HBX Active segment to filter out those visitors we called “engaged”. We were also measuring engaged visitors in context with other KPI’s. So for instance reach was measured against an engaged segment and activations compared. This meant that we could optimize the campaigns we were working on based on activations/conversions *and engagement*, hence the millions in savings we made by re-directing marketing funds.
I just presented this at the eMetrics in Stockholm and it was largely accepted by Jim Sterne and co. So I think I was right. It’s purely a semantics thing. What one man calls engagement I call visitor lifecycle measurement. I will be writing more about this later in the month after I’ve thoroughly examined Eric’s engagement index formula.