A/B testing to improve engagement

In a shock turnaround in fortunes this week I am going to blog about something I am planning on doing rather than something I have just done. Ergo this is probably not going to make much more sense than my ramblings about how I think the best way of doing it should be. Then in a couple of months when I’ve hit all the pitfalls of doing it and realised that this post was a ‘dream’ scenario, I’ll come back and tell you all about where I went wrong. Sound like a deal?

This was actually coming out of an Omniture Webinar that myself and Mr James Kelway were listening to that gave us a whole host of ideas for things that we could do in the future. Specifically it was around how you could use interaction design and ‘persuasion architecture‘ (are they the same thing said in slightly different ways?). Whilst James got excited very quickly about the sorts of things that he could test to see if they improved onsite engagement, it fell to me to come up with the sort of ways we could prove it had worked (when we do it, if it works, etc).

First off, we came up with a couple of changes we wanted to make that were relatively minor (they are text changes, small bits of functionality, etc). However the idea of A/B testing (or even multivariate testing) is that you have to build each slight change into the site, with a control, so that you can work out which option is the best. So with that in mind, we picked two changes that we want to include and set up about setting up our test. It would involve 4 versions of a page:

  • Control (the original site design)
  • Page with change A
  • Page with change B
  • Page with change A & B

Ok – we want to find out if change A works. That means that we have to compare it to a control (the original version for a page that is already there or a standard control for one that isn’t). If we want to make two changes, then suddenly we need 4 pages (we need to test that changes A and B work together).

Now what we are talking about is making some changes that will affect a series of pages (they are the nice fluffy bits around the outside of an article). We don’t really want to just test it on one article, we want to test it on all articles, so this gives us a bit more of a dilemma. Now we don’t just have 4 pages, but we have 4 sites. This makes the whole thing more troublesome because step number one in the whole process is not to muck up any existing reporting – that would just make us look like idiots if it turns out that the best option is to stay as is.

So this is how to do it in HBX. Use your custom metrics. You may already be using your custom metrics, but this shouldn’t have to impact that (even if you are forced to use one that you already are using). Each custom metric has two dimensions to it – this will allow you to give your first custom metric as the fact that you are doing an A/B test and your second associated dimension as the type of test (ie control, A, B, A+B).

These custom metrics now allow you to do some clever segmentation without mucking up your existing reporting. All you need to do is set up a segment that relates to a visitor visiting each of your site types (A, B, A+B and control). You can then compare each of these segments (either using the active viewing or the user interface or report builder) to work out which one works best (more of that in a second).

Obviously this gives you a few technical headaches. You need four concurrently running versions of the site (that’s easier if you have a natural load balancing set up). You need to ensure that each one is just as equally likely to get to (otherwise you’ll end up with skewed results) so that means that they should all have the same url. You’ll also need to ensure that the same version is given to the user each time they visit the page and throughout their visit (cookies coming in here).

Persuasion Architecture is all about getting the user to go down the route that you want them to on the page with the minimum fuss. That means that each step needs to be made clearly and concisely so that the user wants to continue down the route that you have made for them. Forcing a user down a funnel that they don’t understand why they are going down or what they are getting at the end is just going to make them want to leave (which they will do in droves).

This means that when you are looking at your results you need to consider two things. Firstly, what is the point of the page (ie what do you want the user to do on the page). Getting the user to do more of one thing (eg looking at the specifications of a product) at the detriment of another more profitable one (eg buying the product) is not the ideal situation.

In our case we want our users to stay on the site for longer and consume more of our lovely pages (whilst coming back more often in the future). So when we set up our persuasion architecture tests we want to ensure that we are not trying to force our users to consume the pages we are pointing our we have. eg if we are promoting/pointing our users to the ‘related articles’ or ‘most read today’ links in the right hand nav, we need to ensure that it is not at a detriment to in article links, or the natural left hand nav. We also need to make sure that we aren’t directing the users away from other parts of the site that may be more beneficial (sign up forms, email newsletters, sponsored white papers, jobs, etc). Therefore a bit of weighting may be needed to each set of links.

The advantage of this test in HBX (and I am sure other analytics tools) is, however, that with the active segments set up, we should be able to find out which one works best for not only all pages (with our page impressions, pages per visit, bounce rates, etc) but also for each page individually (entry pages v single access for each segment per page, links clicked on per segment, conversions per segment, etc).

3 Comments on “A/B testing to improve engagement

  1. Hi Rex,

    I don’t have anything I think I can share with you (at the moment). But your website looks great. What a brilliant idea! As soon as I do, then I will be sure to put it up!

    Cheers,
    Alec

Leave a Reply

Your email address will not be published. Required fields are marked *

*