First job of this weeks post is to introduce my lovely colleague Kate Duffy. Ok its not that one really, but Kate has actually been writing some posts this week live from SES in New York. Firstly she wrote about Jeffrey Eisenberg and his presentation on people being cats and not dogs. Her second exciting installment is a piece on turning data into action specifically if you are a Star Trek fan.
Secondly, and in my long running theme of writing about things I’ve been doing at work, I am going to write about something I have been doing at work. Specifically I’ve been looking at a series of segmentations based on either campaigns or on referring domains/urls. Here is the best way to do it with a bit of a warning at the start.
When you do your SEO effort one of the things that you may do is a url rewriting system so that all of your urls point to a canonical version. That is each page now only has one url for the content within it.
How does this work? Well you’ll have to find your own way of doing it, but one thing you’ll need to make sure is that you remove all query strings and turn everything into one case.
Why is this important? Well the reason is that search engines will think that your url with a query string on the end is a different page to your url with no query string and will therefore split any link juice you have. They may eventually bring it back in again when they realise, but you may lose out.
What impact does this have? Query strings are your life blood if you want to create campaigns all day long. Most web analytics tools rely on them. You send people to a page, the tag loads on the page, the tag looks for a query string to see if there is a campaign code there. If you’ve just rewritten your pages it won’t be there any longer
How do you get around this problem? Session cookies. Collect your campaign codes in session cookies before you rewrite the url. Then populate the parameters in the tag from the session cookie. Everyone is a winner. Even Errol Brown.
Anyway this aside (always track all your campaigns – emails, ppc, RSS, banners, affiliates, etc) brings us on to our next section on segmenting your data. Or context as Avinash put it. Want to know why your traffic is going up or down, you have to segment it.
My tricks for segmenting always start this way:
- Look at the total visits to the site over a time period
- Look at your key entry content over a time period
- Compare the two
This first level of segmentation will usually help you the most. You’ll have key sets of pages that your sending your users into and you want to be able to work out whether one has gone up more than the other. More specifically you’ll probably be able to work out a bit more an idea of which of your campaigns is working the best (all campaigns should have different types of landing pages).
The next option is to segment your campaigns by the entry point. This is actually really easy to do in Google Analytics:
I’ve just gone into the Search engines report and then chosen segment by landing page from the drop down. You can choose this option in the campaigns menu and the referring sites menu and having clicked through to each of them.
The other important thing about this table is that search box at the bottom that allows you to search the content so that if it contains only a certain set of words of if the structure of your url system is set up in the right folders you can use that. Also don’t forget that you can then compare is on the graph to the whole site.
In HBX it is just as easy to do this if you have report builder and you are looking at referring domains. You can choose the cross referenced metrics, the referring domains and entry pages report and then you are away. The advantages of this in HBX are that because you’ve structured your multi level content in a sensible way you can use this in your filters. The disadvantages are if you want to do it based on campaigns then you’ll need to create segments based on your campaigns first.
How does this help? Well it means that when I was sitting trying to work out why one of our sites traffic had gone down this week, I could isolate it very quickly (a broken RSS feed). And when I am reporting which areas of the site are going up and why I can isolate that equally quickly.