All About Website Usability Blog – Holly Phillips


Google Website Optimizer vs other multivariate tools
August 27, 2009, 4:27 am
Filed under: multivariate testing

Ok, my last post waxed poetic over GWO.  For our purposes, it was the perfect tool and let us do everything we needed.  But there are many other multivariate tools and vendors out there, and in some cases those would be a better choice than GWO.

We ran into few limitations with GWO.  The main one was its inability to track success events separately.  If you have an ad where you’d be equally happy if someone clicked either “Send me a quote” or “Contact me”, you need to identify both of those as success events and GWO tracks them as a whole.  This wasn’t a problem for us, because we have a separate site metrics package that lets us split out the success events if we need to.  But it may be a limiting factor for other companies or purposes.

Another minor limitation is the inability to specify how much traffic is sent to each variation (GWO evenly splits the traffic between all variants).  This became especially disappointing as the test moved on and it became clear that the current design was under-performing.  It would have been nice to be able to send only 10% of the traffic there and the remaining 90% to the other variants.  But, if we really wanted to do this we could have devised a pretty simple work-around.

Tim Ash in his excellent book Landing Page Optimization goes into more detail on the pros and cons of GWO.  (The book also is a great primer on multivariate testing in general and is definitely worth a look.)  For basic A/B or multivariate testing, or if you’re just starting out in this area, I still recommend you give GWO a try.

Advertisements

4 Comments so far
Leave a comment

There is a workaround for tracking goals separately. See:

http://www.gwotricks.com/2009/01/multiple-goals.html

Comment by Eric Vasilik

Great! Thanks Eric – that was one of the main complaints we had with GWO so it’s great to know there’s a workaround.

Comment by hollyhphillips

Hi Holly,

Regarding the splitting of traffic. If you really feel that the original is significantly under performing you could stop the experiment and start a follow up experiment. This takes just a minute and won’t require you to retag any pages. The follow up lets you run one variation against the original and does allow you to run, for example, a 90/10 split.

Keep in mind, unless you’re seeing a red or green bars, you can’t be statistically sure that the variation is actually better or worse.

Comment by Trevor Claiborne

Yes, we had thought of that but in our case we had to route people to the original page if they weren’t in the US or UK (we did the A/B test in English only). So we couldn’t take down the underperforming page until we could fix it for the other languages. Otherwise what you suggest is a simple and quick solution — just have to try to avoid these corner-cases 🙂

Comment by hollyhphillips




Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s



%d bloggers like this: