Splitter: A split-testing plugin for WordPress

I built a new little WordPress plugin called Splitter, and I’m using it to run A/B tests on Truth (plus lies). Its purpose is to show different users slightly different page layouts and report back to Google Analytics who was looking at what. It works by randomly setting a cookie for new visitors and then using JavaScript to show/hide elements based on that cookie. It reports back to Google Analytics by setting a custom variable with the cookie’s value.

Here is the source for the plugin, which is made up of three files — two JavaScript, one PHP. The PHP file splitter.php defines the plugin, some settings, and inserts JavaScript based on those settings into the header of the page. The JavaScript file splitter.js does the work of actually setting the cookie, modifying the page, and setting the Google Analytics custom variable.

As for trigger_yoast.js — unfortunately, I ended up coding Splitter to work specifically with Yoast’s Google Analytics plugin. This was a tricky issue for me to figure out because the custom variable has to be set after the Google Analytics scripts are included but before the pageview event is fired. With a Google Analytics plugin, Splitter’s JavaScript can’t run until after the plugin is initialized, but then the plugin has to wait for Splitter to set the variable until it can finish what it’s doing. Yoast’s plugin lets us do that by providing the “Where should the tracking code be placed” and “Custom Code” settings. If the first is set to “Insert manually”, then it’s up to me to trigger the pageview event, so I added a footer action in splitter.php to do that. Then I set the “Custom Code” to call the function in splitter.js — gaVariable() — that sets the custom variable.

I used this to explore whether a couple of the widgets I’ve got in the sidebar of Truth (plus lies) were worth their real estate. The first was the tag cloud. I decided that the goal of my tag cloud was to encourage visitors to explore the most common topics on my blog, so if it was successful, the tag cloud would be driving down the bounce rate.

In my Splitter settings, I created state A, which showed the tag cloud widget, and state B, where it was hidden. The element is defined by a CSS selector. I defined my cookie name as ‘tagcloud’ and my two states as ‘yes’ and ‘no’. With this information, I created two custom segments in Google Analytics that matched for the custom variable name and values, which are the same as in the cookie. Then I applied my segments and checked out the Bounce Rate report for the home page of my blog. (The home page is the only page that had the tag cloud.) Here were my results:

Google Analytics Report

The total bounce rates were: 60.53% total, 68.18% with the tag cloud, and 53.33% with no tag cloud. So, a definite win for removing the tag cloud, right? Not necessarily. Here’s where I’m a total beginner at A/B testing. These results were for just 65 total pageviews — 40 with the tag cloud and 24 without. (I don’t know where the 65th one went.) This leads me into the biggest A/B testing trap, statistical significance. Crazy things can happen by chance, so it’s important to be rigorous about making sure your test results actually say something useful about your design.

The home page of Truth (plus lies) doesn’t get too many hits. Most of the traffic goes to single post pages, so I decided to end my split test without a strong conclusion. I did go ahead and remove the tag cloud though. Not only did my aborted test show no advantage, but some click tracking I was doing at the same time showed that no one ever clicked the tag cloud.

Before doing my second test, I improved Splitter by automatically turning my A/B tests into A/B/A tests. That is, when you create to states to test, the plugin will create a third “control” state that is a copy of the A state. The control state is tracked separately, so in Google Analytics, you’ve got three segments to look at. But now you know that two of these segments should be producing pretty much the same results and that any difference between them is pure chance. Which is helpful when you’re trying to decide if a difference between states A and B is significant.

Armed with this new feature, I set up a test that either shows or hides the “About Me” widget in Truth (plus lies)‘s sidebar. Actually, the test’s still going, and it’s been going for over a month. Here are the current results in Google Analytics:

Google Analytics Report

Still ambiguous! There’s a slight advantage for the “With” states. But for bounce rate, the difference between “With About Box” and “With About Box (Control)” is bigger than the difference between the control state and “No About Box”. A little clearer are the numbers for “Pages/Visit” and “Avg. Time on Site”. Both show “With” and “Control” close together and doing better than “No About Box”.

But only slightly. More than anything, my tests show me that the presence of both these widgets doesn’t make much difference to the users.

Tagged , , . Bookmark the permalink. Post a comment or leave a trackback: Trackback URL.

Post a Comment

Your email is never published nor shared. Required fields are marked *

*
*

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>