Skip to main content

This site requires you to update your browser. Your browsing experience maybe affected by not having the most up to date version.

 

Testing, testing, 1, 2, 3

A/B testing is the best way to figure out what marketing messages and creative content appeal to your audience the most. Learn how the SilverStripe Marketing team implement variant testing so you too can increase your website’s engagement and conversions.

Read post

Improving onsite engagement with your audience is an iterative process that changes day to day but when done well, has notable gains. A significant part of this process is constantly changing and updating content, but just making assumptions about what content will be most effective is risky. We take the guess work out of the equation by implementing experiments on our site. Let’s look and how you can use various testing tools on your site to identify what content and images resonate best with your audience.

We regularly use testing to investigate how users are interacting with our content. The results of these tests tell us what content is most engaging, or what needs a refresh. The specific types of testing we currently use are A/B testing, multivariate testing and Split-URL testing. Each type of testing allows us to compare different products/metrics on our site.

Types of tests

A/B testing

Potentially the simplest to understand, A/B testing compares two versions of a web page to determine which version performs better. The two variants of your web page, let’s call them A&B, will be compared by presenting a portion of our site traffic with A, and the other portion with B. The one that gives a better conversion rate is considered the most effective. Furthermore, we can keep doing this until we’re happy with the results.

Screenshot of VWO A/B test dashboard

VWO A/B test results

This is really great for small experiments, such as comparing different headings, imagery or even button colours, where we’re looking to make 2–3 variations. It is particularly useful for small websites, as it does not require large amounts of traffic to obtain a valid result.

This does however become problematic if we need to test a larger number of variations, say, 3 headings, and 3 types of forms and 4 call-to-action buttons. How now brown cow? Multivariate testing, that’s how!

Multivariate testing

Unlike A/B testing which is great for comparing 2-3 variations on your web pages, multivariate testing lets you experiment with as many changes and combinations of those changes as you want. As you make edits on your landing pages through a multivariate editor, such as VWO, the tool will create all possible combinations for you to test. You will be able to determine which combination of variations performs the best out of all of the possible combinations.

Screenshot of VWO multivariate test dashboard

VWO Multivariate test dashboard

With multivariate testing, in order to get relevant data, the page you’ll be testing will need a sizable amount of traffic (sample size) or conversions - otherwise you’ll be waiting a while to reach a statistically significant result. Conclusive test results will be highly dependant on several factors such as existing conversion rate, daily traffic, number of variations etc. You can get an indication of how your test might run by using the VWO calculator.

The level of traffic your site experiences will limit the number of variations you can test. We recommend starting small, perhaps a simple A/B test with messages tailored to two distinct personas.

Split-URL testing

Split-URL testing is the process of testing multiple versions of your website hosted on different URLs. Your website traffic is split between the variations and conversions measured to decide which version wins. The key difference between a Split URL test and an A/B test is that the variations are hosted on different URLs in the case of a Split test.

As a rule of thumb, A/B testing is preferred when only front-end cosmetic changes are being tested, Split URL testing is preferred when back-end or significant design changes are being made.

Image of A/B testing versus Split URL testing

What are we testing?

At SilverStripe, our marketing team uses these types of tests to explore the direction for content on a new product or feature. We may decide to focus on a particular user group and tailor messages to suit that audience.

These tests give us clarity on what users are looking for, and in some cases how easily they find it. It shows us what our most popular content is and what content incites engagement in the form of a click-through. The click-through represents that interaction which is our ultimate goal with any content. We can also prove or disprove any assumptions we make as a team about our content. Regardless of how many brains your team brings together, there is no way to guess exactly what your ideal audience is really looking for.

Getting started

1. Choose an A/B testing solution
Rather than creating your own A/B testing tools, we recommend that you utilise one of the great A/B testing solutions out there such as Optimizely or VWO. They are easy to use, quick to setup and will save you a lot time. We found the latter suits our needs best, but you should do a little research to find the solution that works for you.

2. Install your chosen solution
A lot of solutions out there are incredibly easy to setup, usually requiring you to insert a code snippet into the <HEAD> of the page HTML. We implemented this on our site by simply adding a DataExtension, which adds a TextField to your page’s “Settings” tab and calling the value of this text field in your page template. We copy and paste the A/B testing tool's snippet, publish, and away we go.

3. Have a play around
These are really powerful tools that have a lot features to get your head around, especially if you’re new to this. To that we say “dive straight in”. Both have great guides for how to run an insightful test. See VWO guides and Optimizely guides.

Screenshot of resources from testing tools

Image credit: VWO and Otimizely

Findings

Finally, plan for what actions you’ll take once you’ve finalised your test. The results could be overwhelmingly in favour of your original assumption or change. However the results could also prove that your change made no difference at all - or that you have even reduced traffic! Either way, plan for what you may need to change following the test.

The results from the testing tools will be in flux for the first few weeks. If you can, hold out from drawing any conclusions until you see a trend emerging, something that remains constant week to week. At the outset, remember to set yourself a clear goal. How will you measure success if you don’t understand what these tests are trying to achieve?

After a few weeks have gone by, you’re bound to be wrapped up in other work, making it more challenging to remember to check your test and find time to implement the changes. This is where your original goals will play a vital role. If your goal was to test what variant increased traffic to a particular page on your site, you’ll be able to look at those results and choose one option for your site based on what was most successful. Implementing the changes should be simple, you’ve already created all the combinations in one way or another. Now you can hit publish with confidence as you have the data to backup your decision.

About the author
Dani Smith

Bridging the gap between design and marketing, digital and print. 

Post your comment

Comments

No one has commented on this page yet.

RSS feed for comments on this page | RSS feed for all comments