Whether it’s designing news­let­ters or online ad­vert­ise­ments, marketers are con­stantly trying to improve their online marketing methods to appear more user-friendly to potential customers. Ul­ti­mately, they want to optimise all possible online marketing op­por­tun­it­ies to obtain more con­ver­sions, which results in more profit for the company. It doesn’t matter if you’re op­tim­ising a whole website, in­di­vidu­al functions, or in­di­vidu­al landing pages: A/B testing has proven to be a good means of achieving an optimal web design over time.

A/B testing: the basic principle

A/B testing (also known as 'split testing') is a method used to compare different versions of a website, or its in­di­vidu­al elements and functions. Here the original version of a website is usually tested against a slightly altered, improved version, to see which fairs the best. The target group (i.e. the website visitor, or the user) is divided into group A and group B. A different version, (version A and version B) of the test object, (i.e. the advert) is shown to each group; their reactions are recorded, and then compared. The desired reactions have been pre­vi­ously defined by the website operator. For an ad­vert­ise­ment, the desired reaction could be a click or a con­ver­sion. If it is a landing page test, this might be a download or news­let­ter re­gis­tra­tion.

Why is A/B testing used?

Website A/B testing is used in various dis­cip­lines of online marketing. It’s a matter of personal pref­er­ence what you decide to test: whether it is the whole site, single elements, the wording, or just the colour scheme. Unlike mul­tivari­ate testing, split testing only tests a hy­po­thes­is (i.e. a variable). By comparing the two versions, the following aspects can be tested and optimised:

  • Web design
  • Revised features and functions
  • Landing pages
  • Website elements such as call-to-action (CTA) buttons
  • AdWords ad­vert­ise­ments
  • News­let­ters

You can try out as many different usage scenarios as you want. Here are three examples:

  • An online retailer notices a rise in visitors can­cel­ling their shopping baskets and not going ahead with their purchases. A possible solution would be to emphasise in­di­vidu­al elements in the pur­chas­ing process and then test the new, improved version against the original.
  • A web service wants to show targeted ad­vert­ise­ments for one of the products on offer. They can test two different versions of the ad by using a different keyword in both of them.
  • A blogger that finances their work through ad­vert­ising (i.e. a publisher) is looking to increase their visitor numbers. They can test different titles for their articles, or vary the use of images.

Website A/B testing: step by step

You shouldn’t implement any A/B testing without first coming up with an in-depth strategic plan. Many marketers adhere to a strict basic principle:

1. Identify problems

An op­tim­isa­tion can only be carried out if there’s something to improve. The first step, therefore, is to identify the problem. This could be, for example, that a button isn’t being clicked on enough. Once the problem has been detected, the goal is obvious. In this case, it would be more clicks on the CTA button.

2. Research and collect ideas

Before you formulate a hy­po­thes­is, you should acquire some back­ground knowledge and research ac­cord­ingly. It’s easy to claim that a blue button would work better than a red one, but without empirical evidence this theory makes no sense. It is useful to look at some studies in advance e.g. research that analyses the influence of colour use on user behaviour. In addition, you may also get some good tips on which elements can be modified.

3. Define hy­po­theses

You can formulate the hy­po­thes­is once you have an im­pres­sion of the results of your research. This could be: a yellow CTA button will result in a higher click-through-rate (CTR). Or did you change the position of a menu item and predict: this menu item will be easier to find on the newer version than the old one?

4. Test phase

Create two versions of the site that you want to test e.g. version A with a blue CTA button, and version B with the new, yellow CTA button. Split testing enables both versions to run against each other in the test. This can happen at different times or through different URLs. With A/B testing software, it’s possible to forward the user at random to one of the options.

5. Analysis and report

If the test has reached a suf­fi­cient number of samples over a certain time period, eval­u­ation can take place. If it turns out that the click through rate has improved sig­ni­fic­antly with the yellow CTA button, this version will now become the new 'ori­gin­al'.

This sequence can be repeated as often as you wish: in the next step, you can test whether the position of the CTA button on the website has an ad­di­tion­al impact on the CTR. Es­sen­tially, you can test even the smallest element of a web page – as the website’s operator you have full reign in what you decide to test.

Pros and cons of A/B testing

There are many benefits to A/B testing for online marketers. It enables a sub­ject­ive com­par­is­on that takes place in­de­pend­ently of your own point of view, focusing instead on the view of the target group. Thanks to numerous (some even free) testing tools, you can perform a split test without having any prior technical knowledge, and the test will still deliver clear results. These results can be im­ple­men­ted im­me­di­ately af­ter­wards.

On the other hand, testing is only useful when comparing in­di­vidu­al elements with each other. If you change lots of elements at once it defeats the purpose, since you don’t know which change caused which outcome. There’s always a danger of over­whelm­ing or confusing the user if you keep making changes and then reverting back to the original. So with primarily new customers, it is not advisable to carry out these tests. Finally, there is the question of stat­ist­ic­al sig­ni­fic­ance. For small sites with little traffic, it is difficult and time-consuming to obtain figures that are actually sig­ni­fic­ant.

Pros and cons at a glance

Pros Cons
• Sub­ject­ive com­par­is­on • Only one hy­po­thes­is per test
• Reflects the interest of the target group • Confusion among users is possible
• Easy im­ple­ment­a­tion thanks to testing tools • Stat­ist­ic­al sig­ni­fic­ance for small sites is difficult to achieve
• Clear analysis is possible
• Immediate im­ple­ment­a­tion of results is possible

Testing tools

There are numerous testing tools on the market that enable you to carry out split testing. One free option is 'Content Ex­per­i­ment­s', which you can use through Google Analytics. The range of functions that this Google solution offers is smaller than that of pro­grammes like Op­tim­izely, which for a monthly fee offers a re­l­at­ively easy testing tool for small and medium-sized busi­nesses. A similar service is offered by Kameleoon, which is very intuitive and easy to install. Kameleoon also offers a free freemium account for up to 2,500 visitors per month, making it a good entry-level solution for small websites. A slightly more expensive service (due to ad­di­tion­al features such as heat and click maps), is the Visual Website Optimizer. For large busi­nesses that want to use website A/B testing, there are extensive options such as OpenText Optimost or SiteSpect, but these come with a heftier price tag. 

Go to Main Menu