You want to make the most of your website by increasing your conversion rate. And you have heard that the best way to know how to do it is performing A/B tests, also known as split testing – case studies have shown companies such as Dell increasing their conversions to up to 300% thanks to them.
In a nutshell, during an A/B testing, you will get one piece of content in two different versions and test them both with two same-sized audiences – one sample will be your control group, the other your challenge group. The sample that reaches at least 95% of success metrics is your winner. For instance:
In the example above, the colour of a call-to-action button was tested to see which one would bring more conversions.
Still, you might not be sure what split testing really is when so many myths surround the process. And, for this reason, let’s bust the most common A/B testing myths, so you can stay ahead of your competitors.
MYTH 1 – A/B TESTING IS TOO EXPENSIVE
If dealing with numbers is not a problem for you, and if you have some tech background, good news is on the way. There are free tools available for those dealing with small budgets – and one of the options you should definitely check is Google Analytics’ Content Experiments.
On the other hand, there are also plenty of paid services out there, and it is a good idea to consider them before saying that they are too expensive.
You might find out that contracting them is a better cost-effective way to spend your budget, as they will give you faster results. Plus, you won’t have to go through all the problems brought by learning new stuff, the hassle of setting the tests up or of analyzing data yourself – remember, your time also costs money, and you should treasure it.
Some of the A/B testing services available are HubSpot, A/Bingo, Five Second Test, Optimizely, Shareprogress and Unbounce.
MYTH 2 – YOU ALREADY KNOW THE BEST THING TO DO
You have got years of experience in online marketing, so why to waste money on A/B testing?
Because by the time you decide what is best, your competitor might have already got that portion of sales that you wished for. And more than that: everything about online businesses is based on guesswork anyway.
Obviously, the online world is a very fast-paced environment, but it doesn’t mean that any breakthrough is bound to happen right now. You can take your time and try a few options without the fear that your competitor will destroy your business overnight.
And when it comes to marketing decisions, statistics will always be your best friend. No matter how many years of experience you and your team have accumulated so far, it is very unlikely that any of you has developed the ability to read your target audience’s minds.
That is to say that you will need to ask your clients what they think about what you are proposing. Unfortunately, online content still requires being tried out to ensure that it will work – meaning that you can’t just use questionnaires for it.
And even if you think that everything that gurus and experts say about online marketing is pure guessing, there is nothing like a well-researched data to boost your confidence and revenue.
MYTH 3 – YOU SHOULD TEST EVERYTHING
If not testing at all is the choice of the disorganized, testing everything is the favorite option of the perfectionists. You do not want to take any risk, therefore, you test every single thing and every small change to be on the safe side.
This attitude is even more tempting if you are part of a team, so you can justify your decisions with statistics in case anyone disagrees with you.
Unfortunately, this is not the best approach to A/B testing. It still takes time, and you need to allocate financial and human resources for each test. For this reason, it should be used wisely. Plus, everybody knows that, except in rare cases, very small changes won’t change your final results much, so don’t try to be too thorough.
And if your traffic is only small (less than 1,000 pages views), you might rather wait until it increases so you can make the most of the data available.
MYTH 4 – MVT TESTING BRINGS MUCH BETTER RESULTS
This myth could not be more wrong. MVT (multivariate) and A/B testing serve very different purposes, so you shouldn’t be comparing them for starters.
While the first one analyses the impact of a combination of factors, the second checks just one per time.
For instance, you would run an A/B testing to decide which text works best with your call-to-action (without changing anything else on your landing page).
On the other hand, an MVT testing would be great to find out if one text is more efficient on a blue background on the right-hand side of a static landing page, compared to another text on a dark green background on the left-hand side of a non-static landing page.
As you can see above, they are very distinct tests, so pick one of them accordingly to which sort of data you are expecting to collect.
MYTH 5 – GOING THROUGH A/B TESTING CASES STUDIES IS ALL THAT YOU NEED TO DO
Many people are happy to share their A/B testing results, so you can easily find lots of cases studies online. And you think you can just read them carefully and apply their results to your business. Right?
I am afraid not.
If you believe that you can just transfer competitor’s statistics to your business, you are also saying that your business is exactly like theirs: you offer the same product or service to the same audience for the same price through the same channel. If so, go ahead – just watch out for the copycat patrol.
But it is more likely that you are trying to provide something unique or, at least, with some added value. As a consequence, their data, goals and products are different from yours and it will affect the results of your A/B testing.
So, forget this shortcut, run your own A/B tests, and certify yourself that you are getting the right answers.
MYTH 6 – A/B TESTING IS BAD FOR SEO
The biggest fear of any SEO expert is to be penalized for duplicated content. And the idea of testing two versions of the same content with just one small change on their background colours, for example, gives them nightmares.
Still, this is not true. You can run your split testing without any risk of getting into trouble if you follow Google’s instructions. As mentioned above, Google even offers their own free testing tool, so it is obvious that they are not against it.
For starters, Google makes clear that small changes, such as the text of a call-to-action, will often have little or no impact on that page’s search result snippet or ranking.
They also explain a few things you should know about it:
- You should never show one content to users and another to Googlebot – something known as “cloaking”
- Use a rel=“canonical” link attribute to indicate which one is your preferred URL version
- Keep the testing running only for as long as strictly necessary.
As you see, it is no big deal to run an A/B testing without hurting your SEO strategy. Just keep yourself updated with Google recommendations and you should be fine.
MYTH 7 – IF YOUR RESULTS LOOK GOOD ENOUGH, YOU CAN STOP RUNNING THE TEST
There is a reason you should run the A/B testing for a certain amount of time and through a specific number of visitors. It provides you with relevant data, which will guide you to better decisions. If you just stop the process halfway, you might have in your hands inaccurate statistics.
It is as if you go to the streets during you lunch hour asking people what they think about your product. And as the first 10 people say only great things about it, you feel happy enough and ready to go back to the office.
But the fact is that you only stayed there for 1 hour, and maybe met a very specific type of person that walk through that very same area over that same hours every day. Can you see how this sample is not reliable at all?
Plus, this attitude might lead you to inconclusive results that won’t help you, such as something as below:
Of course, you can still get some ideas from inconclusive tests, but not when they result from a limited time window. Therefore, wait until the minimum sample is reached, so you can rest assured that your results will reflect the truth.
MYTH 8 – ONE TEST IS ENOUGH
This is hardly correct. Expect if you are testing something very specific, A/B testing is usually a continuous process. You start with colors, then you move to buttons, texts, images, and so on.
Plus, you will need to find out the significance of each one of these factors to all your metrics, such as leads generation, increase in the number of subscribers or visitors, and the impact on your sales.
And one more reason here: your customers’ behavior changes quickly. So the results you got today will probably be outdated in the near future.
That is to say, A/B testing is never finished and you should always consider it as a recurrent task.
The Bottom Line
Now that you took all these myths out of your head, you can start using A/B testing to improve your business with confidence and knowing what to expect from it. Split testing can certainly increase your sales and improve your brand awareness, so make the most of it.