Client-side and server-side A/B testing

My take on the differences on what many might miss.

Created on 4 August 2024.

I don't know why, but I get this feeling that most articles talking about A/B testing and the differences between client and server side are written by people who are either biased (they have a software solution focused on one type of testing so they promote that one) or by people who haven't really used such a tool.

Why?

Because the list of pros and cons touches only shallow things. Kinda like client-side testing.

Even reading summaries generated by AI such as this one by Perplexity reveals the same thing.

Let's get straight to the point. There are 2 main differences which are obvious if you spent more than 30 minutes doing some A/B tests.

  1. With Client-side testing, you have certain limitations in terms of what can be tested.
  2. With Client-side testing, you don't rely on the dev team — there are even visual tools that a person can use to make basic changes.

One could argue about a third point. The accuracy of the data collected is better on the server-side. For example, there's little chances of counting a returning user as new just because they switched from browser to browser or their cookies expired or what have you.

Let's get back to the main points.

Do those limitations affect you? It depends on where you are in the journey. If you are just starting out, I'm confident there are loads of tests and assumptions that need to be confirmed. So those limitations certainly do not apply to you.

If, however, you have been doing this for a while, then yes. You will feel the need for more advanced scenarios, where client-side testing will show its limitation.

But before you reach those limitations, I would argue that even some changes you would like to make will still require some code written. So this is how the second advantage goes out the window.

Ok, Vladi, what is your actual point?

My point is the shallowness of the benefit of client-side testing: "Allows marketers to easily create and launch experiments without involving developers".

Let's go a bit deeper, shall we? First, yhea, let's assume you really are starting out, and you could really produce 10-20 meaningful tests that can be created only using a visual tool of sorts and no devs will be involved.

Great. And you start working on them. And let's also assume that everything else is perfectly setup, data is being tracked correctly etc. Fantastic. You launch one test. You launch another one.

Eventually, you will reach statistical significance and some of your tests will be declared winners or losers. Now what? For the loosing tests, that's an entire story. We won't go there.

What about the winners?

Let's say you ran 5 tests and 2 of them are winners. What do you do with them then?

You have 2 options:

  • A. Leave the winning test and variation with 100% traffic allocation.

  • B. Actually implement the changes.

Now, if we actually choose Option A, than all the downsides of client-side testing start showing up.

  • Foremost, you will the flicker or flash of original content. Not that big of a deal when you have 1 test, but becomes horrible when you might have 2-3-4 tests running with 100% traffic allocation and another 2-3-4 running tests that perform changes on the page.
  • Then you have more and more and more JavaScript on the page. And growing. The user experience gets degraded little by little with each change. Your e-commerce shop will perform worse over time and thus your tests will also be affected. It sucks.

And if you pick Option B, you will effectively drop the main benefit of client-side testing. You will now have to rely on the dev team to implement those changes.

But guess what, now you did the same thing twice! Think about it!. You thought of the changes that will test your hypothesis. You implemented them using the A/B testing tool. They've won and now you go to the dev team to implement the same things again. And usually it's not something really basic like change this text here and change this color here. So it's not a simple matter of copy-paste-done. Ergo, you go all over again with what has changed, and they will implement and push the changes live.

That's a lot of work! And redundant.

All right, what do you propose?

Firstly, talk with people who know their stuff. Don't rely on marketing content when making decisions.

Then, I would actually choose a tool that does both! Some experiments are better suited for client-side and others for server-side. Obviously it depends on the company and the way the teams are structured, but most would be fine this way.

Then I would pay attention to the impact of the tool. How does it affect the loading speed. How much JS will it polute your pages with? What about cookies and getting consent etc. And then on the transparency of the data provided by the tool.

For example, when they say that this goal has a 95% change of improvement... all right... how? based on what data? How is it tracked and calculated? Don't just blindly trust a tool saying you had 100000 views and 400 signups.

And lastly, don't dismiss the dev team. They need to understand what your objectives are, and figure out the best way to help you. That's their role. Or it should be! Therefore, they might even help you out in picking the right tool for the job.

So is a pure client-side A/B testing tool ever useful?

I can think of only one scenario. When you completely outsource your conversion rate optimization process. It makes no sense to introduce an outside team to your systems and codebase.

But don't forget that, eventually, the dev team will still need to be involved to implement the winning changes. And, also, if your dev-team is truly agile, then the outsourced team will have issues (because the shop will keep changing and the client-side code will keep failing).

So if you are not agile + outsourced CRO services, then a client-side A/B testing tool works best. For everything else, either both or server-side will suit your business goals better.


If you enjoyed this article and think others should read it, please share it on Bluesky or share on LinkedIn.