Nelio A/B Testing offers two different algorithms for optimizing the conversion in your WordPress site: the traditional A/B testing one plus the multi-armed bandit algorithm, also known as Greedy algorithm.

A/B testing and greedy algorithms are just two different ways to reach the same goal: improve the conversions in your site. They cannot really be compared, there is no a better way to do conversion optimization, it’s up to the user which of the two philosophies is better aligned with the way you like to act.

The key difference between the two methods is the following:

  • In traditional A/B Testing, all alternatives are always shown with the same probability. If you’re testing two versions of the same page, each alternative is shown with an equal probability of 50% even if one seems to work better than the other.
  • With a multi-armed greedy algorithm, all alternatives are initially shown with the same probability but as soon as one of them seems to work better, that one is shown much more frequently (typically around 90% of the time).

We could say that the traditional A/B testing algorithm is more conservative, due to the fact that having one alternative that seems to convert better does not change the way the algorithm works. Until a solution is not significantly better (in the statistical sense of the word), both alternatives are shown with equal probability.

Instead, the greedy algorithm takes the risk of assuming that the alternative that seems to work better will actually be proven significantly better in the future, so it decides to show it more in order to maximize the number of conversions (known as exploitation phase) while still showing from time to time the others (exploration phase) just in case the assumption was wrong. If it works well, most people will see the “good” alternative. If it doesn’t (what it looked like the good alternative was just the result of random results), most people will be seeing the bad alternative (until the algorithm realizes that and switches to a “new better” alternative, which may take some time since the algorithm favors the exploitation phase over the exploration one).

With A/B testing you may lose conversions by showing a worse alternative to 50% of your customers. With the multi-armed bandit, you try to minimize the number of people that will see the worse alternative by betting early on a winner.