G

AA Test

A type of test in which two groups receive exactly the same experience (same page, same content, same conditions) in order to verify the reliability of the test system. In a CRO (Conversion Rate Optimization) context, an A/A test ensures that :

  • the distribution of traffic between the variants is well balanced,
  • conversion tracking works correctly,
  • there is no technical or behavioral bias in the testing tool,
  • the differences observed between variants in real-life conditions are not due to chance or poor parameterization.

This type of test is often used before launching a critical A/B test, especially when :

  • a new testing tool has been installed (e.g. Dynamic Yield, AB Tasty, Optimizely),
  • a new segmentation or targeting logic is in place,
  • server-side variations are used and must be validated.

💡 Good to know: It's normal for small statistical differences to appear in an A/A test, but none should be significant. If one variant outperforms the other with a high level of confidence, this may indicate a bias in the configuration, a tracking bug or a randomization problem.

Talk to a Welyft expert

The Data-Marketing agency that boosts the ROI of your customer journeys

Make an appointment
Share this article on

Tell us more about your project

We know how to boost the performance of your digital channels.
CRO
Data
User Research
Experiment
Contact us