Analytics
1.8K members online now
Understand information in your reports and troubleshoot reporting issues such as self-referrals, (not set) data, and inaccurate information
 
Guide Me
star_border
Reply

Experiments - possible bug - GA may be miscalculating significance levels

Visitor ✭ ✭ ✭
# 1
Visitor ✭ ✭ ✭

Hi,

 

Disclaimer: I'm not a statistician nor can I calculate it myself, I'm relying on other AB testing signi

We've recently implemented A/B testing with Google Analytics.  We've shown the first experiment to a decent number of visitors and it's showing way less confidence level than the calculators that we've found.

 

Here are my exact numbers: 

 * variant A - 3466 sessions, 122 conversions

 * variant B - 7551 sessions, 332 conversions.

 

 * Google analytics shows only 77.2% "probability of outperforming original".

 * http://www.hubspot.com/ab-test-calculator says there's 98.74% confidence.

 * http://getdatadriven.com/ab-significance-test says there's 99% certainty (they probably round up).

 

I lowered all the input numbers 10x (346.6 sessions, etc.) and got a 76% confidence level... which means we could have ran 10x more experiments in that time and be as sure as Google Analytics tells us now... .

 

Am I misunderstanding something or is GA reporting wrong confidence level?

1 Expert replyverified_user

Re: Experiments - possible bug - GA may be miscalculating significance levels

Rising Star
# 2
Rising Star
Hi Mateusz,

Each tool can use a different method to calculate. GA uses a multi-armed bandit model and re-balances the experiment daily. Just let it run until it has enough data to declare a winner.

More on the science behind it:

https://support.google.com/analytics/answer/2844870?hl=en

If the experiment is over, then you can reasonably assume that the new variant would outperform.

Best,

Theo Bennett

Analytics Evangelist at MoreVisibility | Contact Me
Connect on LinkedIn