AdWords is now Google Ads. Our new name reflects the full range of advertising options we offer across Search, Display, YouTube, and more. Learn more

4.1K members online now
4.1K members online now
Get started with Google Ads - learn the basics to get set up for success
Guide Me

Regarding ACE, does Google randomly assigns traffic to prevent bias?

Not applicable
# 1
Not applicable

Hi, I have a question regarding Google's AdWords Campaign Experiments (ACE).


I would like to run an experiment testing the effectiveness of 2 different styles of text ads, let's say ad1 as control unit and ad2 as experimental unit. Both ads are in the same adgroup (using same keywords), and I want to see how much CTR or conversions each style would generate. 


My question is that for the results of experiment to be statistically unbiased, I understand that the assignments of a subject to either the control or experimental group must be random. So, does Google shows the traffics of search randomly to prevent bias?


Also, if I cannot have an even split (50%/50%), would I be able to resolve bias in sampling size by considering the percentage of targeted actions over total number of impressions (or visits) instead of raw number of targeted actions observed?


Thank you very much,



Re: Regarding ACE, does Google randomly assigns traffic to prevent bia

Participant ✭ ☆ ☆
# 2
Participant ✭ ☆ ☆

Hello Ted

Here you can find information about the Adwords Experiments:

Even if you create an experiment that uses 50 percent of your auctions for experimental changes and 50 percent for your control, these groups might get different amounts of impressions. This can happen because the performance of keywords, ads, ad groups, and bids might be causing fewer or more impressions.

Once your experiment is running, you can begin viewing data for your experiment within your campaign. You'll find arrow icons in your account performance data that'll help you figure out how certain you can be that the change in data is due to the changes you've made.

If your experimental data is statistically significant, Adwords displays an up arrow or down arrow next to that data depending on whether your performance has increased or decreased. As many as three arrows can appear in the same direction, and the more arrows in the same direction, the more statistically significant the results are. One arrow means 95 percent certain that the change is not due to chance, two arrows means 99 percent certain, and three arrows means we're 99.9 percent certain. Two gray arrows in opposite directions mean the results are not statistically significant.

** I´am learning Adwords and find it very interesting.
** Ik ben Adwords aan het leren en vind het erg boeiend.