Optimize
2.8K members online now
2.8K members online now
Discover how to select who is eligible to be in your experiment, and when they’re served experiment variations.
Guide Me
star_border
Reply
Highlighted

Prevent overlapping between 2 Experiments

[ Edited ]
Visitor ✭ ✭ ✭
# 1
Visitor ✭ ✭ ✭

Hi,


Could you please help me with a rule that can exclude users that have already been exposed to one of other experiments in Google Optimize?

 

What is the best approach?

 

I am thinking about using 1st party cookie variable or some other custom variable that would mark the user as "exposed" so that another experiment will not affect him.

In addition I can use "run custom "JavaScript" in Optimize's visual editor that will create such a cookie. Will that solve the problem?

 

Also I can't understand how to prevent 2 experiments from running simultaneously. So that user who sees experiment A will not see experiment B or C (free version is limited to 3 experiments). Is there any rules or configuration that can help with that?

 

1 Expert replyverified_user

Prevent overlapping between 2 Experiments

Google Employee
# 2
Google Employee

Hi

 

I hope that there is soon better in-product support for this.

 

One idea could be to use "bucketing" based on the Google Analytics client Id so that you don't need to have another cookie.

 

Just create a "Custom Javascript" macro named to something like "Experiment Traffic Bucket" with code like

 

function() {

  return parseInt(ga.getAll()[0].get('clientId'))%3 + 1;

}

 

This will allow you to split your users in 3 buckets and then you can use experiment rules like

 

"Experiment Traffic Bucket" equals 3

 

or you could use multiple buckets.

 

"Experiment Traffic Bucket" equals 1, 3

 

 

Hope that this helps

Prevent overlapping between 2 Experiments

Visitor ✭ ✭ ✭
# 3
Visitor ✭ ✭ ✭

@DimitrisD

 

Just to confirm, for the js code, can we just update the "3" with "10" if we wanted to create 10 separate buckets?

 

Secondly, do you know of any way to check in the developer console (or anywhere) to see which of the buckets we are currently listed in? Trying to find a way to verify the bucket and all optimize tests that are being provided to the user. 

Prevent overlapping between 2 Experiments

Google Employee
# 4
Google Employee

Yes, you can update '3' with '10' or any other number to have more buckets.

 

If you want to see which bucket you are in, in the developer console you could perhaps change the code above to print the bucket number. Or you can assign it in a global variable so that you could inspect it at anytime.

 

For example:

 

function() {

  window.optimize_bucket = parseInt(ga.getAll()[0].get('clientId'))%3 + 1;

  return window.optimize_bucket;

}

 

Now in the dev console you can just type window.optimize_bucket and see what it has been assigned to.

 

Now, if you want to see which experiments have been "triggered" at a page, there are a few methods.

 

- You can inspect the gaData window property and its "experiments" field for a property.

  It should have an object with all the experiments that were triggered and the assigned variant.

 

- You can search for the "collect" requests in the network tab and inspect their parameters. If there are experiments running, then one named "exp" should contain the assigned variants

 

- You can insect the value of the _gaexp cookie in the Application tab (for your domain). This collects the assigned variants to a visitor on any page.

Prevent overlapping between 2 Experiments

Visitor ✭ ✭ ✭
# 5
Visitor ✭ ✭ ✭

Dimitris D appreciate the help. By chance do you know if its possible post-launch of a test to force yourself into one of the variations?

 

Compared to going into incognito windows until you get into the experience. 

Prevent overlapping between 2 Experiments

Google Employee
# 6
Google Employee

Hi Brian

 

You can always use the preview functionality of the variants or the original. This is not identical with being in the test - for example you will not be sending any data to GA but this is usually a good thing if you want to see how the page looks like without affecting the real experiment data.

 

If you really want to force going into the test, you can use browser devtools and modify or set the experiment cookie. This is called _gaexp and contains the variant that a user is assigned to for each experiment. Its value looks like: 

GAX1.2.<experimentId1>.<expireInfo>.<variantIndex>!<experimentId1>.<expireInfo>.<variantIndex>

 

So you could for example change the variantIndex to force your self into another variant, (by reloading the page).