Google Optimize experiment ID as context

Since Google consumed Optimizely and now provides this inside of GA, the GA tracker populates an ‘exp’ variable with an experiment ID + variant.

Using the sp-ga-plugin, this value is passed to the Snowplow collector but it looks like v1 of the GoogleAnalytics driver ignores this value.

Are there any plans to add Optimize experiments (not “Content Experiments”, which were shut down by Google after they acquired Optimizely) as an enriched context to the GoogleAnalytics schema, or pass it through enrichment in some other way (other than copying it to a Custom Dimension in the JS)?

Thanks!

Hi @Kennon_Ballou,

Our javascript tracker has a native optimizely integration.

Of course, it’s also possible to retrieve this data and send it as a custom context, as it is for any other data you want to collect via snowplow.

Best,

Hi Colm, thanks for the reply!

I see Optimizely integration for the normal Snowplow events in the tracker, but the sp-ga-plugin does not use the normal tracker endpoint since it is just ‘forking’ the payload that goes to GA. Also, we are not using Optimizely, we are using Google Optimize via Google Analytics.

This flow works perfectly since the schema for GoogleAnalytics events is part of scala-common-enrich. However, this does not appear to take account of the exp parameter which is the Google Optimize experiment ID.

My question was if this was on the roadmap to implement or not, or if Snowplow Analytics has published any roadmap for details like this.

Thank you!

I can’t speak to the roadmap for Snowplow to natively support Google Optimize the way you’re describing. I’ve developed an approach that works well for my needs that may be of use to you and that I may look to develop a more robust feature & enrichment for.

  1. Parse the _gaexp cookie and attach custom contexts

GA & Optimize use the _gaexp cookie to keep track of a users’ exposure to experiments. We can parse that cookie to attach a custom experiment context to snowplow events. Something like this:

     // _gaexp cookie: client side
     var expRx = /(\S{22})\.(?:\d{5})\.(\d{1})/g;
     var co = getCookie('_gaexp');
     var a;
   
     var customContexts = [];

     while ((a=expRx.exec(co)) !== null) {
       var expId = a[1]
       var expVariant = a[2];

       var optimize_experiment_viewed = {
            schema: "iglu:com.acme_company/optimize_experiment_viewed/jsonschema/1-0-0",
            data: {
                expId: expId,
                variantId: expVariant
            }
       }

       customContexts.push(optimize_experiment_viewed);

     }

     window.snowplow('trackPageView', null, customContexts);

yet another approach might be to handle this with a custom API enrichment that attaches the context

  1. Register a Google Analytics Task and send a custom event

To activate a client-side experiment after the GA page view event has been sent you you have to fire an activation event which is a separate data hit sent to GA to capture the experiment exposure. In the same way that Simo Ahava outlined in his excellent post (and in the sp-ga-plugin) we can register a customTask with GA to look for these hits and send a corresponding event to snowplow.

This is needed because the event is sent after the page view.

Example task

     ga(function(tracker) {
        // Grab a reference to the default sendHitTask 
        var originalSendHitTask = tracker.get('sendHitTask');
        tracker.set('sendHitTask', function(model) {

            // send original hit
            originalSendHitTask(model);

            // check model to determine if it is a 'data' hit
            // and that 'exp' is a valid experiment id
            var exp = model.get("exp"); // experiment id ex: nnNnnNnsNNNABDACDNNNNNN1.0
            
            var hitType = model.get("hitType"); // pageview, data, adtiming, etc

            var expRx = /(\S{22})\.(\d{1})/g; // experiment param regex format

            if (hitType === 'data' && typeof exp !== 'undefined' && (a=expRx.exec(exp)) !== null) {
                
               // valid data hit get attrs
               var expId = a[1];
               var expVariant = a[2];

              /* 
              * Track custom snowplow event struct or unstruct as needed
              */
              window.snowplow('trackUnstructEvent', {
                schema: 'iglu:com.acme_company/optimize_experiment_viewed/jsonschema/1-0-0',
                data: {
                  expId: expId,
                  variantId: expVariant 
                }
              });
            }
        });
    });

I hope this helps! I’d love to hear any feedback that others have on this, thanks!

Also just completeness, I have one minor clarification regarding something @Kennon_Ballou said:

Google didn’t acquire Optimizely they expanded their own experiment platform (formerly Content Experiments as you noted) – in fact, their approach to experiment segmentation and analysis have large philosophical and technical differences but that is an entirely different topic.

5 Likes

You’re right, apologies - I didn’t pay enough attention before I replied.

Disclaimer - I don’t influence the roadmap and am giving you a best guess answer here. There’s no planned development on the GA integration at the moment, and I’m not sure there’s likely to be in the short-medium term - but we’d likely be happy to review a PR if someone wanted to submit one for this.

I don’t know a lot about the domain here but @brad.inscoe’s approach looks ok to me (aside from a general aversion to regex parsing if it can be avoided - it could be the case that it can’t here).

This is a pretty good approach by attaching to a native Snowplow event. I haven’t tried it yet but you may also be able to achieve this in the enrichment process by taking out the _gaexp* and _opt* cookie values out in the cookie enrichment.

1 Like

Thanks everyone!

It seems like the “best” (but definitely not the simplest) approach would be to modify the actual GoogleAnalytics scala code in stream-common-enrich to look for the ‘exp’ parameters and then enrich that as a separate table type akin to custom_dimension, but this would require (for us) forking + modifying that code, implementing a new json schema, running a separate Iglu resolver for our new schema, adding a new redshift table, and bumping the GA schema version, , so we will probably just start populating the exp ID as another custom dimension. I was mainly asking if this kind of thing was known to be on the roadmap or not, and it sounds like it’s not, which is totally fine :slight_smile:

Thank all for the insight!

Actually, using a custom dimension is the best plan IMO! Glad to hear you’ve got a straightforward solution.