Email / Push messaging engines and Snowplow

Hi all, I’m curious how others are connecting their customer messaging to their customer analytics.

With Snowplow we have robust customer analytics that tie behavior all the way from anonymous entry onto our website all the way through their user lifecycle (sign up, purchase, etc). However, our customer messaging platforms for email / push are currently silod. The rules engines for sending emails and push messages are provided by other vendors, whom we must send customer data to from our client applications. Their customer data views are less robust than Snowplow (data collection is not separated from user stitching like with Snowplow).

So we’re over here scratching our heads trying to figure out if we can use our Snowplow data to power our messaging engines. Would this necessitate using the real time pipeline as a first step? Please chime in if you have thoughts on using Snowplow in conjunction with email / push messaging. Thanks!

1 Like

Great topic! We’re struggling with exactly that challenge - with a very robust data warehouse that consolidates all customer info, we still can’t act on that data to improve our communications.

We’ve signed with Salesforce/ExactTarget to use their “Journey Builder”, but after months of trying to sync data and build journeys, we still think the solution is highly limited. We’re also experimenting with other SaaS (Outbound.io, Customer.io, Autopilot, etc) - but most solutions look limited.

Ideally, we should be able to define workflows (journeys) with a specific goal (e.g. purchase), and put users though that flow until they reach the goal. A few points to consider:

  1. For transactional communications, the server should be able to start the flow (and send out emails) immediately, even without web-analytics data, which we currently process in nightly batches.
  2. The flow should obvisouly end as soon as the user reaches the goal.
  3. Each step of the flow should trigger a webhook to one of our “agents” (a server in charge of performing a given task). Could be a POST request with a defined payload.
  4. What service should be querying our Redshift cluster to (a) start workflows and (b) perform checks within the workflow?
  5. How should we deal wih users on multiple workflows?

I’ve recently talked to a company using a BPMN Workflow Engine called Camunda to address this issue. Workflows are triggered server-side and each step of the process calls a webhook that can send e-mails, pushes, etc. Seems to be working well for them - we’re discussing internally if it makes sense for us.

@alex How do you think Sauna fits in this context? Any other tools or approaches you would consider?

Cheers,
Bernardo

2 Likes

Hey @bernardosrulzon - great topic indeed.

I think a business rules engine is a good avenue to explore. We are deliberately holding back from building a “fat” decisioning layer into Sauna, focusing instead for now on the implementation of the actual agents as you call them, or responders as Sauna calls them. (Expect some exciting news on the Sauna real-time roadmap soon!)

I haven’t heard of Camunda, will have to check it out. I have heard good things about Drools, and I think AWS Step Functions could play an interesting role…

2 Likes

This is indeed a great topic. We are working on a solution to address these exact business challenges.
The goal is to automate marketing and analytical processes in 2 modes:

  1. Rules based
  2. AI based
    The main components:
  3. Trackers - currently have 1 webhook we created and we are working on integrating with the snowplow trackers
  4. Collector
  5. Enrichment
  6. Logic - Docker containers for
    a. Machine learning algorithms
    b. rules based logic & workflows - we are currently looking at Camunda
  7. Message queues triggering actions - push, email, page content, etc.
    We will probably end up with a hybrid solution that uses part of the snowplow framework and part ours.
    We are keen to collaborate with other members of the snowplow community on this topic and build formal extensions to the Snowplow framework.
    @Alex - what is the best way to this while staying fully aligned with you and the team?
    @bernardosrulzon and @travisdevitt - would be great to get in touch and further discuss these business needs and possible solutions

Ziv

1 Like

The best way of aligning with Snowplow at this time is to contribute back to Sauna your specific implementations of 5. Message queues triggering actions - push, email, page content, etc.

This is a little difficult at the moment because we haven’t shared our Sauna real-time (RT) roadmap, but we are aiming to do this soon. Once we’ve done this, it should be fairly clear how to contribute new action implementations into Sauna RT. And of course this will all be very new and experimental - so it will be a good opportunity to “get in early” and influence Sauna RT so it works the way you want it to work…

2 Likes

@zivbaram and @travisdevitt - Awesome! Can you please add me on Skype (bernardo.srulzon)? We can schedule a chat from there.

Cheers!

1 Like

We use Iterable for messaging (email, push, sms, etc) and have had really great success. They also support webhooks to send all this data to your snowplow endpoint, so it was also trivial to get all our messaging data into snowplow. That might solve half your problem?

I might not be understanding what you’re looking to do, but it sounds like you also want to use data you’ve collected with snowplow (maybe analyzed) and use it in your messaging system? For instance to target people who visited some app and engaged event data from snowplow, send them a push message.

For us right now on Iterable, we are basically doubling event collecting. We have a separate codebase and logic to send info into Iterable, and we’re doubling up by also sending it into snowplow. That’s something we are interested in fixing in the future. But right now, if we sent everything we collect with snowplow to Iterable they probably wouldn’t be too happy it’s a ton of data.

Ideally we want to use more ML to help send better messaging, using the data we’ve collected with snowplow, and we’re excited to see where Sauna and some other AWS tools take us.

Would love to hear if you find a better solution!

1 Like

Thanks for sharing that @dillondoyle, really interesting.

We have users who are treating Snowplow as (amongst other things) an intelligent tag manager, and so in this case they would write an AWS Lambda function to take the Snowplow event stream and POST a well-scoped, simplified subset of those events to Iterable. This is much more about preventing unnecessary data leakage to the 3rd-party vendor than it is about minimizing data traffic.

That sounds cool @dillondoyle - would you be able to share the schemas you wrote for Iterable, and PR them into Iglu Central? I’m sure the Snowplow community would love to use these.

Yes - we are super-excited about this too, and are working hard on Sauna real-time (using Kinesis streams of commands) now. I’ve just added a placeholder for Iterable support:

1 Like

Thanks @dillondoyle, that’s helpful! We actually had a demo of Iterable the other day as we are looking at marketing automation solutions and I really like how API driven they are.

We came to the conclusion that a near real time, centralized customer messaging platform such as Iterable would be needed regardless of our analytics stack because we didn’t want to build/configure each messaging medium. We do want the systems to be able to talk to one another though, in a setup similar to yours. That requires some bidirectional data flows between Snowplow and Iterable. For instance, Iterable will send drip emails and track each open, and we push those email events to Snowplow. In the opposite direction we might calculate conversion likelihood scores for users in Snowplow and send that data to Iterable so it can be used as a trigger for a follow up push notification or email.

1 Like

Yup @travisdevitt that’s our future vision too, to make snowplow our single source of data collection. The hard part is then setting up logic of when to push data to what other platform and where in the snowplow pipeline, especially as we’re multi-tenant (we have multiple campaigns/clients at once) and AWS hosted, so there are multiple ways we could go.

We’re excited and playing with sauna and the sql router and the new AWS pipeline tools. But for now our double data collection works (and lots of times its even more when you count Google analytics, retargeting code, 3rd party verification code etc. over the next few years we hope to cut way back as its a better end user experience and we get that’s a big reasons ads suck…).

For what it’s worth I don’t own any shares in Iterable or anything, but I think I’m one of their first I would guess 100 maybe even 50 customers, and with politics we send a lot of volume without experiencing many problems, a few bumps growing pain during end of quarter that I would guess we caused (sorry Iterable if you have Google alerts)… The API/customization + simple workflow and sending UIs for non-programmer marketers is why we chose them. Iterable did just get a big funding round and they are scaling up so I would hope they keep up their customer service and helpfulness. Stay cool Iterable!

1 Like

Yes @alex , I’ll add it to our todo list. We have a couple other changes we’re talking about and we’ll definitely try and contribute back to the community.

2 Likes

Thanks @dillondoyle! If Iterable is very API-centric as @travisdevitt suggests, it sounds like it will be a great fit for Snowplow + Sauna integrations…

1 Like

Hi @alex, any news on the Sauna RT roadmap?

Best

Hey @spatialy,

Sauna 0.2.0 with initial Kinesis Observer support is in QA right now. I think we can expect it very soon.

1 Like

Hi @anton great news !!!

Thanks