Event size error when passing large JSON variable as part of self describing event

Hi guys,

We’ve been implementing some self describing events (using datalayer events and GTM triggers/tags) and for the most part everything is working as expected and the events are succesfully making it into our Snowflake atomic events table.

Some events have a large JSON array passed through as an event property (full product api response). I am trying to do this as Snowflake supports JSON structured data really well and I can easily pick and choose what values I want from the JSON array.

The problem is these events are not being accepted by the collector (cloudfront).

In the chrome console, I can see a warning “Snowplow: Event of size 221383 is too long - the maximum size is 40000”.

Just wondering if anyone else has come across this limitation and if there are any suggested workarounds to allow storing of large JSON event properties?

Cheers,
Ryan

@Ryan_Newsome, Cloudfront collector accepts only GET requests, which are limited in size by nature (each browser would have its own limitation imposed though). You need to use Scala collector and switch to POST events.

Thanks @ihor , just another reason to switch across to the Scala collector. I’ll chat with internal stakeholders and try get some resources behind this.

Cheers,
Ryan

Yep - I’d highly recommend switching to the Scala collector. I’m not sure if it’s clear enough in the documentation but the Cloudfront / Clojure collectors have effectively been deprecated.

1 Like

This might be a useful resource for you when the time comes: AWS batch pipeline to real-time pipeline upgrade guide

1 Like