We are running the streaming pipelines + js tracker version 2.11.0, and are noticing issues with enableFormTracking on large forms.
In the situation we’ve been seeing (GET-based tracking in this case, not POST), large e=ue events are generated (as is the case with submit_form self-describing events) but fail to go outbound from the browser. These failing events are then retained in localstorage, via the tracker’s snowplowOutQueue_, but continue to fail when the retry kicks in - therefore stopping all subsequent events that build up in the queue.
Has anyone experienced this before? Is the only workaround setting maxLocalStorageQueueSize to something relatively small, presuming the queue purges chronologically when the threshold has been exceeded?
Note- I do know how to whitelist and blacklist form tracking, but that assumes forms have class consistency (and/or aren’t hashed), or don’t change very often. All of which I cannot rely on. My real concern is not losing the form submission event, but rather the failure to fire the (relatively low-value) event appears to be blocking all subsequent (high-value) events.