I’ve recently been looking at our options for running some integration/end-to-end tests of our snowplow trackers.
Snowplow Micro looks great, and the two recent blog posts with examples were very informative. However, our application’s current CI/CD process does not lend itself well to being spun up and integration tested during the test/build pipeline (It’s mostly just unit tests being run during the build).
We do however have e2e tests running against the deployed infrastructure in our lower (aws) environments. I would like to add some e2e tests (i.e. using a browser automation framework) around the snowplow tracking to this test suite.
FWIW - We already have snowplow mini up and running, which would normally be used for manual verification/QA during the development phase.
If we were to use snowplow micro in this setup, I would need to host it inside the VPC somewhere (so it is visible to the deployed application/services). My question is, is this defeating the purpose of micro? If we already have a mini instance deployed, can we instead use that?
Is there a (public) API endpoint for elasticsearch on the mini instance that I could request (e.g. from a running test) events from? I haven’t been able to figure this out yet (I’m not too familiar with the mini setup). In the usage guide it looks like the elastic search api might be private only. Is this correct?
I’m wondering what my best option for e2e testing against (already deployed) trackers is: should I try to get mini to do what I want, or instead host a micro instance (on ECS/Fargate), either ephemerally or permanently.
Any suggestions or feedback is welcome!