Question, Unstructured events and CSV import

Does it has sense to send unstructured events to import data thru a CSV ? the reason is because I already have previous data that needs to be loaded in there as a “Migration” and later start using the tracker in order to add new data.

And to import data I was about to create a CSV file and use nodejs-tracker to send unstructured events to the collector, but I am afraid because the file contains more than 4 millions of rows.

If there is some trick to add into S3 and load all the events at once using just the etl runner, that would be awesome.

Hi @Germanaz0 - this sounds like a good use case for a combination of:

It would be super-awesome if you would blog your experience, or share it in this thread as a tutorial!

Sure I will, going to check how it goes, with node-js works fine but I think that the CLI would work much faster

The nice thing about the CLI is you can script up a bash file that does increasingly large batches of your events, as you gain confidence that the events are making it through to your collector…

Yes the only con that I’ve found is that there is no buffering, in some point is safer without a buffer, but slow response