CORS issue with AWS Quickstart

Hi. I have a Snowplow deployment running on AWS. I believe this is the open source quick start version deployed using terraform.

It’s using http
Locally I have an html file with the contents of the ‘tracking events → page views’ javascript
the sp.js file is in the same folder.

I have my collector url set to http://MY_AWS_INSTALL.eu-central-1.elb.amazonaws.com/api

When I run the page and check the console I’m getting a CORS error.

Access to XMLHttpRequest at 'http://XXXXX.eu-central-1.elb.amazonaws.com/api/com.snowplowanalytics.snowplow/tp2' from origin 'http://localhost:8080' has been blocked by CORS policy: Response to preflight request doesn't pass access control check: No 'Access-Control-Allow-Origin' header is present on the requested resource.
sp.js:7          POST http://XXXXX.eu-central-1.elb.amazonaws.com/api/com.snowplowanalytics.snowplow/tp2 net::ERR_FAILED

I have no idea how to solve this.

You don’t need /api at the end of your collector URL.

That suggests this is either your Iglu Server URL, or you’ve added /api to the end of your collector URL.

Try navigating to http://MY_AWS_INSTALL.eu-central-1.elb.amazonaws.com/health
If that returns OK, then set your collector URL to:
http://MY_AWS_INSTALL.eu-central-1.elb.amazonaws.com

It could be that I have the wrong url or service then…

http://sp-iglu-lb-XXXXXX.eu-central-1.elb.amazonaws.com/health

I get “the endpoint does not exist”

So that URL is for your Iglu Server (there is no health endpoint on that). Thats not where you need to send events, the Iglu Server is where your schemas are stored to be used by the pipeline.

You should have spun up two things during the quickstart, the Iglu Server and the Pipeline stacks.

The pipeline should have given you a collector url in its output. This is the URL you need to use for your tracking.

That was the issue. i.e. the Pipeline stack wasn’t deployed.

I have sent an event into the pipeline now and just trying to read it back out

1 Like

HI @PaulBoocock thanks for your help getting this setup.

My next challenge is connecting to Kinesis since we want to stream the data in real time.

These are the settings for our connector:
image

I think the only one of these you might be able to help with is the name given to the AWS stream. Not sure of the exact terminology here so I’ll have to do some digging.

I went for the ‘sp-raw-stream’ which is working nicely.

Also going to try the other streams… enriched and bad 1 and 2.

You’ll want to use the Enriched Stream. The pipeline goes something like this:

Collector → Raw Stream → Enrichment → Enriched Stream → Loader → Postgres

So the Enriched Stream is the richest set of data. You’ll likely want to use one of our Analytics SDKs to parse the TSV into JSON: Analytics SDKs - Snowplow Docs