[Scala Stream Enrich] Caught exception while sync'ing Kinesis shards and leases

I’m trying to setup snowplow stream enrich on my local machine, and i couldn’t figure out why this is happening

[WARN] [06/08/2016 16:07:52.976] [snowplow-scala-tracker-akka.actor.default-dispatcher-7] [akka://snowplow-scala-tracker/user/IO-HTTP/host-connector-0/0] Connection attempt to 169.254.169.254:80 failed in response to GET request to /latest/dynamic/instance-identity/document/ with 1 retries left, retrying...
[WARNerror] [06/08/2016 16:08:02.997] [snowplow-scala-tracker-akka.actor.default-dispatcher-16] [akka://snowplow-scala-tracker/user/IO-HTTP/group-0/5] Configured connecting timeout of 10 seconds expired, stopping
[WARN] [06/08/2016 16:08:02.997] [snowplow-scala-tracker-akka.actor.default-dispatcher-12] [akka://snowplow-scala-tracker/user/IO-HTTP/host-connector-0/1] Connection attempt to 169.254.169.254:80 failed in response to GET request to /latest/dynamic/instance-identity/document/ with no retries left, dispatching ...
[INFO] [06/08/2016 16:08:03.000] [snowplow-scala-tracker-akka.actor.default-dispatcher-10] [akka://snowplow-scala-tracker/deadLetters] Message [akka.actor.Status$Failure] from Actor[akka://snowplow-scala-tracker/user/IO-HTTP/host-connector-0/1#75910535] to Actor[akka://snowplow-scala-tracker/deadLetters] was not delivered. [1] dead letters encountered. This logging can be turned off or adjusted with configuration settings 'akka.log-dead-letters' and 'akka.log-dead-letters-during-shutdown'.

and i also setup the elasticsearch sink but the data didn’t stored to my elasticsearch

Hello @Spycomb

Those messages indicate that tracker embedded in Stream Enrich cannot fetch EC2 metadata context which is basically OK, since you’re running it on local machine and there’s no such context available.

So, could you please clarify if there’s somehing else in you stdout that looks like error? Those snowplow-scala-tracker-akka.actor messages are warnings and shouldn’t be critical.