Cant able to start kinesis-s3 with new configuration


I using new configuration of kinesis stream, Below is configuration details.

# Default configuration for s3-loader

# Sources currently supported are:
# 'kinesis' for reading records from a Kinesis stream
# 'nsq' for reading records from a NSQ topic
source = "kinesis"

# Sink is used for sending events which processing failed.
# Sinks currently supported are:
# 'kinesis' for writing records to a Kinesis stream
# 'nsq' for writing records to a NSQ topic
sink = "kinesis"

# The following are used to authenticate for the Amazon Kinesis sink.
# If both are set to 'default', the default provider chain is used
# (see
# If both are set to 'iam', use AWS IAM Roles to provision credentials.
# If both are set to 'env', use environment variables AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY
aws {
  accessKey = "xxxx"
  secretKey = "xxxxxxx"

# Config for NSQ
nsq {
  # Channel name for NSQ source
  # If more than one application reading from the same NSQ topic at the same time,
  # all of them must have unique channel name for getting all the data from the same topic
  # channelName = "{{nsqSourceChannelName}}"
  # Host name for NSQ tools
  # host = "{{nsqHost}}"

  # HTTP port for nsqd
  # port = {{nsqdPort}}

  # HTTP port for nsqlookupd
  # lookupPort = {{nsqlookupdPort}}

kinesis {
  # LATEST: most recent data.
  # TRIM_HORIZON: oldest available data.
  # "AT_TIMESTAMP": Start from the record at or after the specified timestamp
  # Note: This only affects the first run of this application on a stream.
  initialPosition = "TRIM_HORIZON"

  # Need to be specified when initialPosition is "AT_TIMESTAMP".
  # Timestamp format need to be in "yyyy-MM-ddTHH:mm:ssZ".
  # Ex: "2017-05-17T10:00:00Z"
  # Note: Time need to specified in UTC.
  initialTimestamp = "2017-12-17T10:00:00Z"

  # Maximum number of records to read per GetRecords call     
  maxRecords = 100000

  region = "us-east-1"

  # "appName" is used for a DynamoDB table to maintain stream state.
  appName = "snowplowunilog"

streams {
  # Input stream name
  inStreamName = "GoodStream"

  # Stream for events for which the storage process fails
  outStreamName = "BadStream"

  # Events are accumulated in a buffer before being sent to S3.
  # The buffer is emptied whenever:
  # - the combined size of the stored records exceeds byteLimit or
  # - the number of stored records exceeds recordLimit or
  # - the time in milliseconds since it was last emptied exceeds timeLimit
  buffer {
	byteLimit = 4500000 # Not supported by NSQ; will be ignored
	recordLimit = 500
	timeLimit = 60000 # Not supported by NSQ; will be ignored

s3 {
  region = "us-east-1"
  bucket = "sample1bucketevents"

  # Format is one of lzo or gzip
  # Note, that you can use gzip only for enriched data stream.
  format = "lzo"

  # Maximum Timeout that the application is allowed to fail for
  maxTimeout = 1

# Optional section for tracking endpoints
monitoring {
	collectorUri = ""
	collectorPort = 8082 
	appId = "1"
	method = "GET"

I am running using below command.

 java -jar snowplow-s3-loader-0.6.0.jar --config kinesisnew2.conf

I am getting below error.

configuration error: ConfigReaderFailures(KeyNotFound(nsq.channelName,Some(ConfigValueLocation(file:/home/ubuntu/kinesis-s3/target/scala-2.11/kinesisnew2.conf,25)),Set()),List(KeyNotFound(,Some(ConfigValueLocation(file:/home/ubuntu/kinesis-s3/target/scala-2.11/kinesisnew2.conf,25)),Set()), KeyNotFound(nsq.port,Some(ConfigValueLocation(file:/home/ubuntu/kinesis-s3/target/scala-2.11/kinesisnew2.conf,25)),Set()), KeyNotFound(nsq.lookupPort,Some(ConfigValueLocation(file:/home/ubuntu/kinesis-s3/target/scala-2.11/kinesisnew2.conf,25)),Set())))

please help me to resolve this configuration error.


Try and rerun with the NSQ keys uncommented (channelName, host etc).


hey @mike,
I dont have NSQ keys details to add in the uncommented section.


That’s fine - you can leave those keys empty.