Comment on page
Kinesis Data Firehose
Integrating Batch with Kinesis Data Firehose provides a powerful and efficient interface for searching and storing events. In this example, we will demonstrate integrating with Amazon AWS SES service.
- 1.
- 2.Navigate to 'Collections'
- 3.Click 'New Collection' at the top right
- 4.Name your collection and select JSON format

Json Collection
These steps are specific to SES integration but you can adjust most to your use case
- 1.Navigate to Kinesis in AWS console
- 2.Under 'Data Firehose' select 'Create delivery stream'
- 3.Enter a 'Delivery stream name'
- 4.Select 'Direct PUT or other sources'
- 5.Enable 'Enable server-side encryption for source records in delivery stream' if desired
- 6.At the bottom of the page select Next
- 7.On the next page, Processing Records leave defaults and select Next
- 8.Under 'Choose a destination' select 'HTTP Endpoint'
- 9.
- 10.Set 'Access key' to the 'Collection Token' found under the Collection we created earlier in https://console.streamdal.com under the collections tab
- 11.Configure an S3 bucket for failed records and select Next
- 12.On the next page leave the defaults and select Next
- 13.Review and select 'Create delivery stream'
Follow the guide https://docs.aws.amazon.com/ses/latest/DeveloperGuide/event-publishing-add-event-destination-firehose.html to create an SES set and tie it to the Kinesis data firehose you created earlier.
Follow the guide https://docs.aws.amazon.com/ses/latest/DeveloperGuide/event-publishing-send-email.html to associate your email to an SES set
- 1.Trigger an SES email that uses the SES set we created earlier
- 2.Wait a few minutes for Kinesis to send the event to Batch
- 3.

4. Click on a message to view the JSON and all searchable fields

In the image above I clicked on the field 'eventType' on the right-hand side. You can see the search field was automatically updated with the correct search syntax to find more SNS event types that match the value 'Open'
You now have a much better and simpler way of dealing with Kinesis data than dumping to an S3 bucket or managing your own Elasticsearch cluster. You also get access to many more features such as being able to replay these events into other destinations for further processing.
Last modified 9mo ago