Integrating Batch with Kinesis Data Firehouse provides a powerful and efficient interface for searching and storing events. In this example, we will demonstrate integrating with Amazon AWS SES service.
Login into https://console.batch.sh
Navigate to 'Collections'
Click 'New Collection' at the top right
Name your collection and select JSON format
These steps are specific to SES integration but you can adjust most to your use case
Navigate to Kinesis in AWS console
Under 'Data Firehose' select 'Create delivery stream'
Enter a 'Delivery stream name'
Select 'Direct PUT or other sources'
Enable 'Enable server-side encryption for source records in delivery stream' if desired
At the bottom of the page select Next
On the next page, Processing Records leave defaults and select Next
Under 'Choose a destination' select 'HTTP Endpoint'
Set 'HTTP endpoint URL' to https://http-collector.dev.batch.sh/v1/kinesis
Set 'Access key' to the 'Collection Token' found under the Collection we created earlier in https://console.batch.sh under the collections tab
Configure an S3 bucket for failed records and select Next
On the next page leave the defaults and select Next
Review and select 'Create delivery stream'
Follow the guide https://docs.aws.amazon.com/ses/latest/DeveloperGuide/event-publishing-add-event-destination-firehose.html to create an SES set and tie it to the Kinesis data firehose you created earlier.
Follow the guide https://docs.aws.amazon.com/ses/latest/DeveloperGuide/event-publishing-send-email.html to associate your email to an SES set
Trigger an SES email that uses the SES set we created earlier
Wait a few minutes for Kinesis to send the event to Batch
Navigate to the collection we created earlier at https://console.batch.sh
4. Click on a message to view the JSON and all searchable fields
In the image above I clicked on the field 'eventType' on the right-hand side. You can see the search field was automatically updated with the correct search syntax to find more SNS event types that match the value 'Open'
You now have a much better and simpler way of dealing with Kinesis data than dumping to an S3 bucket or managing your own Elasticsearch cluster. You also get access to many more features such as being able to replay these events into other destinations for further processing.