Getting started is really simple thanks to automatic schema inference - you do not have to define any tables, columns or data types.
This guide will show you how to relay data from your message bus (we'll use Kafka for this example) and get the data into Batch.
The quickest way to get an event into the Batch platform is to sign-up for a new account, login and follow the onboarding wizard. In just a few steps, you will push an event using curl to the HTTP collector.
Configure plumber to read event data from your message bus and launch it using the collection token.
This example assumes your Kafka requires TLS and uses auth.
1
$ plumber relay kafka \
2
--address "your-kafka-address.com:9092" \
3
--token YOUR-COLLECTION-TOKEN-HERE \
4
--topics orders \
5
--tls-skip-verify \
6
--sasl-username your-username \
7
--sasl-password your-password
Copied!
After executing the above, you should see something like this:
For production, we suggest to run two or more instances of plumber in Docker. Plumber supports "clean-shutdown" - it listens for SIGTERM and will clear its internal batches before exiting.