Quickstart

Getting started is really simple thanks to automatic schema inference - you do not have to define any tables, columns or data types.

This guide will show you how to relay data from your message bus (we'll use Kafka for this example) and get the data into Batch.

The quickest way to get an event into the Batch platform is to sign-up for a new account, login and follow the onboarding wizard. In just a few steps, you will push an event using curl to the HTTP collector.

Step 1

Sign-up and/or login.

Step 2

Create a collection.

Step 3

Copy collection token.

Step 4

Configure plumber to read event data from your message bus and launch it using the collection token.

This example assumes your Kafka requires TLS and uses auth.

$ plumber relay kafka \
  --address "your-kafka-address.com:9092" \
  --token YOUR-COLLECTION-TOKEN-HERE \
  --topics orders \
  --tls-skip-verify \
  --sasl-username your-username \
  --sasl-password your-password

After executing the above, you should see something like this:

For production, we suggest to run two or more instances of plumber in Docker. Plumber supports "clean-shutdown" - it listens for SIGTERM and will clear its internal batches before exiting.

Directions on how to do this can be found here.

Step 5

Write some data to your message bus.

$ plumber write kafka \
  --address "your-kafka-address.com:9092" \
  --topics orders \
  --tls-skip-verify \
  --sasl-username your-username \
  --sasl-password your-password \
  --input '{"sample": "json"}'

You should see something like this:

Step 6

Watch the event appear in console.

Last updated