Docs
Search…
Plumber Terraform Provider
Terraform can be used to manage Plumber when running in server mode. You can manage connections, relays, and tunnels
https://registry.terraform.io/providers/batchcorp/batchsh/latest

Usage

You will first need to be running Plumber in server mode. See Plumber Server Mode for more information on getting it set up
First define the provider config. For our example, we're putting plumber's authentication token in the config, but it is recommended that you use the PLUMBER_TOKEN environment variable to set this value, or pull in via a secrets manager such as Vault.
You will also need to fill in the address variable so terraform knows where to talk to your plumber server instance
1
terraform {
2
required_providers {
3
plumber = {
4
version = "~> 0.1.0"
5
source = "batchcorp/plumber"
6
}
7
}
8
}
9
​
10
provider "plumber" {
11
plumber_token = "your-plumber-servers-token"
12
address = "address-of-your-plumber-server:9090"
13
}
Copied!

Creating Connections

You can use the plumber_connection resource to manage connections
See https://registry.terraform.io/providers/batchcorp/plumber/latest/docs/resources/connection for all available connection types and their parameters
1
resource "plumber_connection" "my_kafka_server" {
2
name = "test kafka"
3
kafka {
4
address = ["kafka.default.svc.cluster.local:9092"]
5
connection_timeout = 5
6
tls_skip_verify = true
7
sasl_type = "plain"
8
sasl_username = "plumberconn"
9
sasl_password = "uLZ29]q%cHhW$bWe"
10
}
11
}
Copied!

Create Relays

Need to ship messages to your Batch.sh collections? Create a relay with the above connection easily:
1
resource "plumber_relay" "my_kafka_relay" {
2
# Use our connection ID from earlier
3
connection_id = plumber_connection.my_kafka_server.id
4
5
# Fill in with the token from your collection in https://console.batch.sh
6
collection_token = "48b30466-e3cb-4a58-9905-45b74284709f"
7
​
8
# Relay details for your kafka connection
9
kafka {
10
# You can specify multiple topics
11
# Batch commends a collection per topic if message structure differsh
12
topics = ["new_orders"]
13
consumer_group_name = "plumber"
14
}
15
}
Copied!

Create Replay Tunnel

Plumber can act as a replay destination, allowing you to replay messages to your bus without the need to punch holes in firewalls or hand out credentials.
For this functionality, you will need to obtain an API token from https://console.batch.sh/account/security​
As with the plumber token, we don't recommend storing the API token in your terraform files. You can use the environment variable BATCHSH_TOKEN to provide it to terraform. However for demo purposes, we'll just put it in the terraform config:
1
resource "plumber_tunnel" "my_replay_tunnel" {
2
name = "Tunnel to infra kafka"
3
​
4
# Obtained from your https://console.batch.sh account
5
batchsh_api_token = "batchsh_3b17c235a49a871d2c9715c80acdef33c9bfe6e1bc881a61f5659021eac9"
6
​
7
# Use our connection ID from earlier
8
connection_id = plumber_connection.my_kafka_server.id
9
10
# Start the relay immediately after creating
11
active = true
12
​
13
# Specify the kafka topic that messages will be written to when replaying to this tunnel
14
kafka {
15
topics = ["new_orders"]
16
17
# Write the message with a key if needed
18
key = "my_msg_key"
19
20
# Let's also specify kafka headers!
21
headers = {
22
replayed = "true"
23
}
24
}
25
}
Copied!
After applying, you can go to https://console.dev.batch.sh/destinations, you will see your newly created "Tunnel to infra kafka" destination. You can replay messages from your collections to this destination, and plumber will write them to the new_orders topic in your kafka. No fussing with credentials or firewalls!