Screencast: simple event pipes for Kafka

Sebastien Goasguen

Sebastien Goasguen

Jan 25, 2023
Screencast: simple event pipes for Kafka
Screencast: simple event pipes for Kafka

Using Kafka connect can be overwhelming, in addition, letting developers within an enterprise choose their Kafka clients can lead to a proliferation of undifferentiated code.

In this short screencasts we show you how to setup a Kafka producer and consumer using the powerful TriggerMesh CLI. Running locally and using containers to execute all components locally, it empowers developers to simply and quickly create event pipes similar to what you can do with Kafka Connect or AWS EventBridge.

All connectors are running as local containers and can then be described as a Docker-compose file or Kubernetes manifest.

In this first screencast we setup a Kafka producer and send an event to a Kafka topic in a cluster running in Confluent Cloud.

After producing events you can as easily consume events. In this next screencast we setup a Kafka source and and produce a message within the Confluent console, the message is consumed and displayed with our `tmctl watch` command.

Building up from the simple producing and consuming use-case we can start having fun and configure an AWS SQS source. Here we publish a message in AWS SQS, consume it locally, inject it in Kafka and see the message in the Confluent console.

Finally we can build a complete Pipe, consuming messages from AWS SQS, injecting them into a Kafka Topic, consuming that very message for a round-trip in Confluent and then trigger a Google Cloud Run function.

If you liked it, you can get started in less than 5 minutes with our quick start guide.

And join us on Slack for any questions.

Tags
No items found.

Create your first event flow in under 5 minutes