-
Notifications
You must be signed in to change notification settings - Fork 22
Description
Description
More and more projects for the BPCE customer need to consume/write messages from/to GCP PubSub.
Currently there is already two different projects that need to consume messages from GCP PubSub to send it to a HTTP endpoint or a Kafka topic.
Solution proposition
Starlake already provides an easy way to handle messages coming from Kafka and to send the messages to an HTTP endpoint.
What could be done is to duplicate the Kafka implementation into a new implementation for GCP PubSub.
The existing
(Consumer) KAFKA <--> Starlake <--> HTTP ENDPOINT / KAFKA (Writer)
The wish
(Consumer) GCP PubSub <--> Starlake <--> HTTP ENDPOINT / KAFKA / GCP PubSub (Writer)
Considered alternatives
Without the native Starlake consumer/writer from GCP PubSub, we are currently using the following to fulfill our projects need :
(Consumer) GCP PubSub <--> Custom Scala code application <--> HTTP ENDPOINT (Writer) <--> Starlake based application <--> KAFKA (Writer)
The consequence is to have two projects to handle the consumption of GCP PubSub messages and the writing to Kafka.
Additional context
Scala source code to consume messages from GCP PubSub and to send it to HTTP endpoint is available to share upon demand.
Scala source code to listen HTTP endpoint and to send messages to Kafka is available to share upon demand.
Life cycle
Possible implementation
I can provide any more context, source code and support useful to develop the feature.
Complexity estimation
T-Shirt Size: M