Batching of records #535
Unanswered
davidkingg
asked this question in
Q&A
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
Hey Guys,
I need help or clarification on how I can properly batch data.
As of now, I cannot accurately predict my data traffic so I would like data to be inserted into clickhouse once in every 30mins.
is there a connector config to handle this effectively?
I tried using the consumer.override.fetch.max.wait.ms but it seems the consumer can only wait for up to 9mins but fails for longer wait time with this log message:
is there a way to control the connect worker or connector to pull for records every 30mins?
My second question:
is there a way to implement automatic schema evolution?
my last question.
currently, if a column is added to the source table, the connector just skips this and continues writing the existing columns to clickhouse.
I would prefer if an error is thrown so I can make the schema changes on clickhouse,restart the connector and not lose any data.
is there a way to change this behavior ?
Thank you!
Beta Was this translation helpful? Give feedback.
All reactions