Replies: 1 comment
-
Any update? |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
Hi,
I have configured fluentd with kafka output plugin.
Fluentd's configuration is the below:
At kafka side I have set message.max.bytes=2000000
I get the below error at fluentd's logs:
2023-09-28 07:42:22 +0000 [warn]: #1 failed to flush the buffer. retry_times=1 next_retry_time=2023-09-28 07:42:24 +0000 chunk="60666729618260cf29737b0db4803708" error_class= Kafka::MessageSizeTooLarge error="Kafka::MessageSizeTooLarge"
but
[td-agent@ztsl-bssc-fluentd-statefulset-2 /]$ ls -lRt /data/fluentdlogs/logskafka/ | grep 60666729618260cf29737b0db4803708
-rw-r--r--. 1 td-agent 2006 2043098 Sep 28 07:42 buffer.q60666729618260cf29737b0db4803708.log
-rw-r--r--. 1 td-agent 2006 84 Sep 28 07:42 buffer.q60666729618260cf29737b0db4803708.log.meta
Seems that the chunk is biger than 2 MB. Why is this since chunk_limit_size is 2MB?
Beta Was this translation helpful? Give feedback.
All reactions