Replies: 4 comments
-
Thanks for the performance data!
It's normal for flushing to take longer if the size of the chunk being sent increases or if the server's response becomes slower. |
Beta Was this translation helpful? Give feedback.
-
Thanks for getting back 🙏 Never faced such flush delays when running Fluentd v1.11 with Elasticsearch 6.8 on EC2. However, with OpenSearch 2.17.1 and Fluentd v1.16.2 deployed on k8s, we're seeing flush latency increases during the performance testings. This behaviour is directly impacting application performance causing delays. We are working on implementing asynchronous logging, but wanted to check the possibilities of keeping the flush time as low as possible. Do you see any improvements we could make? |
Beta Was this translation helpful? Give feedback.
-
Do you think https and authentication could cause any significant delays? We didn’t have these enabled in our previous setup. |
Beta Was this translation helpful? Give feedback.
-
I see.
I'm not sure... It could be due to differences in such config or version differences of the Could you narrow down the cause through comparative testing? |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
What is a problem?
Hi Team,
We’re running an OpenSearch cluster deployed on k8s using opensearch-k8s-operator with 3 master nodes and 5 hot nodes. Fluentd (2 pods) is handling log forwarding within the same cluster, sending logs to the hot nodes via a headless service. We’ve also tested pointing directly to the pod IPs. Under normal load, everything works great — flush times stay around 10–25ms. But during performance testing, we’re seeing flush times increase to 400ms or more.
Appreciate any pointers!
I've also attached screen shots of the metrics such as flush time, flush errors, retries etc.



Describe the configuration of Fluentd
Describe the logs of Fluentd
No response
Environment
Beta Was this translation helpful? Give feedback.
All reactions