-
Hello I'm using Kubernetes logs as a source and Kafka sink as destination to send logs to the Azure Event Hub. After reviewing number of messages in the Event Hub and in log search engine it looks like 1 Event == 1 Message in the Event hub. Recently we had a problem with reading lot off messages produced by one AKS cluster which is based in the different Azure Region (EH also in different region). We are reading messages using Vectors. No problems in the same region, 1 Vector is able to work on similar load. Vector 0.24.1 Sink config:
|
Beta Was this translation helpful? Give feedback.
Replies: 2 comments 2 replies
-
The One approach you could try is to aggregate your events in a transforms:
wrap_event:
type: remap
inputs:
- name_of_component
source: |-
. = { "events": . }
aggregated_events:
type: reduce
inputs:
- wrap_event
merge_strategies:
events: array
sinks:
kafka_output:
type: kafka
inputs:
- aggregated_events What this is doing is:
We're sort of abusing |
Beta Was this translation helpful? Give feedback.
-
I think that solution for this problem will be implementation of reduce and max_events feature from #14817 (when will be available). |
Beta Was this translation helpful? Give feedback.
I think that solution for this problem will be implementation of reduce and max_events feature from #14817 (when will be available).