Batch / Bulk Data loader #4003
Unanswered
skanagavelu
asked this question in
Q&A
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
I was previously using Universal forwarder with SINKHOLE policy to push batch file updates.
But UF is not user-friendly on what is going on behind the scene.
I would like to use FluentD for such a use case with tail plugin with wildcard option for multiple files.
I read this; https://groups.google.com/g/fluentd/c/yNamDfnVq78/m/vkuuOt-UBAAJ
which is exactly my concern.
In brief; the problem statement:
Is it possible to run Fluentd over a set of application logs once as a script? I don't want my Fluentd to be running all the time, all I need is to launch it using a cronjob. It performs the necessary functions and quits instead of running continuously over the server?
the answer:
If you want to stop the fluentd, send a signal to fluentd by your timing.
The answer seems to be satisfactory, but not sure when to send the signal
The number and size of files are unknown, So how to calculate the timing?
Can I use pos_file information to conclude the file is completely read? if yes please say how?
Or is some other way fluentd gives clue on there is no more to read or the file is completely read and waiting for file content update?
like file size and fluentd read content size of that file, and if both are the same then I would consider the file is completely read.
The answer will really help me to stick with Fluentd solution itself.
Beta Was this translation helpful? Give feedback.
All reactions