Replies: 1 comment
-
It might be longer than that. However, the result might be huge and require a lot of memory. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Q1 - I have a log file that has 500k of json objects, and each one is complex itself,
Any Idea for how to ingest this data with best performance?
In a 30 seconds for example ?
Q2 - is there a way to push multiple document to the index, instead of pushing one by one?
Beta Was this translation helpful? Give feedback.
All reactions