GPU specifications for haystack model #4165
Unanswered
Harshal1810
asked this question in
Questions
Replies: 1 comment 1 reply
-
@Harshal1810 We generally suggest splitting long documents using PreProcessor Node. We have a few benchmarks here using NVIDIA V100 but not other GPUs. We will be working on creating more benchmarks in the near future. Stay tuned! In case you have specific requests for benchmark information, please feel free to open a feature request. We will try to address it whenever we pick it up. |
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Hi everyone,
I am designing a search engine using haystack model which will be able to take document as input and either give answers to asked question or generate questions based on the query(words typed). I have created API endpoints for all of these using fastAPI and tested them suing their Swagger UI and it is working fine for small documents. But for slightly big documents(it's not even that big) question generator api is giving server error. RAM is not sufficient. Currently I am using CPU for computations. I am planning to buy a GPU(server based) to boost the performance. Can you please suggest the best gpu specifications for these. In future documents that I will be uploading will be bigger.
Beta Was this translation helpful? Give feedback.
All reactions