book, ¿training, fine-tuning, large model? #3439
Replies: 4 comments
-
Neither probably, assuming what you mean is that you want to "give" the AI a book and make it "read" it. You can try with private gpt, it's designed exactly for that. But if you want to implement that yourself, with llama.cpp, this is typically done with vector stores, embedding queries, and agents, which llama.cpo could do, but you'd have to write your own code for the agent, as this isn't directly provided by llama.cpp. |
Beta Was this translation helpful? Give feedback.
-
Thankyou. |
Beta Was this translation helpful? Give feedback.
-
in imartinez/privateGPT they are limit, is 4096 same here? |
Beta Was this translation helpful? Give feedback.
-
@staviq : Exist another form for to utilizate any like privateGPT for read books and the response fast it's in another laguaje like spanish?, and use CUDA? |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
what is best, and what is the difference? training, fine-tuning or use large model.
I want llama read a book and make answered to my questions.
¿what order to use?
Beta Was this translation helpful? Give feedback.
All reactions