Replies: 2 comments
-
Have not push code to remote because a 10GB LLM is used and I dont think I can push such a large file to github. Before we can figure out how to serve Llama 2 in a proper way, I don't think the code is ready to be merged. |
Beta Was this translation helpful? Give feedback.
0 replies
-
this is currently blocked by issue #81 |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
Description:
Our slack bot might be unable to identify implicit coreference in certain cases. For example, during the previous conversation, we asked questions on vector store in langchain, then when we asked for a Python example. Sherpa did not get we want a Python example (of vector store in langchain)
Steps to Reproduce:
The example has nothing to do with vector store or langchain
Actual Results:
See above
Expected Results:
I would expect the bot to understand the implicit coreference and give a response with correct information.
Additional Information:
N/A
Reproducibility:
In certain cases, the bot is able to catch the implicit coreference.
In the following cases, our bot failed as well
In certain cases, the bot is able to catch the implicit coreference.
Possible Solutions:
Related Issues:
Steps Taken So Far:
N/A
Environment:
N/A
Beta Was this translation helpful? Give feedback.
All reactions