Replies: 1 comment 1 reply
-
Or let LLAMA CHAT be able to call the API of external systems. Please provide an implementation idea. |
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
How can I let LLAMA CHAT learn to look up information and request support? I want LLAMA CHAT to return a fixed string tag and keywords when encountering unknown questions during the reasoning process, so that I can query these keywords through the background and provide a reference to help AI better answer questions. For example:
AppendChatSentence(JAICR_System, 4, _T("If you don’t understand or don’t have enough information to answer a user’s question,Please answer directly [UNKNOWN QUESTION]."));
AppendChatSentence(JAICR_Assistant, 5, _T("I understand, System."));
AppendChatSentence(JAICR_User, 6, _T("What is today's date and time?"));
AppendChatSentence(JAICR_Assistant, 7, _T("[UNKNOWN QUESTION]"));
But AI still returns: The current time is [insert date and time].
How to achieve such a function so that AI can actively request extended data support from external systems.
If my method is wrong, please give me some ideas. Thank you.
Beta Was this translation helpful? Give feedback.
All reactions