Replies: 1 comment
-
The JSON grammar includes whitespace since that's valid JSON. if you don't want whitespace after the returned object, you could remove the whitespace from the end of the json.gbnf -root ::= object
+root ::= object-no-ws
value ::= object | array | string | number | ("true" | "false" | "null") ws
-object ::=
+object-no-ws ::=
"{" ws (
string ":" ws value
("," ws string ":" ws value)*
- )? "}" ws
+ )? "}"
+
+object ::= object-no-ws ws
array ::=
"[" ws (
value
("," ws value)*
)? "]" ws
string ::=
"\"" (
[^"\\\x7F\x00-\x1F] |
"\\" (["\\/bfnrt] | "u" [0-9a-fA-F] [0-9a-fA-F] [0-9a-fA-F] [0-9a-fA-F]) # escapes
)* "\"" ws
number ::= ("-"? ([0-9] | [1-9] [0-9]*)) ("." [0-9]+)? ([eE] [-+]? [0-9]+)? ws
# Optional space: by convention, applied in this grammar after literal chars when allowed
ws ::= ([ \t\n] ws)? |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
When I send the prompt below without grammars to a model served with a Llama.cpp server, the model ends the response with
<|im_end|><dummy32000>
andstopped_eos
istrue
in the response.However, when I send the same prompt with the JSON grammar, it ends the response with hundreds of newlines (
\n
s) andstopped_eos
come asfalse
andstopped_limit
astrue
in the response.So how can I preserve the model's ability to end the response when it actually has nothing more to say? In other words, how to make it able to stop when it reaches special tokens (like the
eos
token) while using grammars?The prompt:
Data sent to the server:
and
'grammar': grammar
is added of course when using grammars.Thanks in advance!
Beta Was this translation helpful? Give feedback.
All reactions