Skip to content

Commit d58f8a1

Browse files
committed
server : update readme
ggml-ci
1 parent 79a8176 commit d58f8a1

File tree

1 file changed

+4
-2
lines changed

1 file changed

+4
-2
lines changed

examples/server/README.md

Lines changed: 4 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -444,13 +444,14 @@ These words will not be included in the completion, so make sure to add them to
444444

445445
**Response format**
446446

447-
- Note: In streaming mode (`stream`), only `content` and `stop` will be returned until end of completion. Responses are sent using the [Server-sent events](https://html.spec.whatwg.org/multipage/server-sent-events.html) standard. Note: the browser's `EventSource` interface cannot be used due to its lack of `POST` request support.
447+
- Note: In streaming mode (`stream`), only `content`, `tokens` and `stop` will be returned until end of completion. Responses are sent using the [Server-sent events](https://html.spec.whatwg.org/multipage/server-sent-events.html) standard. Note: the browser's `EventSource` interface cannot be used due to its lack of `POST` request support.
448448

449449
- `completion_probabilities`: An array of token probabilities for each completion. The array's length is `n_predict`. Each item in the array has the following structure:
450450

451451
```json
452452
{
453-
"content": "<the token selected by the model>",
453+
"content": "<the token generated by the model>",
454+
"tokens": [ generated token ids ],
454455
"probs": [
455456
{
456457
"prob": float,
@@ -468,6 +469,7 @@ These words will not be included in the completion, so make sure to add them to
468469
Notice that each `probs` is an array of length `n_probs`.
469470

470471
- `content`: Completion result as a string (excluding `stopping_word` if any). In case of streaming mode, will contain the next token as a string.
472+
- `tokens`: Same as `content` but represented as raw token ids.
471473
- `stop`: Boolean for use with `stream` to check whether the generation has stopped (Note: This is not related to stopping words array `stop` from input options)
472474
- `generation_settings`: The provided options above excluding `prompt` but including `n_ctx`, `model`. These options may differ from the original ones in some way (e.g. bad values filtered out, strings converted to tokens, etc.).
473475
- `model`: The path to the model loaded with `-m`

0 commit comments

Comments
 (0)