Skip to content

Commit e5cb7bb

Browse files
authored
fix MODELS env (#663)
1 parent 2fda730 commit e5cb7bb

File tree

1 file changed

+4
-4
lines changed

1 file changed

+4
-4
lines changed

README.md

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -288,7 +288,7 @@ If you want to run chat-ui with llama.cpp, you can do the following, using Zephy
288288
3. Add the following to your `.env.local`:
289289

290290
```env
291-
MODELS=[
291+
MODELS=`[
292292
{
293293
"name": "Local Zephyr",
294294
"chatPromptTemplate": "<|system|>\n{{preprompt}}</s>\n{{#each messages}}{{#ifUser}}<|user|>\n{{content}}</s>\n<|assistant|>\n{{/ifUser}}{{#ifAssistant}}{{content}}</s>\n{{/ifAssistant}}{{/each}}",
@@ -308,7 +308,7 @@ MODELS=[
308308
}
309309
]
310310
}
311-
]
311+
]`
312312
```
313313

314314
Start chat-ui with `npm run dev` and you should be able to chat with Zephyr locally.
@@ -324,7 +324,7 @@ ollama run mistral
324324
Then specify the endpoints like so:
325325

326326
```env
327-
MODELS=[
327+
MODELS=`[
328328
{
329329
"name": "Ollama Mistral",
330330
"chatPromptTemplate": "<s>{{#each messages}}{{#ifUser}}[INST] {{#if @first}}{{#if @root.preprompt}}{{@root.preprompt}}\n{{/if}}{{/if}} {{content}} [/INST]{{/ifUser}}{{#ifAssistant}}{{content}}</s> {{/ifAssistant}}{{/each}}",
@@ -345,7 +345,7 @@ MODELS=[
345345
}
346346
]
347347
}
348-
]
348+
]`
349349
```
350350

351351
#### Amazon

0 commit comments

Comments
 (0)