You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: docs/plugins/stream.md
+30-2Lines changed: 30 additions & 2 deletions
Original file line number
Diff line number
Diff line change
@@ -72,9 +72,9 @@ Below is the example to integrate ChatGPT to Elysia.
72
72
73
73
```ts
74
74
newElysia()
75
-
.post(
75
+
.get(
76
76
'/ai',
77
-
({ body, query: { prompt } }) =>
77
+
({ query: { prompt } }) =>
78
78
newStream(
79
79
openai.chat.completions.create({
80
80
model: 'gpt-3.5-turbo',
@@ -90,6 +90,32 @@ new Elysia()
90
90
91
91
By default [openai](https://npmjs.com/package/openai) chatGPT completion return `AsyncIterable` so you should be able to wrap the OpenAI in `Stream`.
92
92
93
+
## Fetch Stream
94
+
You can pass a fetch from an endpoint that returns the stream to proxy a stream.
95
+
96
+
This is useful for those endpoints that use AI text-generation since you can proxy it directly, eg. [Cloudflare AI](https://developers.cloudflare.com/workers-ai/models/llm/#examples---chat-style-with-system-prompt-preferred).
0 commit comments