@@ -151,18 +151,88 @@ ensure the session is maintained across multiple calls.
151
151
152
152
### Embedding Generation
153
153
154
- TODO
154
+ You can generate embedding vectors using an appropriate model with Ollama or Mistral models:
155
+
156
+ ``` go
157
+ import (
158
+ " github.com/mutablelogic/go-llm"
159
+ )
160
+
161
+ func embedding (ctx context .Context , agent llm .Agent ) error {
162
+ // Create a new chat session
163
+ vector , err := agent.Model (ctx, " mistral-embed" ).Embedding (ctx, " hello" )
164
+ // ...
165
+ }
166
+ ```
155
167
156
168
### Attachments & Image Caption Generation
157
169
158
- TODO
170
+ Some models have ` vision ` capability and others can also summarize text. For example, to
171
+ generate captions for an image,
172
+
173
+ ``` go
174
+ import (
175
+ " github.com/mutablelogic/go-llm"
176
+ )
177
+
178
+ func generate_image_caption (ctx context .Context , agent llm .Agent , path string ) (string , error ) {
179
+ f , err := os.Open (path)
180
+ if err != nil {
181
+ return " " , err
182
+ }
183
+ defer f.Close ()
184
+
185
+ // Describe an image
186
+ r , err := agent.Model (" claude-3-5-sonnet-20241022" ).UserPrompt (
187
+ ctx, model.UserPrompt (" Provide a short caption for this image" , llm.WithAttachment (f))
188
+ )
189
+ if err != nil {
190
+ return " " , err
191
+ }
192
+
193
+ // Return success
194
+ return r.Text (0 ), err
195
+ }
196
+ ```
197
+
198
+ To summarize a text or PDF docment is exactly the same using an Anthropic model, but maybe with a
199
+ different prompt.
159
200
160
201
### Streaming
161
202
162
- TODO
203
+ Streaming is supported with all providers, but Ollama cannot be used with streaming and tools
204
+ simultaneously. You provide a callback function of signature ` func(llm.Completion) ` which will
205
+ be called as a completion is received.
206
+
207
+ ``` go
208
+ import (
209
+ " github.com/mutablelogic/go-llm"
210
+ )
211
+
212
+ func generate_completion (ctx context .Context , agent llm .Agent , prompt string ) (string , error ) {
213
+ r , err := agent.Model (" claude-3-5-sonnet-20241022" ).UserPrompt (
214
+ ctx, model.UserPrompt (" What is the weather in London?" ),
215
+ llm.WithStream (stream_callback),
216
+ )
217
+ if err != nil {
218
+ return " " , err
219
+ }
220
+
221
+ // Return success
222
+ return r.Text (0 ), err
223
+ }
224
+
225
+ func stream_callback (completion llm .Completion ) {
226
+ // Print out the completion text on each call
227
+ fmt.Println (completion.Text (0 ))
228
+ }
229
+
230
+ ```
163
231
164
232
### Tool Support
165
233
234
+ All providers support tools, but not all models.
235
+
166
236
TODO
167
237
168
238
## Options
0 commit comments