2
2
3
3
Large Language Model API interface. This is a simple API interface for large language models
4
4
which run on [ Ollama] ( https://github.com/ollama/ollama/blob/main/docs/api.md ) ,
5
- [ Anthopic] ( https://docs.anthropic.com/en/api/getting-started ) and [ Mistral] ( https://docs.mistral.ai/ ) .
5
+ [ Anthopic] ( https://docs.anthropic.com/en/api/getting-started ) and [ Mistral] ( https://docs.mistral.ai/ )
6
+ (OpenAI might be added later).
6
7
7
8
The module includes the ability to utilize:
8
9
9
10
* Maintaining a session of messages
10
- * Tool calling support
11
- * Creating embeddings from text
11
+ * Tool calling support, including using your own tools (aka Tool plugins)
12
+ * Creating embedding vectors from text
12
13
* Streaming responses
14
+ * Multi-modal support (aka, Images and Attachments)
13
15
14
16
There is a command-line tool included in the module which can be used to interact with the API.
15
- For example,
17
+ If you have docker installed, you can use the following command to run the tool, without
18
+ installation:
16
19
17
20
``` bash
18
21
# Display help
@@ -21,12 +24,13 @@ docker run ghcr.io/mutablelogic/go-llm:latest --help
21
24
# Interact with Claude to retrieve news headlines, assuming
22
25
# you have an API key for Anthropic and NewsAPI
23
26
docker run \
24
- --interactive -e ANTHROPIC_API_KEY -e NEWSAPI_KEY \
27
+ --interactive -e MISTRAL_API_KEY -e NEWSAPI_KEY \
25
28
ghcr.io/mutablelogic/go-llm:latest \
26
- chat claude-3-5-haiku-20241022
29
+ chat claude-3-5-haiku-20241022 --prompt " What is the latest news? "
27
30
```
28
31
29
- See below for more information on how to use the command-line tool.
32
+ See below for more information on how to use the command-line tool (or how to install it
33
+ if you have a ` go ` compiler).
30
34
31
35
## Programmatic Usage
32
36
@@ -46,7 +50,7 @@ import (
46
50
)
47
51
48
52
func main () {
49
- // Create a new agent
53
+ // Create a new agent - replace the URL with the one to your Ollama instance
50
54
agent , err := ollama.New (" https://ollama.com/api/v1/" )
51
55
if err != nil {
52
56
panic (err)
@@ -57,7 +61,7 @@ func main() {
57
61
58
62
To create an
59
63
[ Anthropic] ( https://pkg.go.dev/github.com/mutablelogic/go-llm/pkg/anthropic )
60
- agent,
64
+ agent with an API key stored as an environment variable ,
61
65
62
66
``` go
63
67
import (
@@ -66,7 +70,7 @@ import (
66
70
67
71
func main () {
68
72
// Create a new agent
69
- agent , err := anthropic.New (os.Getev (" ANTHROPIC_API_KEY" ))
73
+ agent , err := anthropic.New (os.Getenv (" ANTHROPIC_API_KEY" ))
70
74
if err != nil {
71
75
panic (err)
72
76
}
@@ -83,7 +87,7 @@ import (
83
87
84
88
func main () {
85
89
// Create a new agent
86
- agent , err := mistral.New (os.Getev (" MISTRAL_API_KEY" ))
90
+ agent , err := mistral.New (os.Getenv (" MISTRAL_API_KEY" ))
87
91
if err != nil {
88
92
panic (err)
89
93
}
@@ -94,6 +98,28 @@ func main() {
94
98
You can append options to the agent creation to set the client/server communication options,
95
99
such as user agent strings, timeouts, debugging, rate limiting, adding custom headers, etc. See [ here] ( https://pkg.go.dev/github.com/mutablelogic/go-client#readme-basic-usage ) for more information.
96
100
101
+ There is also an _ aggregated_ agent which can be used to interact with multiple providers at once. This is useful if you want
102
+ to use models from different providers simultaneously.
103
+
104
+ ``` go
105
+ import (
106
+ " github.com/mutablelogic/go-llm/pkg/agent"
107
+ )
108
+
109
+ func main () {
110
+ // Create a new agent which aggregates multiple providers
111
+ agent , err := agent.New (
112
+ agent.WithAnthropic (os.Getenv (" ANTHROPIC_API_KEY" )),
113
+ agent.WithMistral (os.Getenv (" MISTRAL_API_KEY" )),
114
+ agent.WithOllama (os.Getenv (" OLLAMA_URL" )),
115
+ )
116
+ if err != nil {
117
+ panic (err)
118
+ }
119
+ // ...
120
+ }
121
+ ```
122
+
97
123
### Chat Sessions
98
124
99
125
You create a ** chat session** with a model as follows,
@@ -120,6 +146,9 @@ func session(ctx context.Context, agent llm.Agent) error {
120
146
}
121
147
```
122
148
149
+ The ` Context ` object will continue to store the current session and options, and will
150
+ ensure the session is maintained across multiple calls.
151
+
123
152
### Embedding Generation
124
153
125
154
TODO
@@ -146,16 +175,16 @@ type Model interface {
146
175
// Set session-wide options
147
176
Context (...Opt ) Context
148
177
149
- // Add attachments (images, PDF's) to a user prompt
178
+ // Add attachments (images, PDF's) to a user prompt for completion
150
179
UserPrompt (string , ...Opt ) Context
151
180
152
- // Set embedding options
181
+ // Create an embedding vector with embedding options
153
182
Embedding (context.Context , string , ...Opt ) ([]float64 , error )
154
183
}
155
184
156
185
type Context interface {
157
186
// Add single-use options when calling the model, which override
158
- // session options. You can also attach files to a user prompt.
187
+ // session options. You can attach files to a user prompt.
159
188
FromUser (context.Context , string , ...Opt ) error
160
189
}
161
190
```
0 commit comments