This component provides a simple way to integrate the OpenAI Chat API (or other OpenAI-compatible APIs) on Edgee,
served directly at the edge. You map the component to a specific endpoint such as /chat
, and
then you invoke it from your frontend code.
- Download the latest component version from our releases page
- Place the
openai.wasm
file in your server (e.g.,/var/edgee/components
) - Add the following configuration to your
edgee.toml
:
[[components.edge_functions]]
id = "openai"
file = "/var/edgee/components/openai.wasm"
settings.edgee_path = "/chat"
settings.api_key = "sk-XYZ"
settings.model = "gpt-3.5-turbo"
# optional settings:
settings.max_completion_tokens = "100" # optional, by default it's unlimited
settings.default_system_prompt="You are a funny assistant, always adding a short joke after your response." # optional, no automatic system prompt by default
settings.api_hostname = "api.openai.com" # optional, in case you're using a different OpenAI-compatible API
You can send requests to the endpoint and show the response message as follows:
const response = await fetch('/chat', {
method: 'POST',
body: JSON.stringify({
messages: [{
role: 'user',
content: 'Hello! Please say "ok" if this API call is working.',
}],
}),
});
const json = await response.json();
console.log(json.content);
Prerequisites:
Build command:
edgee component build
Test command (with local HTTP emulator):
edgee component test
Test coverage command:
make test.coverage[.html]
Interested in contributing? Read our contribution guidelines
Report security vulnerabilities to security@edgee.cloud