Skip to content

Commit 8a2f0b7

Browse files
authored
docs: Polish model setup doc for better onboarding experience (#1002)
1 parent b43ee6f commit 8a2f0b7

File tree

5 files changed

+136
-149
lines changed

5 files changed

+136
-149
lines changed

camel/models/openai_compatibility_model.py

Lines changed: 19 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -12,6 +12,7 @@
1212
# limitations under the License.
1313
# =========== Copyright 2023 @ CAMEL-AI.org. All Rights Reserved. ===========
1414

15+
import os
1516
from typing import Any, Dict, List, Optional, Union
1617

1718
from openai import OpenAI, Stream
@@ -25,14 +26,14 @@
2526

2627

2728
class OpenAICompatibilityModel:
28-
r"""Constructor for model backend supporting OpenAI compatibility."""
29+
r"""LLM API served by OpenAI-compatible providers."""
2930

3031
def __init__(
3132
self,
3233
model_type: str,
3334
model_config_dict: Dict[str, Any],
34-
api_key: str,
35-
url: str,
35+
api_key: Optional[str] = None,
36+
url: Optional[str] = None,
3637
token_counter: Optional[BaseTokenCounter] = None,
3738
) -> None:
3839
r"""Constructor for model backend.
@@ -51,13 +52,25 @@ def __init__(
5152
"""
5253
self.model_type = model_type
5354
self.model_config_dict = model_config_dict
54-
self._token_counter = token_counter
55+
self._url = url or os.environ.get("OPENAI_COMPATIBILIY_API_BASE_URL")
56+
self._api_key = api_key or os.environ.get(
57+
"OPENAI_COMPATIBILIY_API_KEY"
58+
)
59+
if self._url is None:
60+
raise ValueError(
61+
"For OpenAI-compatible models, you must provide the `url`."
62+
)
63+
if self._api_key is None:
64+
raise ValueError(
65+
"For OpenAI-compatible models, you must provide the `api_key`."
66+
)
5567
self._client = OpenAI(
5668
timeout=60,
5769
max_retries=3,
58-
api_key=api_key,
59-
base_url=url,
70+
base_url=self._url,
71+
api_key=self._api_key,
6072
)
73+
self._token_counter = token_counter
6174

6275
def run(
6376
self,

docs/get_started/installation.md

Lines changed: 73 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,73 @@
1+
# Installation
2+
3+
## [Option 1] Install from PyPI
4+
To install the base CAMEL library:
5+
```bash
6+
pip install camel-ai
7+
```
8+
Some features require extra dependencies:
9+
- To install with all dependencies:
10+
```bash
11+
pip install 'camel-ai[all]'
12+
```
13+
- To use the HuggingFace agents:
14+
```bash
15+
pip install 'camel-ai[huggingface-agent]'
16+
```
17+
- To enable RAG or use agent memory:
18+
```bash
19+
pip install 'camel-ai[tools]'
20+
```
21+
22+
## [Option 2] Install from Source
23+
### Install from Source with Poetry
24+
```bash
25+
# Make sure your python version is later than 3.10
26+
# You can use pyenv to manage multiple python verisons in your sytstem
27+
28+
# Clone github repo
29+
git clone https://github.com/camel-ai/camel.git
30+
31+
# Change directory into project directory
32+
cd camel
33+
34+
# If you didn't install peotry before
35+
pip install poetry # (Optional)
36+
37+
# We suggest using python 3.10
38+
poetry env use python3.10 # (Optional)
39+
40+
# Activate CAMEL virtual environment
41+
poetry shell
42+
43+
# Install the base CAMEL library
44+
# It takes about 90 seconds
45+
poetry install
46+
47+
# Install CAMEL with all dependencies
48+
poetry install -E all # (Optional)
49+
50+
# Exit the virtual environment
51+
exit
52+
```
53+
54+
### Install from Source with Conda and Pip
55+
```bash
56+
# Create a conda virtual environment
57+
conda create --name camel python=3.10
58+
59+
# Activate CAMEL conda environment
60+
conda activate camel
61+
62+
# Clone github repo
63+
git clone -b v0.2.1a https://github.com/camel-ai/camel.git
64+
65+
# Change directory into project directory
66+
cd camel
67+
68+
# Install CAMEL from source
69+
pip install -e .
70+
71+
# Or if you want to use all other extra packages
72+
pip install -e '.[all]' # (Optional)
73+
```

docs/get_started/setup.md

Lines changed: 24 additions & 138 deletions
Original file line numberDiff line numberDiff line change
@@ -1,105 +1,29 @@
1-
# Installation and Setup
2-
## 🕹 Installation
3-
4-
### [Option 1] Install from PyPI
5-
To install the base CAMEL library:
6-
```bash
7-
pip install camel-ai
8-
```
9-
Some features require extra dependencies:
10-
- To install with all dependencies:
11-
```bash
12-
pip install 'camel-ai[all]'
13-
```
14-
- To use the HuggingFace agents:
15-
```bash
16-
pip install 'camel-ai[huggingface-agent]'
17-
```
18-
- To enable RAG or use agent memory:
19-
```bash
20-
pip install 'camel-ai[tools]'
21-
```
22-
23-
### [Option 2] Install from Source
24-
#### Install from Source with Poetry
25-
```bash
26-
# Make sure your python version is later than 3.10
27-
# You can use pyenv to manage multiple python verisons in your sytstem
28-
29-
# Clone github repo
30-
git clone https://github.com/camel-ai/camel.git
31-
32-
# Change directory into project directory
33-
cd camel
34-
35-
# If you didn't install peotry before
36-
pip install poetry # (Optional)
37-
38-
# We suggest using python 3.10
39-
poetry env use python3.10 # (Optional)
40-
41-
# Activate CAMEL virtual environment
42-
poetry shell
43-
44-
# Install the base CAMEL library
45-
# It takes about 90 seconds
46-
poetry install
47-
48-
# Install CAMEL with all dependencies
49-
poetry install -E all # (Optional)
50-
51-
# Exit the virtual environment
52-
exit
53-
```
54-
55-
#### Install from Source with Conda and Pip
56-
```bash
57-
# Create a conda virtual environment
58-
conda create --name camel python=3.10
59-
60-
# Activate CAMEL conda environment
61-
conda activate camel
62-
63-
# Clone github repo
64-
git clone -b v0.2.1a https://github.com/camel-ai/camel.git
65-
66-
# Change directory into project directory
67-
cd camel
68-
69-
# Install CAMEL from source
70-
pip install -e .
71-
72-
# Or if you want to use all other extra packages
73-
pip install -e '.[all]' # (Optional)
74-
```
75-
76-
77-
## 🕹 API Setup
1+
# API Setup
782
Our agents can be deployed with either OpenAI API or your local models.
793

80-
### [Option 1] Using OpenAI API
81-
Assessing the OpenAI API requires the API key, which you may obtained from [here](https://platform.openai.com/account/api-keys). We here provide instructions for different OS.
4+
## [Option 1] Using OpenAI API
5+
Accessing the OpenAI API requires the API key, which you could get from [here](https://platform.openai.com/account/api-keys). We here provide instructions for different OS.
826

83-
#### Unix-like System (Linux / MacOS)
7+
### Unix-like System (Linux / MacOS)
848
```bash
859
echo 'export OPENAI_API_KEY="your_api_key"' >> ~/.zshrc
8610

87-
# If you are using other proxy services like Azure
88-
echo 'export OPENAI_API_BASE_URL="your_base_url"' >> ~/.zshrc # (Optional)
11+
# # If you are using other proxy services like Azure [TODO]
12+
# echo 'export OPENAI_API_BASE_URL="your_base_url"' >> ~/.zshrc # (Optional)
8913

9014
# Let the change take place
9115
source ~/.zshrc
9216
```
9317

9418
Replace `~/.zshrc` with `~/.bashrc` if you are using bash.
9519

96-
#### Windows
20+
### Windows
9721
If you are using Command Prompt:
9822
```bash
9923
set OPENAI_API_KEY="your_api_key"
10024

101-
# If you are using other proxy services like Azure
102-
set OPENAI_API_BASE_URL="your_base_url" # (Optional)
25+
# If you are using other proxy services like Azure [TODO]
26+
# set OPENAI_API_BASE_URL="your_base_url" # (Optional)
10327
```
10428
Or if you are using PowerShell:
10529
```powershell
@@ -110,68 +34,30 @@ $env:OPENAI_API_BASE_URL="your_base_url" # (Optional)
11034
```
11135
These commands on Windows will set the environment variable for the duration of that particular Command Prompt or PowerShell session only. You may use `setx` or change the system properties dialog for the change to take place in all the new sessions.
11236

37+
### General method
11338

114-
### [Option 2] Using Local Models
115-
In the current landscape, for those seeking highly stable content generation, OpenAI's GPT-3.5 turbo, GPT-4o are often recommended. However, the field is rich with many other outstanding open-source models that also yield commendable results. CAMEL can support developers to delve into integrating these open-source large language models (LLMs) to achieve project outputs based on unique input ideas.
116-
117-
#### Example: Using Ollama to set Llama 3 locally
39+
Create a file named `.env` in your project directory, with the following setting.
11840

119-
- Download [Ollama](https://ollama.com/download).
120-
- After setting up Ollama, pull the Llama3 model by typing the following command into the terminal:
12141
```bash
122-
ollama pull llama3
42+
OPENAI_API_KEY=<your-openai-api-key>
12343
```
124-
- Create a ModelFile similar the one below in your project directory.
125-
```bash
126-
FROM llama3
12744

128-
# Set parameters
129-
PARAMETER temperature 0.8
130-
PARAMETER stop Result
45+
Then, load the environment variables in your python script:
13146

132-
# Sets a custom system message to specify the behavior of the chat assistant
47+
```python
48+
from dotenv import load_dotenv
49+
import os
13350

134-
# Leaving it blank for now.
51+
load_dotenv()
13552

136-
SYSTEM """ """
53+
OPENAI_API_KEY = os.getenv("OPENAI_API_KEY")
13754
```
138-
- Create a script to get the base model (llama3) and create a custom model using the ModelFile above. Save this as a .sh file:
139-
```bash
140-
#!/bin/zsh
14155

142-
# variables
143-
model_name="llama3"
144-
custom_model_name="camel-llama3"
14556

146-
#get the base model
147-
ollama pull $model_name
57+
## [Option 2] Using other APIs
58+
59+
If you are using other APIs that are not provided by OpenAI, you can refer to [Models/Using Models by API calling](../key_modules/models.md#using-models-by-api-calling)
60+
61+
## [Option 3] Using Local Models
62+
If you are using local models, you can refer to [Models/Using Local Models](../key_modules/models.md#using-on-device-open-source-models)
14863

149-
#create the model file
150-
ollama create $custom_model_name -f ./Llama3ModelFile
151-
```
152-
- Navigate to the directory where the script and ModelFile are located and run the script. Enjoy your Llama3 model, enhanced by CAMEL's excellent agents.
153-
```python
154-
from camel.agents import ChatAgent
155-
from camel.messages import BaseMessage
156-
from camel.models import ModelFactory
157-
from camel.types import ModelPlatformType
158-
159-
ollama_model = ModelFactory.create(
160-
model_platform=ModelPlatformType.OLLAMA,
161-
model_type="llama3",
162-
url="http://localhost:11434/v1",
163-
model_config_dict={"temperature": 0.4},
164-
)
165-
166-
assistant_sys_msg = BaseMessage.make_assistant_message(
167-
role_name="Assistant",
168-
content="You are a helpful assistant.",
169-
)
170-
agent = ChatAgent(assistant_sys_msg, model=ollama_model, token_limit=4096)
171-
172-
user_msg = BaseMessage.make_user_message(
173-
role_name="User", content="Say hi to CAMEL"
174-
)
175-
assistant_response = agent.step(user_msg)
176-
print(assistant_response.msg.content)
177-
```

docs/index.rst

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -16,6 +16,7 @@ Main Documentation
1616
:caption: Get Started
1717
:name: getting_started
1818

19+
get_started/installation.md
1920
get_started/setup.md
2021

2122
.. toctree::

docs/key_modules/models.md

Lines changed: 19 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -21,6 +21,7 @@ The following table lists currently supported model platforms by CAMEL.
2121
| Azure OpenAI | gpt-4-turbo | Y |
2222
| Azure OpenAI | gpt-4 | Y |
2323
| Azure OpenAI | gpt-3.5-turbo | Y |
24+
| OpenAI Compatible | Depends on the provider | ----- |
2425
| Mistral AI | mistral-large-2 | N |
2526
| Mistral AI | open-mistral-nemo | N |
2627
| Mistral AI | codestral | N |
@@ -49,9 +50,9 @@ The following table lists currently supported model platforms by CAMEL.
4950
| Together AI | https://docs.together.ai/docs/chat-models | ----- |
5051
| LiteLLM | https://docs.litellm.ai/docs/providers | ----- |
5152

52-
## 3. Model Calling Template
53+
## 3. Using Models by API calling
5354

54-
Here is the example code to use a chosen model. To utilize a different model, you can simply change three parameters the define your model to be used: `model_platform`, `model_type`, `model_config_dict` .
55+
Here is an example code to use a specific model (gpt-4o-mini). If you want to use another model, you can simply change these three parameters: `model_platform`, `model_type`, `model_config_dict` .
5556

5657
```python
5758
from camel.models import ModelFactory
@@ -69,15 +70,28 @@ model = ModelFactory.create(
6970

7071
# Define an assitant message
7172
system_msg = BaseMessage.make_assistant_message(
72-
role_name="Assistant",
73-
content="You are a helpful assistant.",
73+
role_name="Assistant",
74+
content="You are a helpful assistant.",
7475
)
7576

7677
# Initialize the agent
7778
ChatAgent(system_msg, model=model)
7879
```
7980

80-
## 4. Open Source LLMs
81+
And if you want to use an OpenAI-compatible API, you can replace the `model` with the following code:
82+
83+
```python
84+
from camel.models.openai_compatibility_model import OpenAICompatibilityModel
85+
86+
model = OpenAICompatibilityModel(
87+
model_type="a-string-representing-the-model-type",
88+
model_config_dict={"max_tokens": 4096}, # and other parameters you want
89+
url=os.environ.get("OPENAI_COMPATIBILIY_API_BASE_URL"),
90+
api_key=os.environ.get("OPENAI_COMPATIBILIY_API_KEY"),
91+
)
92+
```
93+
94+
## 4. Using On-Device Open Source Models
8195
In the current landscape, for those seeking highly stable content generation, OpenAI’s gpt-4o-mini, gpt-4o are often recommended. However, the field is rich with many other outstanding open-source models that also yield commendable results. CAMEL can support developers to delve into integrating these open-source large language models (LLMs) to achieve project outputs based on unique input ideas.
8296

8397
While proprietary models like gpt-4o-mini and gpt-4o have set high standards for content generation, open-source alternatives offer viable solutions for experimentation and practical use. These models, supported by active communities and continuous improvements, provide flexibility and cost-effectiveness for developers and researchers.

0 commit comments

Comments
 (0)