Skip to content

Feature/chatbot secrets compatibility #622

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 14 commits into
base: main
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
5 changes: 5 additions & 0 deletions docs/sdk/chat.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -47,6 +47,11 @@ Implemented in `services/openai.js`
- OpenAI chat completions integration
- GPT-3.5-turbo implementation
- Conversation threading support
- API key can be loaded from secrets.toml project file by adding a [data.openai] field and setting api_key to your API key like so
```
[data.openai]
api_key = "sk-YOUR-KEY"
```

## Basic Usage

Expand Down
1 change: 1 addition & 0 deletions frontend/src/components/DynamicComponents.jsx
Original file line number Diff line number Diff line change
Expand Up @@ -218,6 +218,7 @@ const MemoizedComponent = memo(
{...props}
sourceId={component.config?.source || null}
sourceData={component.config?.data || null}
apiKey={component.config?.apiKey || null}
value={component.value || component.state || { messages: [] }}
onChange={(value) => {
handleUpdate(componentId, value);
Expand Down
8 changes: 7 additions & 1 deletion frontend/src/components/widgets/ChatWidget.jsx
Original file line number Diff line number Diff line change
Expand Up @@ -14,6 +14,7 @@ import { createChatCompletion } from '@/services/openai';
const ChatWidget = ({
sourceId = null,
sourceData = null,
apiKey = null,
value = { messages: [] },
onChange,
className,
Expand All @@ -23,9 +24,14 @@ const ChatWidget = ({
const messagesEndRef = useRef(null);
const chatContainerRef = useRef(null);

// Load API key from secrets.toml if present
if (apiKey) {
sessionStorage.setItem('openai_api_key', apiKey.trim());
}

const [inputValue, setInputValue] = useState('');
const [showSettings, setShowSettings] = useState(false);
const [apiKey, setApiKey] = useState('');
// const [apiKey, setApiKey] = useState('');
const [isLoading, setIsLoading] = useState(false);
const [error, setError] = useState(null);
const hasApiKey = useMemo(() => !!sessionStorage.getItem('openai_api_key'), []);
Expand Down
12 changes: 12 additions & 0 deletions preswald/interfaces/components.py
Original file line number Diff line number Diff line change
Expand Up @@ -12,6 +12,7 @@
import matplotlib.pyplot as plt
import numpy as np
import pandas as pd
import tomllib

# from PIL import Image
# try:
Expand Down Expand Up @@ -97,6 +98,16 @@ def chat(source: str, table: Optional[str] = None) -> Dict:
if current_state is None:
current_state = {"messages": [], "source": source}

# Get API key from secrets.toml
with open("secrets.toml", "rb") as toml:
secrets = tomllib.load(toml)

if secrets and secrets["data"]["openai"]["api_key"]:
api_key = secrets["data"]["openai"]["api_key"]

else:
api_key = None

# Get dataframe from source
df = (
service.data_manager.get_df(source)
Expand Down Expand Up @@ -132,6 +143,7 @@ def chat(source: str, table: Optional[str] = None) -> Dict:
"config": {
"source": source,
"data": serializable_data,
"apiKey": api_key,
},
}

Expand Down