-
-
Notifications
You must be signed in to change notification settings - Fork 3.8k
Closed as not planned
Labels
Description
What happened?
Impossible to use realtime model with litellm proxy and websocket
the code works fine directly with openAI
could be also related to models associated to a team ( model alias ? )
it connect to the litellm and close
Relevant log output
# example requires websocket-client library:
# pip install websocket-client
import os
import json
import websocket
OPENAI_API_KEY = os.environ.get("OPENAI_API_KEY")
url = "wss://xxxxxxxx/v1/realtime?model=gpt-4o-realtime-preview"
headers = [
"Authorization: Bearer " + OPENAI_API_KEY,
"api_key: Bearer " + OPENAI_API_KEY,
"OpenAI-Beta: realtime=v1"
]
def on_open(ws):
print("Connected to server.")
def on_message(ws, message):
data = json.loads(message)
print("Received event:", json.dumps(data, indent=2))
ws = websocket.WebSocketApp(
url,
header=headers,
on_open=on_open,
on_message=on_message,
)
ws.run_forever()
Are you a ML Ops Team?
Yes
What LiteLLM version are you on ?
V1.61
Twitter / LinkedIn details
No response
slyt and valckmir