✅ Asynchronous chat (Channels, WebSocket)
✅ AI/LLM API consuming (OpenAI)
✅ Chat memory using Redis layer
✅ Production: Docker, Railway
OPENAI_API_KEY=<api-key>
SECRET_KEY=<secret-key>
DEBUG=True
ALLOWED_HOSTS=localhost,127.0.0.1
https://platform.openai.com/docs/quickstart?api-mode=chat
Set to 32 [nbytes] for a 64 long key.
python -c "import secrets;print(secrets.token_hex(32))"
docker build .
docker-compose -f docker-compose-dev.yml up --build
Removing containers:
docker-compose -f docker-compose-dev.yml down
http://localhost:8000
- Add new project from GitHub
- Set domain on the port 8080
- Set the same variables except
DEBUG
in the Settings panel but changeALLOWED_HOSTS
to the new domain - Use Start Command:
daphne -b 0.0.0.0 -p 8080 config.asgi:application
- Grab
docker-compose-railway.yml
from your file browser and drop on project canvas