-
Notifications
You must be signed in to change notification settings - Fork 0
Issues: jmemcc/streamlit-llm-interface
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Author
Label
Projects
Milestones
Assignee
Sort
Issues list
Implement Functionality for LLMs Loaded in the Browser
enhancement
New feature or request
#3
opened Jan 25, 2025 by
jmemcc
Add Support for More Local LLM Inferencing Tools
enhancement
New feature or request
#2
opened Jan 23, 2025 by
jmemcc
Ollama Running on Host via Bridge Instead of Inside of the Docker Container
enhancement
New feature or request
#1
opened Jan 23, 2025 by
jmemcc
ProTip!
Follow long discussions with comments:>50.