[Section request] AI/LLM #755
Replies: 3 comments 2 replies
-
Thats are more then 1 Request. |
Beta Was this translation helpful? Give feedback.
-
For things like Stable-Diffusion, I do think there need to be multiple helper-scripts [or force users to go trough the manual setup script] since not everyone runs it on supported hardware. since for stable-diffusion a NVIDIA GPU is recommended. For Stable-diffusion, the helper-script should not have a "AUTO-install" option, A example for me would be: I have proxmox running on a Intel N100. and would be able to run a StableDiffusion fork called "easydiffusion" in a LXC container, |
Beta Was this translation helpful? Give feedback.
-
I would like this |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
Application Name
Ollama, Stable Diffusion, LocalAI, gpt4all, etc
Website
Description
I believe a lot of the readily-available LXC-based scripts are already using (selfhosted) AI tools, so might as well we start having an AI section for it.
Due Diligence
Beta Was this translation helpful? Give feedback.
All reactions