[Script request]: IPEX-LLM option added to the OPENWEBUI install script/update script #6747
Unanswered
PinkWaters00
asked this question in
Request script
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Application Name
OPENWEBUI
Website
https://openwebui.com
Description
Hello folks,
I am wondering if it is possible at all to incorporate the IPEX-LLM fork for the OLLAMA that supports INTEL ARC GPUs.
Project: https://github.com/intel/ipex-llm
They have a portable install option that creates a conda enviroment to run ollama with the intel library, which I think can make adding this as an option pretty easy:
https://github.com/intel/ipex-llm/blob/main/docs/mddocs/Quickstart/ollama_portable_zip_quickstart.md#linux-quickstart
I think it is a fantastic option to have this added to the install script and also the update script in case someone decides to get an ARC gpu and want to update their installation to work with it.
Due Diligence
Beta Was this translation helpful? Give feedback.
All reactions