Replies: 2 comments 7 replies
-
Same here... |
Beta Was this translation helpful? Give feedback.
-
I think only libOpenCL.so is needed, so you can copy that to another dir and use that dir as your LD_LIBRARY_PATH - avoiding overriding any other libraries E.g just copy it to current dir (~/llama.cpp/build/bin): Though I'm not sure if this really worked (or if I went wrong somewhere else), because tokens/sec performance does not seem better than the version compiled without OpenCL, but I need to do more testing... maybe it works better for you? |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
As written on the readme, for running llama.cpp with CLBlast support on Android we need to add this before launching: export LD_LIBRARY_PATH=/vendor/lib64:$LD_LIBRARY_PATH
This works fine for main, but running server does not work, because it cannot find some required libs for networking.
That's the error:
CANNOT LINK EXECUTABLE "./server": cannot locate symbol "__emutls_get_address" referenced by "/data/data/com.termux/files/home/llama.cpp/server"...
Someone has some idea on how to make that work?
Beta Was this translation helpful? Give feedback.
All reactions