Replies: 1 comment
-
I directly downloaded from website. Today I downloaded with command and compiled. Now I do not have this problem. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
I HAVE AN ISSUE ABOUT LOADING MODEL. I AM NOT FAMILIAR WITH LLAMA GGML OR GGUF.
MY STEPS:
I DOWNLOADED REPO.
MY CMAKE COMMAND:
I DID IT LIKE THIS TO MAKE .lib FILES COMPATIBLE WITH UNREAL ENGINE 5.
I MAKE A NEW VS2022 C++ CONSOLE APP TO TEST ".libs"
C/C++:
/permissive- /ifcOutput "x64\Release\" /GS /GL /W3 /Gy /Zc:wchar_t /I"C:\Users\gomi\source\repos\llamaTest\include\ggml" /I"C:\Users\gomi\source\repos\llamaTest\include\llama" /Gm- /O2 /sdl /Fd"x64\Release\vc143.pdb" /Zc:inline /fp:precise /D "NDEBUG" /D "_CONSOLE" /D "_CRT_SECURE_NO_WARNINGS" /D "_UNICODE" /D "UNICODE" /errorReport:prompt /WX- /Zc:forScope /Gd /Oi /MD /std:c++17 /FC /Fa"x64\Release\" /EHsc /nologo /Fo"x64\Release\" /Fp"x64\Release\llamaTest.pch" /diagnostics:column
LINKER:
/OUT:"C:\Users\gomi\source\repos\llamaTest\x64\Release\llamaTest.exe" /MANIFEST /LTCG:incremental /NXCOMPAT /PDB:"C:\Users\gomi\source\repos\llamaTest\x64\Release\llamaTest.pdb" /DYNAMICBASE "ggml.lib" "ggml-base.lib" "ggml-cpu.lib" "llama.lib" "kernel32.lib" "user32.lib" "gdi32.lib" "winspool.lib" "comdlg32.lib" "advapi32.lib" "shell32.lib" "ole32.lib" "oleaut32.lib" "uuid.lib" "odbc32.lib" "odbccp32.lib" /DEBUG /MACHINE:X64 /OPT:REF /PGD:"C:\Users\gomi\source\repos\llamaTest\x64\Release\llamaTest.pgd" /SUBSYSTEM:CONSOLE /MANIFESTUAC:"level='asInvoker' uiAccess='false'" /ManifestFile:"x64\Release\llamaTest.exe.intermediate.manifest" /LTCGOUT:"x64\Release\llamaTest.iobj" /OPT:ICF /ERRORREPORT:PROMPT /ILK:"x64\Release\llamaTest.ilk" /NOLOGO /LIBPATH:"C:\Users\gomi\source\repos\llamaTest\lib" /TLBID:1
I DOWNLOADED THIS MODEL: https://huggingface.co/TheBloke/Llama-2-13B-chat-GGUF
OUTPUT:
PROJECT FILES: llamaTest.zip
I GET SAME ERROR FOR ANY MODEL DOWNLOADED.
WHAT IS THE WAY I SOLVE THIS PROBLEM?
THANK YOU.
Beta Was this translation helpful? Give feedback.
All reactions