You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I'm trying to compile llama.cpp with SYCL support on Arch Linux with the glibc 2.41+r2+g0a7c7a3e283a-1 and gcc 14.2.1+r730+gc061ad5a36ba-1 packages. Prior to installing these updates I had no problems with the compile.
I am also using the Intel oneAPI Base Toolkit version 2025.0.1
First Bad Commit
N/A
Compile command
I do`rm -rf build` and then
cmake -B build -DCMAKE_EXPORT_COMPILE_COMMANDS=ON -DGGML_SYCL=ON -DCMAKE_C_COMPILER=icx -DCMAKE_CXX_COMPILER=icpx -DLLAMA_BUILD_TESTS=ON -DCMAKE_BUILD_TYPE=Release
followed by:
cmake --build build --config Release -j 10
Relevant log output
There are multiple instances of the following type of output
In file included from /vol_KUtil2/LLM_Models/llama.cpp/ggml/src/ggml-sycl/conv.cpp:13:
In file included from /vol_KUtil2/LLM_Models/llama.cpp/ggml/src/ggml-sycl/conv.hpp:16:
In file included from /vol_KUtil2/LLM_Models/llama.cpp/ggml/src/ggml-sycl/common.hpp:16:
In file included from /usr/lib64/gcc/x86_64-pc-linux-gnu/14.2.1/../../../../include/c++/14.2.1/fstream:40:
In file included from /usr/lib64/gcc/x86_64-pc-linux-gnu/14.2.1/../../../../include/c++/14.2.1/istream:40:
In file included from /usr/lib64/gcc/x86_64-pc-linux-gnu/14.2.1/../../../../include/c++/14.2.1/ios:40:
In file included from /usr/lib64/gcc/x86_64-pc-linux-gnu/14.2.1/../../../../include/c++/14.2.1/iosfwd:42:
In file included from /usr/lib64/gcc/x86_64-pc-linux-gnu/14.2.1/../../../../include/c++/14.2.1/bits/postypes.h:40:
In file included from /usr/lib64/gcc/x86_64-pc-linux-gnu/14.2.1/../../../../include/c++/14.2.1/cwchar:44:
In file included from /usr/include/wchar.h:30:
/usr/include/bits/floatn.h:79:52: error: unsupported machine mode '__TC__'
79 | typedef _Complex float __cfloat128 __attribute__ ((__mode__ (__TC__)));| ^
The text was updated successfully, but these errors were encountered:
Hello @dpriedel I'm not sure the issue is coming from llama.cpp. The error seems to come from including <fstream> with your particular environment. Have you maybe tried compiling a simpler cpp application that would also include <fstream> with the same compiler and environment and similar compilation flags used to compile ggml/src/ggml-sycl/conv.cpp?
I removed the include for fstream since it's not needed but the same error recurs from the iostream header which triggers the same sequence of includes, erroring out on the complex-float typedef.
The code in the floatn.h header indicates it is for 'older' compilers. I wonder if this is then an Intel oneAPI compiler problem??
Thanks.
PS. If I build for the Vulkan back-end, llama.cpp compiles.
JohannesGaessler
changed the title
Compile bug:
Compile bug: I'm trying to compile llama.cpp with SYCL support on Arch Linux with the glibc 2.41+r2+g0a7c7a3e283a-1
Feb 6, 2025
Git commit
HEAD as of the time I post this
9ab42dc
Operating systems
Linux
GGML backends
SYCL
Problem description & steps to reproduce
I'm trying to compile llama.cpp with SYCL support on Arch Linux with the glibc 2.41+r2+g0a7c7a3e283a-1 and gcc 14.2.1+r730+gc061ad5a36ba-1 packages. Prior to installing these updates I had no problems with the compile.
I am also using the Intel oneAPI Base Toolkit version 2025.0.1
First Bad Commit
N/A
Compile command
followed by:
cmake --build build --config Release -j 10
Relevant log output
The text was updated successfully, but these errors were encountered: