Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

error: implicit declaration of function ‘getcpu’ #5537

Closed
davesworking opened this issue Feb 16, 2024 · 9 comments
Closed

error: implicit declaration of function ‘getcpu’ #5537

davesworking opened this issue Feb 16, 2024 · 9 comments

Comments

@davesworking
Copy link

[dave@Rocky8_6 build]$ cmake --build . --config Release

[  1%] Building C object CMakeFiles/ggml.dir/ggml.c.o
/home/dstiff/Repos/GitHub/ggerganov/llama.cpp/ggml.c: In function ‘ggml_numa_init’:
/home/dstiff/Repos/GitHub/ggerganov/llama.cpp/ggml.c:2054:22: error: implicit declaration of function ‘getcpu’; did you mean ‘getopt’? [-Werror=implicit-function-declaration]
     int getcpu_ret = getcpu(&current_cpu, &g_state.numa.current_node);
                      ^~~~~~
                      getopt
cc1: some warnings being treated as errors
@ashishbharthi
Copy link

I ran into this issue as well. I am trying to run on Amazon Linux 2.

@nbbull
Copy link

nbbull commented Feb 18, 2024

i meet the same error

@DeadBranches
Copy link

On aarch64, termux

~/.../llama.cpp/build $ cmake .. -DBUILD_SHARED_LIBS=ON -DCMAKE_INSTALL_PREFIX=$PREFIX -DLLAMA_CLBLAST=OFD -DLLAMA_OPENBLAS=ON      -- The C compiler identification is Clang 17.0.6
-- The CXX compiler identification is Clang 17.0.6
-- Detecting C compiler ABI info
-- Detecting C compiler ABI info - done
-- Check for working C compiler: /data/data/com.termux/files/usr/bin/cc - skipped
-- Detecting C compile features
-- Detecting C compile features - done
-- Detecting CXX compiler ABI info
-- Detecting CXX compiler ABI info - done
-- Check for working CXX compiler: /data/data/com.termux/files/usr/bin/c++ - skipped                                                -- Detecting CXX compile features                                 -- Detecting CXX compile features - done
-- Found Git: /data/data/com.termux/files/usr/bin/git (found version "2.43.2")                                                      -- Performing Test CMAKE_HAVE_LIBC_PTHREAD                        -- Performing Test CMAKE_HAVE_LIBC_PTHREAD - Failed
-- Check if compiler accepts -pthread
-- Check if compiler accepts -pthread - yes
-- Found Threads: TRUE
-- CLBlast found
-- Warning: ccache not found - consider installing it for faster compilation or disable this warning with LLAMA_CCACHE=OFF
-- CMAKE_SYSTEM_PROCESSOR: aarch64                                -- ARM detected
-- Performing Test COMPILER_SUPPORTS_FP16_FORMAT_I3E
-- Performing Test COMPILER_SUPPORTS_FP16_FORMAT_I3E - Failed     -- Configuring done (2.6s)                                        -- Generating done (0.2s)                                         CMake Warning:
  Manually-specified variables were not used by the project:

    LLAMA_OPENBLAS


-- Build files have been written to: /data/data/com.termux/files/home/dl-repos/llama.cpp/build                                      ~/.../llama.cpp/build $ cmake --build . --config Release          [  1%] Building C object CMakeFiles/ggml.dir/ggml.c.o
/data/data/com.termux/files/home/dl-repos/llama.cpp/ggml.c:1315:5: warning: implicit conversion increases floating-point precision: 'float32_t' (aka 'float') to 'ggml_float' (aka 'double') [-Wdouble-promotion]                                                        1315 |     GGML_F16_VEC_REDUCE(sumf, sum);
      |     ^~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
/data/data/com.termux/files/home/dl-repos/llama.cpp/ggml.c:830:41: note: expanded from macro 'GGML_F16_VEC_REDUCE'                    830 |     #define GGML_F16_VEC_REDUCE         GGML_F32Cx4_REDUCE      |                                         ^
/data/data/com.termux/files/home/dl-repos/llama.cpp/ggml.c:820:38: note: expanded from macro 'GGML_F32Cx4_REDUCE'
  820 |     #define GGML_F32Cx4_REDUCE       GGML_F32x4_REDUCE
      |                                      ^
/data/data/com.termux/files/home/dl-repos/llama.cpp/ggml.c:750:11: note: expanded from macro 'GGML_F32x4_REDUCE'                      750 |     res = GGML_F32x4_REDUCE_ONE(x[0]);         \
      |         ~ ^~~~~~~~~~~~~~~~~~~~~~~~~~~
/data/data/com.termux/files/home/dl-repos/llama.cpp/ggml.c:735:34: note: expanded from macro 'GGML_F32x4_REDUCE_ONE'
  735 | #define GGML_F32x4_REDUCE_ONE(x) vaddvq_f32(x)
      |                                  ^~~~~~~~~~~~~            /data/data/com.termux/files/home/dl-repos/llama.cpp/ggml.c:1363:9: warning: implicit conversion increases floating-point precision: 'float32_t' (aka 'float') to 'ggml_float' (aka 'double') [-Wdouble-promotion]                                                        1363 |         GGML_F16_VEC_REDUCE(sumf[k], sum[k]);
      |         ^~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
/data/data/com.termux/files/home/dl-repos/llama.cpp/ggml.c:830:41: note: expanded from macro 'GGML_F16_VEC_REDUCE'                    830 |     #define GGML_F16_VEC_REDUCE         GGML_F32Cx4_REDUCE
      |                                         ^
/data/data/com.termux/files/home/dl-repos/llama.cpp/ggml.c:820:38: note: expanded from macro 'GGML_F32Cx4_REDUCE'
  820 |     #define GGML_F32Cx4_REDUCE       GGML_F32x4_REDUCE
      |                                      ^
/data/data/com.termux/files/home/dl-repos/llama.cpp/ggml.c:750:11: note: expanded from macro 'GGML_F32x4_REDUCE'
  750 |     res = GGML_F32x4_REDUCE_ONE(x[0]);         \
      |         ~ ^~~~~~~~~~~~~~~~~~~~~~~~~~~
/data/data/com.termux/files/home/dl-repos/llama.cpp/ggml.c:735:34: note: expanded from macro 'GGML_F32x4_REDUCE_ONE'
  735 | #define GGML_F32x4_REDUCE_ONE(x) vaddvq_f32(x)
      |                                  ^~~~~~~~~~~~~
/data/data/com.termux/files/home/dl-repos/llama.cpp/ggml.c:1963:5: error: unknown type name 'cpu_set_t'                              1963 |     cpu_set_t cpuset; // cpuset from numactl
      |     ^                                                     /data/data/com.termux/files/home/dl-repos/llama.cpp/ggml.c:2001:8: error: unknown type name 'cpu_set_t'
 2001 | static cpu_set_t ggml_get_numa_affinity(void) {                 |        ^                                                  /data/data/com.termux/files/home/dl-repos/llama.cpp/ggml.c:2002:5: error: use of undeclared identifier 'cpu_set_t'
 2002 |     cpu_set_t cpuset;                                           |     ^                                                     /data/data/com.termux/files/home/dl-repos/llama.cpp/ggml.c:2005:5: error: call to undeclared function 'CPU_ZERO'; ISO C99 and later do not support implicit function declarations [-Wimplicit-function-declaration]
 2005 |     CPU_ZERO(&cpuset);                                          |     ^                                                     /data/data/com.termux/files/home/dl-repos/llama.cpp/ggml.c:2005:15: error: use of undeclared identifier 'cpuset'
 2005 |     CPU_ZERO(&cpuset);
      |               ^                                           /data/data/com.termux/files/home/dl-repos/llama.cpp/ggml.c:2006:5: error: call to undeclared function 'pthread_getaffinity_np'; ISO C99 and later do not support implicit function declarations [-Wimplicit-function-declaration]                                        2006 |     pthread_getaffinity_np(thread, sizeof(cpu_set_t), &cpuset);                                                                   |     ^                                                     /data/data/com.termux/files/home/dl-repos/llama.cpp/ggml.c:2006:5: note: did you mean 'pthread_getattr_np'?                         /data/data/com.termux/files/usr/include/pthread.h:190:5: note: 'pthread_getattr_np' declared here                                     190 | int pthread_getattr_np(pthread_t __pthread, pthread_attr_t* _Nonnull __attr);
      |     ^
/data/data/com.termux/files/home/dl-repos/llama.cpp/ggml.c:2006:43: error: use of undeclared identifier 'cpu_set_t'
 2006 |     pthread_getaffinity_np(thread, sizeof(cpu_set_t), &cpuset);                                                                   |                                           ^               /data/data/com.termux/files/home/dl-repos/llama.cpp/ggml.c:2006:56: error: use of undeclared identifier 'cpuset'
 2006 |     pthread_getaffinity_np(thread, sizeof(cpu_set_t), &cpuset);
      |                                                        ^
/data/data/com.termux/files/home/dl-repos/llama.cpp/ggml.c:2007:12: error: use of undeclared identifier 'cpuset'                     2007 |     return cpuset;
      |            ^
/data/data/com.termux/files/home/dl-repos/llama.cpp/ggml.c:2054:22: error: call to undeclared function 'getcpu'; ISO C99 and later do not support implicit function declarations [-Wimplicit-function-declaration]                                                       2054 |     int getcpu_ret = getcpu(&current_cpu, &g_state.numa.current_node);                                                            |                      ^                                    2 warnings and 10 errors generated.                               make[2]: *** [CMakeFiles/ggml.dir/build.make:76: CMakeFiles/ggml.dir/ggml.c.o] Error 1                                              make[1]: *** [CMakeFiles/Makefile2:742: CMakeFiles/ggml.dir/all] Error 2                                                            make: *** [Makefile:146: all] Error 2

@Zhenzhong1
Copy link

Zhenzhong1 commented Feb 19, 2024

Hi, I got the same error on Linux x86_64 arch.

image

image

Similar issue:
#5562

Related PR:
#5377

git reset --hard 60ed04cf82dc91ade725dd7ad53f0ee81f76eccf

reset to this commit (the last PR before #5377), will be ok.

@bmtwl
Copy link
Contributor

bmtwl commented Feb 19, 2024

This should be fixed now
solution PR 5557: master...bmtwl:llama.cpp:master
Please let me know if the issue persists

@dspasyuk
Copy link
Contributor

dspasyuk commented Mar 5, 2024

have this issue been resolved? Still getting this error with the fresh install. @bmtwl the solution in your link gives me the same error. Or am I a missing something?

/cvmfs/soft.computecanada.ca/gentoo/2020/usr/x86_64-pc-linux-gnu/binutils-bin/2.33.1/ld: ggml.o: in function ggml_numa_init': ggml.c:(.text+0x973a): undefined reference to getcpu'
collect2: error: ld returned 1 exit status
make: *** [Makefile:830: q8dot] Error 1
/cvmfs/soft.computecanada.ca/gentoo/2020/usr/x86_64-pc-linux-gnu/binutils-bin/2.33.1/ld: ggml.o: in function ggml_numa_init': ggml.c:(.text+0x973a): undefined reference to getcpu'
/cvmfs/soft.computecanada.ca/gentoo/2020/usr/x86_64-pc-linux-gnu/binutils-bin/2.33.1/ld: ggml.o: in function ggml_numa_init': ggml.c:(.text+0x973a): undefined reference to getcpu'
collect2: error: ld returned 1 exit status
make: *** [Makefile:817: benchmark-matmult] Error 1
collect2: error: ld returned 1 exit status

@bmtwl
Copy link
Contributor

bmtwl commented Mar 6, 2024

@dspasyuk Hmmm. your system should be able to use the stable syscall wrapper since your glib is above 2.28.
If you change ggml.c line 2153 to:
#if __GLIBC__ > 2 || (__GLIBC__ == 2 && __GLIBC_MINOR__ > 33)

does it compile?

1-ashraful-islam referenced this issue Mar 6, 2024
* Revert "swift : update Package.swift to use ggml as dependency (#4691)"

This reverts commit ece9a45.

* spm : add ggml headers
@M3Dade
Copy link

M3Dade commented Mar 27, 2024

@dspasyuk Hmmm. your system should be able to use the stable syscall wrapper since your glib is above 2.28. If you change ggml.c line 2153 to: #if __GLIBC__ > 2 || (__GLIBC__ == 2 && __GLIBC_MINOR__ > 33)

does it compile?

@bmtwl Hello, my glib version is 2.29. I solved the same problem @dspasyuk who asked by following your method. Thanks for your help.

@github-actions github-actions bot added the stale label Apr 27, 2024
Copy link
Contributor

This issue was closed because it has been inactive for 14 days since being marked as stale.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

8 participants