llama-server standalone #5
llama-server.yml
on: workflow_dispatch
Matrix: macos-arm64
Matrix: ubuntu-hip
Matrix: ubuntu-sycl
Matrix: windows-cpu
Matrix: windows-cuda
Matrix: windows-hip
ubuntu-cpu
1m 41s
ubuntu-cuda
11m 6s
ubuntu-vulkan
4m 18s
ubuntu-musa
12m 14s
windows-vulkan
4m 14s
windows-kompute
5m 25s
windows-sycl
10m 59s
release
6m 17s
Annotations
5 errors and 2 warnings
windows-cpu (avx512, -DGGML_NATIVE=OFF -DGGML_AVX512=ON)
Process completed with exit code 1.
|
ubuntu-hip (native)
Process completed with exit code 2.
|
ubuntu-hip (legacy)
The job was canceled because "native" failed.
|
ubuntu-hip (legacy)
The operation was canceled.
|
ubuntu-musa
Process completed with exit code 2.
|
ubuntu-cuda
ubuntu-latest pipelines will use ubuntu-24.04 soon. For more details, see https://github.com/actions/runner-images/issues/10636
|
release
ubuntu-latest pipelines will use ubuntu-24.04 soon. For more details, see https://github.com/actions/runner-images/issues/10636
|
Artifacts
Produced during runtime
Name | Size | |
---|---|---|
llama-server--aarch64-apple-darwin--cpu
|
2.73 MB |
|
llama-server--aarch64-apple-darwin--metal
|
2.83 MB |
|
llama-server--x86_64-pc-windows-msvc--avx
|
2.47 MB |
|
llama-server--x86_64-pc-windows-msvc--avx2
|
2.48 MB |
|
llama-server--x86_64-pc-windows-msvc--avx512
|
2.49 MB |
|
llama-server--x86_64-pc-windows-msvc--cuda-11.7
|
145 MB |
|
llama-server--x86_64-pc-windows-msvc--cuda-12.4
|
145 MB |
|
llama-server--x86_64-pc-windows-msvc--kompute
|
2.66 MB |
|
llama-server--x86_64-pc-windows-msvc--noavx
|
2.45 MB |
|
llama-server--x86_64-pc-windows-msvc--openblas
|
2.49 MB |
|
llama-server--x86_64-pc-windows-msvc--sycl
|
101 MB |
|
llama-server--x86_64-pc-windows-msvc--vulkan
|
4.72 MB |
|
llama-server--x86_64-pc-windows-msvc--x86_64
|
2.45 MB |
|
llama-server--x86_64-unknown-linux-gnu--cpu
|
2.92 MB |
|
llama-server--x86_64-unknown-linux-gnu--cuda-12.6
|
20.2 MB |
|
llama-server--x86_64-unknown-linux-gnu--sycl-default
|
3.53 MB |
|
llama-server--x86_64-unknown-linux-gnu--sycl-fp16
|
3.55 MB |
|
llama-server--x86_64-unknown-linux-gnu--vulkan
|
6 MB |
|