llama-server standalone #17
llama-server.yml
on: workflow_dispatch
Matrix: macos-arm64
Matrix: ubuntu-cuda
Matrix: windows-cpu
Matrix: windows-cuda
ubuntu-cpu
1m 37s
ubuntu-vulkan
0s
ubuntu-musa
0s
windows-vulkan
0s
windows-kompute
0s
windows-sycl
0s
Matrix: ubuntu-hip
Matrix: ubuntu-sycl
Matrix: windows-hip
release
13s
Annotations
6 errors and 2 warnings
ubuntu-cuda (12.6.2, ubuntu24.04, 89-real;90-real, ada-lovelace-hopper)
Process completed with exit code 127.
|
windows-cuda (11.7)
The run was canceled by @anagri.
|
windows-cuda (11.7)
The operation was canceled.
|
windows-cuda (12.4)
The run was canceled by @anagri.
|
windows-cuda (12.4)
The operation was canceled.
|
release
Resource not accessible by integration
|
ubuntu-cuda (12.6.2, ubuntu24.04, 89-real;90-real, ada-lovelace-hopper)
ubuntu-latest pipelines will use ubuntu-24.04 soon. For more details, see https://github.com/actions/runner-images/issues/10636
|
release
ubuntu-latest pipelines will use ubuntu-24.04 soon. For more details, see https://github.com/actions/runner-images/issues/10636
|
Artifacts
Produced during runtime
Name | Size | |
---|---|---|
llama-server--aarch64-apple-darwin--cpu
|
2.73 MB |
|
llama-server--aarch64-apple-darwin--metal
|
2.83 MB |
|
llama-server--x86_64-pc-windows-msvc--cpu
|
2.45 MB |
|
llama-server--x86_64-unknown-linux-gnu--cpu
|
2.92 MB |
|