Skip to content

llama-server standalone #17

llama-server standalone

llama-server standalone #17

Manually triggered January 27, 2025 11:45
Status Cancelled
Total duration 4m 18s
Artifacts 4

llama-server.yml

on: workflow_dispatch
Matrix: macos-arm64
Matrix: ubuntu-cuda
Matrix: windows-cpu
Matrix: windows-cuda
ubuntu-vulkan
0s
ubuntu-vulkan
ubuntu-musa
0s
ubuntu-musa
windows-vulkan
0s
windows-vulkan
windows-kompute
0s
windows-kompute
windows-sycl
0s
windows-sycl
Matrix: ubuntu-hip
Matrix: ubuntu-sycl
Matrix: windows-hip
Fit to window
Zoom out
Zoom in

Annotations

6 errors and 2 warnings
ubuntu-cuda (12.6.2, ubuntu24.04, 89-real;90-real, ada-lovelace-hopper)
Process completed with exit code 127.
windows-cuda (11.7)
The run was canceled by @anagri.
windows-cuda (11.7)
The operation was canceled.
windows-cuda (12.4)
The run was canceled by @anagri.
windows-cuda (12.4)
The operation was canceled.
release
Resource not accessible by integration
ubuntu-cuda (12.6.2, ubuntu24.04, 89-real;90-real, ada-lovelace-hopper)
ubuntu-latest pipelines will use ubuntu-24.04 soon. For more details, see https://github.com/actions/runner-images/issues/10636
release
ubuntu-latest pipelines will use ubuntu-24.04 soon. For more details, see https://github.com/actions/runner-images/issues/10636

Artifacts

Produced during runtime
Name Size
llama-server--aarch64-apple-darwin--cpu
2.73 MB
llama-server--aarch64-apple-darwin--metal
2.83 MB
llama-server--x86_64-pc-windows-msvc--cpu
2.45 MB
llama-server--x86_64-unknown-linux-gnu--cpu
2.92 MB