llama : remove check flash_attn with lora #18103
Triggered via pull request
January 6, 2025 11:29
Status
Success
Total duration
1h 23m 57s
Artifacts
–
build.yml
on: pull_request
Matrix: windows-2019-cmake-cuda
Matrix: windows-latest-cmake-hip-release
Matrix: windows-latest-cmake
macOS-latest-cmake-arm64
12m 27s
macOS-latest-cmake-x64
3m 25s
ubuntu-latest-cmake
2m 38s
macOS-latest-cmake
11m 20s
ubuntu-latest-cmake-rpc
2m 34s
ubuntu-22-cmake-vulkan
11m 30s
ubuntu-22-cmake-hip
19m 27s
ubuntu-22-cmake-musa
11m 48s
ubuntu-22-cmake-sycl
4m 43s
ubuntu-22-cmake-sycl-fp16
4m 47s
macOS-latest-cmake-ios
1m 21s
macOS-latest-cmake-tvos
1m 10s
ubuntu-latest-cmake-cuda
11m 57s
windows-latest-cmake-sycl
12m 41s
windows-latest-cmake-hip
20m 1s
ios-xcode-build
1m 18s
android-build
6m 21s
Matrix: macOS-latest-swift
Matrix: ubuntu-latest-cmake-sanitizer
Matrix: windows-msys2
release
0s
Annotations
1 error and 10 warnings