Skip to content

[Bugfix][VLM] Fix Fuyu batching inference with max_num_seqs>1 #21664

[Bugfix][VLM] Fix Fuyu batching inference with max_num_seqs>1

[Bugfix][VLM] Fix Fuyu batching inference with max_num_seqs>1 #21664

Annotations

2 warnings

The logs for this run have expired and are no longer available.