Skip to content

Commit

Permalink
Remove batch size argument warning when unjustified (huggingface#35519)
Browse files Browse the repository at this point in the history
* use max batch size

* revert unneccessary change

---------

Co-authored-by: Raushan Turganbay <raushan@huggingface.co>
  • Loading branch information
2 people authored and elvircrn committed Feb 13, 2025
1 parent 0e66689 commit 0769d42
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion src/transformers/cache_utils.py
Original file line number Diff line number Diff line change
Expand Up @@ -1142,7 +1142,7 @@ def __init__(
self.key_cache: List[torch.Tensor] = []
self.value_cache: List[torch.Tensor] = []
# Note: There will be significant perf decrease if switching to use 5D tensors instead.
cache_shape = (self.batch_size, self.num_key_value_heads, self.max_cache_len, self.head_dim)
cache_shape = (self.max_batch_size, self.num_key_value_heads, self.max_cache_len, self.head_dim)
for idx in range(config.num_hidden_layers):
if layer_device_map is not None:
layer_device = layer_device_map[idx]
Expand Down

0 comments on commit 0769d42

Please sign in to comment.