Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

assisted_decoding called directly inside generate triggering warning to use when it shouldn't #29860

Closed
2 of 4 tasks
ofirzaf opened this issue Mar 25, 2024 · 1 comment · Fixed by #29585
Closed
2 of 4 tasks

Comments

@ofirzaf
Copy link
Contributor

ofirzaf commented Mar 25, 2024

System Info

  • transformers version: 4.39.1
  • Python version: 3.10.14
  • Huggingface_hub version: 0.22.0
  • Safetensors version: 0.4.2
  • Accelerate version: not installed
  • Accelerate config: not found
  • PyTorch version (GPU?): 2.2.1+cpu (False)
  • Tensorflow version (GPU?): not installed (NA)
  • Flax version (CPU?/GPU?/TPU?): not installed (NA)
  • Jax version: not installed
  • JaxLib version: not installed
  • Using GPU in script?: No
  • Using distributed or parallel set-up in script?: No

Who can help?

@gante

Information

  • The official example scripts
  • My own modified scripts

Tasks

  • An officially supported task in the examples folder (such as GLUE/SQuAD, ...)
  • My own task or dataset (give details below)

Reproduction

out = model.generate(
    **inputs,
    max_new_tokens=128,
    streamer=TextStreamer(tokenizer=tokenizer, skip_special_tokens=True),
    pad_token_id=tokenizer.eos_token_id,
    prompt_lookup_num_tokens=3,
)

This triggers the following warning:

Calling `_assisted_decoding` directly is deprecated and will be removed in v4.41. Use `generate` or a custom generation loop instead.

This happens probably due to the following call which should be changed to self._assisted_decoding(...)

result = self.assisted_decoding(

Expected behavior

Warning shouldn't be triggered

@zucchini-nlp
Copy link
Member

Thanks for opening an issue. Yes, there is a PR fixing this behavior. You can update transformers with
!pip install --upgrade git+https://github.com/huggingface/transformers.git

after the PR is merged 🤗

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants