We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Here lookahead_generation doesn't take logits_warper as input:
lookahead_generation
logits_warper
PainlessInferenceAcceleration/pia/lookahead/common/pretrained_model_batch.py
Lines 426 to 439 in 8015f12
logits_warper is used in original sample to modify next_tokens_scores:
sample
next_tokens_scores
Lines 474 to 486 in 8015f12
and to modifies logits by temperature, top_k, top_p...
if generation_config.temperature is not None and generation_config.temperature != 1.0: warpers.append(TemperatureLogitsWarper(generation_config.temperature)) if generation_config.top_k is not None and generation_config.top_k != 0: warpers.append(TopKLogitsWarper(top_k=generation_config.top_k, min_tokens_to_keep=min_tokens_to_keep)) if generation_config.top_p is not None and generation_config.top_p < 1.0: warpers.append(TopPLogitsWarper(top_p=generation_config.top_p, min_tokens_to_keep=min_tokens_to_keep))
https://github.com/huggingface/transformers/blob/09f9f566de83eef1f13ee83b5a1bbeebde5c80c1/src/transformers/generation/utils.py#L728-L733
This is not applied inside lookahead_generation. So with do_sample=True the temperature is always one
do_sample=True
The text was updated successfully, but these errors were encountered:
Thank you, we will fix it soon.
Sorry, something went wrong.
@zheyishine Hi, Has there been any progress?
No branches or pull requests
Here
lookahead_generation
doesn't takelogits_warper
as input:PainlessInferenceAcceleration/pia/lookahead/common/pretrained_model_batch.py
Lines 426 to 439 in 8015f12
logits_warper
is used in originalsample
to modifynext_tokens_scores
:PainlessInferenceAcceleration/pia/lookahead/common/pretrained_model_batch.py
Lines 474 to 486 in 8015f12
and to modifies logits by temperature, top_k, top_p...
https://github.com/huggingface/transformers/blob/09f9f566de83eef1f13ee83b5a1bbeebde5c80c1/src/transformers/generation/utils.py#L728-L733
This is not applied inside
lookahead_generation
. So withdo_sample=True
the temperature is always oneThe text was updated successfully, but these errors were encountered: