Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[vllm] leverage passthrough for generation_config engine argument #2727

Merged
merged 1 commit into from
Feb 6, 2025

Conversation

siddvenk
Copy link
Contributor

@siddvenk siddvenk commented Feb 6, 2025

Description

This PR removes the generation_config parameter from vllm properties. This is an engine argument and can be leveraged via pass-through.

I validated this for GPU - when specifying OPTION_GENERATION_CONFIG=auto , vllm will pick up generation_config.json if present.

I added some logging to test:

Without sampling params in the payload:

INFO  PyProcess W-3507-08a20b291e447c9-stdout: INFO::default sampling params are default_sampling_params={'temperature': 0.36, 'top_p': 0.9}
INFO  PyProcess W-3507-08a20b291e447c9-stdout: INFO::using sampling params sampling_params=SamplingParams(n=1, presence_penalty=0.0, frequency_penalty=0.0, repetition_penalty=1.0, temperature=0.36, top_p=0.9, top_k=-1, min_p=0.0, seed=None, stop=[], stop_token_ids=[], bad_words=[], include_stop_str_in_output=False, ignore_eos=False, max_tokens=131012, min_tokens=0, logprobs=None, prompt_logprobs=None, skip_special_tokens=True, spaces_between_special_tokens=True, truncate_prompt_tokens=None, guided_decoding=None)
INFO  PyProcess W-3507-08a20b291e447c9-stdout: INFO::[RequestId=92a42946-013e-4b18-99fc-39dd6085ea0c] parsed and scheduled for inference

override sampling params in payload

INFO  WorkerThread Starting worker thread WT-0001 for model meta_llama_Llama_3_1_8B_Instruct_-1887791483 (M-0001, READY) on device gpu(0)
INFO  ModelServer Initialize BOTH server with: EpollServerSocketChannel.
INFO  ModelServer BOTH API bind to: http://0.0.0.0:8080
INFO  PyProcess W-2859-08a20b291e447c9-stdout: INFO 02-06 22:44:17 chat_utils.py:330] Detected the chat template content format to be 'string'. You can set `--chat-template-content-format` to override this.
INFO  PyProcess W-2859-08a20b291e447c9-stdout: INFO::default sampling params are default_sampling_params={'temperature': 0.36, 'top_p': 0.9}
INFO  PyProcess W-2859-08a20b291e447c9-stdout: INFO::using sampling params sampling_params=SamplingParams(n=1, presence_penalty=0.0, frequency_penalty=0.0, repetition_penalty=1.0, temperature=0.6, top_p=0.9, top_k=-1, min_p=0.0, seed=None, stop=[], stop_token_ids=[], bad_words=[], include_stop_str_in_output=False, ignore_eos=False, max_tokens=512, min_tokens=0, logprobs=None, prompt_logprobs=None, skip_special_tokens=True, spaces_between_special_tokens=True, truncate_prompt_tokens=None, guided_decoding=None)
INFO  PyProcess W-2859-08a20b291e447c9-stdout: INFO::[RequestId=ca3db734-fbf8-4e17-83d8-5c50aea81200] parsed and scheduled for inference

Type of change

Please delete options that are not relevant.

  • Bug fix (non-breaking change which fixes an issue)
  • Breaking change (fix or feature that would cause existing functionality to not work as expected)
  • New feature (non-breaking change which adds functionality)
  • This change requires a documentation update

Checklist:

  • Please add the link of Integration Tests Executor run with related tests.
  • Have you manually built the docker image and verify the change?
  • Have you run related tests? Check how to set up the test environment here; One example would be pytest tests.py -k "TestCorrectnessLmiDist" -m "lmi_dist"
  • Have you added tests that prove your fix is effective or that this feature works?
  • Has code been commented, particularly in hard-to-understand areas?
  • Have you made corresponding changes to the documentation?

Feature/Issue validation/testing

Please describe the Unit or Integration tests that you ran to verify your changes and relevant result summary. Provide instructions so it can be reproduced.
Please also list any relevant details for your test configuration.

  • Test A
    Logs for Test A

  • Test B
    Logs for Test B

@siddvenk siddvenk requested review from zachgk and a team as code owners February 6, 2025 22:48
@siddvenk siddvenk merged commit d23acca into deepjavalibrary:master Feb 6, 2025
9 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants