[Feature]: Conditional Prompt Inclusion in generate
Function for Streaming Efficiency
#8359
Closed
1 task done
Labels
🚀 The feature, motivation and pitch
Title: Conditional Prompt Inclusion in
generate
Function for Streaming EfficiencyFeature Proposal:
This feature introduces a new parameter,
is_return_prompt
, to thegenerate
function invllm/entrypoints/api_server.py
. The parameter allows users to conditionally include the prompt in the generated response, addressing inefficiencies observed in streaming scenarios.Motivation and Pitch:
In the current implementation, the
generate
function always includes the prompt in its response, whether streaming is enabled or not. This results in inefficiencies, especially in streaming mode, where the prompt is repeatedly included with each token update. This behavior can be redundant and slow down the processing, as users typically do not need to see the prompt after it has been provided to the LLM.Proposal:
The proposed feature will add an
is_return_prompt
parameter to thegenerate
function. Whenis_return_prompt
is set toFalse
(the default), the prompt will not be included in the response. When set toTrue
, the prompt will be included as part of the output. This will make the streaming process more efficient and reduce redundancy.Details:
is_return_prompt
(default:False
)is_return_prompt
isTrue
, the prompt is included in the response. Otherwise, the prompt is omitted.Alternatives
No response
Additional context
This feature is particularly relevant for users working with streaming responses, where including the prompt with each token update can hinder performance. The new parameter will provide greater control over the response format, making it more suitable for various use cases and improving overall efficiency.
By incorporating this feature, users can benefit from more streamlined and performant interactions with the
generate
function, especially in scenarios involving continuous or large-scale text generation.Before submitting a new issue...
The text was updated successfully, but these errors were encountered: