Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Feature Request: server default system prompt support like -spf in old version support gemma2 #10520

Open
4 tasks done
atozj opened this issue Nov 26, 2024 · 3 comments
Open
4 tasks done
Labels
enhancement New feature or request

Comments

@atozj
Copy link

atozj commented Nov 26, 2024

Prerequisites

  • I am running the latest code. Mention the version if possible as well.
  • I carefully followed the README.md.
  • I searched using keywords relevant to my issue to make sure that I am creating a new issue that is not already open (or closed).
  • I reviewed the Discussions, and have a new and useful enhancement to share.

Feature Description

It seems that -p --prompt is not working.
need -spf system prompt file in llama-server,but it has removed。

#9811
We can probably think about reintroducing the option and use the CLI system prompt as default value for the chat template system prompt when it is not passed by the client.

Motivation

default system prompt make model behave better

Possible Implementation

like -spf in old version

@atozj atozj added the enhancement New feature or request label Nov 26, 2024
@atozj atozj changed the title Feature Request: Feature Request: server default system prompt support like -spf in old version Nov 29, 2024
@atozj atozj changed the title Feature Request: server default system prompt support like -spf in old version Feature Request: server default system prompt support like -spf in old version support gemma2 Dec 11, 2024
@github-actions github-actions bot added the stale label Jan 11, 2025
@atozj
Copy link
Author

atozj commented Jan 19, 2025

give an example of several scenarios I want to implement. Let's assume the startup prompt is 'a', the user system prompt is 'b', and the user prompt is 'c'.
1.When the user does not input a system prompt: The total input will be 'a' + 'c'.
2.When the user inputs a system prompt: The total input will be 'b' + 'c' or 'a' + 'b' + 'c'.
The purpose of this approach is to enhance my friends' experience with open-source AI and make it more enjoyable. Teaching them how to set up prompts and other configurations is often too cumbersome, and they tend to give up quickly. Using a startup prompt effectively simplifies the process and improves the overall user experience.
This translation maintains the technical details while using natural English phrasing. It also preserves the explanatory tone of the original text. Let me know if you need any adjustments!

@github-actions github-actions bot removed the stale label Jan 20, 2025
@ggerganov
Copy link
Owner

I'm not sure what you mean by "startup prompt". Models today only have "system prompt" which is optionally the first message in the chat. Thanks to the prompt caching functionality, the system prompt is always reused automatically.

@atozj
Copy link
Author

atozj commented Jan 24, 2025

maybe like this
[startup prompt]
<|im_start|>system
[user system prompt]<|im_end|>
<|im_start|>user
Hello<|im_end|>
<|im_start|>assistant

or

<|im_start|>system
[startup prompt]+[user system prompt]<|im_end|>
<|im_start|>user
Hello<|im_end|>
<|im_start|>assistant

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

2 participants