Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fix special not work for llama-server #8553

Merged
merged 1 commit into from
Jul 18, 2024

Conversation

RunningLeon
Copy link
Contributor

Fix --special not working for llama-server as discussed in #8506

Copy link
Owner

@ggerganov ggerganov left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

There are 3 more instances of llama_token_to_piece in server/utils.hpp that need to become aware of the params.special flag. If you are not sure how to update those, we can merge this PR and I'll fix this later

@RunningLeon
Copy link
Contributor Author

There are 3 more instances of llama_token_to_piece in server/utils.hpp that need to become aware of the params.special flag. If you are not sure how to update those, we can merge this PR and I'll fix this later

@ggerganov hi, yes. I'm not sure how to do that.

@ggerganov ggerganov merged commit 3807c3d into ggerganov:master Jul 18, 2024
53 checks passed
arthw pushed a commit to arthw/llama.cpp that referenced this pull request Jul 27, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants