Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix cann compilation error after llama.cpp supports dynamically loadable backends #9891

Merged
merged 1 commit into from
Oct 16, 2024

Conversation

leo-pony
Copy link
Contributor

@leo-pony leo-pony commented Oct 15, 2024

Changes:

  1. To adapt to the changes in the definition of ggml_backend_event_t, event no longer contains a backend member, and a ggml_backend_device type member is added to support dynamically loadable backends.
  2. Adapt to the change of ggml_backend_cann_event_record function signature. backend is no longer passed in through event, but a ggml_backend_t type parameter is added.

Remarks:
1)This PR fix the issue: "Bug: [CANN] compile failure #9844"
2)[CANN] backend adapts to llama.cpp dynamical backend loading mechanism will be implemented in #9862

@github-actions github-actions bot added the ggml changes relating to the ggml tensor library for machine learning label Oct 15, 2024
@leo-pony
Copy link
Contributor Author

Running inference sucessfully with two npu:
image

@hipudding hipudding added Ascend NPU issues specific to Ascend NPUs bug Something isn't working labels Oct 15, 2024
@hipudding hipudding self-requested a review October 15, 2024 02:54
@leo-pony leo-pony changed the title Fix cann compilation error after ollama supports dynamic loading of b… Fix cann compilation error after ollama supports dynamically loadable backends Oct 15, 2024
@leo-pony leo-pony force-pushed the backend_dynamic_load_adapty branch from 6a98cee to 8f0f762 Compare October 15, 2024 06:30
@leo-pony leo-pony force-pushed the backend_dynamic_load_adapty branch from 0badb30 to 2378a77 Compare October 15, 2024 07:59
@leo-pony leo-pony changed the title Fix cann compilation error after ollama supports dynamically loadable backends Fix cann compilation error after llama.cpp supports dynamically loadable backends Oct 15, 2024
@leo-pony
Copy link
Contributor Author

leo-pony commented Oct 15, 2024

image

@ggerganov example/server/public/index.js seems changed last day. Server check is failed for new PR request because index.js has different from npm.reversehttp.com, and when push with new index.js download from upstream server check passed.

If it is right, I will update index.js with the content from upstream

@ggerganov
Copy link
Owner

I've opened a PR: #9895

Will merge it after the CI is green and then you can rebase.

@hipudding hipudding merged commit becfd38 into ggerganov:master Oct 16, 2024
53 checks passed
@leo-pony leo-pony deleted the backend_dynamic_load_adapty branch October 17, 2024 02:19
drollings pushed a commit to drollings/llama.cpp that referenced this pull request Oct 18, 2024
Fix cann compilation error after merging llama.cpp supports dynamically loadable backends.
dsx1986 pushed a commit to dsx1986/llama.cpp that referenced this pull request Oct 29, 2024
Fix cann compilation error after merging llama.cpp supports dynamically loadable backends.
arthw pushed a commit to arthw/llama.cpp that referenced this pull request Nov 15, 2024
Fix cann compilation error after merging llama.cpp supports dynamically loadable backends.
arthw pushed a commit to arthw/llama.cpp that referenced this pull request Nov 18, 2024
Fix cann compilation error after merging llama.cpp supports dynamically loadable backends.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Ascend NPU issues specific to Ascend NPUs bug Something isn't working ggml changes relating to the ggml tensor library for machine learning
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants