-
Notifications
You must be signed in to change notification settings - Fork 495
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Proper sentence-transformers ONNX export support #1589
Proper sentence-transformers ONNX export support #1589
Conversation
Can those extra outputs be supported directly in transformers? Just find the changes a bit hacky, and this is causing errors in optimum neuron subpackage: aws-neuron/aws-neuron-sdk#808 |
I doubt this is feasible, as sentence-transformers adds quite a few features on top of transformers. For example, the sentence embeddings.
I don't find this to be too hacky, as the model_type optimum/optimum/exporters/onnx/model_configs.py Lines 818 to 842 in fc214d4
optimum/optimum/exporters/onnx/model_configs.py Lines 872 to 883 in fc214d4
Happy to refactor if needed though |
@require_torch | ||
@require_vision | ||
@require_sentence_transformers | ||
@pytest.mark.timm_test |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Why is it timm test? @fxmarty
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It is a typo
As reported in #1519, simple mapping
sentence-transformers
totransformers
library allows to use only a subset ofsentence-transformers
library.This PR adds the support of the export of
sentence_embedding
for sentence-transformers models.Examples: