Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

MBart model cannot be loaded #377

Closed
4 tasks
thies1006 opened this issue Sep 9, 2022 · 4 comments
Closed
4 tasks

MBart model cannot be loaded #377

thies1006 opened this issue Sep 9, 2022 · 4 comments
Assignees
Labels
bug Something isn't working onnxruntime Related to ONNX Runtime

Comments

@thies1006
Copy link

System Info

optimum @ git+https://github.com/huggingface/optimum.git@3347a0a75f18b854979dd7e9f78d4c3ebb92852a
transformers==4.21.3
onnxruntime==1.12.1

Who can help?

@lewtun, @michaelbenayoun

Information

  • The official example scripts
  • My own modified scripts

Tasks

  • An officially supported task in the examples folder (such as GLUE/SQuAD, ...)
  • My own task or dataset (give details below)

Reproduction

from optimum.onnxruntime import ORTModelForSeq2SeqLM
from transformers import AutoTokenizer, pipeline

import os
os.makedirs("/tmp/onnx_model")

model = ORTModelForSeq2SeqLM.from_pretrained("facebook/mbart-large-en-ro",from_transformers=True)
model.save_pretrained("/tmp/onnx_model")
model = ORTModelForSeq2SeqLM.from_pretrained("/tmp/onnx_model")

Error:

Traceback (most recent call last):
  File "convert.py", line 12, in <module>
    model = ORTModelForSeq2SeqLM.from_pretrained("/tmp/onnx_model")
  File "/secondary/thies/.virtualenvs/onnx/lib/python3.8/site-packages/optimum/modeling_base.py", line 237, in from_pretrained
    return cls._from_pretrained(
  File "/secondary/thies/.virtualenvs/onnx/lib/python3.8/site-packages/optimum/onnxruntime/modeling_seq2seq.py", line 316, in _from_pretrained
    model = cls.load_model(
  File "/secondary/thies/.virtualenvs/onnx/lib/python3.8/site-packages/optimum/onnxruntime/modeling_seq2seq.py", line 213, in load_model
    decoder_session = onnxruntime.InferenceSession(str(decoder_path), providers=[provider])
  File "/secondary/thies/.virtualenvs/onnx/lib/python3.8/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py", line 347, in __init__
    self._create_inference_session(providers, provider_options, disabled_optimizers)
  File "/secondary/thies/.virtualenvs/onnx/lib/python3.8/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py", line 395, in _create_inference_session
    sess.initialize_session(providers, provider_options, disabled_optimizers)
onnxruntime.capi.onnxruntime_pybind11_state.InvalidArgument: [ONNXRuntimeError] : 2 : INVALID_ARGUMENT : Deserialize tensor onnx::MatMul_3788 failed.Invalid fd was supplied: -1

Expected behavior

.

@thies1006 thies1006 added the bug Something isn't working label Sep 9, 2022
@JingyaHuang JingyaHuang self-assigned this Sep 23, 2022
@michaelbenayoun michaelbenayoun added the onnxruntime Related to ONNX Runtime label Oct 14, 2022
@JingyaHuang
Copy link
Contributor

JingyaHuang commented Oct 21, 2022

By reproducing your code, I got the following output ONNX models. It seems that the decoders are only model proto and the external data files are missing.

image

ORTModels might need to improve the export when the model size exceeds 2GB. Will ask internally and make a fix ASAP.

@JingyaHuang
Copy link
Contributor

Gently tagging @mht-sharma as you are working on the encoder-decoder new exporter #497. FYI there was a bug when exporting the seq2seq model exceeding 2GB, external files seem not have been correctly stored.

@mht-sharma
Copy link
Contributor

Thanks @JingyaHuang. The issue is that the external data files are not copied when the save_pretrained method is called. There is a PR 255 open from @NouamaneTazi which tries to tackle the issue.

@fxmarty
Copy link
Contributor

fxmarty commented Dec 22, 2022

Fixed in #586

@fxmarty fxmarty closed this as completed Dec 22, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working onnxruntime Related to ONNX Runtime
Projects
None yet
Development

No branches or pull requests

5 participants