You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
It appears that onnxconverter_common.float16.convert_float_to_float16 doesn't convert external data from float to float16.
If I convert a model that uses external data for parameter values, the conversion appears to work, but then inference fails when getting the external data.
Not a proper solution, but one workaround you could try is to load all external data into the model proto during onnx.load. The loaded model object should have all its external data already initialized in TensorProto.raw_data, so the converter might just treat it as a regular model?
It appears that
onnxconverter_common.float16.convert_float_to_float16
doesn't convert external data from float to float16.If I convert a model that uses external data for parameter values, the conversion appears to work, but then inference fails when getting the external data.
Repro code:
Output:
When an initializer uses external data, I instead get an error calling
convert_float_to_float16
.Repro code:
Output:
If I patch
numpy.frombuffer
to work around this, I get a similar error during inference to the first example.Repro code:
Output:
Versions:
The text was updated successfully, but these errors were encountered: