Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Help wanted] How to get the result of executing model_exec.py? #133

Closed
Zeus1116 opened this issue Mar 15, 2024 · 7 comments · Fixed by #134
Closed

[Help wanted] How to get the result of executing model_exec.py? #133

Zeus1116 opened this issue Mar 15, 2024 · 7 comments · Fixed by #134
Labels
bug Something isn't working

Comments

@Zeus1116
Copy link

How do I know if this model has caused an error after executing the "nnsmith.model_exec model.type=onnx backend.type=onnxruntime model.path=nnsmith_output/model.onnx", " nnsmith.model_exec model.type=onnx
backend.type=onnxruntime
model.path=nnsmith_output/model.onnx
cmp.with='{type:tvm, optmax:true, target:cpu}'"?

@lazycal
Copy link
Collaborator

lazycal commented Mar 16, 2024

I think it will show the error in console like this
image

Looks like you can also specify cmp.save=/path/to/folder (https://github.com/ise-uiuc/nnsmith/blob/4061eb7347cc51880329624127113c229a7f6209/nnsmith/config/main.yaml#L65a) to save the bug report.

@Zeus1116
Copy link
Author

I think it will show the error in console like this image

Looks like you can also specify cmp.save=/path/to/folder (https://github.com/ise-uiuc/nnsmith/blob/4061eb7347cc51880329624127113c229a7f6209/nnsmith/config/main.yaml#L65a) to save the bug report.

Thanks! Have you encountered these two types of errors when executing the model_exec command?
image
image
In my execution process, there were a large number of tasks such as TorchModelCPU object has no attribute 'state_dict' and Could not find an implementation for Greater (13) nodes with name '/Creator'

@ganler
Copy link
Member

ganler commented Mar 19, 2024

@Zeus1116 Can you retry with a fresh install of nnsmith? Thanks.

@Zeus1116
Copy link
Author

TorchModelCPU object has no attribute 'state_dict'

Okay, I cloned the latest project. And I rerun the command, and I found that "TorchModelCPU object has no attribute 'state_dict'" will not occur again. However, errors such as"onnxruntime.capi.onnxruntime_pybind11_state.NotImplemented: [ONNXRuntimeError] : 9 : NOT_IMPLEMENTED : Could not find an implementation for Atan(7) node with name '/Atan'
"will still occur.
This is my script to excute the command:
image

and my nnsmith conda enviornment configuration is this:
image
image
image

@Zeus1116
Copy link
Author

@Zeus1116 Can you retry with a fresh install of nnsmith? Thanks.

BTW, if I use the tvm as the end, it will not appear the errors. I guess the error was caused by a version update of onnxruntime.

@ganler
Copy link
Member

ganler commented Mar 19, 2024

So the world is not as perfect as every framework supports every model & operator in every data type :)

Therefore, different frameworks have their own sets of support coverage -- that's what the message is saying -- we are testing the model over ONNXRuntime which does not support this particular operator & datatype.

But no worry, we considered this when building NNSmith -- You can also specify that you want to generate models with ONNXRuntime in the command:

- nnsmith.model_gen model.type=onnx debug.viz=true
+ nnsmith.model_gen model.type=onnx debug.viz=true backend.type=onnxruntime

Meanwhile, it seems that you are interested in just doing fuzzing, then you can just try:

nnsmith.fuzz fuzz.time={TIME} model.type=onnx backend.type=onnxruntime fuzz.root=fuzz_report

@Zeus1116
Copy link
Author

So the world is not as perfect as every framework supports every model & operator in every data type :)

Therefore, different frameworks have their own sets of support coverage -- that's what the message is saying -- we are testing the model over ONNXRuntime which does not support this particular operator & datatype.

But no worry, we considered this when building NNSmith -- You can also specify that you want to generate models with ONNXRuntime in the command:

- nnsmith.model_gen model.type=onnx debug.viz=true
+ nnsmith.model_gen model.type=onnx debug.viz=true backend.type=onnxruntime

Meanwhile, it seems that you are interested in just doing fuzzing, then you can just try:

nnsmith.fuzz fuzz.time={TIME} model.type=onnx backend.type=onnxruntime fuzz.root=fuzz_report

Thank you very much for your help! I am indeed very interested in using nnsmith for model testing, but based on my understanding of nnsmith, the fuzz command in nnsmith can only retain models that have errors in the backend compiler. My goal is to collect all the models generated by nnsmith during the fuzz process and their running results.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants