-
Notifications
You must be signed in to change notification settings - Fork 48
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
add mindspore native graph infer & onnx runtime infer engine #216
Conversation
@ltcs11 hi, thank you for your contribution of mindyolo, did you run the mindir and onnx on 310 platform? |
Just tried on x86 cpu platform |
I tried on x86 cpu, MindIR works well, but when I export ONNX, I got an runtime error can you show me how did you do to get ONNX file, really appreciate :) |
just forgot to tell, that I made a custom SiLU using sigmoid and replace all nn.SiLU into my custom SiLU
BTW, there are some models that can not export into ONNX(or ONNX infer error) I just test YoloV3 |
yeah, yolov3 works if use 'EdgeSiLU', but yolov8 gets different predict results from same ckpt in onnx and mindir format, I will notice the risks in README, thanks for your valuable information. looking foward to see you again in mindyolo. |
@zhanghuiyao @CaitinZhao please merge this pr, thank you |
@@ -32,7 +32,7 @@ def get_parser_infer(parents=None): | |||
parser.add_argument("--img_size", type=int, default=640, help="inference size (pixels)") | |||
parser.add_argument("--seed", type=int, default=2, help="set global seed") | |||
|
|||
parser.add_argument("--model_type", type=str, default="MindX", help="model type MindX/Lite") | |||
parser.add_argument("--model_type", type=str, default="MindX", help="model type MindX/Lite/MindIR/ONNX") |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Please add usage methods and verified applicability in deploy/README.md(https://github.com/mindspore-lab/mindyolo/blob/master/deploy/README.md)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
DONE
deploy/README.md
Outdated
### MindSpore Test | ||
```shell | ||
python deploy/test.py --model_type MindIR --model_path ./path_to_mindir/weight.mindir --config ./path_to_config/yolo.yaml | ||
e.g. | ||
python deploy/test.py --model_type MindIR --model_path ./yolov5n.mindir --config ./configs/yolov5/yolov5n.yaml | ||
``` |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
- 可以参考MindX和ONNX新建MINDIR.md
- 当前是否仅支持predict.py
deploy/ONNXRuntime.md
Outdated
@@ -0,0 +1,49 @@ | |||
# ONNXRuntime部署指南 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
ONNXRuntime是否换成ONNX.md更好
Thank you for your contribution to the MindYOLO repo.
Before submitting this PR, please make sure:
Motivation
add mindspore native graph infer & onnx runtime infer engine
since after mindspore 2.x, mindspore is able to infer mindIR model in GraphMode without other python pkgs
Test Plan
--
Related Issues and PRs
--