Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

add mindspore native graph infer & onnx runtime infer engine #216

Merged
merged 4 commits into from
Oct 18, 2023

Conversation

ltcs11
Copy link

@ltcs11 ltcs11 commented Sep 27, 2023

Thank you for your contribution to the MindYOLO repo.
Before submitting this PR, please make sure:

Motivation

add mindspore native graph infer & onnx runtime infer engine
since after mindspore 2.x, mindspore is able to infer mindIR model in GraphMode without other python pkgs

Test Plan

--

Related Issues and PRs

--

@Ash-Lee233
Copy link
Collaborator

@ltcs11 hi, thank you for your contribution of mindyolo, did you run the mindir and onnx on 310 platform?

@ltcs11
Copy link
Author

ltcs11 commented Oct 11, 2023

@ltcs11 hi, thank you for your contribution of mindyolo, did you run the mindir and onnx on 310 platform?

Just tried on x86 cpu platform

@Ash-Lee233
Copy link
Collaborator

@ltcs11 hi, thank you for your contribution of mindyolo, did you run the mindir and onnx on 310 platform?

Just tried on x86 cpu platform

I tried on x86 cpu, MindIR works well, but when I export ONNX, I got an runtime error
RuntimeError: Can not find key SiLU in convert map. Exporting SiLU operator is not yet supported.
use command: python ./deploy/export.py --config ./configs/yolov8/yolov8n.yaml --weight ./yolov8-n_500e_mAP372-cc07f5bd.ckpt --per_batch_size 1 --file_format ONNX --device_target CPU

can you show me how did you do to get ONNX file, really appreciate :)

@ltcs11
Copy link
Author

ltcs11 commented Oct 11, 2023

@ltcs11 hi, thank you for your contribution of mindyolo, did you run the mindir and onnx on 310 platform?

Just tried on x86 cpu platform

I tried on x86 cpu, MindIR works well, but when I export ONNX, I got an runtime error RuntimeError: Can not find key SiLU in convert map. Exporting SiLU operator is not yet supported. use command: python ./deploy/export.py --config ./configs/yolov8/yolov8n.yaml --weight ./yolov8-n_500e_mAP372-cc07f5bd.ckpt --per_batch_size 1 --file_format ONNX --device_target CPU

can you show me how did you do to get ONNX file, really appreciate :)

just forgot to tell, that I made a custom SiLU using sigmoid and replace all nn.SiLU into my custom SiLU

class EdgeSiLU(nn.Cell):
    """
    SiLU activation function: x * sigmoid(x). To support for onnx export with nn.SiLU.
    """

    def __init__(self):
        super().__init__()

    def construct(self, x):
        return x * ops.sigmoid(x)

BTW, there are some models that can not export into ONNX(or ONNX infer error)

I just test YoloV3

@Ash-Lee233
Copy link
Collaborator

yeah, yolov3 works if use 'EdgeSiLU', but yolov8 gets different predict results from same ckpt in onnx and mindir format, I will notice the risks in README, thanks for your valuable information. looking foward to see you again in mindyolo.

@Ash-Lee233
Copy link
Collaborator

@zhanghuiyao @CaitinZhao please merge this pr, thank you

@@ -32,7 +32,7 @@ def get_parser_infer(parents=None):
parser.add_argument("--img_size", type=int, default=640, help="inference size (pixels)")
parser.add_argument("--seed", type=int, default=2, help="set global seed")

parser.add_argument("--model_type", type=str, default="MindX", help="model type MindX/Lite")
parser.add_argument("--model_type", type=str, default="MindX", help="model type MindX/Lite/MindIR/ONNX")
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Please add usage methods and verified applicability in deploy/README.md(https://github.com/mindspore-lab/mindyolo/blob/master/deploy/README.md)

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

DONE

@ltcs11 ltcs11 requested a review from zhanghuiyao October 17, 2023 11:39
deploy/README.md Outdated
Comment on lines 38 to 43
### MindSpore Test
```shell
python deploy/test.py --model_type MindIR --model_path ./path_to_mindir/weight.mindir --config ./path_to_config/yolo.yaml
e.g.
python deploy/test.py --model_type MindIR --model_path ./yolov5n.mindir --config ./configs/yolov5/yolov5n.yaml
```
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

  1. 可以参考MindX和ONNX新建MINDIR.md
  2. 当前是否仅支持predict.py

@@ -0,0 +1,49 @@
# ONNXRuntime部署指南
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

ONNXRuntime是否换成ONNX.md更好

@ltcs11 ltcs11 requested a review from zhanghuiyao October 18, 2023 03:43
@CaitinZhao CaitinZhao merged commit d1d16b3 into mindspore-lab:master Oct 18, 2023
4 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants