Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug] infer yolox onnx model used by python infrence sdk failed in new linux server #2159

Closed
3 tasks done
PancakeAwesome opened this issue Jun 7, 2023 · 4 comments
Closed
3 tasks done
Assignees
Labels

Comments

@PancakeAwesome
Copy link

PancakeAwesome commented Jun 7, 2023

Checklist

  • I have searched related issues but cannot get the expected help.
  • 2. I have read the FAQ documentation but cannot get the expected help.
  • 3. The bug has not been fixed in the latest version.

Describe the bug

code running until below:
bboxes, labels, _ = detector(img)
with errors:
[2023-06-07 10:00:27.160] [mmdeploy] [info] [model.cpp:35] [DirectoryModel] Load model: "yolox_s_fast_1xb32-300e-rtmdet-hyp/"
[2023-06-07 10:00:27.207] [mmdeploy] [error] [ort_net.cpp:205] unhandled exception when creating ORTNet: /onnxruntime_src/onnxruntime/core/platform/posix/env.cc:142 onnxruntime::{anonymous}::PosixThread::PosixThread(const char*, int, unsigned int ()(int, Eigen::ThreadPoolInterface), Eigen::ThreadPoolInterface*, const onnxruntime::ThreadOptions&) pthread_setaffinity_np failed

[2023-06-07 10:00:27.207] [mmdeploy] [error] [net_module.cpp:54] Failed to create Net backend: onnxruntime, config: {
"context": {
"device": "",
"model": "",
"stream": ""
},
"input": [
"prep_output"
],
"input_map": {
"img": "input"
},
"is_batched": false,
"module": "Net",
"name": "yolodetector",
"output": [
"infer_output"
],
"output_map": {},
"type": "Task"
}
[2023-06-07 10:00:27.207] [mmdeploy] [error] [task.cpp:99] error parsing config: {
"context": {
"device": "",
"model": "",
"stream": ""
},
"input": [
"prep_output"
],
"input_map": {
"img": "input"
},
"is_batched": false,
"module": "Net",
"name": "yolodetector",
"output": [
"infer_output"
],
"output_map": {},
"type": "Task"
}
load detector>>>>>>>>>>
段错误

Reproduction

image = Image.open(args.image_path)
image = image.convert("RGB")
image = np.array(image)[:, :, ::-1]
print(image.shape)
print(image)
detector = Detector(
model_path=args.model_path, device_name=args.device_name)
print("load detector>>>>>>>>>>")
bboxes, labels, masks = detector(image)

Environment

mmdeploy 1.1.0
mmdeploy-runtime 1.1.0
 $LD_LIBRARY_PATH:/home/admin/model_repository/9835a5de374c8ed1-model/tools/onnxruntime-linux-x64-1.8.1/lib:/usr/local/lib/python3.8/site-packages/mmdeploy_runtime/

Error traceback

[2023-06-07 10:00:27.160] [mmdeploy] [info] [model.cpp:35] [DirectoryModel] Load model: "yolox_s_fast_1xb32-300e-rtmdet-hyp/"
[2023-06-07 10:00:27.207] [mmdeploy] [error] [ort_net.cpp:205] unhandled exception when creating ORTNet: /onnxruntime_src/onnxruntime/core/platform/posix/env.cc:142 onnxruntime::{anonymous}::PosixThread::PosixThread(const char*, int, unsigned int (*)(int, Eigen::ThreadPoolInterface*), Eigen::ThreadPoolInterface*, const onnxruntime::ThreadOptions&) pthread_setaffinity_np failed

[2023-06-07 10:00:27.207] [mmdeploy] [error] [net_module.cpp:54] Failed to create Net backend: onnxruntime, config: {
  "context": {
    "device": "<any>",
    "model": "<any>",
    "stream": "<any>"
  },
  "input": [
    "prep_output"
  ],
  "input_map": {
    "img": "input"
  },
  "is_batched": false,
  "module": "Net",
  "name": "yolodetector",
  "output": [
    "infer_output"
  ],
  "output_map": {},
  "type": "Task"
}
[2023-06-07 10:00:27.207] [mmdeploy] [error] [task.cpp:99] error parsing config: {
  "context": {
    "device": "<any>",
    "model": "<any>",
    "stream": "<any>"
  },
  "input": [
    "prep_output"
  ],
  "input_map": {
    "img": "input"
  },
  "is_batched": false,
  "module": "Net",
  "name": "yolodetector",
  "output": [
    "infer_output"
  ],
  "output_map": {},
  "type": "Task"
}
load detector>>>>>>>>>>
段错误
@PancakeAwesome PancakeAwesome changed the title [Bug] [Bug] infer yolox onnx model used by python infrence sdk failed in new linux server Jun 7, 2023
@PancakeAwesome
Copy link
Author

@irexyc @RunningLeon pleasethank you

@RunningLeon
Copy link
Collaborator

@PancakeAwesome Could you post here your env by running python tools/check_env.py?

@PancakeAwesome
Copy link
Author

@PancakeAwesome Could you post here your env by running python tools/check_env.py?

I'm afraid I can't provide check_env.py results at the moment. mmcv is difficult to install
But what kind of environmental information do you need that I can output manually

@irexyc
Copy link
Collaborator

irexyc commented Jun 7, 2023

Not sure if it is related to daquexian/onnx-simplifier#171.

May use pure onnxruntime api to test if the error still exist.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

3 participants