Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

0.7.0 cant load onnx model with bug: 'InferenceSession' object has no attribute 'eval' #2033

Closed
xzk-seu opened this issue Dec 13, 2022 · 4 comments · Fixed by #2034
Closed
Assignees
Labels
bug Something isn't working triaged Issue has been reviewed and triaged

Comments

@xzk-seu
Copy link

xzk-seu commented Dec 13, 2022

🐛 Describe the bug

0.7.0 cant load onnx model with bug: 'InferenceSession' object has no attribute 'eval'

Error logs

2022-12-13T02:04:52,528 [WARN ] W-9003-yolov5x6_onnx_1.0 org.pytorch.serve.wlm.WorkerLifeCycle - terminateIOStreams() threadName=W-9003-yolov5x6_onnx_1.0-stdout
2022-12-13T02:04:52,528 [INFO ] W-9002-yolov5x6_onnx_1.0-stdout MODEL_LOG - self.handle_connection(cl_socket)
2022-12-13T02:04:52,528 [INFO ] W-9003-yolov5x6_onnx_1.0 org.pytorch.serve.wlm.WorkerThread - Retry worker: 9003 in 1 seconds.
2022-12-13T02:04:52,528 [INFO ] W-9002-yolov5x6_onnx_1.0-stdout MODEL_LOG - File "/home/model-server/.local/lib/python3.8/site-packages/ts/model_service_worker.py", line 154, in handle_connection
2022-12-13T02:04:52,528 [INFO ] W-9002-yolov5x6_onnx_1.0-stdout MODEL_LOG - service, result, code = self.load_model(msg)
2022-12-13T02:04:52,529 [INFO ] W-9002-yolov5x6_onnx_1.0-stdout MODEL_LOG - File "/home/model-server/.local/lib/python3.8/site-packages/ts/model_service_worker.py", line 118, in load_model
2022-12-13T02:04:52,529 [INFO ] W-9002-yolov5x6_onnx_1.0-stdout MODEL_LOG - service = model_loader.load(
2022-12-13T02:04:52,529 [INFO ] W-9002-yolov5x6_onnx_1.0-stdout MODEL_LOG - File "/home/model-server/.local/lib/python3.8/site-packages/ts/model_loader.py", line 135, in load
2022-12-13T02:04:52,529 [INFO ] W-9002-yolov5x6_onnx_1.0-stdout MODEL_LOG - initialize_fn(service.context)
2022-12-13T02:04:52,529 [INFO ] W-9002-yolov5x6_onnx_1.0-stdout MODEL_LOG - File "/home/model-server/.local/lib/python3.8/site-packages/ts/torch_handler/base_handler.py", line 176, in initialize
2022-12-13T02:04:52,529 [INFO ] W-9002-yolov5x6_onnx_1.0-stdout MODEL_LOG - self.model.eval()
2022-12-13T02:04:52,529 [INFO ] W-9002-yolov5x6_onnx_1.0-stdout MODEL_LOG - AttributeError: 'InferenceSession' object has no attribute 'eval'
2022-12-13T02:04:52,598 [INFO ] W-9003-yolov5x6_onnx_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - Stopped Scanner - W-9003-yolov5x6_onnx_1.0-stdout
2022-12-13T02:04:52,599 [INFO ] W-9003-yolov5x6_onnx_1.0-stderr org.pytorch.serve.wlm.WorkerLifeCycle - Stopped Scanner - W-9003-yolov5x6_onnx_1.0-stderr

Installation instructions

No.
using Docker

Model Packaing

Yolov5x6

config.properties

No response

Versions

0.7.0

Repro instructions

No

Possible Solution

No response

@msaroufim
Copy link
Member

Hmm this was a miss, not sure how that eval statement snuck back - for now please use the older version if you're using ONNX - we will fix this in nightlies soon

@msaroufim msaroufim added bug Something isn't working triaged Issue has been reviewed and triaged labels Dec 13, 2022
@xzk-seu
Copy link
Author

xzk-seu commented Dec 13, 2022

Hmm this was a miss, not sure how that eval statement snuck back - for now please use the older version if you're using ONNX - we will fix this in nightlies soon

Thank you, but there is also a problem about "InferenceSession" in 0.6.1 version.

2022-12-13T05:28:18,310 [INFO ] W-9000-yolov5x6_onnx_1.0-stdout MODEL_LOG - Invoking custom service failed.
2022-12-13T05:28:18,310 [INFO ] W-9000-yolov5x6_onnx_1.0-stdout MODEL_LOG - Traceback (most recent call last):
2022-12-13T05:28:18,310 [INFO ] W-9000-yolov5x6_onnx_1.0 org.pytorch.serve.wlm.WorkerThread - Backend response time: 3105
2022-12-13T05:28:18,310 [INFO ] W-9000-yolov5x6_onnx_1.0-stdout MODEL_LOG -   File "/usr/local/lib/python3.8/dist-packages/ts/service.py", line 120, in predict
2022-12-13T05:28:18,311 [INFO ] W-9000-yolov5x6_onnx_1.0-stdout MODEL_LOG -     ret = self._entry_point(input_batch, self.context)
2022-12-13T05:28:18,311 [INFO ] W-9000-yolov5x6_onnx_1.0-stdout MODEL_LOG -   File "/usr/local/lib/python3.8/dist-packages/ts/torch_handler/base_handler.py", line 282, in handle
2022-12-13T05:28:18,311 [INFO ] W-9000-yolov5x6_onnx_1.0-stdout MODEL_LOG -     output = self.inference(data_preprocess)
2022-12-13T05:28:18,311 [INFO ] W-9000-yolov5x6_onnx_1.0-stdout MODEL_LOG -   File "/usr/local/lib/python3.8/dist-packages/ts/torch_handler/base_handler.py", line 229, in inference
2022-12-13T05:28:18,311 [INFO ] W-9000-yolov5x6_onnx_1.0-stdout MODEL_LOG -     results = self.model(marshalled_data, *args, **kwargs)
2022-12-13T05:28:18,311 [INFO ] W-9000-yolov5x6_onnx_1.0-stdout MODEL_LOG - TypeError: 'InferenceSession' object is not callable

@msaroufim
Copy link
Member

msaroufim commented Dec 13, 2022

That's expected you need to override the handle or inference function in your handler. Here's an example https://github.com/pytorch/serve/blob/master/test/pytest/onnx_handler.py

Granted I could add a better warning for this, so will do it in the PR linked above

@xzk-seu
Copy link
Author

xzk-seu commented Dec 13, 2022

thank you so much!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working triaged Issue has been reviewed and triaged
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants