-
Notifications
You must be signed in to change notification settings - Fork 870
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
0.7.0 cant load onnx model with bug: 'InferenceSession' object has no attribute 'eval' #2033
Comments
Hmm this was a miss, not sure how that eval statement snuck back - for now please use the older version if you're using ONNX - we will fix this in nightlies soon |
Thank you, but there is also a problem about "InferenceSession" in 0.6.1 version.
|
That's expected you need to override the handle or inference function in your handler. Here's an example https://github.com/pytorch/serve/blob/master/test/pytest/onnx_handler.py Granted I could add a better warning for this, so will do it in the PR linked above |
thank you so much! |
🐛 Describe the bug
0.7.0 cant load onnx model with bug: 'InferenceSession' object has no attribute 'eval'
Error logs
2022-12-13T02:04:52,528 [WARN ] W-9003-yolov5x6_onnx_1.0 org.pytorch.serve.wlm.WorkerLifeCycle - terminateIOStreams() threadName=W-9003-yolov5x6_onnx_1.0-stdout
2022-12-13T02:04:52,528 [INFO ] W-9002-yolov5x6_onnx_1.0-stdout MODEL_LOG - self.handle_connection(cl_socket)
2022-12-13T02:04:52,528 [INFO ] W-9003-yolov5x6_onnx_1.0 org.pytorch.serve.wlm.WorkerThread - Retry worker: 9003 in 1 seconds.
2022-12-13T02:04:52,528 [INFO ] W-9002-yolov5x6_onnx_1.0-stdout MODEL_LOG - File "/home/model-server/.local/lib/python3.8/site-packages/ts/model_service_worker.py", line 154, in handle_connection
2022-12-13T02:04:52,528 [INFO ] W-9002-yolov5x6_onnx_1.0-stdout MODEL_LOG - service, result, code = self.load_model(msg)
2022-12-13T02:04:52,529 [INFO ] W-9002-yolov5x6_onnx_1.0-stdout MODEL_LOG - File "/home/model-server/.local/lib/python3.8/site-packages/ts/model_service_worker.py", line 118, in load_model
2022-12-13T02:04:52,529 [INFO ] W-9002-yolov5x6_onnx_1.0-stdout MODEL_LOG - service = model_loader.load(
2022-12-13T02:04:52,529 [INFO ] W-9002-yolov5x6_onnx_1.0-stdout MODEL_LOG - File "/home/model-server/.local/lib/python3.8/site-packages/ts/model_loader.py", line 135, in load
2022-12-13T02:04:52,529 [INFO ] W-9002-yolov5x6_onnx_1.0-stdout MODEL_LOG - initialize_fn(service.context)
2022-12-13T02:04:52,529 [INFO ] W-9002-yolov5x6_onnx_1.0-stdout MODEL_LOG - File "/home/model-server/.local/lib/python3.8/site-packages/ts/torch_handler/base_handler.py", line 176, in initialize
2022-12-13T02:04:52,529 [INFO ] W-9002-yolov5x6_onnx_1.0-stdout MODEL_LOG - self.model.eval()
2022-12-13T02:04:52,529 [INFO ] W-9002-yolov5x6_onnx_1.0-stdout MODEL_LOG - AttributeError: 'InferenceSession' object has no attribute 'eval'
2022-12-13T02:04:52,598 [INFO ] W-9003-yolov5x6_onnx_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - Stopped Scanner - W-9003-yolov5x6_onnx_1.0-stdout
2022-12-13T02:04:52,599 [INFO ] W-9003-yolov5x6_onnx_1.0-stderr org.pytorch.serve.wlm.WorkerLifeCycle - Stopped Scanner - W-9003-yolov5x6_onnx_1.0-stderr
Installation instructions
No.
using Docker
Model Packaing
Yolov5x6
config.properties
No response
Versions
0.7.0
Repro instructions
No
Possible Solution
No response
The text was updated successfully, but these errors were encountered: