Skip to content
This repository has been archived by the owner on Sep 18, 2024. It is now read-only.

RuntimeError: Has not supported infering output shape from input shape for module/function: prim::TupleUnpack, .prim::TupleUnpack.344 #3467

Closed
faj2007 opened this issue Mar 22, 2021 · 3 comments

Comments

@faj2007
Copy link

faj2007 commented Mar 22, 2021

Environment:

  • NNI version:2.1

Error when in run model speedup. anyone has an idea?

  • NNI mode (local|remote|pai):
  • Client OS:ubuntu 18.04
  • Server OS (for remote mode only):
  • Python version:3.6
  • PyTorch/TensorFlow version:1.6
  • Is conda/virtualenv/venv used?:no
  • Is running in Docker?:no
@colorjam
Copy link
Contributor

Hi @faj2007, could you please provide the code snippet of your model?

@xuezu29
Copy link

xuezu29 commented Apr 6, 2021

Hi, I encountered the same problem. Have you solved it? ths a lot!

@zheng-ningxin
Copy link
Contributor

Hi all, thanks for the valuable feedback and sincerely sorry for the late reply.
There is a same issue on the TupleUnpack, please reference it for more details.
#3645
Thanks~

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Projects
None yet
Development

No branches or pull requests

6 participants