You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
AssertionError: to_numpy can only be called on None or a torch.Tensor, got: <tensorrt_bindings.tensorrt.ITensor object at 0x7f72c6108d30> While executing %batch_norm
This is using new export workflow from https://github.com/pytorch/TensorRT/tree/dynamo_export_refactor branch.
The issue seems to be coming from partitioning (using from the torch.compile) workflow where all the constants are being registered as placeholders when a graph copy happens. Hence, constants like weights and biases are now treated as ITensors while the batch norm converter expects them to be constants.
Bug Description
AssertionError: to_numpy can only be called on None or a torch.Tensor, got: <tensorrt_bindings.tensorrt.ITensor object at 0x7f72c6108d30> While executing %batch_norm
This is using new
export
workflow from https://github.com/pytorch/TensorRT/tree/dynamo_export_refactor branch.The issue seems to be coming from partitioning (using from the
torch.compile
) workflow where all the constants are being registered as placeholders when a graph copy happens. Hence, constants like weights and biases are now treated as ITensors while the batch norm converter expects them to be constants.To Reproduce
Expected behavior
It should pass
Environment
conda
,pip
,libtorch
, source):Additional context
The text was updated successfully, but these errors were encountered: