You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
def test_RNN_torch(num_layers: int,
bidirectional: bool,
use_bias: bool,
hidden_size: int,
input_size: int,
seq_len: int,
batch_first: bool,
batch_size: int):
r'''
Args:
num_layers (int): num_layers to be passed to torch.nn.RNN
bidirectional (bool): whether to build bidirectional RNN or not
use_bias (bool): whether to use bias or not
hidden_size (int): hidden_size of RNN cells
input_size (int): Input features
seq_len (int): Timesteps in input data
batch_first (bool): Whether batch dimension is first or second dimension in input tensor
batch_size (int): Batch size of input. If 0, unbatched input will be fed to network
'''
if batch_first:
input_shape = (batch_size, seq_len, input_size)
else:
input_shape = (seq_len, batch_size, input_size)
pytorch_net = torch.nn.Sequential(
torch.nn.RNN(input_size,
hidden_size,
batch_first=batch_first,
num_layers=num_layers,
bidirectional=bidirectional,
bias=use_bias)
)
scripted_model = torch.jit.trace(pytorch_net.eval(),
torch.randn(input_shape))
mod, params = relay.frontend.from_pytorch(scripted_model,
[('input', input_shape)])
mod = relay.transform.InferType()(mod)
print(mod.astext())
if __name__ == "__main__":
test_RNN_torch(1,
False,
True,
5,
5,
15,
True,
32)
full traceback:
Traceback (most recent call last):
File "lstm_transform.py", line 477, in <module>
test_RNN_torch(1,
File "lstm_transform.py", line 469, in test_RNN_torch
mod, params = relay.frontend.from_pytorch(scripted_model,
File "/home/catalinvladu/Mambaforge/envs/tvm-build/lib/python3.8/site-packages/tvm/relay/frontend/pytorch.py", line 4030, in from_pytorch
converter.report_missing_conversion(op_names)
File "/home/catalinvladu/Mambaforge/envs/tvm-build/lib/python3.8/site-packages/tvm/relay/frontend/pytorch.py", line 3236, in report_missing_conversion
raise NotImplementedError(msg)
NotImplementedError: The following operators are not implemented: ['aten::rnn_tanh']
If this is not the correct place to raise the issue or if my expected result is wrong, please let me know
The text was updated successfully, but these errors were encountered:
You are converting pytorch model (or jit traced model) to "tvm model". And if tvm doesn't have implemented the corresponding equivalent operator (in your case, aten::rnn_tanh), error occurs.
So you need contributors come to help. I can take a stab.
masahi
changed the title
[Bug] relay.frontend.from_pytorch fails for RNN networks
[PyTorch] Missing op aten::rnn_tanhJun 24, 2022
I built a simple RNN network in PyTorch and tried to convert it using
relay.frontend.from_pytorch
interfaceExpected behavior
The network is converted
Actual behavior
NotImplementedError: The following operators are not implemented: ['aten::rnn_tanh'] is raised
Environment
Operating system:
Distributor ID: Ubuntu
Description: Ubuntu 18.04.6 LTS
Release: 18.04
Codename: bionic
TVM version: 0.9.dev0
Steps to build the TVM were followed from: https://tvm.apache.org/docs/install/from_source.html -- no change to config.cmake file
Steps to reproduce
full traceback:
If this is not the correct place to raise the issue or if my expected result is wrong, please let me know
The text was updated successfully, but these errors were encountered: