You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi, @myleott , I meet an AssertionError when I run the demo code to extract features from RoBERTa. It shows the last layer features is not the same as 'all_layers[-1]'.
There is nothing wrong with the size of last_layer_features and the length of all_layers.
The outputs of last_layers_features and all_layers[-1] are shown below:
What's your environment?
fairseq Version (e.g., 1.0 or master):0.9.0
PyTorch Version (e.g., 1.0) 1.4.0
OS (e.g., Linux):linux
How you installed fairseq (pip, source):pip
Build command you used (if compiling from source):
Python version:3.6
CUDA/cuDNN version:10.0
GPU models and configuration:-
Any other relevant information:
The text was updated successfully, but these errors were encountered:
…ne Translation" (#2044)
Summary:
# Before submitting
- [ ] Was this discussed/approved via a Github issue? (no need for typos, doc improvements)
- [ ] Did you read the [contributor guideline](https://github.com/pytorch/fairseq/blob/master/CONTRIBUTING.md)?
- [ ] Did you make sure to update the docs?
- [ ] Did you write any new necessary tests?
## What does this PR do?
Release the code for the paper "Discriminative Reranking for Neural Machine Translation"
## PR review
Anyone in the community is free to review the PR once the tests have passed.
If we didn't discuss your PR in Github issues there's a high chance it will not be merged.
## Did you have fun?
Make sure you had fun coding �
Pull Request resolved: fairinternal/fairseq-py#2044
Reviewed By: michaelauli
Differential Revision: D29628590
Pulled By: an918tw
fbshipit-source-id: 7a52602d495b736573187cc721829aa545d24770
❓ Questions and Help
Before asking:
What is your question?
Hi, @myleott , I meet an AssertionError when I run the demo code to extract features from RoBERTa. It shows the last layer features is not the same as 'all_layers[-1]'.
Code
import torch
roberta = torch.hub.load('pytorch/fairseq', 'roberta.large')
tokens = roberta.encode('Hello world!')
last_layer_features = roberta.extract_features(tokens)
all_layers = roberta.extract_features(tokens, return_all_hiddens=True)
assert torch.all(all_layers[-1] == last_layer_features)
What have you tried?
There is nothing wrong with the size of last_layer_features and the length of all_layers.
The outputs of last_layers_features and all_layers[-1] are shown below:
What's your environment?
pip
, source):pipThe text was updated successfully, but these errors were encountered: