You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Thanks for your work!
I am trying to fine-tune the model on some of my data on a multilabel classification task.
From my understanding of the example notebooks, the downstream architectures always take embeddings (generated by ankh-large or ankh-base) as inputs and not tokenized sequences directly.
How can one train the full network (and thus modify the embeddings representations given to a sequence) for a downstream task?
My question is simple and might have an obvious response and I'm sorry if it's the case.
Thanks a lot for your help!
The text was updated successfully, but these errors were encountered:
Hello,
Thanks for your work!
I am trying to fine-tune the model on some of my data on a multilabel classification task.
From my understanding of the example notebooks, the downstream architectures always take embeddings (generated by ankh-large or ankh-base) as inputs and not tokenized sequences directly.
How can one train the full network (and thus modify the embeddings representations given to a sequence) for a downstream task?
My question is simple and might have an obvious response and I'm sorry if it's the case.
Thanks a lot for your help!
The text was updated successfully, but these errors were encountered: