Skip to content

Public repository describing training and testing of AntiBERTa.

License

Notifications You must be signed in to change notification settings

alchemab/antiberta

Repository files navigation

blah

AntiBERTa

This is a repository with Jupyter notebooks describing how we pre-trained the model, and how we apply the model for fine-tuning.

FAQs

  • What dataset did you use? This is described in our paper, but briefly, we used a section of the Observed Antibody Space (OAS) database (Kovaltsuk et al., 2018) for pre-training, and a snapshot of SAbDab (Dunbar et al., 2014) as of 26 August, 2021. We've included small snippets of the OAS database that we used for pre-training, and the paratope prediction datasets under assets.

  • Why HuggingFace? We felt that the maturity of the library and its straight-forward API were key advantages. Not to mention it fits really well with cloud compute architectures like AWS.

About

Public repository describing training and testing of AntiBERTa.

Resources

License

Stars

Watchers

Forks

Packages

No packages published