Skip to content

mzkhan2000/BERT

Repository files navigation

BERT

BERT based models and libraries for the NLP downstrem tasks:

BERT Model: https://arxiv.org/abs/1810.04805 BERT from: https://github.com/google-research/bert

BERT pretrained model: https://github.com/google-research/bert#pre-trained-models

PyTorch Pretrained Bert: https://github.com/huggingface/transformers/tree/v0.4.0

RoBERTa : https://github.com/facebookresearch/fairseq/blob/main/examples/roberta/README.md

RoBERTa from huggingface: https://huggingface.co/docs/transformers/model_doc/roberta

KeyBERT from: https://github.com/MaartenGr/KeyBERT

DocBERT: BERT for Document Classification: https://paperswithcode.com/paper/docbert-bert-for-document-classification

SBERT pretrained model or Sentence-transformers Pretrained Models from: https://www.sbert.net/docs/pretrained_models.html

BioBERT: Domain-specific BERT models: BioBERT is a pre-trained biomedical language representation model for biomedical text mining Ref: https://academic.oup.com/bioinformatics/article/36/4/1234/5566506

ClinicalBERT is trained on clinical notes/Electronic Health Records (EHR). https://arxiv.org/abs/1904.03323

CancerBERT: A Cancer Domain Specific Language Model for Extracting Breast Cancer Phenotypes from Electronic Health Records https://academic.oup.com/jamia/article/29/7/1208/6554005

Geoscience Domain-specific BERT model is recently trained for domain specific downstream NLP task. Geoscience Language Models: https://github.com/NRCan/geoscience_language_models

SciBERT: This is the pretrained model presented in SciBERT: A Pretrained Language Model for Scientific Text, which is a BERT model trained on scientific text. https://huggingface.co/allenai/scibert_scivocab_uncased

About

BERT based models

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published