Skip to content

Fake news classification using pre-trained BERT algorithm with two unsupervised learning tasks in pre training

Notifications You must be signed in to change notification settings

developeravsk/Fake-News-Classification-using-BERT-algorithm

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

1 Commit
 
 

Repository files navigation

Fake-News-Classification-using-BERT-algorithm

Fake news classification using pre-trained BERT algorithm with two unsupervised learning tasks in pre training

BERT stands for Bidirectional Encoder Representations from Transformers. ERT works by radomly masking word tokens and representing each masked word with a vector based on its context. The two applications of BERT are “pre-training” and “fine-tuning”.

About

Fake news classification using pre-trained BERT algorithm with two unsupervised learning tasks in pre training

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published