Skip to content

Commit

Permalink
add ad-hoc code section
Browse files Browse the repository at this point in the history
  • Loading branch information
CatIIIIIIII committed Jan 26, 2024
1 parent 68916b5 commit 1cff024
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -316,7 +316,7 @@ We also implement other BERT-like large-scale pre-trained RNA language models fo
## Feature work
We pretrained model from scractch with additional classification head appended to '[CLS]' token. The total loss function is
$\mathcal{L} = \mathcal{L}_{\rm MLM}+\alpha\mathcal{L}_{\rm CLS}$
where $\mathcal{L}_{\rm MLM}$ is mask language model loss and $\mathcal{L}_{\rm MLM}$ is classification loss and we set the balance coefficient $\alpha$ as $0.1$. Other settings are kept same with our original RNAErnie pre-training procedure.
where $\mathcal{L}_{MLM}$ is mask language model loss and $\mathcal{L}_{\rm MLM}$ is classification loss and we set the balance coefficient $\alpha$ as $0.1$. Other settings are kept same with our original RNAErnie pre-training procedure.

The pre-trained model could be downloaded from [Google Drive](https://drive.google.com/drive/folders/13Wmw_1hM-iPdvIhJUPxtH-sR92PUvCVr?usp=sharing) and place the `.pdparams` and `.json` files in the `./output/BERT,ERNIE,MOTIF,ADHOC` folder. Moreover, original pre-trained RNAErnie weight at 1 epoch could be obtained from [Google Drive](https://drive.google.com/drive/folders/1nHyHbnBpSgMVYvTU4T7LjTmmWDE2KiKY?usp=sharing).

Expand Down

0 comments on commit 1cff024

Please sign in to comment.