Official PyTorch implementation of our NeurIPS 2024 paper "Persistence Homology Distillation for Semi-supervised Continual Learning".
We have implemented the pre-processing of CIFAR10
, CIFAR100
, and imagenet100
. When training on CIFAR10
and CIFAR100
, this framework will automatically download it. When training on imagenet100
, you should specify the folder of your dataset in utils/data.py
.
def download_data(self):
assert 0,"You should specify the folder of your dataset"
train_dir = '[DATA-PATH]/train/'
test_dir = '[DATA-PATH]/val/'
- Generate the label_index files based on [https://github.com/brain-research/realistic-ssl-evaluation]. (We have also provide our label_index files in ./data/[DATA NAME]_labelindex)
- Edit the
[MODEL NAME].json
file for global settings. - Edit the hyperparameters in the corresponding
[MODEL NAME].py
file (e.g.,models/icarl.py
). - Run:
python main.py --config=./exps/[MODEL NAME]/[MODEL NAME]_[CLS]_[LABEL_NUM]_[METHOD].json --prefix [LOG_FILE]
where [MODEL NAME] should be chosen from icarl
, der
, [CLS] should be the number of class, [LABEL_NUM] should be the number of annotations, [METHOD] hould be chosen from base
, DSGD
, and PsHD
.
We thank the following repos providing helpful components/functions in our work.
If you find our codes or paper useful, please consider giving us a star or cite with:
@inproceedings{fan2024persistence,
title={Persistence Homology Distillation for Semi-supervised Continual Learning},
author={Fan, Yan and Wang, Yu and Zhu, Pengfei, Chen, Dongyue, and Hu, Qinghua},
booktitle={Advances of Neural Information Processing Systems},
year={2024}
}