Skip to content

Persistence Homology Distillation for Semi-supervised Continual Learning (NeurIPS2024)

License

Notifications You must be signed in to change notification settings

fanyan0411/PsHD

Repository files navigation

Persistence Homology Distillation for Semi-supervised Continual Learning

Yan Fan  Yu Wang*  Pengfei Zhu  Dongyue Chen  Qinghua Hu

Official PyTorch implementation of our NeurIPS 2024 paper "Persistence Homology Distillation for Semi-supervised Continual Learning".

How to run PsHD?

Dependencies

  1. torch 1.8.1
  2. torchvision 0.6.0
  3. tqdm
  4. numpy
  5. scipy
  6. quadprog
  7. POT

Datasets

We have implemented the pre-processing of CIFAR10, CIFAR100, and imagenet100. When training on CIFAR10 and CIFAR100, this framework will automatically download it. When training on imagenet100, you should specify the folder of your dataset in utils/data.py.

    def download_data(self):
        assert 0,"You should specify the folder of your dataset"
        train_dir = '[DATA-PATH]/train/'
        test_dir = '[DATA-PATH]/val/'

Run experiment

  1. Generate the label_index files based on [https://github.com/brain-research/realistic-ssl-evaluation]. (We have also provide our label_index files in ./data/[DATA NAME]_labelindex)
  2. Edit the [MODEL NAME].json file for global settings.
  3. Edit the hyperparameters in the corresponding [MODEL NAME].py file (e.g., models/icarl.py).
  4. Run:
python main.py --config=./exps/[MODEL NAME]/[MODEL NAME]_[CLS]_[LABEL_NUM]_[METHOD].json --prefix [LOG_FILE]

where [MODEL NAME] should be chosen from icarl, der, [CLS] should be the number of class, [LABEL_NUM] should be the number of annotations, [METHOD] hould be chosen from base, DSGD, and PsHD.

Acknowledgments

We thank the following repos providing helpful components/functions in our work.

CITATION

If you find our codes or paper useful, please consider giving us a star or cite with:

@inproceedings{fan2024persistence,
  title={Persistence Homology Distillation for Semi-supervised Continual Learning},
  author={Fan, Yan and Wang, Yu and Zhu, Pengfei, Chen, Dongyue, and Hu, Qinghua},
  booktitle={Advances of Neural Information Processing Systems},
  year={2024}
}

About

Persistence Homology Distillation for Semi-supervised Continual Learning (NeurIPS2024)

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages