Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

finetune experience or code #150

Open
niatzt opened this issue Sep 5, 2021 · 14 comments
Open

finetune experience or code #150

niatzt opened this issue Sep 5, 2021 · 14 comments

Comments

@niatzt
Copy link

niatzt commented Sep 5, 2021

I tried to finetune CLIP on my own small dataset. The results are not satisfying (probably overfitting). After checking the issues here, I notice other people also have similar problems. Can someone share his successful finetune experience or code.

@ABaldrati
Copy link

Hi @niatzt
In my limited experience of tine-tuning with CLIP I have often noticed how this operation is extremely sensitive to various hyperparameters.
The advice I feel like giving you (which has worked in my case) is:

  • keep CLIP in evaluation mode even during training (i.e. keeping the normalization layers frozen)

  • use very low learning rate (even reaching values like 1e-7 or 1e-8)

  • If you want to fine-tune CLIP together with another network trained from scratch, use different learning rates (lower for CLIP and higher for the other network). You may also consider keeping CLIP frozen a few epochs.

I hope these tips can help you and that your fine-tuning goes successfully.
If you have more questions do not hesitate to ask

@Sierkinhane
Copy link

I have finetuned CLIP on PASCAL VOC2012 dataset using this implementation. When I changed the weight decay from 0.2 to 0.001, it works well on my task.

@iremonur
Copy link

Hi, I aim to finetune the clip but I reached %30 of accuracy at max with the learning rate of 5e-6 and weight decay of 0.2, respectively. How is your accuracy in your experiment? What is the size of your data which you use for fine-tuning? I would be glad if you help me with this.

@dongyun-kim-arch
Copy link

@iremonur Hi! I am wondering how you get the accuracy metrics when training since I could only see the loss... Is there any other function or script I can refer to to plot acc metrics? Thanks!

@jweihe
Copy link

jweihe commented Mar 8, 2022

I have finetuned CLIP on PASCAL VOC2012 dataset using this implementation. When I changed the weight decay from 0.2 to 0.001, it works well on my task.

Could you provide your fine-tune code?thank you~

@Sierkinhane
Copy link

Sierkinhane commented Mar 12, 2022

My code was same with the implementation

@jweihe
Copy link

jweihe commented Mar 18, 2022

My code was same with the implementation

ok, thank u

@ErinZhang1998
Copy link

@ABaldrati What do you mean that you freeze the normalization layers?

@sanjaygunda13
Copy link

My code was same with the implementation

I think this helps in training clip from scratch but not finetune. please let me know if i am wrong i am noob :p in all these stuff

@bwanglzu
Copy link

bwanglzu commented Sep 22, 2022

in case you're searching for some handy solutions, checkout this page https://finetuner.jina.ai/tasks/text-to-image/

it helps help you get your job done.

@zzxslp
Copy link

zzxslp commented Dec 20, 2022

I have finetuned CLIP on PASCAL VOC2012 dataset using this implementation. When I changed the weight decay from 0.2 to 0.001, it works well on my task.

Hi, could you let me know the batch size you are using for fine-tuning CLIP? Thanks!

@zzxslp
Copy link

zzxslp commented Dec 20, 2022

Hi Alberto, could you share some advice on the batch size for fine-tuning CLIP? For pertaining it is the larger the better in most cases, nor sure if that is the same based on your experience?

@zhentingqi
Copy link

Hi, any update in 2024? Is there any method to fine-tune the CLIP model using newest huggingface transformers library (4.35.*)? Thanks!

@tengshaofeng
Copy link

this is my training code, everything gose well. https://github.com/tengshaofeng/finetune-jina-clip-v2

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests