-
Notifications
You must be signed in to change notification settings - Fork 3.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
finetune experience or code #150
Comments
Hi @niatzt
I hope these tips can help you and that your fine-tuning goes successfully. |
I have finetuned CLIP on PASCAL VOC2012 dataset using this implementation. When I changed the weight decay from 0.2 to 0.001, it works well on my task. |
Hi, I aim to finetune the clip but I reached %30 of accuracy at max with the learning rate of 5e-6 and weight decay of 0.2, respectively. How is your accuracy in your experiment? What is the size of your data which you use for fine-tuning? I would be glad if you help me with this. |
@iremonur Hi! I am wondering how you get the accuracy metrics when training since I could only see the loss... Is there any other function or script I can refer to to plot acc metrics? Thanks! |
Could you provide your fine-tune code?thank you~ |
My code was same with the implementation |
ok, thank u |
@ABaldrati What do you mean that you freeze the normalization layers? |
I think this helps in training clip from scratch but not finetune. please let me know if i am wrong i am noob :p in all these stuff |
in case you're searching for some handy solutions, checkout this page https://finetuner.jina.ai/tasks/text-to-image/ it helps help you get your job done. |
Hi, could you let me know the batch size you are using for fine-tuning CLIP? Thanks! |
Hi Alberto, could you share some advice on the batch size for fine-tuning CLIP? For pertaining it is the larger the better in most cases, nor sure if that is the same based on your experience? |
Hi, any update in 2024? Is there any method to fine-tune the CLIP model using newest huggingface transformers library (4.35.*)? Thanks! |
this is my training code, everything gose well. https://github.com/tengshaofeng/finetune-jina-clip-v2 |
I tried to finetune CLIP on my own small dataset. The results are not satisfying (probably overfitting). After checking the issues here, I notice other people also have similar problems. Can someone share his successful finetune experience or code.
The text was updated successfully, but these errors were encountered: