-
Notifications
You must be signed in to change notification settings - Fork 705
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Document the usage of timm as backbone provider #544
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for updating the documentation! I have a minor comment.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
requested changes done
@jpcbertoldo can you have a look at the issues raised by codacy? |
I'll try if i have time today. |
Sure take your time. I just ran the CI and I can see that it fails for linting checks but we can have a call in case |
this was integrated in #576 |
I used convnextv2_huge.fcmae_ft_in22k_in1k_512 to replace the pre-trained network wide_resnet50_2 in patchcore, and used layers=['stages.1', 'stages.2'] for classification. The indicator effect is far worse than wide_resnet50_2. Isn't convnextv2 better than wide_resnet50_2? Why Would it be so bad? |
Description
Add some text in the documentation to make it clearer that timm is used to fetch pre-trained models.
Changes
Checklist