-
Notifications
You must be signed in to change notification settings - Fork 37
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Why the Gated CNN Blocks are not 24 layers? #5
Comments
They can also set 8 layers, but they did not either. Man! |
its worth discussing and i think its necessary to reproduce the code and change the layers to test result. |
It will be a meaningful work! |
就你这个issue显得格格不入。伟大无需多言! |
《changed the title 什么罐头我说? Why the Gated CNN Blocks are not 24 layers?》hhhhhh |
Thank you so much for your suggestion. We released MambaOut-Kobe model, a Kobe Memorial version with 24 Gated CNN blocks. MambaOut-Kobe achieves really competitive performance, surpassing ResNet-50 and ViT-S with much fewer parameters and FLOPs. For example, MambaOut-Kobe outperforms ViT-S by 0.2% accuracy with only 41% parameters and 33% FLOPs.
|
Man! Hahahaha |
its a meaningful work, what can i say? |
What a great suggestion! |
I think its necessary to set 24 layers of MambaOut in memory of Kobe Bryant.
The text was updated successfully, but these errors were encountered: