-
Notifications
You must be signed in to change notification settings - Fork 705
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
🚀Add SuperSimpleNet model #2428
🚀Add SuperSimpleNet model #2428
Conversation
Signed-off-by: Blaz Rolih <blaz.rolih@gmail.com> Signed-off-by: blaz.rolih <blaz.rolih@fri.uni-lj.si>
Signed-off-by: Blaz Rolih <blaz.rolih@gmail.com> Signed-off-by: blaz.rolih <blaz.rolih@fri.uni-lj.si>
Signed-off-by: blaz-r <blaz.rolih@gmail.com> Signed-off-by: blaz.rolih <blaz.rolih@fri.uni-lj.si>
Signed-off-by: blaz-r <blaz.rolih@gmail.com> Signed-off-by: blaz.rolih <blaz.rolih@fri.uni-lj.si>
Signed-off-by: blaz-r <blaz.rolih@gmail.com> Signed-off-by: blaz.rolih <blaz.rolih@fri.uni-lj.si>
Signed-off-by: blaz-r <blaz.rolih@gmail.com> Signed-off-by: blaz.rolih <blaz.rolih@fri.uni-lj.si>
Signed-off-by: blaz-r <blaz.rolih@gmail.com> Signed-off-by: blaz.rolih <blaz.rolih@fri.uni-lj.si>
Signed-off-by: blaz-r <blaz.rolih@gmail.com> Signed-off-by: blaz.rolih <blaz.rolih@fri.uni-lj.si>
Signed-off-by: blaz-r <blaz.rolih@gmail.com> Signed-off-by: blaz.rolih <blaz.rolih@fri.uni-lj.si>
Signed-off-by: blaz-r <blaz.rolih@gmail.com> Signed-off-by: blaz.rolih <blaz.rolih@fri.uni-lj.si>
Signed-off-by: blaz-r <blaz.rolih@gmail.com> Signed-off-by: blaz.rolih <blaz.rolih@fri.uni-lj.si>
Signed-off-by: blaz-r <blaz.rolih@gmail.com> Signed-off-by: blaz.rolih <blaz.rolih@fri.uni-lj.si>
Signed-off-by: blaz-r <blaz.rolih@gmail.com> Signed-off-by: blaz.rolih <blaz.rolih@fri.uni-lj.si>
Signed-off-by: blaz-r <blaz.rolih@gmail.com> Signed-off-by: blaz.rolih <blaz.rolih@fri.uni-lj.si>
Signed-off-by: Blaz Rolih <blaz.rolih@gmail.com> Signed-off-by: blaz.rolih <blaz.rolih@fri.uni-lj.si>
Signed-off-by: blaz.rolih <blaz.rolih@fri.uni-lj.si>
Signed-off-by: blaz-r <blaz.rolih@gmail.com> Signed-off-by: blaz.rolih <blaz.rolih@fri.uni-lj.si>
Signed-off-by: blaz.rolih <blaz.rolih@fri.uni-lj.si>
Signed-off-by: blaz.rolih <blaz.rolih@fri.uni-lj.si>
6acde98
to
3c65da6
Compare
This is amazing @blaz-r! Thanks a lot! I'll review it shortly |
Signed-off-by: blaz.rolih <blaz.rolih@fri.uni-lj.si>
Signed-off-by: blaz.rolih <blaz.rolih@fri.uni-lj.si>
I have fixed the linter error and formated the files. The tests for SuperSimpleNet are still failing on onnx and openvino export locally. I'm not entirely sure how to fix this, but I'll take a lot when I get the time. |
Tests are failing on imgaug, somethign with numpy version incompatiblity it seems. |
@blaz-r, we will need to remove |
directly predicts the anomaly map and score. The predicted anomaly map is upscaled to match the input image size | ||
and refined with a Gaussian filter. | ||
|
||
This implementation supports both unsupervised and supervised setting, but Anomalib currently supports only unsupervised learning. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
What are the missing points in Anomalib to support supervised setting
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Right now I believe there are no standard supervised datasets. Another problem is the Folder dataset as it assumes that abnormal samples are always in test set:
anomalib/src/anomalib/data/image/folder.py
Lines 174 to 175 in bcc0b43
samples.loc[(samples.label == DirType.NORMAL), "split"] = Split.TRAIN | |
samples.loc[(samples.label == DirType.ABNORMAL) | (samples.label == DirType.NORMAL_TEST), "split"] = Split.TEST |
Another thing for full reproduction of SuperSimpleNet results is the fixed flipping augmentation and frequency sampling. This is however not necessary, but needed for best results. It's also not SuperSimpleNet specific, so might be worth considering if other supervised model will be supported.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
yeah, we would like to diversify the model pool, and include more learning types than one-class models. Thanks for the feedback.
@abc-125, you might want to be aware of this discussion as you have recently worked on this stuff
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thank you for adding me, it would be great to have supervised models and datasets in Anomalib. Recently, I looked at how to add a supervised dataset, and it certainly would require changing some base structures, such as paths to folders (I guess we will need abnormal_train_dir
and maybe renaming the rest to make it easier to understand, normal_train_dir
, etc.):
anomalib/src/anomalib/data/image/folder.py
Lines 195 to 202 in bcc0b43
normal_dir (str | Path | Sequence): Path to the directory containing normal images. | |
root (str | Path | None): Root folder of the dataset. | |
Defaults to ``None``. | |
abnormal_dir (str | Path | Sequence | None, optional): Path to the directory containing abnormal images. | |
Defaults to ``None``. | |
normal_test_dir (str | Path | Sequence | None, optional): Path to the directory containing | |
normal images for the test dataset. | |
Defaults to ``None``. |
@blaz-r can you pull the latest changes again? The tests should pass after that |
Okay, I am currently at the ICPR. I will try to sort this out when I get some time. |
ah nice, enjoy the conference! |
…ature/supersimplenet # Conflicts: # src/anomalib/models/components/feature_extractors/torchfx.py
Signed-off-by: blaz-r <blaz.rolih@gmail.com>
Signed-off-by: blaz-r <blaz.rolih@gmail.com>
Signed-off-by: blaz-r <blaz.rolih@gmail.com>
This is now using the update perlin noise code and includes the latest v2 changes. |
Signed-off-by: blaz-r <blaz.rolih@gmail.com>
Signed-off-by: blaz-r <blaz.rolih@gmail.com>
Signed-off-by: blaz-r <blaz.rolih@gmail.com>
Signed-off-by: blaz-r <blaz.rolih@gmail.com>
…ature/supersimplenet
Can you run the checks for this please @djdameln @samet-akcay |
Codecov ReportAttention: Patch coverage is
Additional details and impacted files@@ Coverage Diff @@
## release/v2.0.0 #2428 +/- ##
==================================================
+ Coverage 78.53% 78.93% +0.39%
==================================================
Files 303 311 +8
Lines 12934 13197 +263
==================================================
+ Hits 10158 10417 +259
- Misses 2776 2780 +4
Flags with carried forward coverage won't be shown. Click here to find out more. ☔ View full report in Codecov by Sentry. |
@blaz-r, the tests seem to pass. I'll give another review and we could merge. Thanks for this great contribution! |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I only have a single comment...
Anomalib's model naming convention is usually snake-case to camel case for model directory vs model class implementation. If this is followed, supersimplenet
would be super_simple_net
as the class name is SuperSimpleNet
. Which one do you think is better in this case?
I think that |
Signed-off-by: blaz-r <blaz.rolih@gmail.com>
I have now renamed the class to follow the naming convention. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Very nice and clean implementation, thanks a lot Blaž!
I only have some minor comments, mainly some naming issues.
optim = AdamW( | ||
[ | ||
{ | ||
"params": self.model.adaptor.parameters(), | ||
"lr": 0.0001, | ||
}, | ||
{ | ||
"params": self.model.segdec.parameters(), | ||
"lr": 0.0002, | ||
"weight_decay": 0.00001, | ||
}, | ||
], | ||
) | ||
sched = MultiStepLR( | ||
optim, | ||
milestones=[int(self.trainer.max_epochs * 0.8), int(self.trainer.max_epochs * 0.9)], | ||
gamma=0.4, | ||
) | ||
return [optim], [sched] |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We prefer to use full variables names, i.e. optimizer
and scheduler
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I update this and some other names as well.
@property | ||
def learning_type(self) -> LearningType: | ||
"""Return the learning type of the model. | ||
|
||
Returns: | ||
LearningType: Learning type of the model. | ||
""" | ||
return LearningType.ONE_CLASS |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I guess this would change when we add support for the supervised mode to the overall Anomalib pipeline. Maybe you could add a comment about this, as it might confuse users. Just a brief mention in the docstring would be sufficient.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I added a comment clarifying this.
def init_weights(m: nn.Module) -> None: | ||
"""Init weight of the model. | ||
|
||
Args: | ||
m (nn.Module): torch module. | ||
""" | ||
if isinstance(m, nn.Linear | nn.Conv2d): | ||
nn.init.xavier_normal_(m.weight) | ||
elif isinstance(m, nn.BatchNorm1d | nn.BatchNorm2d): | ||
nn.init.constant_(m.weight, 1) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Again, I would prefer a full variable name here for readability.
def init_weights(m: nn.Module) -> None: | |
"""Init weight of the model. | |
Args: | |
m (nn.Module): torch module. | |
""" | |
if isinstance(m, nn.Linear | nn.Conv2d): | |
nn.init.xavier_normal_(m.weight) | |
elif isinstance(m, nn.BatchNorm1d | nn.BatchNorm2d): | |
nn.init.constant_(m.weight, 1) | |
def init_weights(module: nn.Module) -> None: | |
"""Init weight of the model. | |
Args: | |
module (nn.Module): torch module. | |
""" | |
if isinstance(module, nn.Linear | nn.Conv2d): | |
nn.init.xavier_normal_(module.weight) | |
elif isinstance(module, nn.BatchNorm1d | nn.BatchNorm2d): | |
nn.init.constant_(module.weight, 1) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is now changed.
return sum(feature.shape[1] for feature in features.values()) | ||
|
||
|
||
class FeatureAdaptor(nn.Module): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I would prefer FeatureAdapter
, since we try to use American English throughout the code base. but I guess it would cause a discrepancy with the official implementation in your own repo.
@samet-akcay @ashwinvaidya17 thoughts?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
as someone based in the UK, not sure what to say here 😁
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I changed this to FeatureAdapter, I think it's clear enough when compared with the official code 😄
from anomalib.data.utils.generators import generate_perlin_noise | ||
|
||
|
||
class SSNAnomalyGenerator(nn.Module): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Not sure about the naming here. Why not simply use AnomalyGenerator
? You don't use the SSN
prefix for the other submodules.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is now just AnomalyGenerator
.
Signed-off-by: blaz-r <blaz.rolih@gmail.com>
Signed-off-by: blaz-r <blaz.rolih@gmail.com>
Thank you for the review. The comments are now addressed. Let me know if anything else needs to be sorted. |
Seems like all checks pass but one that is failing due to some dead links. I think this is unrelated to my PR, or did I break something in docs? |
no worries, it is related to some new documentation URL, which is not on the readthedocs yet |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for the awesome work!
a304221
into
openvinotoolkit:release/v2.0.0
📝 Description
✨ Changes
Select what type of change your PR is:
✅ Checklist
Before you submit your pull request, please make sure you have completed the following steps:
For more information about code review checklists, see the Code Review Checklist.