-
Notifications
You must be signed in to change notification settings - Fork 705
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Configurable metrics #230
Configurable metrics #230
Conversation
pixel_f1 = F1(num_classes=1, compute_on_step=False, threshold=self.hparams.model.threshold.pixel_default) | ||
self.image_metrics = MetricCollection([image_auroc, image_f1], prefix="image_").cpu() | ||
self.pixel_metrics = MetricCollection([pixel_auroc, pixel_f1], prefix="pixel_").cpu() | ||
self.image_metrics, self.pixel_metrics = get_metrics(self.hparams) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Is there something planned for classification models which do not have pixel metrics? When I removed the pixel metric key from the config file it threw error for padim
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This should work:
metrics:
image:
- F1
- AUROC
pixel: []
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Works now 🙂
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Not sure why my comments disappeared from here, but ideally
metrics:
image:
- F1
- AUROC
should also work fine.
Works like a charm! |
from .min_max import MinMax | ||
from .optimal_f1 import OptimalF1 | ||
|
||
__all__ = ["AUROC", "OptimalF1", "AdaptiveThreshold", "AnomalyScoreDistribution", "MinMax"] | ||
|
||
|
||
def get_metrics(config: Union[ListConfig, DictConfig]) -> Tuple[AnomalibMetricCollection, AnomalibMetricCollection]: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We might need to modify this for LightningCLI
pixel_f1 = F1(num_classes=1, compute_on_step=False, threshold=self.hparams.model.threshold.pixel_default) | ||
self.image_metrics = MetricCollection([image_auroc, image_f1], prefix="image_").cpu() | ||
self.pixel_metrics = MetricCollection([pixel_auroc, pixel_f1], prefix="pixel_").cpu() | ||
self.image_metrics, self.pixel_metrics = get_metrics(self.hparams) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Not sure why my comments disappeared from here, but ideally
metrics:
image:
- F1
- AUROC
should also work fine.
Might be out of the scope of this PR but we can have a look at this PR Lightning-AI/torchmetrics#709 which updates the base metrics only once. Maybe we can find a way to automatically group metrics. |
Can you fix torchmetrics version in this PR as well |
Description
Changes
Checklist