Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We鈥檒l occasionally send you account related emails.

Already on GitHub? Sign in to your account

MetricTracker of a single metric should not accept a list of maximize #1428

Closed
ValerianRey opened this issue Jan 6, 2023 · 3 comments 路 Fixed by #1430
Closed

MetricTracker of a single metric should not accept a list of maximize #1428

ValerianRey opened this issue Jan 6, 2023 · 3 comments 路 Fixed by #1430
Labels
bug / fix Something isn't working help wanted Extra attention is needed

Comments

@ValerianRey
Copy link
Contributor

ValerianRey commented Jan 6, 2023

馃悰 Bug

It can be confusing that MetricTracker can be instantiated with a single metric and still accept a list of booleans as maximize. Further, when calling tracker.best_metric, the metric will always be maximized when maximize is a list and metric is a single Metric (even if maximize is [False]), which is clearly not an expected behavior (the reason for this comes from those two lines in MetricTracker's best_metric method:

if isinstance(self._base_metric, Metric):
    fn = torch.max if self.maximize else torch.min

Here, if self.maximize is any list, the condition will be True.
Raising an error at initialization in such a scenario would be safer.

To Reproduce

Initialize a MetricTracker with a single metric as metric and a list of booleans as maximize.

Code sample

>>> import torch
>>> from torchmetrics import MetricTracker, MeanSquaredError
>>> _ = torch.manual_seed(42)
>>> tracker = MetricTracker(MeanSquaredError(), maximize=[False])
>>> for epoch in range(5):
...     tracker.increment()
...     for batch_idx in range(5):
...         preds, target = torch.randn(100), torch.randn(100)
...         tracker.update(preds, target)
... 
>>> best_acc, which_epoch = tracker.best_metric(return_step=True)
>>> print(best_acc)
2.2481114864349365
>>> print(which_epoch)
4
>>> print(tracker.compute_all())
tensor([1.8218, 2.0268, 1.9491, 1.9800, 2.2481])

=> The metric has been maximized despite maximize being [False]

Expected behavior

Raising a ValueError at the initialization of MetricTracker, indicating that maximize should be a single bool when the metric is a single Metric.

Environment

  • TorchMetrics version: 0.12.0dev
  • Python & PyTorch Version: Python 3.10.6, torch 1.13.1+cu117
  • Any other relevant information such as OS (e.g., Linux): Ubuntu 20.04

Additional context

With the additional support of MultioutputWrapper that I am working on (#1409) this becomes even more confusing, because a MultioutputWrapper is a single Metric and a user could be tempted to give a list of booleans as maximize.

@ValerianRey ValerianRey added bug / fix Something isn't working help wanted Extra attention is needed labels Jan 6, 2023
@Borda
Copy link
Member

Borda commented Jan 6, 2023

good catch, would you be interested in sending a fix? 馃惏

@ValerianRey
Copy link
Contributor Author

Yes, I coded a fix already, just need to make a proper pull request!

@ValerianRey
Copy link
Contributor Author

I'm also curious about why we use a maximize value to begin with.
Couldn't we simply use the metric.higher_is_better instead (and [m.higher_is_better for m in metric.values()] when metric is a MetricCollection)?
It seems that maximize is redundant with higher_is_better.

ValerianRey added a commit to ValerianRey/metrics that referenced this issue Jan 6, 2023
* Add test case to `test_raises_error_on_wrong_input` to check that the initialization of a MetricTracker with a single `metric` and a list `maximize` raises a ValueError
ValerianRey added a commit to ValerianRey/metrics that referenced this issue Jan 6, 2023
* Add check on maximize to be a singe bool when the metric is a single Metric
ValerianRey added a commit to ValerianRey/metrics that referenced this issue Jan 6, 2023
Borda pushed a commit that referenced this issue Jan 9, 2023
Borda pushed a commit that referenced this issue Jan 30, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug / fix Something isn't working help wanted Extra attention is needed
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants