Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

cleanup prototype transforms functional test #5668

Merged
merged 2 commits into from
Mar 23, 2022

Conversation

pmeier
Copy link
Collaborator

@pmeier pmeier commented Mar 23, 2022

The goal of this PR is to facilitate best practices for our tests. In a normal scenario, these changes would be considered nitpicks. In the light of #5520 and new contributors usually (and rightfully) copy-paste code from similar tests, I think it would be beneficial to have proper usages early on.

There are two patterns I found:

  1. The constructor of the new features accepts a device kwarg. Thus, something like features.Image(...).to(device) is an anti-pattern and should be replaced by features.Image(..., device=device).
  2. torch.testing.assert_close is able to handle nested sequence comparison. For example, in the bounding box reference tests, we want to test the output tensor of a kernel against a two level list of values computed with another library. Instead of writing custom comparison logic for this scenario, we can simply do torch.tensor(tensor.tolist(), list_of_lists).

cc @vfdev-5

@facebook-github-bot
Copy link

facebook-github-bot commented Mar 23, 2022

💊 CI failures summary and remediations

As of commit 373feaf (more details on the Dr. CI page):


None of the CI failures appear to be your fault 💚



🚧 3 ongoing upstream failures:

These were probably caused by upstream breakages that are not fixed yet.


This comment was automatically generated by Dr. CI (expand for details).

Please report bugs/suggestions to the (internal) Dr. CI Users group.

Click here to manually regenerate this comment.

@vfdev-5
Copy link
Collaborator

vfdev-5 commented Mar 23, 2022

@pmeier torch.testing is in beta and may change without warning in future PyTorch releases, how problematic this could be to maintain ?

@pmeier
Copy link
Collaborator Author

pmeier commented Mar 23, 2022

As of pytorch/pytorch#73348 roughly a month ago, torch.testing is considered stable.

Copy link
Member

@NicolasHug NicolasHug left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks @pmeier , I agree it's worth encouraging good test practices and it's best done by example. Just a random question but LGTM

assert len(output_boxes) == len(expected_bboxes)
for a_out_box, out_box in zip(expected_bboxes, output_boxes.cpu()):
np.testing.assert_allclose(out_box.cpu().numpy(), a_out_box)
torch.testing.assert_close(output_boxes.tolist(), expected_bboxes)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

assert_close refuses to compare a list to a tensor?
IIRC numpy testing utils allow to compare array-likes with array-likes right?

Copy link
Collaborator Author

@pmeier pmeier Mar 23, 2022

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes and yes. numpy testing functions coerce everything into an np.ndarray first and than compare. We consciously opted against that for mainly two reasons:

  1. If you have tests parametrized over operators like PyTorch core does, it is impossible to see if the returned type is correct. For example

    >>> expected = 1.0
    >>> np.testing.assert_allclose(np.array(expected), expected)
    >>> torch.testing.assert_close(torch.tensor(expected), expected)
    TypeError: No comparison pair was able to handle inputs of type <class 'torch.Tensor'> and <class 'float'>.
  2. It allows us to compare sequences of tensors with different shapes elementwise

    >>> shapes = [(2, 3), (2, 4)]
    >>> np.testing.assert_allclose(*([np.ones(shape) for shape in shapes],) * 2)
    ValueError: could not broadcast input array from shape (2,3) into shape (2,)
    >>> torch.testing.assert_close(*([torch.ones(shape) for shape in shapes],) * 2)

    This aligns nicely with the ability of torch.testing.assert_close to compare mappings elementwise, which is not possible with the numpy functions.

@pmeier pmeier merged commit 56fb0bf into pytorch:main Mar 23, 2022
@pmeier pmeier deleted the cleanup-functional-test branch March 23, 2022 18:45
facebook-github-bot pushed a commit that referenced this pull request Apr 5, 2022
Summary: (Note: this ignores all push blocking failures!)

Reviewed By: datumbox

Differential Revision: D35216775

fbshipit-source-id: cc4aa86ec81c3d2b88b601d190ed763f2b4f3b91
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants