Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Introducing array-like sequence methods __contains__ #17733

Closed
wants to merge 2 commits into from
Closed

Introducing array-like sequence methods __contains__ #17733

wants to merge 2 commits into from

Conversation

bhushan23
Copy link
Contributor

for tensor

Fixes: #17000

@bhushan23
Copy link
Contributor Author

@colesbury where to add documentation for this? (_torch / _tensor)_docs?

also, I don't think we need reversed and cotains aten implementation as this is high level functions and internal functions being aten should be okay

@vadimkantorov
Copy link
Contributor

vadimkantorov commented Mar 7, 2019

hmm, strange, I thought reversed was already added in #9216

@soumith
Copy link
Member

soumith commented Mar 7, 2019

@bhushan23 the docs should go into _tensor_docs i think

torch/tensor.py Outdated Show resolved Hide resolved
torch/tensor.py Outdated Show resolved Hide resolved
@vishwakftw vishwakftw changed the title Introducing array-like sequence methods __contains__ and __reversed__ Introducing array-like sequence methods __contains__ Mar 7, 2019
@bhushan23
Copy link
Contributor Author

@pytorchbot retest this please

torch/tensor.py Outdated
@@ -426,6 +426,10 @@ def __array_wrap__(self, array):
array = array.astype('uint8')
return torch.from_numpy(array)

def __contains__(self, val):
r"""Check if `val` is present"""
return (val == self).any().item()
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

shouldn't you make sure that val is a pytorch/python(or numpy?) scalar first, and return NotImplemented otherwise?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What about the case where we need to check if a row existed in a matrix?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@ssnl throwing NotImplemented in not tensor or scalar.
@vishwakftw NumPy also does not check if a row exists in a matrix.
Current behavior: any one element being present returns true.
Certainly, checking if row exists will be good to have.
Question is whether to make it default or under some option.

Copy link
Collaborator

@ssnl ssnl left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

also test is needed

for tensor

Test added-
1. test_contains
torch/tensor.py Outdated
"""
if torch.is_tensor(element) or isinstance(element, Number):
return (element == self).any().item()
raise NotImplementedError(
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

hmm I think you should return NotImplemented rather than raising NotImplementedError so that the default behavior using __iter__ can be invoked.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

strainge observation

>>> a = torch.arange(10)
>>> 'b' in a   // this returns True, because of fall back to __iter__

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

string 'b'? that doesn't look right...

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yeah.
I checked about NotImplemented.
Looks like we are typecasting it to bool and returning somewhere
bool(NotImplemented) is true
That's why it's not falling back to __iter__ based method

btw, why should we fall back to basic method as it will still lead to error if it's not tensor or scalar

And Isn't specific error by us preferable than relying on standard __iter__ based method to give random error.

ref: https://medium.com/@s16h/pythons-notimplemented-type-2d720137bf41

torch/tensor.py Outdated
element (Tensor or scalar): element to be checked
for presence in current tensor"
"""
if torch.is_tensor(element) or isinstance(element, Number):
Copy link
Collaborator

@ssnl ssnl Mar 8, 2019

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

use isinstance(element, (torch.Tensor, Number)) instead

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

done.

- returning NotImplemented instead of Error
Copy link
Contributor

@facebook-github-bot facebook-github-bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@soumith is landing this pull request. If you are a Facebook employee, you can view this diff on Phabricator.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Tensor class should implement __contains__
8 participants