Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[BUG]: nb::c_contig constraint is not enforced for non-contiguous PyTorch arrays #278

Closed
bathal1 opened this issue Aug 23, 2023 · 6 comments

Comments

@bathal1
Copy link

bathal1 commented Aug 23, 2023

Problem description

When binding a function processing tensors, one can specify contiguity flags, like nb::c_contig. Nanobind is expected to ensure the input tensors are contiguous in memory before calling the function.

However, when passing a non-contiguous PyTorch tensor to such a function will raise an error, instead of making it contiguous under the hood.

Modifying the Nanobind example project with the following function definition allows to reproduce this issue.

Function binding

m.def("add", [](nb::ndarray<float, nb::c_contig> a, nb::ndarray<float, nb::c_contig> b) { return 0; }, "a"_a, "b"_a);

Reproducer

from nanobind_example import add
import drjit as dr
import torch
a = torch.ones((3,3))
b = dr.ones(dr.llvm.TensorXf, (3,3))

# DrJit tensors, works as expected
print(b,b)
print(b,b[::-1])

# PyTorch tensors, fails
print(add(a,a))  # Contiguous tensors, returns 0
print(add(a,a.T)) # a.T is non-contiguous, this fails

This raises the following issue

Traceback (most recent call last):
  File "<stdin>", line 11, in <module>
TypeError: add(): incompatible function arguments. The following argument types are supported:
    1. add(a: ndarray[dtype=float32, order='C'], b: ndarray[dtype=float32, order='C']) -> int

Invoked with types: torch.Tensor, torch.Tensor
@wjakob
Copy link
Owner

wjakob commented Aug 23, 2023

Just double-checking: wasn't the error in the cholespy repository about a crash? (Rather than a TypeError mentioned here).

@bathal1
Copy link
Author

bathal1 commented Aug 23, 2023

The bug mentioned a crash, but the reproducer produced this error, without crashing.

@wjakob
Copy link
Owner

wjakob commented Aug 23, 2023

Does that mean that the code in the original issue didn't have the annotation? Otherwise it sounds like there are two separate issues.

@bathal1
Copy link
Author

bathal1 commented Aug 23, 2023

The bindings in cholespy already had the nb::c_contig annotations, prior to this issue.

@wjakob
Copy link
Owner

wjakob commented Aug 23, 2023

Fixed in 23ea320.

@wjakob wjakob closed this as completed Aug 23, 2023
wjakob added a commit that referenced this issue Aug 24, 2023
nanobind ndarrays provide the ``nb::c_contig`` and ``nb::f_contig``
annotations to specify that input arrays must be represented by
contiguous memory blocks in C or Fortran-style ordering. When this is
not the case, the nanobind will by default attempt an implicit
conversion.

This conversion previously failed in some cases: when no underlying
scalar type was specified, and when converting from PyTorch. Those
issues are addressed by this commit.

Fixes issue #278.
@wjakob
Copy link
Owner

wjakob commented Aug 24, 2023

@bathal1 : I released nanobind v1.5.2 -- you might want to re-release cholespy with those fixes.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants