-
Is it possible to have python extensions built with pybind11 and extensions built with nanobind to interop? For example, library A may expose As always, thanks for the great work on both pybind11 and nanobind! |
Beta Was this translation helpful? Give feedback.
Replies: 4 comments 9 replies
-
It's a hard limitation. nanobind can access a pybind11 object as generic Python object (in the sense of duck typing) and vice versa, but you won't be able to safely unpack it and do C++ calls. Having a tighter coupling would require sharing internal data structures between libraries, and that would ruin the point of nanobind (which is to avoid the complexity of pybind11's internal data structures). I will also point out that interop is impossible even between various pybind11 and compiler versions, which illustrates how hard/messy this is. |
Beta Was this translation helpful? Give feedback.
-
As a follow-up to this question: is it possible to do interop only with return types? In other words, I have a library |
Beta Was this translation helpful? Give feedback.
-
I'm sorry to stray off-topic from nano bind, but, Will, I'm not sure Wenzel was promising that different compiled modules would necessarily be able to share C++ <-> Python mappings just because they _are_ both pybind11 modules.
As of pybind11 ~2.1 I asked about this on pybind11 forums and someone clarified that the type mapping system is not necessarily compatible across pybind11 versions, and so it is (was?) hidden in an ABI-versioned C++ namespace. You may find that C++/Python type mappings are only shared with the same pybind11 version. I believe there is some discussion of this sort of thing in the pybind docs, though, and you should be able to get quick answers on the pybind11 Gitter channel.
To bring this back to nano bind, though: Since nanobind embeds its extra stuff in the PyObject itself, separately compiled (and distributed) modules able to share type mappings when exchanging nb::objects? That's super cool!
|
Beta Was this translation helpful? Give feedback.
-
I have implemented mostly-complete nanobind/pybind interop in a way that doesn't require each library to understand the other's internals. (This was a prerequisite for embarking on a port of a large set of interdependent extension modules from pybind to nanobind at my employer, too large to do in one fell swoop.) It seems to work well and I'm wondering if there would be appetite for upstreaming it. The basic approach is to extend each library with the ability to "export" a
for its own types, and "import" one for a foreign (other-library-bound) type. These go in a new internals map ( If you'd be interested in supporting an interface like this, let me know; I'll clean up what I have and submit a PR. |
Beta Was this translation helpful? Give feedback.
It's a hard limitation. nanobind can access a pybind11 object as generic Python object (in the sense of duck typing) and vice versa, but you won't be able to safely unpack it and do C++ calls. Having a tighter coupling would require sharing internal data structures between libraries, and that would ruin the point of nanobind (which is to avoid the complexity of pybind11's internal data structures).
I will also point out that interop is impossible even between various pybind11 and compiler versions, which illustrates how hard/messy this is.