-
Notifications
You must be signed in to change notification settings - Fork 110
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
build Python wheel #521
build Python wheel #521
Conversation
Codecov ReportPatch coverage:
Additional details and impacted files@@ Coverage Diff @@
## master #521 +/- ##
=======================================
Coverage 91.50% 91.50%
=======================================
Files 35 35
Lines 4521 4521
=======================================
Hits 4137 4137
Misses 384 384
☔ View full report in Codecov by Sentry. |
0e979d0
to
2f63a62
Compare
@elalish maybe you can have a look at this later? both windows and macos have trouble finding the installed package. You can use the setup.py to build and inspect the generated wheel file. |
Could it be this? |
I don't think so, we only have a dynamic library file in there and we should have no python files. |
I don't know, seems like package installation didn't go well. Did it work on Linux? Did you ever look into scikit-build: #362 (comment)? Seems like it's designed specifically to streamline our type of building process. Wild guess says if it was easy, they wouldn't have bothered making a package for it... |
no, I haven't tried scikit-build, but I tried another cmake extension for setuptools https://github.com/diegoferigo/cmake-build-extension without success. |
Did you see this, regarding the mac issue? |
yeah I saw that, but we are not calling cibuildwheel from a python script so this should not be an issue, we are running the test from within cibuildwheel (so the virtualenv should be the same). |
I guess I should try scikit-build later. |
4fd2c17
to
13ae702
Compare
ok it seems scikit build works |
@elalish https://cibuildwheel.readthedocs.io/en/stable/deliver-to-pypi/ we probably want to run the build wheel actions on tag/release only as well, as they are too time consuming. |
musllinux_1_1 is too old
the cibuildwheel errors here are weird... will have a look at them later |
Thanks so much for your work on this, it would be great to use this tool from Python! The
in the following PR for a different library: BerkeleyAutomation/python-fcl#70 |
Thanks a lot! |
I found that in order to make python source distribution work, we have to put everything in the top level directory. |
oh and we can switch to build backend as well, which verifies the sdist for us |
ah I have no idea why the macos build keeps running for 2 hours. I guess it is stuck somewhere |
I am not entirely sure if my current way of using MANIFEST.in file to specify source file is idiomatic or not. It seems that it is mainly for data files, but it works. |
speed up compilation a bit
@@ -291,7 +291,7 @@ std::shared_ptr<const PathImpl> CrossSection::GetPaths() const { | |||
if (transform_ == glm::mat3x2(1.0f)) { | |||
return paths_; | |||
} | |||
paths_ = shared_paths(transform(paths_->paths_, transform_)); | |||
paths_ = shared_paths(::transform(paths_->paths_, transform_)); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
What does ::
do?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It search for the transform function defined in the anonymous namespace. By default it should search for the one in the current namespace, i.e. the transform function defined in par.
I found this error when I was experimenting with precompiled headers (minor compile time improvement)
@@ -12,10 +12,6 @@ | |||
// See the License for the specific language governing permissions and | |||
// limitations under the License. | |||
|
|||
#include <thrust/count.h> | |||
#include <thrust/logical.h> | |||
#include <thrust/transform_reduce.h> |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
👍
auto m4 = Manifold::Cube({4, 1, 1}).Transform( | ||
glm::translate(glm::mat4(1.0f), glm::vec3(3.0f, 0, 0))); | ||
glm::mat4x3(glm::translate(glm::mat4(1.0f), glm::vec3(3.0f, 0, 0)))); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Is there a reason we're using Transform
instead of Translate
directly?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
yeah, I found that we actually set GLM_FORCE_EXPLICIT_CTOR
in public.h
, and compile with precompiled header will cause errors. I think it is something related to include order which causes it to pass compilation in normal builds, but error when I tried using precompiled header.
uses: pypa/cibuildwheel@v2.15.0 | ||
- uses: actions/upload-artifact@v3 | ||
with: | ||
path: ./wheelhouse/*.whl |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
So, we upload the artifact to the wheelhouse, but what do with it? Should we add a release action? Should we try TestPyPI first?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Excellent work! I can't wait to see this published as a proper Python package! |
I also looked at nanobind, which said they can compile ABI3 wheels (while pybind11 explicitly said they will not support ABI3), and claims that they are faster both in compilation time and runtime. The port is quite simple, and it seems that their API is a bit more flexible than the one provided by pybind11, although a bit less ergonomic (does not perform list <-> array conversion automatically). The build is also very fast. |
I don't know Python well enough to have an opinion on ABI3 or pybind11 (what is ABI3?). I trust your judgement here. Not sure if @Eric-Vin has any thoughts to add? |
https://pyo3.rs/v0.14.5/building_and_distribution#py_limited_apiabi3 has a good summary:
So basically the idea is to have forward compatibility. Old versions can be used in higher python versions without recompiling. Potentially when python keeps evolving we still can build a limited set of python wheels. (because we need to build one for every supported python version, multiplied by the supported OS) But the problem is that while nanobind can target abi3, it can only target abi3-py312, which is not even released yet... |
another downside of using nanobind is it only supports python 3.8+ (although 3.7 already reached EOL) |
I have no way to judge how much work it would be to switch to nanobind, but if you think it's worthwhile, I'm fine with it. I'm not too concerned about the long builds, but short is nice. It's also funny that it says "By default Python extension modules can only be used with the same Python version they were compiled against." - Does that mean they can in fact work against multiple versions as is? Seems hard to believe a minor version bump would make every module break. |
It depends on the ABI. By default there is no guarantee, so yeah a version bump can make every module break. I think there are also performance concerns making libraries want to use unstable APIs. For Python 3.12, it seems that they introduced some very efficient stable APIs so minibind can target them. (just my guess, I don't know enough about Python internals for this) |
For pybind11 vs nanobind, I will just open another PR and see what others think. I think nanobind will be nicer in the long run but I am not sure if the subtle differences in the API will break existing code that depends on our binding. |
Attempt at #362.
Now the problematic thing is windows...