-
Notifications
You must be signed in to change notification settings - Fork 498
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Can PyTorch/XLA wheel for release branch build with cxx_abi disabled? #5325
Comments
Hey, First some background information. Building process
To answer your first question, cxx_abi is explicitly enabled for the XLA computation client library. It's not explicitly set for PyTorch_XLA, but I see in the logs it's set anyway (search for
Applying patches Sure, it's easy to apply any patches with Ansible. Example of applying TF Testing your changes locally You can test your changes locally (assuming you have docker installed) by In the
Hope it helps, |
Thanks @mateuszlewko . I'll give it a try. |
I looks I need to update the tag v2.0.0 which I did. But the build still failed: log:
Notice commit
cc: @ManfeiBai |
I'm able to create a new r2.0 wheel now. |
Hi @mateuszlewko ,
I wonder if the new wheel build process (with ansible) can disable cxx_abi when it builds a torch_xla wheel for a release branch (such as r2.0). We recently build a torch_xla wheel (on pt/xla branch r2.0, cuda 11.8, python=3.10). From the log, it seems it still enables cxx_abi (I see
-D_GLIBCXX_USE_CXX11_ABI=1
in the log above, which make me think it enables cxx_abi. Please correct me if I'm wrong.). Building an official torch_xla with cxx_abi enabled causes torch_xla wheel to be incompatible with torch's wheel.What we used to do in the release branch, is to first apply a torch patch (as in this pr), then disable the cxx_abi (as in this pr). So my question is
Thanks.
cc: @JackCaoG @miladm
The text was updated successfully, but these errors were encountered: