-
-
Notifications
You must be signed in to change notification settings - Fork 1.2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
META: Python 3.12 package status #6421
Comments
@hgy59 @mreid-tt @SynoCommunity/developers Considering the following, strongly suggesting to migrate to python 3.12 which would be a lot less impactful and wait anothe year or so before switching to a newer python version considering impacts on older DSM6 archs. python311:
python312:
python313:
|
I've added a first set of relatively "easy" wins for early testing: |
Are we considering a big-bang approach where we prepare all the packages for Python 3.1.2 and release them simultaneously? Or should we prioritize quick wins, releasing simpler packages first and gradually tackling the more complex ones? Regarding the current FlexGet PR I have, should I proceed with that change now, or wait until all the packages are ready for release? The main advantage of a big-bang approach is that users would only need to have both Python 3.1.1 and 3.1.2 installed temporarily, rather than having to maintain them for an extended period. However, this approach could introduce more risk and delays, especially with more complex packages. I’d appreciate your thoughts on this. |
no I'm not, I'd rather have a few easy wins first, like the one i started with.
i would be tempted to suggest you release as-is first using python 3.11, and in a week or two once issues if any have been found with your 3.11 release (or the easy wins above) you migrate "as-is" to py312. That's why would recommend. |
@th0ma7 I have taken the octoprint package. Tested on VitualDSM 7.2.2. |
@th0ma7, thanks for the feedback. I'll start with those I've touched before |
I just tested with deluge PR on my armv7 NAS. First off python runs ok, basic testing showed it being functional. When installing deluge this came into the logs:
That can only mean one thing: it either failed to build on github or failed to copy it over to the wheelhouse directory.... looking further into the github logs, it was built sucessfully:
So is the file there or not? indeed it is!
So why wasn't it able to install it? Something is odd, maybe you're hiting a similar issue? may just be with newer pip? EDIT: Getting the exact same error on my x64 nas. EDIT2: Was missing |
Hmm, I've just completed a test build of FlexGet on DSM 6 in #6427 and everything seemed to install and upgrade fine. I'm attaching the install and upgrade logs to the PR for your review if it would help. |
@mreid-tt that's great news! In the last round of wheel build update I didn't change "anything" in how builds are being made, I just changed how the makefiles are being processed. That being said I did one change that I reminded: removal of the EDIT: I just noticed I had not updated the |
@th0ma7 I am working on homeassistant first, trying to build homeassistant 2023.7.3 with python312 (before updating HA I want to know whether it works at runtime - or has issues like octoprint or deluge)
|
Having an early PR I could chime in? |
@hgy59 when invoking |
@th0ma7, building for Build Log
Should I mark |
This comment has been minimized.
This comment has been minimized.
This comment has been minimized.
This comment has been minimized.
It could well be? Can you check if there is a cython also installed? |
My problems seem to be related to my local environment(s) It could be one of the following
So I will restart with a "clean" environment... Sorry for the noise, but I didn't want to create a PR before knowing wheter the current version runs with py312. PS: |
This is not the problem |
Since the |
WARNING: just found that python312-wheel package (and may be others) did not included numpy, greenlet etc. |
python311-wheels is not affected, only python312-wheels and python313-wheels |
@mreid-tt thnx for your involvment in this, it is really appreciated (further as it allows others to learn how this works). Although with @hgy59 finding and my recent new understanding #6430 (comment) we should pause just a little in order to get this last tidbit fixed. Hopefully I may be able to fix this within a week or so (bein optimistic a bit). |
@th0ma7, understood. I’ve been testing a Bazarr branch with these changes for a PR: master...mreid-tt:spksrc:bazarr-update. Everything appeared to build successfully: https://github.com/mreid-tt/spksrc/actions/runs/13002372904/job/36263335064. However, when I installed and ran the software, I noticed that
It appears that it's not being included in the build process at all during the run. |
FWIW I'm fiddling with bazarr upgrade to Python 3.12 (here) |
@hgy59, I'm currently using the script for requirements generation. The latest run is:
I encountered dependency issues with the following plugins: autobpm, bpd, metasync, and replaygain. I've documented these findings in my branch here. Regarding the suggested dependencies:
|
I guess cross/dbus dependency builds ok (depends on cross/libexpat) and only the wheel build fails? EDIT: |
I'm not sure this is the case, see the log extract below: Build Log
|
As an update to Beets, when I run a test with the compiled plugins enabled, I realise that some of them were broken and/or deprecated. I've amended the PR to remove these based on this output:
With
|
Hey @th0ma7, I’ve been working on updating The requirements generation script worked as expected, and I used it as a basis for the updates. However, during the GitHub build, I encountered the following error:
This seemed odd because, in my test environment, After looking into it, I found some suggestions online to install
However, when I tried this in our GitHub build, it failed with:
Any thoughts on how to resolve this? Would it be best to manually add |
The usage of Here's what I did as a workaround:
I haven't tested but maybe the default crossenv would have work as well?
|
Hey @th0ma7, thanks for the feedback! I'll give your approach a shot, but it looks like we might need to add Looking at the
This seems to function similarly to our Also, reviewing the
To maintain consistency, I'd create a |
Hey @th0ma7, thanks for this! Would this solution work for my branch as-is, or does PR #6437 need to be merged first? Also, I'm unsure what references the new |
It won't work as is since the capacity to associate build vs cross environments under the generated crossenv in my other pr used for building wheels needs to be merged first. Although you could in theory remove the That being said, one of the key change is having the ability to create a per-wheel specific crossenv. This allows handling exceptions and avoid needing to include every possible dependencies into the default crossenv that normally suits most use cases. Take numpy as example, older numpy versions require really old build dependencies that are no longer applicable for other wheels. Thus the needs for its own crossenv definition file.. whereas previously we only had one by default to rule them all. In your case atom requires yet another dependency that other wheels normally do not, thus the creation of a python312/crossenv/requirement-atom.txt definition file. To your question, for crossenv def files it handle first foo-version, not foud falls back to foo, then default. |
@th0ma7, thanks for the clarification. I see that the EDIT: Why I ask is that when I was working on Beets I came across this issue:
Then, looking into the file at
This suggests that once we have |
The ordering won't count as each wheel is built individually. Although the version will so it doesn't conflict at install time. |
If reviewing the dbus-python-1.3.2:
numpy-2.1.3:
pygobject-3.50.0:
scikit_learn-1.6.1:
scipy-1.15.1:
soxr-0.5.0.post1:
|
No, this is not the case. @th0ma7 I guess the |
Exactly, and numpy is a dependency in cross most likely (and not build) |
And hopefully everything that is using mesonpy will be solved also with the numpy fixes... Next. |
Hey @th0ma7, thanks for the recent merge! I've rebased my However, all ARM and PPC builds fail with the following errors:
I've tested multiple compiler flags (as seen in the reverted commit) but haven’t had any success. I'm attaching the detailed error logs in case you’d like to take a closer look. Any guidance on resolving this would be greatly appreciated. error_armv7-7.1.txt |
Was it compiling previously? |
No, sorry, this is a new branch. Previously, I couldn't build EDIT: So I downgraded to v2.2.4 but saw the same errors. |
@th0ma7 @hgy59, the more I analyze this Given this, I came across a related issue where similar errors were encountered. In this post (arduino/Arduino#7997 (comment)), the author noted:
Could it be that removing |
If behind the scene pip wheel uses meson to build it won't work as wrong includes such as your example gets into the mix. That's the case for numpy for instance. |
@th0ma7, I was wondering — can we pass arguments to Meson in our spksrc builds? I came across the
Would it be possible to pass an argument like this to override the default behavior? --config-settings=setup-args="-Dimplicit_include_directories=false" This might help as a workaround for the issue we're facing. Let me know your thoughts. |
Yes we can, although one must also pass the cross and native file. But that might help indeed. Issues are, we're not using native file for meson currently and also a meson file must be generated on a per sub-build basis like we do for cmake to ensure build flags are properly set. |
Okay, do you have an example of how we might do this such that I can test it? I don't have any idea how to implement this. Would it be as simple as adding something to the |
I can confirm that it is a cross compilation issue Working: Failing: still failing with
It either does not use the correct toolchain or does not support cross compilation. Another solution would be to build such wheels outside of spksrc by using native environments like manylinux-docker images (or build it native under DSM 7.1). |
Exact, my preliminary patchset works toward building those with proper meson definition files to enforce using proper include and libs, like done for other meson builds. |
can't we disable sanity checks (mesonbuild/meson#12881) ? Just adding
to Or is the meson definition file something else? |
@th0ma7, I did some experimentation in my branch for duplicity and tried passing the argument Reflecting on @hgy59’s comment about the toolchain, I noticed an interesting pattern:
Looking deeper into the logs, the execution parameters were nearly identical, except for the expected DSM version difference. However, one key distinction stood out:
I wonder — if we were to change the platform for aarch64-6.2.4 to the one ending in |
Python 3.12 package status
checkmark: ✔️
xmark: ❌️
Packages formally using python 3.x
2025.1.4 fails to cross compile some wheels for python312 - those are installed from index and not included in the package:
Framework clean-up
The text was updated successfully, but these errors were encountered: