Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

META: Python 3.12 package status #6421

Open
5 of 6 tasks
th0ma7 opened this issue Jan 26, 2025 · 69 comments
Open
5 of 6 tasks

META: Python 3.12 package status #6421

th0ma7 opened this issue Jan 26, 2025 · 69 comments
Assignees
Labels
status/help-wanted update request to update existing package

Comments

@th0ma7
Copy link
Contributor

th0ma7 commented Jan 26, 2025

Python 3.12 package status

checkmark: ✔️
xmark: ❌️

Packages formally using python 3.x

PACKAGE Build Install. Working Published COMMENT
bazarr needs numpy (and TC_GCC > 5)
beets #6447
borgbackup #6444
deluge #6423
domoticz
duplicity
ffsync #6429
fishnet
flexget #6427
haproxy
homeassistant ✔️ ✔️ #6453
2025.1.4 fails to cross compile some wheels for python312 - those are installed from index and not included in the package:
  • av==13.1.0
  • awscrt==0.23.5
  • deebot_client==11.0.0
  • numpy==2.2.0
  • pandas==2.2.3
  • pillow==11.0.0 => use cross/pillow_ha instead
mercurial #6422
octoprint
plexpy-custom Ready for migration but BUG with the package install, to remove?
rdiff-backup
rutorrent #6404 update to python312 pending
sabnzbd #6431
sickchill
salt-master
salt-minion
tvheadend #6424
znc #6425

Framework clean-up

@th0ma7
Copy link
Contributor Author

th0ma7 commented Jan 26, 2025

@hgy59 @mreid-tt @SynoCommunity/developers Considering the following, strongly suggesting to migrate to python 3.12 which would be a lot less impactful and wait anothe year or so before switching to a newer python version considering impacts on older DSM6 archs.

python311:

  • newer pillow now finally compile on all archs!!!

python312:

  • newer numpy finally compile to the detriment that older compatible with gcc-4.9.x no longer works (now DSM7 only)
  • armv5 no longer supported
  • impacted packages: bazarr, homeassistant

python313:

  • greenlet only work with newer gcc (now DSM7 only)
  • armv5 no longer supported
  • impacted packages: bazarr, ffsync, flexget, homeassistant, sickchill

@th0ma7 th0ma7 added update request to update existing package status/help-wanted labels Jan 26, 2025
@th0ma7
Copy link
Contributor Author

th0ma7 commented Jan 26, 2025

I've added a first set of relatively "easy" wins for early testing: deluge, mercurial, tvheadend, znc

@mreid-tt
Copy link
Contributor

Are we considering a big-bang approach where we prepare all the packages for Python 3.1.2 and release them simultaneously? Or should we prioritize quick wins, releasing simpler packages first and gradually tackling the more complex ones?

Regarding the current FlexGet PR I have, should I proceed with that change now, or wait until all the packages are ready for release? The main advantage of a big-bang approach is that users would only need to have both Python 3.1.1 and 3.1.2 installed temporarily, rather than having to maintain them for an extended period. However, this approach could introduce more risk and delays, especially with more complex packages. I’d appreciate your thoughts on this.

@th0ma7
Copy link
Contributor Author

th0ma7 commented Jan 26, 2025

Are we considering a big-bang approach where we prepare all the packages for Python 3.1.2 and release them simultaneously? Or should we prioritize quick wins, releasing simpler packages first and gradually tackling the more complex ones?

no I'm not, I'd rather have a few easy wins first, like the one i started with.

Regarding the current FlexGet PR I have, should I proceed with that change now, or wait until all the packages are ready for release? The main advantage of a big-bang approach is that users would only need to have both Python 3.1.1 and 3.1.2 installed temporarily, rather than having to maintain them for an extended period. However, this approach could introduce more risk and delays, especially with more complex packages. I’d appreciate your thoughts on this.

i would be tempted to suggest you release as-is first using python 3.11, and in a week or two once issues if any have been found with your 3.11 release (or the easy wins above) you migrate "as-is" to py312. That's why would recommend.

@hgy59
Copy link
Contributor

hgy59 commented Jan 26, 2025

@th0ma7 I have taken the octoprint package.
It successfully builds the current and the updated version (1.10.1 and 1.10.3) with python312 but it does not run. There is no error shown, it just terminates and the pid file is gone.
I have the same issue with python311 (only the already published package runs).

Tested on VitualDSM 7.2.2.

@mreid-tt
Copy link
Contributor

mreid-tt commented Jan 26, 2025

@th0ma7, thanks for the feedback. I'll start with those I've touched before flexget, ffsync and then perhaps look at sabnzbd, bazarr and move from there.

@hgy59 hgy59 pinned this issue Jan 26, 2025
@th0ma7
Copy link
Contributor Author

th0ma7 commented Jan 26, 2025

@th0ma7 I have taken the octoprint package. It successfully builds the current and the updated version (1.10.1 and 1.10.3) with python312 but it does not run. There is no error shown, it just terminates and the pid file is gone. I have the same issue with python311 (only the already published package runs).

Tested on VitualDSM 7.2.2.

I just tested with deluge PR on my armv7 NAS. First off python runs ok, basic testing showed it being functional. When installing deluge this came into the logs:

2025/01/26 15:19:28     ERROR: Could not find a version that satisfies the requirement libtorrent==2.0.10 (from versions: none)
2025/01/26 15:19:28     ERROR: No matching distribution found for libtorrent==2.0.10
2025/01/26 15:19:30     ERROR: Python package installation failed

That can only mean one thing: it either failed to build on github or failed to copy it over to the wheelhouse directory....

looking further into the github logs, it was built sucessfully:

2025-01-26T17:22:49.7727317Z Collecting git+https://github.com/arvidn/libtorrent.git@v2.0.10
2025-01-26T17:22:49.7734237Z   Cloning https://github.com/arvidn/libtorrent.git (to revision v2.0.10) to /tmp/pip-req-build-oyf8bquz
2025-01-26T17:22:49.7754763Z   Running command git clone --filter=blob:none --quiet https://github.com/arvidn/libtorrent.git /tmp/pip-req-build-oyf8bquz
2025-01-26T17:22:51.6474808Z   Running command git checkout -q 74bc93a37a5e31c78f0aa02037a68fb9ac5deb41
2025-01-26T17:22:52.0212517Z   Resolved https://github.com/arvidn/libtorrent.git to commit 74bc93a37a5e31c78f0aa02037a68fb9ac5deb41
2025-01-26T17:22:52.0215494Z   Running command git submodule update --init --recursive -q
2025-01-26T17:22:52.5070351Z   Preparing metadata (setup.py): started
2025-01-26T17:22:52.9431697Z   Preparing metadata (setup.py): finished with status 'done'
2025-01-26T17:22:52.9461118Z Building wheels for collected packages: libtorrent
2025-01-26T17:22:52.9487882Z   Building wheel for libtorrent (setup.py): started
2025-01-26T17:23:53.1898128Z   Building wheel for libtorrent (setup.py): still running...
2025-01-26T17:24:53.6521239Z   Building wheel for libtorrent (setup.py): still running...
2025-01-26T17:25:53.7688651Z   Building wheel for libtorrent (setup.py): still running...
2025-01-26T17:26:53.9019848Z   Building wheel for libtorrent (setup.py): still running...
2025-01-26T17:27:00.7920473Z   Building wheel for libtorrent (setup.py): finished with status 'done'
2025-01-26T17:27:00.7981113Z   Created wheel for libtorrent: filename=libtorrent-2.0.10-cp311-cp311-linux_arm.whl size=5034100 sha256=77e258ce31e09bddb25ab91caca247550f86f67f41d6c698246a131a74122301
2025-01-26T17:27:00.7982868Z   Stored in directory: /tmp/pip-ephem-wheel-cache-7rpjsi4i/wheels/67/90/66/613c0360108c8f1fabe47bc78d60c59f6520ebb48ae96c8592
2025-01-26T17:27:00.8002042Z Successfully built libtorrent
2025-01-26T17:27:00.9452691Z ===>  Installing wheel [libtorrent], version [2.0.10], type [crossenv]
2025-01-26T17:27:00.9453510Z ===>  Adding libtorrent==2.0.10 to wheelhouse/requirements-crossenv.txt

So is the file there or not? indeed it is!

root@DS115j-armv7:/var/packages/deluge/target/share/wheelhouse# ls -la /var/packages/deluge/target/share/wheelhouse/
total 10116
drwxr-xr-x 2 sc-deluge synocommunity    4096 Jan 26 12:27 .
drwxr-xr-x 3 sc-deluge synocommunity    4096 Jan 26 12:27 ..
-rw-r--r-- 1 sc-deluge synocommunity  188352 Jan 26 12:27 cffi-1.17.1-cp311-cp311-linux_armv7l.whl
-rw-r--r-- 1 sc-deluge synocommunity 1624032 Jan 26 12:27 cryptography-44.0.0-cp37-abi3-linux_armv7l.whl
-rw-r--r-- 1 sc-deluge synocommunity 3146827 Jan 26 12:27 deluge-2.1.1.dev127-py3-none-any.whl
-rw-r--r-- 1 sc-deluge synocommunity   30302 Jan 26 12:27 GeoIP-1.3.2-cp311-cp311-linux_armv7l.whl
-rw-r--r-- 1 sc-deluge synocommunity 5034100 Jan 26 12:27 libtorrent-2.0.10-cp311-cp311-linux_armv7l.whl
-rw-r--r-- 1 sc-deluge synocommunity   14496 Jan 26 12:27 MarkupSafe-3.0.2-cp311-cp311-linux_armv7l.whl
-rw-r--r-- 1 sc-deluge synocommunity   65767 Jan 26 12:27 rencode-1.0.6-cp311-cp311-linux_armv7l.whl
-rw-r--r-- 1 sc-deluge synocommunity      15 Jan 26 12:27 requirements-abi3.txt
-rw-r--r-- 1 sc-deluge synocommunity     125 Jan 26 12:27 requirements-crossenv.txt
-rw-r--r-- 1 sc-deluge synocommunity     276 Jan 26 12:27 requirements-pure.txt
-rw-r--r-- 1 sc-deluge synocommunity     416 Jan 26 12:27 requirements.txt
-rw-r--r-- 1 sc-deluge synocommunity  211536 Jan 26 12:27 zope.interface-7.2-cp311-cp311-linux_armv7l.whl

So why wasn't it able to install it? Something is odd, maybe you're hiting a similar issue? may just be with newer pip?

EDIT: Getting the exact same error on my x64 nas.

EDIT2: Was missing PYTHON_PACKAGE variable defined with python312 which I'm pretty was the issue. Tested on both x64 and armv7 and fully functional. Will now test with the reverting of the #egg= changes.

@mreid-tt
Copy link
Contributor

mreid-tt commented Jan 26, 2025

Hmm, I've just completed a test build of FlexGet on DSM 6 in #6427 and everything seemed to install and upgrade fine. I'm attaching the install and upgrade logs to the PR for your review if it would help.

@th0ma7
Copy link
Contributor Author

th0ma7 commented Jan 26, 2025

@mreid-tt that's great news! In the last round of wheel build update I didn't change "anything" in how builds are being made, I just changed how the makefiles are being processed.

That being said I did one change that I reminded: removal of the #egg= portion of URL based wheels set for deprecation in future pip releases. The wheel that failed to install is a URL type wheel, so that may only be that, to be confirmed.

EDIT: I just noticed I had not updated the PYTHON_PACKAGE variable... thus libtorrent wheel was advertising a python311 version (actually all cross-compiled wheels which failed to install, starting with libtorrent) libtorrent-2.0.10-cp311-cp311-linux_armv7l.whl

@hgy59
Copy link
Contributor

hgy59 commented Jan 26, 2025

@th0ma7 I am working on homeassistant

first, trying to build homeassistant 2023.7.3 with python312 (before updating HA I want to know whether it works at runtime - or has issues like octoprint or deluge)
Initially some cross wheels failed (added to list above)
Then I updated failing wheels and found versions that build successfully (but I guess HA will not accept those)
But I have still an issue with numpy.
Even when I use numpy==1.26.4 (the same version is included in python312-wheels), and even the default-crossenv has Cython==3.0.11 I get the following error.

../meson.build:37:2: ERROR: Problem encountered: NumPy requires Cython >= 0.29.34

@th0ma7
Copy link
Contributor Author

th0ma7 commented Jan 26, 2025

@th0ma7 I am working on homeassistant

first, trying to build homeassistant 2023.7.3 with python312 (before updating HA I want to know whether it works at runtime - or has issues like octoprint or deluge) Initially some cross wheels failed (added to list above) Then I updated failing wheels and found versions that build successfully (but I guess HA will not accept those) But I have still an issue with numpy. Even when I use numpy==1.26.4 (the same version is included in python312-wheels), and even the default-crossenv has Cython==3.0.11 I get the following error.

../meson.build:37:2: ERROR: Problem encountered: NumPy requires Cython >= 0.29.34

Having an early PR I could chime in?

@th0ma7
Copy link
Contributor Author

th0ma7 commented Jan 27, 2025

@hgy59 when invoking make WHEELS="numpy==1.26.4" wheel-<arch>-<tcversion> what does that gives? Assume all cross were previously completed...

@mreid-tt
Copy link
Contributor

mreid-tt commented Jan 27, 2025

@th0ma7, building for ffsync under #6429 I get an error for the hi3535-6.2.4 build. The issue seems to stem from greenlet as the logs show:

Build Log
  ===>  Compiling wheel [greenlet], version [1.1.3], type [crossenv]
  ===>  make WHEEL_NAME="greenlet" WHEEL_VERSION="1.1.3" crossenv-hi3535-6.2.4
  ===>  make ARCH="hi3535" TCVERSION="6.2.4" WHEEL_NAME="greenlet" WHEEL_VERSION="1.1.3" crossenv
  make[5]: 'crossenv' is up to date.
  ===>  pip build [greenlet], version: [1.1.3]     
  ===>  crossenv: [/github/workspace/spk/ffsync/work-hi3535-6.2.4/crossenv-default]
  ===>  pip: [/github/workspace/spk/ffsync/work-hi3535-6.2.4/crossenv-default/bin/cross-pip]
  ===>  maturin: [/github/workspace/native/python312/work-native/install/usr/local/bin/maturin]
  ===>  _PYTHON_HOST_PLATFORM="arm-cortexa9-linux-gnueabi" PATH=:/github/workspace/native/python312/work-native/install/usr/local/bin:/github/workspace/spk/ffsync/work-hi3535-6.2.4/crossenv-default/bin:/github/workspace/distrib/cargo/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin CMAKE_TOOLCHAIN_FILE= MESON_CROSS_FILE= cross-pip wheel --disable-pip-version-check --no-binary :all: --find-links /github/workspace/distrib/pip --cache-dir /github/workspace/spk/ffsync/work-hi3535-6.2.4/pip --no-deps --wheel-dir /github/workspace/spk/ffsync/work-hi3535-6.2.4/wheelhouse --no-index --no-build-isolation greenlet==1.1.3
  Looking in links: /github/workspace/distrib/pip
  Processing /github/workspace/distrib/pip/greenlet-1.1.3.tar.gz
    Preparing metadata (setup.py): started
    Preparing metadata (setup.py): finished with status 'done'
  Building wheels for collected packages: greenlet
    Building wheel for greenlet (setup.py): started
    Building wheel for greenlet (setup.py): finished with status 'error'
    error: subprocess-exited-with-error
    
    × python setup.py bdist_wheel did not run successfully.
    │ exit code: 1
    ╰─> [160 lines of output]
        running bdist_wheel
        running build
        running build_py
        creating build/lib.linux-arm-cpython-312/greenlet
        copying src/greenlet/__init__.py -> build/lib.linux-arm-cpython-312/greenlet
        creating build/lib.linux-arm-cpython-312/greenlet/tests
        copying src/greenlet/tests/test_weakref.py -> build/lib.linux-arm-cpython-312/greenlet/tests
        copying src/greenlet/tests/test_gc.py -> build/lib.linux-arm-cpython-312/greenlet/tests
        copying src/greenlet/tests/test_throw.py -> build/lib.linux-arm-cpython-312/greenlet/tests
        copying src/greenlet/tests/test_leaks.py -> build/lib.linux-arm-cpython-312/greenlet/tests
        copying src/greenlet/tests/test_generator_nested.py -> build/lib.linux-arm-cpython-312/greenlet/tests
        copying src/greenlet/tests/test_extension_interface.py -> build/lib.linux-arm-cpython-312/greenlet/tests
        copying src/greenlet/tests/test_generator.py -> build/lib.linux-arm-cpython-312/greenlet/tests
        copying src/greenlet/tests/test_greenlet.py -> build/lib.linux-arm-cpython-312/greenlet/tests
        copying src/greenlet/tests/test_tracing.py -> build/lib.linux-arm-cpython-312/greenlet/tests
        copying src/greenlet/tests/test_cpp.py -> build/lib.linux-arm-cpython-312/greenlet/tests
        copying src/greenlet/tests/test_contextvars.py -> build/lib.linux-arm-cpython-312/greenlet/tests
        copying src/greenlet/tests/test_stack_saved.py -> build/lib.linux-arm-cpython-312/greenlet/tests
        copying src/greenlet/tests/__init__.py -> build/lib.linux-arm-cpython-312/greenlet/tests
        copying src/greenlet/tests/test_version.py -> build/lib.linux-arm-cpython-312/greenlet/tests
        running egg_info
        writing src/greenlet.egg-info/PKG-INFO
        writing dependency_links to src/greenlet.egg-info/dependency_links.txt
        writing requirements to src/greenlet.egg-info/requires.txt
        writing top-level names to src/greenlet.egg-info/top_level.txt
        ERROR setuptools_scm._file_finders.git listing git files failed - pretending there aren't any
                                                     ^
        src/greenlet/greenlet.c:560:55: error: ‘_PyCFrame’ has no member named ‘use_tracing’
                 ts__g_switchstack_use_tracing = tstate->cframe->use_tracing;
                                                               ^
        src/greenlet/greenlet.c:621:23: error: ‘_PyCFrame’ has no member named ‘use_tracing’
                 tstate->cframe->use_tracing = ts__g_switchstack_use_tracing;
                               ^
        src/greenlet/greenlet.c:624:15: error: ‘PyThreadState’ has no member named ‘recursion_remaining’
                 tstate->recursion_remaining = (tstate->recursion_limit
                       ^
        src/greenlet/greenlet.c:624:46: error: ‘PyThreadState’ has no member named ‘recursion_limit’
                 tstate->recursion_remaining = (tstate->recursion_limit
                                                      ^
        src/greenlet/greenlet.c: In function ‘g_calltrace’:
        src/greenlet/greenlet.c:105:51: error: ‘_PyCFrame’ has no member named ‘use_tracing’
         #define TSTATE_USE_TRACING(tstate) (tstate->cframe->use_tracing)
                                                           ^
        src/greenlet/greenlet.c:654:5: note: in expansion of macro ‘TSTATE_USE_TRACING’
             TSTATE_USE_TRACING(tstate) = 0;
             ^
        src/greenlet/greenlet.c:105:51: error: ‘_PyCFrame’ has no member named ‘use_tracing’
         #define TSTATE_USE_TRACING(tstate) (tstate->cframe->use_tracing)
                                                           ^
        src/greenlet/greenlet.c:657:5: note: in expansion of macro ‘TSTATE_USE_TRACING’
             TSTATE_USE_TRACING(tstate) =
             ^
        src/greenlet/greenlet.c: In function ‘g_initialstub’:
        src/greenlet/greenlet.c:903:49: error: ‘PyThreadState’ has no member named ‘recursion_limit’
             self->recursion_depth = (PyThreadState_GET()->recursion_limit
                                                         ^
        src/greenlet/greenlet.c:904:51: error: ‘PyThreadState’ has no member named ‘recursion_remaining’
                                      - PyThreadState_GET()->recursion_remaining);
                                                           ^
        error: command '/github/workspace/toolchain/syno-hi3535-6.2.4/work/arm-cortexa9-linux-gnueabi/bin/arm-cortexa9-linux-gnueabi-gcc' failed with exit code 1
        [end of output]
    
    note: This error originates from a subprocess, and is likely not a problem with pip.
    ERROR: Failed building wheel for greenlet
    Running setup.py clean for greenlet
  Failed to build greenlet

Should I mark ARMv7L_ARCHS as unsupported?

@hgy59

This comment has been minimized.

@hgy59

This comment has been minimized.

@th0ma7
Copy link
Contributor Author

th0ma7 commented Jan 27, 2025

It could well be? Can you check if there is a cython also installed?

@hgy59
Copy link
Contributor

hgy59 commented Jan 27, 2025

My problems seem to be related to my local environment(s)

It could be one of the following

  • I share the /spksrc/distrib folder between all my environments (> 20 clones active)
  • I do not run as root but as regular user in the spksrc container (id = 1000, gid=1000)
  • also the home folder of this user and the toolchain, toolkit and kernel folders are shared between the environments

So I will restart with a "clean" environment...

Sorry for the noise, but I didn't want to create a PR before knowing wheter the current version runs with py312.

PS:
I already tested octoprint with a non-shared environment after encountering a problem. Since this did not solve the problem there, I assumed that it is not related to shared environment.

@hgy59
Copy link
Contributor

hgy59 commented Jan 27, 2025

When adding output of maturin list-python to cross-compile-wheel-% target, it shows:

===>  maturin: [/spksrc/spk/homeassistant/work-x64-7.1/../../../native/python312/work-native/install/usr/local/bin/maturin]
🐍 2 python interpreter found:
 - CPython 3.11 at /usr/bin/python3.11
 - CPython 3.12 at /spksrc/native/python312/work-native/install/usr/local/bin/python3.12

can it be, that maturin uses host python (3.11) in this case?

This is not the problem
Successful builds of python312-wheels have the same output.

@mreid-tt
Copy link
Contributor

@th0ma7, building for ffsync under #6429 I get an error for the hi3535-6.2.4 build. The issue seems to stem from greenlet as the logs show:

Build Log

Should I mark ARMv7L_ARCHS as unsupported?

Since the python312-wheels package example lacked support for greenlet 1.1.3, which is used by hi3535, I’ve marked ARMv7L_ARCHS as unsupported in the PR. Additionally, I revisited the requirements based on the source repository, cleaned up, and updated some wheels. Everything now builds cleanly.

@hgy59
Copy link
Contributor

hgy59 commented Jan 27, 2025

WARNING: just found that python312-wheel package (and may be others) did not included numpy, greenlet etc.
fix in progress...

@hgy59
Copy link
Contributor

hgy59 commented Jan 27, 2025

WARNING: just found that python312-wheel package (and may be others) did not included numpy, greenlet etc. fix in progress...

python311-wheels is not affected, only python312-wheels and python313-wheels
Started #6430 to fix this.

@mreid-tt
Copy link
Contributor

@th0ma7, I've submitted the PRs and completed testing for ffsync (#6429) and SABnzbd (#6431). Full logs are attached to each PR for your review. Let me know if anything else is needed.

@th0ma7
Copy link
Contributor Author

th0ma7 commented Jan 28, 2025

@mreid-tt thnx for your involvment in this, it is really appreciated (further as it allows others to learn how this works). Although with @hgy59 finding and my recent new understanding #6430 (comment) we should pause just a little in order to get this last tidbit fixed. Hopefully I may be able to fix this within a week or so (bein optimistic a bit).

@mreid-tt
Copy link
Contributor

@th0ma7, understood. I’ve been testing a Bazarr branch with these changes for a PR: master...mreid-tt:spksrc:bazarr-update. Everything appeared to build successfully: https://github.com/mreid-tt/spksrc/actions/runs/13002372904/job/36263335064.

However, when I installed and ran the software, I noticed that numpy wasn’t included:

Installation log: /var/log/packages/bazarr.log
Starting bazarr ...
tail: /var/packages/bazarr/var/bazarr.log: file truncated
Mon Jan 27 22:56:16 -04 2025
Starting bazarr command env LANG=en_US.UTF-8 LC_ALL=en_US.utf8 /volume1/@appstore/bazarr/env/bin/python3 /volume1/@appstore/bazarr/share/bazarr/bazarr.py --no-update --config /volume1/@appstore/bazarr/var/data
Traceback (most recent call last):
  File "/volume1/@appstore/bazarr/share/bazarr/bazarr/main.py", line 43, in <module>
    from app.server import webserver, app  # noqa E402
    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/volume1/@appstore/bazarr/share/bazarr/bazarr/app/server.py", line 13, in <module>
    from api import api_bp
  File "/volume1/@appstore/bazarr/share/bazarr/bazarr/api/__init__.py", line 7, in <module>
    from .episodes import api_ns_list_episodes
  File "/volume1/@appstore/bazarr/share/bazarr/bazarr/api/episodes/__init__.py", line 4, in <module>
    from .episodes_subtitles import api_ns_episodes_subtitles
  File "/volume1/@appstore/bazarr/share/bazarr/bazarr/api/episodes/episodes_subtitles.py", line 12, in <module>
    from subtitles.upload import manual_upload_subtitle
  File "/volume1/@appstore/bazarr/share/bazarr/bazarr/subtitles/upload.py", line 25, in <module>
    from .sync import sync_subtitles
  File "/volume1/@appstore/bazarr/share/bazarr/bazarr/subtitles/sync.py", line 10, in <module>
    from subtitles.tools.subsyncer import SubSyncer
  File "/volume1/@appstore/bazarr/share/bazarr/bazarr/subtitles/tools/subsyncer.py", line 6, in <module>
    from ffsubsync.ffsubsync import run, make_parser
  File "/volume1/@appstore/bazarr/share/bazarr/bazarr/../libs/ffsubsync/__init__.py", line 21, in <module>
    from .ffsubsync import main  # noqa
    ^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/volume1/@appstore/bazarr/share/bazarr/bazarr/../libs/ffsubsync/ffsubsync.py", line 12, in <module>
    import numpy as np
ModuleNotFoundError: No module named 'numpy'
Stopping bazarr ...
Mon Jan 27 22:58:45 -04 2025
Stopping bazarr service : python3 (31711)

It appears that it's not being included in the build process at all during the run.

@smaarn
Copy link
Contributor

smaarn commented Feb 9, 2025

FWIW I'm fiddling with bazarr upgrade to Python 3.12 (here)

@mreid-tt
Copy link
Contributor

@hgy59, I'm currently using the script for requirements generation. The latest run is:

./generate_requirements.sh "beets[absubmit,aura,beatport,chroma,discogs,docs,embedart,embyupdate,fetchart,import,kodiupdate,lastgenre,lastimport,lyrics,mpdstats,plexupdate,reflink,scrub,sonosupdate,thumbnails,web]" 312

I encountered dependency issues with the following plugins: autobpm, bpd, metasync, and replaygain. I've documented these findings in my branch here.

Regarding the suggested dependencies:

  • cross/cairo: This only works on x86/x64. On other architectures, it fails with a "sanity test executable" error, similar to what we see with numpy.
  • cross/dbus: This fails to build entirely, throwing an "expat not found" error.

@hgy59
Copy link
Contributor

hgy59 commented Feb 10, 2025

  • cross/dbus: This fails to build entirely, throwing an "expat not found" error.

I guess cross/dbus dependency builds ok (depends on cross/libexpat) and only the wheel build fails?
may be a compiler flag to find libexpat could help...

EDIT:
sorry, reading in your PR says expat not found when building cross/dbus.
So this must be an issue with the reuse of prebuilt python312 (cross/python312 itself depends on cross/libexpat)
@th0ma7 can it be that the reuse of libexpat needs additional handling (or must be forced to build local like zlib)?

@mreid-tt
Copy link
Contributor

mreid-tt commented Feb 10, 2025

  • cross/dbus: This fails to build entirely, throwing an "expat not found" error.

I guess cross/dbus dependency builds ok (depends on cross/libexpat) and only the wheel build fails? may be a compiler flag to find libexpat could help...

I'm not sure this is the case, see the log extract below:

Build Log
make[3]: Entering directory '/github/workspace/cross/dbus'
===>  Downloading files for dbus
===>    File dbus-1.13.22.tar.xz already downloaded
===>  Verifying files for dbus
===>    Checking sha1sum of file dbus-1.13.22.tar.xz
===>    Checking sha256sum of file dbus-1.13.22.tar.xz
===>    Checking md5sum of file dbus-1.13.22.tar.xz
/github/workspace/cross/dbus/../../distrib/dbus-1.13.22.tar.xz
===>  Processing dependencies of dbus
make[4]: Entering directory '/github/workspace/cross/libexpat'
make[4]: Nothing to be done for 'default'.
make[4]: Leaving directory '/github/workspace/cross/libexpat'
===>  Extracting for dbus
tar -xJpf /github/workspace/cross/dbus/../../distrib/dbus-1.13.22.tar.xz -C /github/workspace/spk/beets/work-x64-7.1  
===>  Patching for dbus
patch -p0 < patches/000-remove-warnings-not-known-by-gcc.patch
patching file CMakeLists.txt
===>  Generating /github/workspace/spk/beets/work-x64-7.1/dbus-1.13.22/x64-toolchain.cmake
env make --no-print-directory cmake_pkg_toolchain > /github/workspace/spk/beets/work-x64-7.1/dbus-1.13.22/x64-toolchain.cmake 2>/dev/null;
===>  Configuring for dbus
===>  - Configure ARGS:
===>  - Install prefix: /var/packages/beets/target
===>  - CMake configure
===>  - Dependencies = cross/libexpat
===>  - Optional Dependencies =
===>  - Use Toolchain File = ON [/github/workspace/spk/beets/work-x64-7.1/dbus-1.13.22/x64-toolchain.cmake]
===>  - Use NASM = 0
===>  - Use DESTDIR = 1
===>  - CMake = /usr/bin/cmake [3.25.1]
===>  - Path DESTDIR = /github/workspace/spk/beets/work-x64-7.1/install
===>  - Path BUILD_DIR = /github/workspace/spk/beets/work-x64-7.1/dbus-1.13.22/build
===>  - Path CMAKE_SOURCE_DIR = /github/workspace/spk/beets/work-x64-7.1/dbus-1.13.22
cd /github/workspace/spk/beets/work-x64-7.1/dbus-1.13.22 && env -u AR -u AS -u CC -u CPP -u CXX -u LD -u NM -u OBJDUMP -u RANLIB -u READELF -u STRIP -u CFLAGS -u CPPFLAGS -u CXXFLAGS -u LDFLAGS  PKG_CONFIG=/usr/bin/pkg-config PKG_CONFIG_LIBDIR=/github/workspace/spk/beets/work-x64-7.1/install//var/packages/beets/target/lib/pkgconfig WORK_DIR=/github/workspace/spk/beets/work-x64-7.1 INSTALL_PREFIX=/var/packages/beets/target TC_WORK_DIR=/github/workspace/toolchain/syno-x64-7.1/work TC_GCC=$(eval $(echo /github/workspace/spk/beets/work-x64-7.1/../../../toolchain/syno-x64-7.1/work/x86_64-pc-linux-gnu/bin/x86_64-pc-linux-gnu-gcc -dumpversion) 2>/dev/null || true) TC_GLIBC=2.26 TC_KERNEL=4.4.180+ rm -rf CMakeCache.txt CMakeFiles
cd /github/workspace/spk/beets/work-x64-7.1/dbus-1.13.22 && env -u AR -u AS -u CC -u CPP -u CXX -u LD -u NM -u OBJDUMP -u RANLIB -u READELF -u STRIP -u CFLAGS -u CPPFLAGS -u CXXFLAGS -u LDFLAGS  PKG_CONFIG=/usr/bin/pkg-config PKG_CONFIG_LIBDIR=/github/workspace/spk/beets/work-x64-7.1/install//var/packages/beets/target/lib/pkgconfig WORK_DIR=/github/workspace/spk/beets/work-x64-7.1 INSTALL_PREFIX=/var/packages/beets/target TC_WORK_DIR=/github/workspace/toolchain/syno-x64-7.1/work TC_GCC=$(eval $(echo /github/workspace/spk/beets/work-x64-7.1/../../../toolchain/syno-x64-7.1/work/x86_64-pc-linux-gnu/bin/x86_64-pc-linux-gnu-gcc -dumpversion) 2>/dev/null || true) TC_GLIBC=2.26 TC_KERNEL=4.4.180+ cmake -S /github/workspace/spk/beets/work-x64-7.1/dbus-1.13.22 -B /github/workspace/spk/beets/work-x64-7.1/dbus-1.13.22/build -DDBUS_BUILD_TESTS=OFF -DDBUS_ENABLE_XML_DOCS=OFF -DCMAKE_BUILD_TYPE=Release -DCMAKE_INSTALL_PREFIX=/var/packages/beets/target -DCMAKE_INSTALL_LOCALSTATEDIR=/var/packages/beets/var -G Ninja -DCMAKE_TOOLCHAIN_FILE=/github/workspace/spk/beets/work-x64-7.1/dbus-1.13.22/x64-toolchain.cmake  
-- The C compiler identification is GNU 8.5.0
-- The CXX compiler identification is GNU 8.5.0
-- Detecting C compiler ABI info
-- Detecting C compiler ABI info - done
-- Check for working C compiler: /github/workspace/toolchain/syno-x64-7.1/work/x86_64-pc-linux-gnu/bin/x86_64-pc-linux-gnu-gcc - skipped
-- Detecting C compile features
-- Detecting C compile features - done
-- Detecting CXX compiler ABI info
-- Detecting CXX compiler ABI info - done
-- Check for working CXX compiler: /github/workspace/toolchain/syno-x64-7.1/work/x86_64-pc-linux-gnu/bin/x86_64-pc-linux-gnu-g++ - skipped
-- Detecting CXX compile features
-- Detecting CXX compile features - done
check_include_file(minix/config.h     HAVE_MINIX_CONFIG_H)
#cmakedefine HAVE_MINIX_CONFIG_H
check_include_file(wchar.h     HAVE_WCHAR_H)
#cmakedefine HAVE_WCHAR_H
-- Performing Test CMAKE_HAVE_LIBC_PTHREAD
-- Performing Test CMAKE_HAVE_LIBC_PTHREAD - Failed
-- Looking for pthread_create in pthreads
-- Looking for pthread_create in pthreads - not found
-- Looking for pthread_create in pthread
-- Looking for pthread_create in pthread - found
-- Found Threads: TRUE  
-- Found PkgConfig: /usr/bin/pkg-config (found version "1.8.1") 
-- Checking for module 'libsystemd>=209'
--   Package 'libsystemd', required by 'virtual:world', not found
-- Checking for modules 'libsystemd-login>=32;libsystemd-daemon>=32;libsystemd-journal>=32'
--   Package 'libsystemd-login', required by 'virtual:world', not found
--   Package 'libsystemd-daemon', required by 'virtual:world', not found
--   Package 'libsystemd-journal', required by 'virtual:world', not found
-- Checking for module 'systemd'
--   Package 'systemd', required by 'virtual:world', not found
-- Could NOT find EXPAT (missing: EXPAT_LIBRARY EXPAT_INCLUDE_DIR) 
-- Could NOT find X11 (missing: X11_X11_INCLUDE_PATH X11_X11_LIB) 
-- Could NOT find GLIB2 (missing: GLIB2_LIBRARIES GLIB2_MAIN_INCLUDE_DIR) 
-- Looking for alloca.h
-- Looking for alloca.h - found
-- Looking for byteswap.h
-- Looking for byteswap.h - found
-- Looking for crt/externs.h
-- Looking for crt/externs.h - not found
-- Looking for dirent.h
-- Looking for dirent.h - found
-- Looking for dlfcn.h
-- Looking for dlfcn.h - found
-- Looking for execinfo.h
-- Looking for execinfo.h - found
-- Looking for errno.h
-- Looking for errno.h - found
-- Looking for expat.h
-- Looking for expat.h - found
-- Looking for grp.h
-- Looking for grp.h - found
-- Looking for inttypes.h
-- Looking for inttypes.h - found
-- Looking for io.h
-- Looking for io.h - not found
-- Looking for locale.h
-- Looking for locale.h - found
-- Looking for memory.h
-- Looking for memory.h - found
-- Looking for signal.h
-- Looking for signal.h - found
-- Looking for stdint.h
-- Looking for stdint.h - found
-- Looking for stdlib.h
-- Looking for stdlib.h - found
-- Looking for stdio.h
-- Looking for stdio.h - found
-- Looking for string.h
-- Looking for string.h - found
-- Looking for strings.h
-- Looking for strings.h - found
-- Looking for syslog.h
-- Looking for syslog.h - found
-- Looking for 3 include files stdint.h, ..., sys/event.h
-- Looking for 3 include files stdint.h, ..., sys/event.h - not found
-- Looking for sys/inotify.h
-- Looking for sys/inotify.h - found
-- Looking for sys/random.h
-- Looking for sys/random.h - found
-- Looking for sys/resource.h
-- Looking for sys/resource.h - found
-- Looking for sys/stat.h
-- Looking for sys/stat.h - found
-- Looking for sys/types.h
-- Looking for sys/types.h - found
-- Looking for sys/uio.h
-- Looking for sys/uio.h - found
-- Looking for sys/prctl.h
-- Looking for sys/prctl.h - found
-- Looking for sys/syslimits.h
-- Looking for sys/syslimits.h - not found
-- Looking for sys/time.h
-- Looking for sys/time.h - found
-- Looking for sys/wait.h
-- Looking for sys/wait.h - found
-- Looking for time.h
-- Looking for time.h - found
-- Looking for ws2tcpip.h
-- Looking for ws2tcpip.h - not found
-- Looking for unistd.h
-- Looking for unistd.h - found
-- Looking for sys/inotify.h
-- Looking for sys/inotify.h - found
-- Looking for backtrace
-- Looking for backtrace - found
-- Looking for getgrouplist
-- Looking for getgrouplist - found
-- Looking for getpeerucred
-- Looking for getpeerucred - not found
-- Looking for nanosleep
-- Looking for nanosleep - found
-- Looking for getpwnam_r
-- Looking for getpwnam_r - found
-- Looking for setenv
-- Looking for setenv - found
-- Looking for unsetenv
-- Looking for unsetenv - found
-- Looking for clearenv
-- Looking for clearenv - found
-- Looking for writev
-- Looking for writev - found
-- Looking for setrlimit
-- Looking for setrlimit - found
-- Looking for socketpair
-- Looking for socketpair - found
-- Looking for setlocale
-- Looking for setlocale - found
-- Looking for localeconv
-- Looking for localeconv - found
-- Looking for poll
-- Looking for poll - found
-- Looking for strtoll
-- Looking for strtoll - found
-- Looking for strtoull
-- Looking for strtoull - found
-- Looking for pipe2
-- Looking for pipe2 - found
-- Looking for accept4
-- Looking for accept4 - found
-- Looking for inotify_init1
-- Looking for inotify_init1 - found
-- Looking for SCM_RIGHTS
-- Looking for SCM_RIGHTS - found
-- Looking for prctl
-- Looking for prctl - found
-- Looking for raise
-- Looking for raise - found
-- Looking for getrandom
-- Looking for getrandom - found
-- Looking for getrlimit
-- Looking for getrlimit - found
-- Looking for prlimit
-- Looking for prlimit - found
-- Looking for vasprintf
-- Looking for vasprintf - found
-- Looking for vsnprintf
-- Looking for vsnprintf - found
-- Looking for MSG_NOSIGNAL
-- Looking for MSG_NOSIGNAL - found
-- Looking for environ
-- Looking for environ - found
-- Looking for LOG_PERROR
-- Looking for LOG_PERROR - found
-- Performing Test HAVE_CMSGCRED
-- Performing Test HAVE_CMSGCRED - Failed
-- Performing Test DBUS_HAVE_LINUX_EPOLL
-- Performing Test DBUS_HAVE_LINUX_EPOLL - Success
-- Performing Test HAVE_VA_COPY
-- Performing Test HAVE_VA_COPY - Success
-- Performing Test HAVE___VA_COPY
-- Performing Test HAVE___VA_COPY - Success
-- Performing Test DBUS_USE_SYNC
-- Performing Test DBUS_USE_SYNC - Success
-- Performing Test HAVE_DIRFD
-- Performing Test HAVE_DIRFD - Failed
-- Performing Test HAVE_DDFD
-- Performing Test HAVE_DDFD - Failed
-- Looking for stddef.h
-- Looking for stddef.h - found
-- Check size of short
-- Check size of short - done
-- Check size of int
-- Check size of int - done
-- Check size of long
-- Check size of long - done
-- Check size of long long
-- Check size of long long - done
-- Check size of __int64
-- Check size of __int64 - failed
-- Check size of socklen_t
-- Check size of socklen_t - done
-- effectively used warnings for 'WARNINGS_CFLAGS': all;array-bounds;cast-align;char-subscripts;declaration-after-statement;extra;float-equal;format-nonliteral;format-security;format=2;implicit-function-declaration;init-self;logical-op;missing-declarations;missing-format-attribute;missing-include-dirs;missing-noreturn;missing-prototypes;nested-externs;no-error=missing-field-initializers;no-error=unused-label;no-error=unused-parameter;no-missing-field-initializers;no-unused-label;no-unused-parameter;old-style-definition;packed;pointer-arith;pointer-sign;redundant-decls;return-type;shadow;sign-compare;strict-aliasing;strict-prototypes;switch-default;switch-enum;undef;write-strings
-- effectively used disabled warnings for 'WARNINGS_CFLAGS': error=overloaded-virtual;error=missing-field-initializers;error=unused-parameter;unused-parameter
-- effectively used warnings for 'WARNINGS_CXXFLAGS': all;array-bounds;cast-align;char-subscripts;declaration-after-statement;extra;float-equal;format-nonliteral;format-security;format=2;implicit-function-declaration;init-self;logical-op;missing-declarations;missing-format-attribute;missing-include-dirs;missing-noreturn;missing-prototypes;nested-externs;no-error=missing-field-initializers;no-error=unused-label;no-error=unused-parameter;no-missing-field-initializers;no-unused-label;no-unused-parameter;old-style-definition;packed;pointer-arith;pointer-sign;redundant-decls;return-type;shadow;sign-compare;strict-aliasing;strict-prototypes;switch-default;switch-enum;undef;write-strings
-- effectively used disabled warnings for 'WARNINGS_CXXFLAGS': error=overloaded-virtual;error=missing-field-initializers;error=unused-parameter;unused-parameter
CMake Error at CMakeLists.txt:499 (message):
  expat not found!


-- Configuring incomplete, errors occurred!
See also "/github/workspace/spk/beets/work-x64-7.1/dbus-1.13.22/build/CMakeFiles/CMakeOutput.log".
See also "/github/workspace/spk/beets/work-x64-7.1/dbus-1.13.22/build/CMakeFiles/CMakeError.log".
make[3]: *** [../../mk/spksrc.cross-cmake.mk:159: cmake_configure_target] Error 1
make[3]: Leaving directory '/github/workspace/cross/dbus'
make[2]: *** [../../mk/spksrc.depend.mk:54: depend_target] Error 2

@mreid-tt
Copy link
Contributor

mreid-tt commented Feb 10, 2025

As an update to Beets, when I run a test with the compiled plugins enabled, I realise that some of them were broken and/or deprecated. I've amended the PR to remove these based on this output:

$ beet -v version
user configuration: /var/services/homes/mreid/.config/beets/config.yaml
data directory: /var/services/homes/mreid/.config/beets
plugin paths: 
** error loading plugin docs:
Traceback (most recent call last):
  File "/volume1/@appstore/beets/env/lib/python3.12/site-packages/beets/plugins.py", line 268, in load_plugins
    namespace = __import__(modname, None, None)
                ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
ModuleNotFoundError: No module named 'beetsplug.docs'

** error loading plugin import:
Traceback (most recent call last):
  File "/volume1/@appstore/beets/env/lib/python3.12/site-packages/beets/plugins.py", line 268, in load_plugins
    namespace = __import__(modname, None, None)
                ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
ModuleNotFoundError: No module named 'beetsplug.import'

** error loading plugin reflink:
Traceback (most recent call last):
  File "/volume1/@appstore/beets/env/lib/python3.12/site-packages/beets/plugins.py", line 268, in load_plugins
    namespace = __import__(modname, None, None)
                ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
ModuleNotFoundError: No module named 'beetsplug.reflink'

lyrics: Disabling google source: no API key configured.
absubmit: This plugin is deprecated.
error: No extractor command found: please install the extractor binary from https://essentia.upf.edu/

With absubmit, docs, import and reflink removed, we now have a clean response:

$ beet -v version
user configuration: /var/services/homes/mreid/.config/beets/config.yaml
data directory: /var/services/homes/mreid/.config/beets
plugin paths: 
lyrics: Disabling google source: no API key configured.
artresizer: method is ImageMagick
thumbnails: using ImageMagick to write metadata
thumbnails: using Python Pathlib to compute URIs
fetchart: google: Disabling art source due to missing key
fetchart: lastfm: Disabling art source due to missing key
Sending event: pluginload
library database: /var/services/homes/mreid/.config/beets/library.db
library directory: /var/services/homes/mreid/Music
Sending event: library_opened
beets version 2.2.0
Python version 3.12.8
plugins: aura, beatport, chroma, discogs, embedart, embyupdate, fetchart, kodiupdate, lastgenre, lastimport, lyrics, mpdstats, plexupdate, scrub, sonosupdate, thumbnails, web
Sending event: cli_exit

@mreid-tt
Copy link
Contributor

mreid-tt commented Feb 12, 2025

Hey @th0ma7, I’ve been working on updating duplicity in my branch (master...mreid-tt:spksrc:duplicity-update) and ran into an interesting issue while cross-compiling atom.

The requirements generation script worked as expected, and I used it as a basis for the updates. However, during the GitHub build, I encountered the following error:

ModuleNotFoundError: No module named 'cppy'
...
RuntimeError: Missing setup required dependencies: cppy. Installing through pip as recommended ensure one never hits this issue.

This seemed odd because, in my test environment, cppy was never explicitly downloaded or installed.

After looking into it, I found some suggestions online to install cppy via pip, but I wanted to avoid modifying the requirements file in that way. I then considered whether it could be declared as a prerequisite. The Requirement Specifiers documentation suggests using a format like:

atom[cppy]==0.10.5

However, when I tried this in our GitHub build, it failed with:

ERROR: Could not find a version that satisfies the requirement atom[cppy]==0.10.5

Any thoughts on how to resolve this?

Would it be best to manually add cppy to the requirements file? If so, how can I ensure it gets installed before atom is compiled?

@th0ma7
Copy link
Contributor Author

th0ma7 commented Feb 12, 2025

The usage of atom[cppy]==0.10.5 will not work. Also, it seems that the atom wheel may have a wrong metadata within its package as it fails to download then use the dowloaded file, weird but I did saw this previously (and may check if there is a workaround that).

Here's what I did as a workaround:

  1. I created a crossenv definition file specifically for atom such as:
pip==24.3.1
setuptools==75.8.0
wheel==0.45.1
#
build:poetry==1.8.5
build:setuptools-scm==8.1.0
#
cross:cppy==1.3.1

I haven't tested but maybe the default crossenv would have work as well?
EDIT: I just tested and it works with the default crossenv, so no need to define one specifically for atom as probably using poetry it is generating its own venv, extract from the logs:

* Creating isolated environment: venv+pip...
* Installing packages in isolated environment:                                                                                                                
- cppy>=1.2.0
- setuptools>=61.2
- setuptools_scm[toml]>=3.4.3
- wheel
* Getting build dependencies for wheel...
  1. Created a cross/atom directory as workaround:
PKG_NAME = atom
PKG_VERS = 0.11.0
PKG_EXT = tar.gz
PKG_DIST_NAME = $(PKG_NAME)-$(PKG_VERS).$(PKG_EXT)
PKG_DIST_SITE = https://files.pythonhosted.org/packages/source/a/atom
PKG_DIR = $(PKG_NAME)-$(PKG_VERS)

DEPENDS =

HOMEPAGE = https://github.com/nucleic/atom
COMMENT  = Atom is a framework for creating memory efficient Python objects with enhanced features such as dynamic initialization, validation, and change notification for object attributes.
LICENSE  = Revised BSD

include ../../mk/spksrc.python-wheel.mk
  1. did a make digests to update the digests file.
  2. Added DEPENDS += cross/atom to python312-wheels/Makefile
  3. launched make spkclean && make arch-aarch64-7.1 and got:
$ ls -la work-aarch64-7.1/wheelhouse/
total 224
drwxr-xr-x 1 spksrc spksrc    126 Feb 12 22:06 .
drwxr-xr-x 1 spksrc spksrc  19136 Feb 12 22:10 ..
-rw-r--r-- 1 spksrc spksrc 214694 Feb 12 22:06 atom-0.11.0-cp312-cp312-linux_aarch64.whl
-rw-r--r-- 1 spksrc spksrc     13 Feb 12 22:06 requirements-cross.txt
$ cat work-aarch64-7.1/wheelhouse/requirements-cross.txt
atom==0.11.0

@mreid-tt
Copy link
Contributor

mreid-tt commented Feb 12, 2025

Hey @th0ma7, thanks for the feedback! I'll give your approach a shot, but it looks like we might need to add DEPENDS += cross/atom, since cppy appears to be only a build requirement.

Looking at the pyproject.toml for atom-0.10.5, we see:

[build-system]
  requires = ["setuptools>=61.2", "wheel", "setuptools_scm[toml]>=3.4.3", "cppy>=1.2.0"]
  build-backend = "setuptools.build_meta"

This seems to function similarly to our BUILD_DEPENDS. If that's the case, is there a way to express this cppy build dependency natively without explicitly adding DEPENDS += cross/atom?

Also, reviewing the requirements.txt in duplicity-3.0.3.2, I see why it's not using the latest atom:

atom==0.10.5 ; python_version <= "3.12"
atom>=0.11.0 ; python_version >= "3.13"

To maintain consistency, I'd create a cross/atom package using version 0.10.5. Let me know your thoughts.

@th0ma7
Copy link
Contributor Author

th0ma7 commented Feb 13, 2025

@mreid-tt I was able to cross-compile atom using pip, this without the need for using a cross/atom. I have now included it into a386c3c. Reading your thread above it might fail on py313 as requires atom>=0.11.0, I'll fix it afterwards as needed.

@mreid-tt
Copy link
Contributor

Hey @th0ma7, thanks for this! Would this solution work for my branch as-is, or does PR #6437 need to be merged first? Also, I'm unsure what references the new requirements-atom.txt file. Does simply naming a file requirements-foo.txt, where foo matches the wheel name, automatically apply its contents to configure the wheel build?

@th0ma7
Copy link
Contributor Author

th0ma7 commented Feb 13, 2025

It won't work as is since the capacity to associate build vs cross environments under the generated crossenv in my other pr used for building wheels needs to be merged first. Although you could in theory remove the cross: and build: prefix from the entries in the atom crossenv requirement definition file and it should in theory work.

That being said, one of the key change is having the ability to create a per-wheel specific crossenv. This allows handling exceptions and avoid needing to include every possible dependencies into the default crossenv that normally suits most use cases. Take numpy as example, older numpy versions require really old build dependencies that are no longer applicable for other wheels. Thus the needs for its own crossenv definition file.. whereas previously we only had one by default to rule them all.

In your case atom requires yet another dependency that other wheels normally do not, thus the creation of a python312/crossenv/requirement-atom.txt definition file.

To your question, for crossenv def files it handle first foo-version, not foud falls back to foo, then default.

@mreid-tt
Copy link
Contributor

mreid-tt commented Feb 13, 2025

@th0ma7, thanks for the clarification. I see that the requirements-atom.txt file is located in the python312/crossenv path. Does this imply that environment prerequisites must be defined at the Python level, not just for any new package we create?

EDIT: Why I ask is that when I was working on Beets I came across this issue:

#numba==0.61.0                          # ModuleNotFoundError: No module named 'numpy'

Then, looking into the file at numba-0.61.0/numba.egg-info/requires.txt, I found this:

llvmlite<0.45,>=0.44.0dev0
numpy<2.2,>=1.24

This suggests that once we have numpy working, we may also have a problem with numba if it tries to build before numpy.

@th0ma7
Copy link
Contributor Author

th0ma7 commented Feb 13, 2025

The ordering won't count as each wheel is built individually. Although the version will so it doesn't conflict at install time.

@mreid-tt
Copy link
Contributor

If reviewing the pyproject.toml files for packages I couldn't cross-compile for Beets would be helpful, I've extracted them below for reference.

dbus-python-1.3.2:

[build-system]
build-backend = 'mesonpy'
requires = [
    'meson-python>=0.8.1',
    'meson>=0.60.0',
    'ninja',
    'patchelf',
    'setuptools',
    'wheel',
]

numpy-2.1.3:

[build-system]
build-backend = "mesonpy"
requires = [
    "meson-python>=0.15.0",
    "Cython>=3.0.6",  # keep in sync with version check in meson.build
]

pygobject-3.50.0:

[build-system]
build-backend = "mesonpy"
requires = ["meson-python>=0.12.1", "pycairo>=1.16"]

scikit_learn-1.6.1:

[build-system]
build-backend = "mesonpy"
# Minimum requirements for the build system to execute.
requires = [
    "meson-python>=0.16.0",
    "Cython>=3.0.10",
    "numpy>=2",
    "scipy>=1.6.0",
]

scipy-1.15.1:

[build-system]
build-backend = 'mesonpy'
requires = [
    # The upper bound on meson-python is pre-emptive only (looser
    # on purpose, since chance of breakage in 0.18/0.19 is low with
    # 0.17.1 working at time of writing)
    "meson-python>=0.15.0,<0.20.0",
    # we need at least Cython 3.1.0a1 for free-threaded CPython;
    # for other CPython versions, the regular pre-emptive
    # Cython version bounds policy applies
    "Cython>=3.0.8,<3.1.0",
    # The upper bound on pybind11 is pre-emptive only
    "pybind11>=2.13.2,<2.14.0",     # when updating version, also update check in scipy/meson.build
    # The upper bound on pythran is pre-emptive only; 0.17.0
    # is released/working at time of writing
    "pythran>=0.14.0,<0.18.0",

    # numpy requirement for wheel builds for distribution on PyPI - building
    # against 2.x yields wheels that are also compatible with numpy 1.x at
    # runtime.
    # Note that building against numpy 1.x works fine too - users and
    # redistributors can do this by installing the numpy version they like and
    # disabling build isolation.
    # NOTE: need numpy>=2.1.3 for free-threaded CPython 3.13 support
    "numpy>=2.0.0,<2.5",
]

soxr-0.5.0.post1:

[build-system]
requires = [
    "scikit-build-core >=0.9.0",
    "nanobind >=2",
    "setuptools>=45",
    "setuptools_scm[toml]>=6.2",
    "typing-extensions; python_version < '3.11'",
]
build-backend = "scikit_build_core.build"

@hgy59
Copy link
Contributor

hgy59 commented Feb 14, 2025

This suggests that once we have numpy working, we may also have a problem with numba if it tries to build before numpy.

No, this is not the case.
numba requires numpy in the crossenv at build time and does not depend on numpy wheel created for a package.
But we will need a numba specific crossenv that includes numpy.

@th0ma7 I guess the crossenv/requirements-numba.txt will need an entry like cross:numpy==<vers> (or build:numpy==<vers> ?)
how can we clarify what needs build: prefix and what cross: in crossenv requirements?

@th0ma7
Copy link
Contributor Author

th0ma7 commented Feb 14, 2025

Exactly, and numpy is a dependency in cross most likely (and not build)

@th0ma7
Copy link
Contributor Author

th0ma7 commented Feb 14, 2025

And hopefully everything that is using mesonpy will be solved also with the numpy fixes... Next.

@mreid-tt
Copy link
Contributor

Hey @th0ma7, thanks for the recent merge! I've rebased my duplicity work in my branch (master...mreid-tt:spksrc:duplicity-update), and the x86/x64 packages compile successfully.

However, all ARM and PPC builds fail with the following errors:

  • AArch64-6.2.4, ARMv7-6.2.4, Comcerto2K, and QorIQ-6.2.4:
    error: unknown type name ‘__float128’
    
  • ARMv7-7.1 (different error):
    error: ‘_Float128’ is not supported on this target  
    error: ‘_Float64x’ is not supported on this target
    

I've tested multiple compiler flags (as seen in the reverted commit) but haven’t had any success. I'm attaching the detailed error logs in case you’d like to take a closer look.

Any guidance on resolving this would be greatly appreciated.

error_armv7-7.1.txt
error_comcerto2k-7.1.txt
error_aarch64-6.2.4.txt
error_armv7-6.2.4.txt
error_qoriq-6.2.4.txt

@th0ma7
Copy link
Contributor Author

th0ma7 commented Feb 16, 2025

Was it compiling previously?

@mreid-tt
Copy link
Contributor

mreid-tt commented Feb 16, 2025

Was it compiling previously?

No, sorry, this is a new branch. Previously, I couldn't build atom, so I never got past that stage before your merge. The significant version bump from v1.2.3 might have introduced an incompatibility at the source. I could try an earlier version to pinpoint where things started failing.

EDIT: So I downgraded to v2.2.4 but saw the same errors.

@mreid-tt
Copy link
Contributor

mreid-tt commented Feb 16, 2025

@th0ma7 @hgy59, the more I analyze this __float128 error, the more it seems tied to our build environment. Many of the error messages reference files like /usr/include/stdlib.h and /usr/include/bits/floatn.h, both of which reside in /usr/include. Notably, this path is explicitly included in the compilation command with -I/usr/local/include -I/usr/include.

Given this, I came across a related issue where similar errors were encountered. In this post (arduino/Arduino#7997 (comment)), the author noted:

C_INCLUDE_PATH is set to /usr/include... Worth noting is CPATH: /usr/include

EDIT: I cleared the CPATH and C_INCLUDE_PATH variables, logged out and in and now the Arduino IDE now compiles correctly!

Could it be that removing -I/usr/local/include -I/usr/include from our build configuration allows successful compilation on ARM and PPC?

@th0ma7
Copy link
Contributor Author

th0ma7 commented Feb 16, 2025

If behind the scene pip wheel uses meson to build it won't work as wrong includes such as your example gets into the mix. That's the case for numpy for instance.

@mreid-tt
Copy link
Contributor

@th0ma7, I was wondering — can we pass arguments to Meson in our spksrc builds? I came across the implicit_include_directories option in the documentation, which states:

Description: Controls whether Meson adds the current source and build directories to the include path
Default: true

Would it be possible to pass an argument like this to override the default behavior?

--config-settings=setup-args="-Dimplicit_include_directories=false"

This might help as a workaround for the issue we're facing. Let me know your thoughts.

@th0ma7
Copy link
Contributor Author

th0ma7 commented Feb 16, 2025

Yes we can, although one must also pass the cross and native file. But that might help indeed.

Issues are, we're not using native file for meson currently and also a meson file must be generated on a per sub-build basis like we do for cmake to ensure build flags are properly set.

@mreid-tt
Copy link
Contributor

Yes we can, although one must also pass the cross and native file. But that might help indeed.

Issues are, we're not using native file for meson currently and also a meson file must be generated on a per sub-build basis like we do for cmake to ensure build flags are properly set.

Okay, do you have an example of how we might do this such that I can test it? I don't have any idea how to implement this. Would it be as simple as adding something to the Makefile?

@hgy59
Copy link
Contributor

hgy59 commented Feb 16, 2025

I can confirm that it is a cross compilation issue

Working:
make WHEELS="numpy==2.2.0" wheel-x64-7.1
make WHEELS="numpy==2.2.0" wheel-evansport-7.1

Failing:
make WHEELS="numpy==2.2.0" wheel-aarch64-7.1
make WHEELS="numpy==2.2.0" wheel-armv7-7.1

still failing with sanitycheckc.exe

      ../meson.build:1:0: ERROR: Could not invoke sanity test executable: [Errno 8] Exec format error: '/tmp/pip-wheel-72iobxq2/numpy_4b3915609c4a426c8b9c1a36c571a3a4/.mesonpy-de2ie2ru/meson-private/sanitycheckc.exe'.

It either does not use the correct toolchain or does not support cross compilation.

Another solution would be to build such wheels outside of spksrc by using native environments like manylinux-docker images (or build it native under DSM 7.1).

@th0ma7
Copy link
Contributor Author

th0ma7 commented Feb 16, 2025

Exact, my preliminary patchset works toward building those with proper meson definition files to enforce using proper include and libs, like done for other meson builds.

@hgy59
Copy link
Contributor

hgy59 commented Feb 16, 2025

Exact, my preliminary patchset works toward building those with proper meson definition files to enforce using proper include and libs, like done for other meson builds.

can't we disable sanity checks (mesonbuild/meson#12881) ?

Just adding

[properties]
skip_sanity_check = true

to tc_vars.meson didn't change anything.

Or is the meson definition file something else?

@mreid-tt
Copy link
Contributor

@th0ma7, I did some experimentation in my branch for duplicity and tried passing the argument --config-settings=setup-args="-Dimplicit_include_directories=false". However, this didn’t change the situation.

Reflecting on @hgy59’s comment about the toolchain, I noticed an interesting pattern:

  • The Build (aarch64-7.1) succeeded.
  • The Build (aarch64-6.2.4) failed.

Looking deeper into the logs, the execution parameters were nearly identical, except for the expected DSM version difference. However, one key distinction stood out:

  • aarch64-7.1: _PYTHON_HOST_PLATFORM="aarch64-unknown-linux-gnu"
  • aarch64-6.2.4: _PYTHON_HOST_PLATFORM="aarch64-unknown-linux-gnueabi"

I wonder — if we were to change the platform for aarch64-6.2.4 to the one ending in gnu, would that resolve the compilation issue?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
status/help-wanted update request to update existing package
Projects
None yet
Development

No branches or pull requests

4 participants