Skip to content
Snippets Groups Projects

Add python3.12 in the CI

Merged jvoisin requested to merge python12 into master

Merge request reports

Loading
Loading

Activity

Filter activity
  • Approvals
  • Assignees & reviewers
  • Comments (from bots)
  • Comments (from users)
  • Commits & branches
  • Edits
  • Labels
  • Lock status
  • Mentions
  • Merge request status
  • Tracking
  • assigned to @georg

  • Developer

    @jvoisin This needs a new image; the images are build via https://0xacab.org/georg/mat2-ci-images. The latest version, as of now, python3.11 is pulled from Debian unstable. So far there is no python3.12 uploaded. This leaves two options, at least:

    • we wait up until python3.12 hits Debian unstable (I can't tell, right now, when this will be available) or
    • we rely on Python upstream images via https://hub.docker.com/_/python, there is already even python3.13 available.

    Any preference?

  • Author Owner

    Python upstream sounds like the way to go, since we're testing python versions compatibility, not Debian ones, with those runs.

  • Developer

    @jvoisin What's your preferred ETA of this change?

  • Author Owner

    I'd like to have it done ASAP, since some distributions are already shipping Python3.12 :/

  • Developer

    @jvoisin I've prepared building the relevant image via georg/mat2-ci-images!4 (merged), however, it seems, there are problems in regards to the CI: the runner didn't pick up the job, it's currently unreachable it seems. I'll debug what's going on, and find out if there are network problems at the datacenter. Will keep you posted.

  • Developer

    @jvoisin Network fixed, CI works again. The python3.12 container is now available.

    However, as per https://0xacab.org/jvoisin/mat2/-/jobs/506593, the testsuite fails, due to

    ModuleNotFoundError: No module named 'cairo'

    I've debugged this a bit, and made the corresponding execution of pip install verbose, which now might give a hint what's going wrong:

    197.3     _handle_missing_dynamic(dist, project_table)
    197.3   /tmp/pip-build-env-ied_slen/overlay/lib/python3.12/site-packages/setuptools/config/_apply_pyprojecttoml.py:75: _MissingDynamic: `dependencies` defined outside of `pyproject.toml` is ignored.
    197.3   !!
    197.3 
    197.3           ********************************************************************************
    197.3           The following seems to be defined outside of `pyproject.toml`:
    197.3 
    197.3           `dependencies = ['mutagen', 'PyGObject', 'pycairo']`
    197.3 
    197.3           According to the spec (see the link below), however, setuptools CANNOT
    197.3           consider this value unless `dependencies` is listed as `dynamic`.
    197.3 
    197.3           https://packaging.python.org/en/latest/specifications/declaring-project-metadata/
    197.3 
    197.3           To prevent this problem, you can list `dependencies` under `dynamic` or alternatively
    197.3           remove the `[project]` table from your file and rely entirely on other means of
    197.3           configuration.
    197.3           ********************************************************************************
    197.3 

    See https://0xacab.org/georg/mat2-ci-images/-/jobs/506634 for details, near the bottom starting a line 1835, it seems direct link there doesn't work as expected.

  • jvoisin mentioned in commit 05d1ca58

    mentioned in commit 05d1ca58

  • Author Owner

    I just sent 05d1ca58 to fix the issue, kudos for finding the root cause!

  • Developer

    Thanks -- now the CI runner itself seems to have problems: jobs are just getting stuck, see https://0xacab.org/georg/mat2-ci-images/-/jobs/508596 as an example.

    No clue what's going on, yet.

  • Developer

    It seems the jobs are not stuck, but container image downloads (and uploads) are super slow.

  • georg added 6 commits

    added 6 commits

    Compare with previous version

  • Developer

    @jvoisin That's now finally fixed, and ready to be merged.

    There are quite some errors logged, but it seems this has happened since a while already, two examples:

    test_avi (tests.test_corrupted_files.TestCorruptedFiles.test_avi) ... ERROR:root:Something went wrong during the processing of ./tests/data/clean.avi: Command '['/usr/bin/ffmpeg', '-i', './tests/data/clean.avi', '-y', '-map', '0', '-codec', 'copy', '-loglevel', 'panic', '-hide_banner', '-map_metadata', '-1', '-map_chapters', '-1', '-disposition', '0', '-fflags', '+bitexact', '-flags:v', '+bitexact', '-flags:a', '+bitexact', './tests/data/clean.cleaned.avi']' returned non-zero exit status 1.
    ok
    test_zip (tests.test_corrupted_files.TestCorruptedFiles.test_zip) ... ERROR:root:Unable to parse /tmp/tmpu8w37r8n/docProps/dirty.xml: not well-formed (invalid token): line 1, column 18
    WARNING:root:Something went wrong during deep cleaning of docProps/dirty.xml in /tmp/tmplewz2ot3/tests/data/embedded_corrupted.docx
    ERROR:root:Unable to parse /tmp/tmpu8w37r8n/word/document.xml: not well-formed (invalid token): line 1, column 18
  • jvoisin approved this merge request

    approved this merge request

  • merged

Please register or sign in to reply
Loading