In CI, verify that locked/pinned dependencies are installable in tier-1 platforms + python versions
Categories
(Firefox Build System :: Mach Core, enhancement, P2)
Tracking
(Not tracked)
People
(Reporter: mhentges, Unassigned)
References
(Depends on 1 open bug, Blocks 1 open bug)
Details
In CI, test installation of locked/pinned dependencies on Windows/Mac/Linux on supported Python versions (3.6 thru 3.9, at the moment).
This is probably going to be tricky, since CI generally has a single Python version per platform.
When chatting with glandium some months ago, he recommended that we make other Python versions available as Taskcluster artifacts (toolchain, IIRC?). Kind of like Rust and Node, I believe.
Reporter | ||
Updated•3 years ago
|
Reporter | ||
Updated•3 years ago
|
Reporter | ||
Comment 1•3 years ago
|
||
One tricky part of this is that (from an initial investigation) it appears that Windows and Mac workers only use our internal PyPI mirror (https://pypi.pub.build.mozilla.org/pub/
) instead of PyPI by default.
Since our internal mirror is manually managed, package-by-package, we should hit real PyPI instead.
This has two concerns:
- When running the test, we need to ask the workers to override their config and fetch from PyPI directly.
- We'll have increased traffic to PyPI, but:
- It shouldn't be that significant? We'd fetch, say, 100 packages on each push, which happens every ~5 minutes? I don't think this'll register as a blip on PyPI's radar.
- We could resolve this issue by relying on a package-caching proxy in the future (so, make the mirror automatically cache, rather than being manually populated).
Reporter | ||
Comment 2•3 years ago
|
||
As part of this work, we should verify that the different platforms don't change the lockfile due to using Python code (instead of environment markers) to conditionally change requirements.
For example, for the following setup.py
:
import sys
import setuptools
setuptools.setup(
name='package',
version='0.0.1',
install_requires=["six==1.16.0" if sys.version_info < (3, 7) else "noop==1.0"],
)
The dependent project's pyproject.toml
:
[tool.poetry]
name = "poetry-test"
description = ""
version = "0"
authors = []
[tool.poetry.dependencies]
python = "^3.6"
package = { path = "../package" }
Doing poetry lock
will result in different lockfiles, depending on the version of Python in use.
Fortunately, this kind of setup.py
behaviour is becoming less common, especially with the proliferation of environment markers.
With environment markers, that faulty setup.py
could be fixed by doing:
import sys
import setuptools
setuptools.setup(
name='package',
version='0.0.1',
install_requires=[
"six==1.16.0; python_version < '3.7'",
"noop==1.0; python_version >= '3.7'",
],
)
This gives us a universal, correct lockfile:
...
[package.dependencies]
noop = {version = "1.0", markers = "python_version >= \"3.7\""}
six = {version = "1.16.0", markers = "python_version < \"3.7\""}
...
Note: Note that Poetry has bugs in how it handles transitive dependencies, too: for example, using environment markers to conditionally set a single package's version is not properly locked.
Comment 3•3 years ago
|
||
Historically I think the main objection to using "real" PyPI is that "PyPI is down means that our entire CI is down". I don't know what the past reliability of PyPI looks like, but obviously the more external services we depend on the more likely it is that one of them is down at any given time.
Reporter | ||
Comment 4•3 years ago
|
||
Good point, and there's been some recent build discussion on this too, with some additional points worth considering:
- Downstream Firefox builders sometimes build with no network, which means that there are cases when we can't download packages
- It makes sense that we want to reduce our dependencies for release-critical tasks: for example, we don't want a PyPI.org outage blocking us from doing a chemspill
- [background note]: to enable using unvendorable python packages (
zstandard
,pypi
), we're pre-populating workers at machine-setup-time by installing these packages to the system environment.
The fallout of this, and the current discussion that's ongoing is:
- Outside of release-critical tasks, can we officially allow communication with PyPI.org? (note: we're doing it already in a lot of tasks, but unofficially).
- To enable us to to use pinned versions of unvendorable python packages in release-critical tasks, perhaps we can depend on our internal PyPI repo, then download pinned packages from there.
Comment 5•3 years ago
|
||
Not sure it will help but i have a Renovatebot running.
This friendly bot created this pr:
Description
•