I have a mostly-stable system with no explicit PYTHON_TARGETS. Over the weekend, I emerged world with --changed-use. Since uninstalled all libraries for the default Python (3.5) and reinstalled them for Python 3.6. After this, all my cron jobs started failing because of missing libraries. This seems like a suboptimal way of handling these upgrades.
Do you have a proposal for how this could be handled better? I'm not sure how we could figure out what the user actually wants.
I think this is only an issue for scripts written/managed outside of portage. Scripts that are installed using an ebuild would be installed in /usr/lib/python-exec/${EPYTHON}/, and the /usr/bin/python-exec wrapper will iterate over all installed python versions until it finds one that is valid for the script being executed.
This should happen if you leave python-exec.conf blank.
If a new version of Python is installed because it is now in PYTHON_TARGETS and there are no other versions of Python (of the same major version -- I'm guessing this won't be exercised anytime soon if ever for anything other than 3.x) in PYTHON_TARGETS, then the new version should become the default as soon as it is installed. That probably still leaves a short window where it doesn't have libraries installed, but it definitely seems better than the current process. And yes, I'm not talking about Portage-managed scripts; I have many little utility things that do things like scraping feeds or testing API availability.
Maybe we could write a small ebuild that would have all python versions in PYTHON_COMPAT, and would simply make sure that the currently selected system python version is enabled in PYTHON_TARGETS.
I'm curious if this is still relevant after the switch to python3_11 as default.
(In reply to Jack from comment #6) > I'm curious if this is still relevant after the switch to python3_11 as > default. I don't think so, especially given the python-exec change we made a few years ago to make "Python targets follow it" when we deprecated eselect-python.