Hacker News new | past | comments | ask | show | jobs | submit | zahlman's comments login

>Lastly, the GP is correct: liquid nitrogen is incredibly cheap. It's basically the cost of drinking water.

Hold on, what? How?? Reaching that temperature seems like a difficult task; and what's used as a source of pure nitrogen, anyway? Is there some clever trick to separating it out from the air?


"install name == import name" cannot work in Python, because when you `pip install foo`, you may get more than one top-level package. Or you may get a single-file module. Or you may, validly per the spec, get no Python code whatsoever. (For example, you could publish large datasets separately from your data science library, as separate wheels which could be listed as optional dependencies.)

The lack of good namespacing practice is a problem. Part of the reason for it, in my estimation, is that developers have cargo-culted around a mistaken understanding of `__init__.py`.


> There was a proposal for a directory-based node_modules analogue which was unfortunately rejected.

There were many problems with the proposal. The corresponding discussion (https://discuss.python.org/t/_/963) is worth looking through, despite the length.

Installers like Pip could help by offering to install `--in-new-environment` for the first install, and Brett Cannon (a core developer) has done some work on a universal (i.e. not just Windows) "launcher" (https://github.com/brettcannon/python-launcher) which can automatically detect and use the project's venv if you create it in the right place (i.e., what you'd have to do with __pypackages__ anyway).


You'll be interested in relevant upcoming packaging standards PEPs: https://peps.python.org/pep-0751/ for lock files (still in discussion - have your say at https://discuss.python.org/t/_/69721), and (recently accepted, but I can't immediately point at existing tool support) https://peps.python.org/pep-0735/ for listing groups of dependencies in pyproject.toml.

A few lines of shell script remedies that rather nicely. And there will be times when you need to choose between installing from fully resolved, pinned dependencies versus abstract immediate ones.

uv solves this better, including supporting Windows, and with significantly better error reporting. It even installs Python for you. uv also has platform-independent lockfiles and a number of other features. Oh, and it's a lot faster as well.

Have higher standards for your devtools.


People who are new to programming have a long way to go before even the concept of "managing dependencies" could possibly be made coherent for them. And the "unsoundness" described (i.e. not having lockfile-driven workflows by default) really just doesn't matter a huge percentage of the time. I've been writing Python for 20 years and what I write nowadays will still just work on multiple Python versions across a wide range of versions for my dependencies - if it even has any dependencies at all.

But nowadays people seem to put the cart before the horse, and try to teach about programming language ecosystems before they've properly taught about programming. People new to programming need to worry about programming first. If there are any concepts they need to learn before syntax and debugging, it's how to use a command line (because it'll be harder to drive tools otherwise; IDEs introduce greater complexity) and how to use version control (so they can make mistakes fearlessly).

Educators, my plea: if you teach required basic skills to programmers before you actually teach programming, then those skills are infinitely more important than modern "dependency management". And for heavens' sake, you can absolutely think of a few months' worth of satisfying lesson plans that don't require wrapping one's head around full-scale data-science APIs, or heaven forbid machine-learning libraries.

If you need any more evidence of the proper priorities, just look at Stack Overflow. It gets flooded with zero-effort questions dumping some arcane error message from the bowels of Tensorflow, forwarded from some Numpy 2d arrays used as matrices having the wrong shape - and it'll get posted by someone who has no concept of debugging, no idea of any of the underlying ML theory, and very possibly no idea what matrix multiplication is or why it's useful. What good is it to teach "dependency management" to a student who's miles away from understanding the actual dependencies being managed?

For that matter, sometimes they'll take a screenshot of the terminal instead of copying and pasting an error message (never mind proper formatting). Sometimes they even use a cell phone to take a picture of the computer monitor. You're just not going to teach "dependency management" successfully to someone who isn't properly comfortable with using a computer.


> You need Python 3.10 to create a Python 3.10 venv.

Yes, but this doesn't need to cause a problem for those of us using bare-bones tooling. Speaking for myself, I just... run venv with that version of Python. Because I have a global installation of everything from 3.5 to 3.13 inclusive, plus 2.7 (just so I can verify what I post online about how things used to work in 2.x). And I never install anything to my base Python versions, because my system Python is Apt's territory and the ones I built from source are only and specifically there to make venvs with.

Compiling Python from source on Linux is about as straightforward as compiling from source ever gets, honestly. (The Python dev guide has some useful tips - https://devguide.python.org/getting-started/setup-building/i... .)


I have multiple versions of Python built from source. If I want to test what my code will do on a given version, I spin up a new venv (near instantaneous using `--without-pip`, which I achieve via a small Bash wrapper) and try installing it (using the `--python` option to Pip, through another wrapper, allowing me to reuse a single global copy).

No matter what tooling you have, that kind of test is really the only way to be sure anyway.

If something doesn't work, I can play around with dependency versions and/or do the appropriate research to figure out what's required for a given Python version, then give the necessary hints in my `pyproject.toml` (https://packaging.python.org/en/latest/specifications/pyproj...) as environment markers on my dependency strings (https://peps.python.org/pep-0508/#environment-markers).

"Mysterious errors" in this area are usually only mysterious to end users.


In principle, yes. In practice, there are a lot of issues with that workflow.

For example, `pip install --ignore-installed --dry-run --quiet --report` will build sdists (and run arbitrary code from `setup.py` or other places specified by a build backend) - just so that it can confirm that the downloaded sdist would produce a wheel with the right name and version. Even `pip download` will do the same. I'm not kidding. There are multiple outstanding issues on the tracker that are all ultimately about this problem, which has persisted through multiple versions of the UI and all the evolving packaging standards, going back almost the entire history of Pip.

See for example https://github.com/pypa/pip/issues/1884 ; I have a long list of related reports written down somewhere.

A security researcher was once infamously bitten by this (https://moyix.blogspot.com/2022/09/someones-been-messing-wit...).


Of the list you offer, only Poetry, Hatch, PDM and Uv actually do "Python dependency management" - in the sense of offering an update command to recalculate locked versions of dependencies, a wrapper to install or upgrade those dependencies, and a wrapper to keep track of which environment they'll be installed into.

pipenv, micropipenv and pip-tools are utilities for creating records of dependencies, but don't actually "manage" those dependencies in the above sense.

Your list also includes an installer (Pip), a build backend (Setuptools - although it has older deprecated use as something vaguely resembling a workflow tool similar to modern dependency managers), a long-deprecated file format (egg) which PyPI hasn't even accepted for a year and a half (https://packaging.python.org/en/latest/discussions/package-f...), two alternative sources for Python itself (ActiveState and homebrew - and I doubt anyone has a good reason to use ActiveState any more), and two package management solutions that are orthogonal to the Python ecosystem (Conda - which was created to support Python, but its environments aren't particularly Python-centric - and Linux system package managers).

Any system can be made to look complex by conflating its parts with other vaguely related but ultimately irrelevant objects.


You seem to know a lot about it, but you're really just reinforcing the point.

I don't see how. There aren't as many options as GP represents, because most of the things in the list are simply non sequitur.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: