As both a long time Perl and long time Python user who has seen both used in production, here are some of my thoughts:
Perl
- Was an easy jump from bash to Perl
- Perl never felt like it "got in the way"
- was WAY too easy to write "write only code"
- that being said, I learned Java first and most people found MY Perl code to be very legible
- regexes as first class citizen were amazing
- backwards compatible is GREAT for older systems still running Perl (looking at you banks and some hedge funds)
Python
- Forced indentation made it MUCH easier to read other people's code
- everything is an object from day one was much better than "bless" in Perl
- no standard way of setting up an environment for MANY years hurt
- sklearn and being taught in universities were real game changers
> no standard way of setting up an environment for MANY years hurt
Serious question: is that solved? I still see a forest of options, some of which depend on each other, and at last count my laptop has 38 python binaries. What's the standard way?
There's no "de jure" standard, but uv sure looks like it's on its way to becoming the "de facto" standard.
uv.
https://docs.astral.sh/uv/
It’s hard to call that standard, it’s just the latest hn rust craze idolisation.
Sure, but it’s demonstrably better than Poetry, which was the best until uv.
If uv isn’t a standard, it’s because not enough people have tried it. It is obscenely good at its job.
uv is young, unstable and still lacking in details. It has multiple updates per month, nearly every week there are significant enhancements and bug fixes. It's not mature enough for being a standard yet, even though, what it already offers is excellent. But let it grow, change needs time.
"Prepare to die."
False dichotomy. Been using pipenv for 8 years. At first it was a bit too slow, but at this moment it gets the job done.
uv is an excellent piece of software regardless of the language used to write it. Really, if you do python, it's worth giving it a try, especially script mode.
Sure but it’s quality or your/my opinion doesn’t make it ‘standard’ even if it will be some day in the future.
Eh, I’m not so sure.
We didn’t see adoption nearly this fast for poetry, pipenv, or conda (or hatch or PDM, but I never saw those as even reaching critical mass in the first place).
Those tools got pretty popular, but it took a long time and most folks found them to have a lot of tradeoffs (miles better than Python’s first party tooling, but still).
I’m not seeing that with “uv”. Other than concerns about Astral’s stewardship model (which could be valid!), I’m not seeing widespread “it works but is hard to use” dissatisfaction with the to the way I do with, say, poetry.
Couple that with uv durably solving the need for pyenv/asdf/mise by removing the pain of local interpreter compilation entirely, and I do think that adds up to uv being fundamentally different in popularity or approach compared to prior tools. Is that “different” the same as “better”? Time will tell.
As to being written in Rust? Shrug. A ton of shops for whom uv has been transformative don’t even know or care what language it’s written in. Being Rust provides, in my opinion, two benefits: a) avoiding chicken-and-egg problems by writing the tool for managing a programming language environment in a different language that is b) not bash.
> avoiding chicken-and-egg problems by writing the tool for managing a programming language environment in a different language
I've heard this a lot, but I don't really understand the use case. It seems that people want to do development in Python, want to install and manage third-party Python packages, and know how to use command-line tools, but somehow they don't already have Python installed and would find it challenging to install directly? Building from source on Linux is a standard "get dev packages from the system package manager; configure, make and install" procedure that I've done many times (yes, putting it beside the system Python could break things, but you can trivially set an alternate install prefix, and anyway the system Python will usually be a version that meets the basic needs of most developers). Installing on Windows is a standard Windows installer experience.
Aside from that, people seem to imagine "chicken-and-egg" scenarios with pip making itself available in the environment. But this is a thoroughly (if inefficiently) solved problem. First off, for almost three years now pip has been able to install cross-environment (albeit with an ugly hack; I detail some of this in https://zahlman.github.io/posts/2025/01/07/python-packaging-...). Second, the standard library `venv` defaults to bootstrapping pip into new environments — taking advantage of the fact that pre-built Python packages are zip archives, and that Python has a protocol for running code from zip archives, which the pip package implements.
The real bootstrapping issue I've heard about is https://github.com/pypa/packaging-problems/issues/342 , but this affects very few people — basically, Linux distro maintainers who want to "build" an entire Python toolchain "from source" even though it's all Python code that the runtime could bytecode-compile on demand anyway.
Whether or not you like it, and whether or not it is better, it is not in the standard lib, and is not the way the vast majority of people install python libraries.
You forgot to update your HN craze list. Zig is chic, Rust is out.
No, it’s not. Everywhere I see, uv is adopted.
We're like a year into the uv hype cycle. It needs enough time to make sure it solves the issues of its predecessors.
So what if uv is everywhere you look? So were Poetry, pipenv, and so on. Give it time.
Tried all of them, dropped all of them. Sticked with uv, I’ll take my chances.
I'm using uv but in corporate places I'm seeing just Conda.
Yes. Try uv and never look back.
You still need pip-tools in uv environment
What for (honest question) ? Doesn't uv handle locking?
uv for project management and pipx for user-/system-wide tool installation.
uv handles that too with "uv tool".
But does it create completely isolated, updatable tools possessing all of pipx functionality?
Yep!
What is Astral's business model?
> Serious question: is that solved?
It depends on what "setting up" means.
Creating an environment, given that the Python binary it will use is already installed, is trivial (standard library functionality since late 2012). So is choosing which environment to use. So is installing pre-built packages, and even legacy source packages are pretty easy (but slow, and installation runs arbitrary code which is entirely needless for these) when they only contain pure Python code. Even dependency resolution is usually not too bad.
The big problems are things like
* building multi-language packages from source locally, because this is expected to set up temporary local build environments (and build tools have to avoid recursion there)
* using external non-Python dependencies (essentially unsolved, and everyone works around this by either vendoring stuff or by not declaring the dependency and failing at runtime) — see https://pypackaging-native.github.io/ for an overview of the problems and https://peps.python.org/pep-0725/ for what they're trying to standardize to deal with it
* dealing with metadata for source packages; in the really general case you have to build the source to get this (although the package-building API now provides a hook so that build backends can specifically prepare metadata). This is mainly because some packages have dependencies that depend on very particular platform details that (apparently) can't be expressed with the "environment marker" scheme in standard metadata (https://peps.python.org/pep-0508/#environment-markers)
* and, of course, figuring out which packages need to be in your environment (Python won't decide for you what your direct dependencies are) and managing that environment over time. The reason all these other tools popped up is because Pip only installs the packages and offers very basic environment inspection; it's only now starting to do anything with lockfiles, for example, now that there is finally a standard for them (https://peps.python.org/pep-0751/).
But if you mean, is there a standard toolchain that does everything and will be officially blessed by the core language developers, then no, you should not ever expect this. There is no agreement on what "everything" entails, and Python users (a large fraction of which don't fit the traditional image of a "developer" at all) have widely varying workflows and philosophical/aesthetic preferences about that. Besides which, the core language team doesn't generally work on or care about the problem; they care about the interpreter first and foremost. Packaging is an arms-length consideration. Good news, though: the Python Packaging Authority (not at all authoritative, and named with tongue firmly in cheek, but a lot of people didn't get that) is stepping up and working on official governance (see https://peps.python.org/pep-0772/).
> at last count my laptop has 38 python binaries
Something has gone very wrong (unless you're on Windows, such that admin rights would be needed to create symlinks and by default `venv` doesn't try). To be clear, I mean with your setup, not with the tooling. You should only need one per distinct version of Python that your various environments use. I'd be happy to try to help if you'd like to shoot me an email (I use that Proton service, with the same username as here) and give more details on how things are currently set up and what you're trying to accomplish that way.
> - no standard way of setting up an environment for MANY years hurt
I will say coming from years of perl that python had a refreshing amount of "batteries included" via the standard library.
It was only rarely that my code needed "outside help", usually something like requests or numpy.
I suspect this is because I used python in the same environment as perl, automating unixy kinds of things.
I suspect "setting up an environment" is because python has been so successful, becoming an enormously broad general language.
Awww, come on. cgi-lib.pl was all you really needed! <smirk>
Perl was my first scripting language, I occasionally need to run some of those old scripts (15-20 years old), they always run. Python scripts last 6-12 months.
So if job security was a high priority for you, ...
> Python scripts last 6-12 months.
Not, if you know what you are doing.
"You're holding it wrong" ;-)
Every Python version is officially supported for 5 years. Tons of useful scripts don't require any dependencies outside the standard library, and I have code from very many years ago that has not been hit by any standard library deprecations (which have been in relatively obscure corners, honestly) and would work fine under current Python. (Of course, it helps that I was a relatively early 3.x adopter.) Conversely, what I write today should generally be backwards-compatible to 3.6 (f-strings are really neat, and `pathlib` semantics greatly improved in that version).
I mean, if you are not able to make a reproducible virtual environment, by using one of the popular package managers, like Poetry or uv, then yes, you are holding it wrong. If you are relying on requirements.txt merely pinning versions and not transitive dependencies, then yes, you are holding it wrong. In that case it would be a skill issue.
> If you are relying on requirements.txt merely pinning versions and not transitive dependencies, then yes, you are holding it wrong.
Even then, if you have the original environment, you may very well not need to re-create it. If you do need to re-create an equivalent for a newer Python version (EOL versions of Python don't just spontaneously vanish, but maybe you upgraded your system or something, or better yet maybe you're planning in advance for that) then you generally can easily create the needed requirements.txt (or, for that matter, a PEP 751 lockfile) from the old environment.
I never did much perl, what is the standard way to setup a perl environment? CPAN?
Articles that poopoo on one language kind of have a dated all/nothing perspective.
Most languages have a decent enough framework or two that the differences between using them for different use cases may be closer than many folks realize vs whatever we hear about as the new hotness through the grapevine.
A mess can be made in a lot of languages, and a long time ago, it was even easier, except some of that code worked and it didn't get touched for a long time.
Toolbox vs. religious zealotry. Sometimes, one has to maintain .bat files, Tcl, F77, or COBOL. Professionals don't whine or wince, but do the job.