I gotta say, I feel pretty vindicated after hearing for years how Python’s tooling was just fine and you should just use virtualenv with pip and how JS must be worse, that when Python devs finally get a taste of npm/cargo/bundler in their ecosystem, they freaking love it. Because yes, npm has its issues but lock files and consistent installs are amazing
There is nothing I dread more within the general context of software development, broadly, than trying to run other people's Python projects. Nothing. It's shocking that it has been so bad for so long.
Never underestimate cultural momentum I guess. NBA players shot long 2 pointers for decades before people realized 3 > 2. Doctors refused to wash their hands before doing procedures. There’s so many things that seem obvious in retrospect but took a long time to become accepted
Hey and you can use both lanes in a zip merge!
Isn't that the law anyway?
Morale: follow the rules.
>NBA players shot long 2 pointers for decades before people realized 3 > 2
And the game is worse for it :')
This is a fundamental problem in sports. Baseball is going the same way. Players are incentivized to win, and the league is incentivized to entertain. Turns out these incentives are not aligned.
> Players are incentivized to win, and the league is incentivized to entertain.
Players are incentivized to win due to specific decisions made by the league.
In Bananaball the league says, "practice your choreographed dance number before batting practice." And those same athletes are like, "Wait, which choreographed dance number? The seventh inning stretch, the grand finale, or the one we do in the infield when the guy on stilts is pitching?"
Edit: the grand finale dance number I saw is both teams dancing together. That should be noted.
Sure. There's a market for that. But the NBA sells a lot more tickets than the Harlem Globetrotters.
But that's a matter of scale. When I was a child, the Harlem Globetrotters were far more more famous than any 3-4 NBA teams combined. They were in multiple Scooby Doo movies/episodes. They failed tp scale the model, but wrestling didn't.
Would be very curious about, say, the worst MLB team's ticket sales vs. the Savannah Bananas.
This isn't right - the league can change the rules. NFL has done a wonderful job over the years on this.
Baseball has done a terrible job, but at least seems to have turned the corner with the pitch clock. Maybe they'll move the mound back a couple feet, make the ball 5.5oz, reduce the field by a player and then we'll get more entertainment and the players can still try their hardest to win.
I wonder if anyone has made an engine for simulating MLB play with various rule changes.
Personally, I think it'd be interesting to see how the game plays if you could only have two outfielders (but you could shift however you choose.)
It's a good thought.
I'd guess MLB The Show video game wouldn't be a bad place to start. They should have a decent simulator built in.
And the ongoing gambling scandal gives credence to a third incentive I've long suspected. Only half joking
Something Derek Thompson has written about https://archive.ph/uSgNd
Is it ? I, for one, enjoy watching the 3s raining down!
They did wash their hands. Turns out that soap and water wasn't quite enough. Lister used carbolic acid (for dressing and wound cleaning) and Semmelweis used chlorinated lime (for hand washing).
And Semmelweis is a perfect case against being an asshole who's right: He was more right than wrong (he didn't fully understand why what he was doing helped, but it did) but he was such a horrible personality and such an amazing gift for pissing people off it probably cost lives by delaying the uptake of his ideas.
But this is getting a bit off topic, I suppose.
Or you could say it the other way around: Even leading scientists are susceptible to letting emotions get the best of them and double-down defending their personal investments into things.
"A scientific truth does not triumph by convincing its opponents and making them see the light, but rather because its opponents eventually die and a new generation grows up that is familiar with it." - Max Planck.
Was soap often used prior to the mid 1800s?
That was later; earlier in history doctors (or "doctors" if you so insist) did not wash their hands.
I was mainly pushing back on the idea that something as seemingly obvious as hand washing was the thing that made surgery safe. It took quite a bit more than just simple hand washing.
People paid 100x more for their hosting when using aws cloud until they realized they never neded 99.97% uptime for their t-shirt business. Oh wait too soon. Save for post for the future.
People paid only 100x more than self hosting to use AWS until they realized that they could get a better deal by paying 200x for a service that is a wrapper over AWS but they never have to think about since it turns out that for most businesses that 100x is like 30 bucks a month.
People spent half their job figuring out self hosted infrastructure until they realized they rather just have some other company deploy their website when they make a commit.
kubernetes
[flagged]
Usually when someone comes with that argument, I ask them to pick any week date in the past year and then I take a random item on my calendar on that day; I give them the time and address of where I need to be as well as the address of my home and I ask them how long it's going to take me and how much it's going to cost. That's usually enough to bring them down a notch from "train work" to "sometimes train work". (But they tend to forget very often, they need to be reminded regularly for some reason). Do you want to play that game with me to get your reality check in order ?
Western Europe in a VERY dense city BTW.
> I give them the time and address of where I need to be [...] That's usually enough to bring them down a notch from "train work" to "sometimes train work" [...] Do you want to play that game with me to get your reality check in order ?
I don't think the implied claim is that there should be specifically a train to every particular address, if that's what you're counting as failure in the game, but rather that with good public transport (including trains) and pedestrian/cyclist-friendly streets it shouldn't be the case that most people need to drive.
Cars are so flexible. It's the answer to so many questions outside "how to move one or two people from A to common destination B".
Need to move 3 or 4 people? Driving the car may be cheaper.
Don't want to get rained on? Or heatstroke? Or walk through snow? Or carry a bunch of stuff, like a groceries/familyWeek or whatever else? Or go into the countryside/camping? Or move a differently-abled person? Or go somewhere outside public transport hours? Or, or .. or.
Are there many cases where people should take public transport or ride a bike instead of their car? Obviously yes. But once you have a car to cover the exigent circumstances it is easy to use them for personal comfort reasons.
> Cars are so flexible.
They’re also a joke when it comes to moving large numbers of people. I can’t imagine the chaos if everyone leaving a concert at Wembley Stadium decided to leave by car.
You wouldn't have to imagine it if you visited Dallas. AT&T stadium has roughly the same capacity as Wembley, and no public transit at all.
Dallas would look very different if they emphasized public transport. Outside of downtown it is so sparse, many of the suburbs suffer from crumbling infrastructure because it turns out pipes made to last 30 years do poorly after 40 to 50 years when all the low density suburbs have aged out and there is no remaining land to subsidize the infrastructure ponzi scheme.
Fort Worth is worse for this!
Strongtowns is definitely worth a listen.
Are they crap during peak hour traffic or mass public events? Sure are! They're not some miracle device.
But people claiming that you can live a life without cars don't seem to realise the very many scenarios where cars are often easier and sometimes the only answer.
Until everyone wants to go from A to B, when a traffic jam happens. If that happens quite often, it might be more convenient to use a bicycle, an umbrella or snow boots.
Yes, cars are flexible. They are jacks of all trades, but masters of none. At long distances, trains win handily. At short distances, bikes do.
> Need to move 3 or 4 people? Driving the car may be cheaper.
That's the issue―the average car occupancy is <1.5. Our goal should be to raise it, by offering alternatives to cars in cases where they're not appropriate.
> Are there many cases where people should take public transport or ride a bike instead of their car? Obviously yes.
Not many, most. Cars are a niche, they're only economical when transporting a few people with cargo over medium distances. Everything else is more efficiently covered by another mode of transport.
And "obviously", huh? Look outside. It's all roads.
> But once you have a car to cover the exigent circumstances it is easy to use them for personal comfort reasons.
You'd be surprised. The Netherlands is the best example of this―the Dutch own almost as many cars per person as Americans do, yet they cycle orders of magnitude more.
It's a matter of designing our built environment to make the most efficient mode of transportation for the situation the most convenient option.
My initial post had the context of "life as it is now". To further these thoughts though:
> > Need to move 3 or 4 people? Driving the car may be cheaper. >That's the issue―the average car occupancy is <1.5. Our goal should be to raise it, by offering alternatives to cars in cases where they're not appropriate.
When I said this, I meant in terms of $ to the individual making the choice. Apart from city parking costs, and congestion charges, with modern phones being used a lot for transport these days could we do dynamic group discounts? IE my transport app shows a QR code, my friends who are coming with me scan it with their transport app and by travelling together(beeping on and off at the same locations within the same timeslot) we get a discount?
> Not many, most. Cars are a niche, they're only economical when transporting a few people with cargo over medium distances. Everything else is more efficiently covered by another mode of transport.
I agree in the context of city planning and public transport being a lot better than it is now. Otherwise, the last mile problem is a hard one to get past. As soon as you walk or ride a bike to the station/bus-stop you've introduced constraints on cargo, physical fitness and weather. All mostly easier with a car. Also, a car provides freedom/flexibility for midday decisions like "I'll do the groceries on the way from work" or "my wife had an issue at work, so I'll go pick up the kids this afternoon" or similar - harder to do if you've committed to pubic transport in the morning.
> And "obviously", huh? Look outside. It's all roads.
Where I am, public transport is buses. Bicycles are meant to ride on the road. So the roads are still used even if the car isn't.
> You'd be surprised. The Netherlands is the best example of this―the Dutch own almost as many cars per person as Americans do, yet they cycle orders of magnitude more.
This is one thing I find frustrating. But not everyone has a "default active" lifestyle. Many are quite sedentary. Also, a significant chunk of car costs - purchase/depreciation, yearly insurance and registration - are not mileage based. But it is frustrating that other options are not even considered. Again though, urban planning and current public transport shape the society we live in for generations. Maybe we'd all be more active if it was better done.
> It's a matter of designing our built environment to make the most efficient mode of transportation for the situation the most convenient option.
So much this. But there is a lot to overcome. Individualism, NIMBYs and cars themselves as a status symbol of freedom and "go anywhere, go anytime" flexibility. I don't see how to do it - but I'd support smart attempts to try.
The argument there is a little dishonest, given that if you only had the option of riding public transit that your schedule would indeed be well conformed to using public transit. I think everyone understands VERY well that they could get from point A to point B faster by using a dedicated vehicle which is solely concerned with getting them from point A to point B, that's not really debatable.
In the states at least if you're using public transit it's generally as an intentional time / cost tradeoff. That's not a mystery and taking a point-to-point schedule and comparing that against public transit constraints doesn't really prove much.
The average European mind can't comprehend freedom of movement across vast amounts of open nature.
I live in Canada, which is similar to the US in this regard, and I can't believe how enslaved we are to the private automobile.
If you want the freedom to move across vast amounts of open nature, then yeah the private automobile is a good approximation for freedom of mobility. But designing urban areas that necessitate the use of a private vehicle (or even mass transit) for such essentials as groceries or education is enslavement. I don't buy the density argument either. Places that historically had the density to support alternative modes of transportation, densities that are lower than they are today, are only marginally accessible to alternative forms of transportation today. Then there is modern development, where the density is decreased due to infrastructure requirements.
To me, "urban planning" has a lot to answer for. They seem to have the foresight of a moth. However, they are probably constrained by politics which is similar.
“enslaved,” really?????
Can you reasonably get by without a car? For most Americans the answer is no. Therefore, yes you are enslaved. You don't have the freedom to choose how you get to work, you have to spend money on a car.
Do you understand what enslavement is? Because it’s not “i can’t reasonably get by,” it’s “I am not recognized as human, I am legally property and have no rights.”
These things are different.
I’m pretty sure we can comprehend it, we just usually enjoy said freedom of movement in nature on our feet rather than sat in an SUV.
Heard an anecdote about a German engineer who was in California (I think San Francisco, but if it was Los Angeles then the distances involved would be even larger) for meetings with American colleagues, and thought he would drive up to Oregon for a day trip. His American colleagues asked him to take another look at the scale on the bottom right of the map, and calculate the driving time. Once he ran the numbers, he realized that his map-reading instincts, trained in Germany, were leading him astray: the scale of maps he was used to had him thinking it was a 2- or 3-hour drive from San Francisco to Oregon. But in fact it's a 6-hour drive just to get to the Oregon border from SF, and if you want to head deeper into the interior then it's probably 9 to 10 hours depending on where you're going.
So no, I don't think Europeans who haven't been in America have quite absorbed just how vast America is. It stretches across an entire continent in the E-W direction, and N-S (its shortest border) still takes nearly a full day. (San Diego to Seattle is about 20 hours, and that's not even the full N-S breadth of the country since you can drive another 2.5 hours north of Seattle before reaching the Canadian border). In fact, I can find a route that goes nearly straight N-S the whole way, and takes 25 hours to drive, from McAllen, TX to Pembina, ND: https://maps.app.goo.gl/BpvjrzJvvdjD9vdi9
Train travel is sometimes feasible in America (I am planning Christmas travel with my family, and we are planning to take a train from Illinois to Ohio rather than fly, because the small Illinois town we'll be in has a train station but no airport; counting travel time to get to the airport, the train will be nearly as fast as flying but a lot cheaper). But there are vast stretches of the country where trains just do not make economic sense, and those whose only experience is in Europe usually don't quite realize that until they travel over here. For most people, they might have an intellectual grasp of the vastness of the United States, but it takes experiencing it before you really get it deep down. Hence why the very smart German engineer still misread the map: his instincts weren't quite lined up with the reality of America yet, and so he forgot to check the scale of the map.
> there are vast stretches of the country where trains just do not make economic sense
There are plenty of city pairs where high speed trains do make economic sense and America still doesn't have them. [1] is a video "56 high speed rail links we should've built already" by CityNerd. And that's aside from providing services for the greater good instead of for profit - subsidizing public transport to make a city center more walkable and more profitable and safer and cleaner can be a worthwhile thing. The US government spends a lot subsidizing air travel.
> So no, I don't think Europeans who haven't been in America have quite absorbed just how vast America is
China had some 26,000 miles of high speed rail two years ago, almost 30,000 miles now connecting 550 cities, and adding another couple of thousand miles by 2030. A hundred plus years ago America had train networks coast to coast. Now all Americans have is excuses why the thing you used to have and tore up is impossible, infeasible, unafordable, unthinkable. You have reusable space rockets that can land on a pillar of fire. If y'all had put as much effort into it as you have into special pleading about why it's impossible, you could have had it years ago.
[1] https://www.youtube.com/watch?v=wE5G1kTndI4
Personally, I'd blame California for American voters' distaste for subsidizing high-speed rail. They look at the massive budget (and time) overruns of California's celebrated high-speed rail, and say "I don't want that waste of money happening in MY state, funded with MY state taxes" and then vote against any proposed projects.
This is, of course, a massively broad generalization, and there will be plenty of voters who don't fit that generalization. But the average American voter, as best I can tell, recoils from the words "high-speed rail" like Dracula would recoil from garlic. And I do believe that California's infamous failure (multiple failures, even) to build the high-speed rail they have been working on for years has a lot to do with that "high-speed rail is a boondoggle and a waste of taxpayer dollars" knee-jerk reaction that so many voters have.
Good luck reaching the good remote spots from a train.
Focusing on remote spots is largely a different topic. If the majority of driving was to remote spots then we'd have 90% less driving and cars wouldn't be a problem.
Honestly people really just dont understand how far apart things are. And yeah the good remote spots are a 4 hour drive from the city (and you aren’t even half way across the state at that point).
The forests and wilderness of the PNW are much, much, much, much more remote and wild than virtually anywhere you’d go in Europe. Like not even close.
It seems like people are just talking past each other here. The fact is that 99% of driving is not done by people in the process of visiting remote nature destinations.
Also, the USA is not the only big country in the world... I live in a small city in Patagonia. The nearest towns are 60 km, 90 km, and 480 km away. But you can still live without a car in the city.
they can't also realize a country that ditches personal vehicles can invest in buses or more trains to "remote places". nor they realize the vehicle industry is one of the biggest pollutants on micro-plastic; which screws the "remote nature" as well our health
Great so train to major destinations and then rent a car from there.
In the future, I hope this becomes a thing. As cars become more commodotised and self driving taxis can be ordered easily maybe there'll be competing mass fleets?
Or have a "car-cabin-without-engine-and-wheels" and treat it like a packet on a network of trains and "skateboard car platforms".
I believe Russians have something to say on that, though.
Is this satire? In the nordics we have allemansrätten, the right to use even private land to camp as long as you're not too close to where someone lives, not to mention huge national parks. In the US you have the right to get shot if you enter private land.
The average american mind can't comprehend this works out to a huge number of them having to commute by car 1-2 hours per day to get to work in some ungodly urban sprawl while living an alienated existence in crappy suburbs, and destroying the environment while doing so. At the same time working far more, slaving year round with laughable paid vacation time or sick day provisions, while being subjected to far worse homicide rates, and being treated as subjects by cops.
Such "freedom"...
No I love being stuck in traffic every day of the week for hours, its totally worth it because I can drive to an empty patch of grassland that no one wants to go to and there's nothing there. That's why cars are so amazing and freedom granting. Trains can't take you to the middle of nowhere to do nothing for the 1% of the time you don't want to be near other civilization so cars are better
lol, yeah. Meanwhile they can't even comprehend that it's a false dillema: Europeans have cars just fine, even several per family.
They just don't have to use them all the time since they can take the more efficient public transport, and they can buy one after college even, they don't need to drive one from 16 yo just to be able to get around...
Are you arguing that trains are infeasible (due to cost or duration) for certain trips?
I'm curious how this changes (in your mind) if "trains" can be expanded to "trains, buses, bicycle", or if you consider that to be a separate discussion.
I live in Atlanta.
The Atlanta Metro has 6.5 million people across TWENTY THOUSAND square kilometers.
Trains just don't make sense for this. Everything is too spread out. And that's okay. Cites are allowed to have different models of transportation and living.
I like how much road infra we have. That I can visit forests, rivers, mountains, and dense city all within a relatively short amount of time with complete flexibility.
Autonomous driving is going to make this paradise. Cars will be superior to trains when they drive themselves.
Trains lack privacy and personal space.
The German metro area "Rheinland" has a population of 8.7 million people across 12 thousand square kilometers. ~700/sqkm vs the 240/sqkm population density of Atlanta metro. Train and metro travel in this metrk area is extremely convenient and fast. It's not that Atlanta (or anywhere else in the United States for that matter) couldn't do it because of vastness, there's just no political and societal will behind this idea. In a society that glamorizes everyone driving the biggest trucks and carrying the largest rifles, of course convenient train systems are "not feasible".
> The German metro area "Rheinland" has a population of 8.7 million people across 12 thousand square kilometers. ~700/sqkm vs the 240/sqkm population density of Atlanta metro. Train and metro travel in this metrk area is extremely convenient and fast. It's not that Atlanta (or anywhere else in the United States for that matter) couldn't do it because of vastness
Did you forget to support yourself? You're saying Rheinland has three times the population density of Atlanta, with convenient passenger rail, and that demonstrates that low population density isn't an obstacle to passenger rail in Atlanta?
I'm not following your logic. Having nearly triple the population density in Rheinland makes trains way _more_ feasible, not _less_. That means on average you have a train 1/3 the distance away from you. That's a big difference.
I live in NYC which has 29,000/sqkm in Manhattan and 11,300/sqkm overall. Public transportation is great here and you don't need a car.
but at 240/sqkm, that's really not much public trans per person!
Rule 35 of the internet? Every discussion will eventually devolve into the United State's horrible usage (or lack thereof) of public transportation.
And it loses money. And doesn't it have time reliability issues?
The exact same comment could be made of Atlanta's roads.
How did we get here from the post about uv?
This did veer very far from uv!
I'm so stoked for what uv is doing for the Python ecosystem. requirements.txt and the madness around it has been a hell for over a decade. It's been so pointlessly hard to replicate what the authors of Python projects want the state of your software to be in.
uv has been much needed. It's solving the single biggest pain point for Python.
roads also lose a lot of money, and that's fine. Public infrasturcture doesn't need to make money
Is your car a profitable investment?
Public transport is to move people around, not to make money.
Having replied in good faith already, I also want to call out that your jab about trucks and rifles adds nothing to the conversation and is merely culture-war fuel.
> Please don't use Hacker News for political or ideological battle. It tramples curiosity.
> Eschew flamebait. Avoid generic tangents. Omit internet tropes.
It seems like a fair point to me. You can't bring your rifle on the train but you can bring it in your truck. Whether or not that shapes Atlanteans choice of transport I can't say though.
Fair point perhaps, but was clearly intended as sarcasm:
> a society that glamorizes everyone driving the biggest trucks and carrying the largest rifles
Yep, driving in Atlanta is so great, historians write whole books about how bad the traffic is and what caused it:
https://www.nytimes.com/interactive/2019/08/14/magazine/traf...
> and how much it's going to cost
Depending how expensive is gasoline in your country, when using a car people underestimate the cost of a travel by a factor two to five, because they don't count the depreciation of their vehicle's value and the maintenance cost (and sometimes even insurance price) driven by the kilometers ridden during the trip.
By that logic cars work also turns into sometimes cars work. Ever heard of traffic jams and have you compared the number of fatal car accidents vs fatal train accidents. Not to mention the negative effect on air quality with many cars in dense cities. Cars main advantage is flexibility and that’s it. For times were the place and time usually stays the same like work, trains are a valid option.
I'll happily play your game with a bicycle.
Great lets pick Canada in January. Bring a shovel.
Don't need one in Toronto within a ½ day or so of the snow stopping for the major bicycle routes (including the MGT).
Calgary apparently also does a good job of clearing its bike lanes.
And I do my Costco shopping by bike year-round. I think I've used the car for large purchases at Costco twice in the last year.
I _rarely_ drive my car anywhere in Toronto, and find the streets on bike safer than most of the sidewalks in January -- they get plowed sooner than most homeowners and businesses clear the ice from their sidewalks.
And in Toronto we're rank amateurs at winter biking. Look at Montreal, Oslo, or Helsinki for even better examples. Too bad we've got a addle-brained carhead who doesn't understand public safety or doing his own provincial as our premier.
Just to add a less opinionated take: https://www.citymonitor.ai/analysis/why-winter-is-a-poor-arg...
Personally I've also biked to work (and everywhere, really) in sub-zero degrees many times, because the bicycle lanes are cleared and salted. It's really not too bad. It actually gets a bit too hot even, because you start out by wearing so much.
In cold weather, one should always dress for 5℃ warmer than the temperature outside when you have a bike longer than 5 km. Runners pretty much have to do the same. Your body heat and good layering will take care of everything else.
Personally I've also biked to work (and everywhere, really) in sub-zero degrees many times, because the bicycle lanes are cleared and salted.
I used to bike to work in just-above-freezing temperatures. That wasn't so bad.
The one time it started to rain mid-journey, that was bad.
They don't clear snow from cycle paths in Canada? If not then it's an infrastructure problem, not a weather problem.
Sound like a German saying :
> there’s no weather problems, there’s clothing problems
Love losing?
Really? This is your go-to argument? And nobody's pointed out to you before how bad it is? Well then.
Do the opposite thought experiment for me: Pick any two points of interest on the map and see how well connected they are with roads. Keep doing it until you find somewhere not accessible via car. See the issue yet?
We've paved over the entire planet to the point that you can get anywhere you'd like with a car. We have not done so whatsoever for any other mode of transportation. Pedestrian walkways come close but we prioritize vehicles over those too. The investment into public transport & cycling infrastructure is a statistical error in comparison to roadways.
So no shit it's more convenient for you to take a car than a train, that's the entire point―it shouldn't be.
A 20 lane highway should be a train track, intra-city roads should be dedicated to bikes, not cars.
skill issue
People in Europe spents years with people dying due to heat stress before they discovered ACs....
This isn't really true. Heat stress deaths in Europe are comparitively rare, or were until urbanization and climate change became bigger factors.
I mean, more Europeans die from heat issues than Americans from guns. https://www.chicagotribune.com/2025/07/28/opinion-us-heat-de...
> This content is not available in your region
I guess Europeans will never find out how great the US is :-)
Good article, thanks (I've actually read it before, but forgot the overall stats).
Urbanisation.
python uv
They still do. More Europeans die every year from heat-related injuries than Americans do from guns.
I've wanted an AC, but I cannot get one. This apartment is just not equipped for it, it literally cannot handle multiple ACs (the cables are old, yadda yadda).
And you have to get lots of permits to have an AC installed legally. If you do not have a permit, you will have to pay a really hefty fee when the inspectors come.
So yeah, buying an AC is what most people would do, but they do not because of the damn permits they most likely will not get. It is a shitty situation.
Spaniard here. Don't lecture Southern Europeans on surviving heat when the church of the village of my parents predates America itself (and it's pretty fresh inside in Summer).
Always sucks when you're arguing with someone and it turns out the buildings in their town are older than yours. Sometimes you just gotta take the L.
This is not people’s fault individually, but rather in aggregate (ie government). The places that have good train infrastructure that is legitimately an alternative to driving are very few and far between in the US. It’s just not an option for most people. And people can’t just all move to the places where it is an option, because housing and jobs are already strained in those places negating many of the benefits.
Have you considered that the repeated attempts to reinvent what's basically trains are not, in fact, evidence that people don't know about trains, but evidence that people like the advantages of trains but that the downsides suck so bad that people will pay literally tens of thousands of dollars a year to avoid them?
Yeah all you need to do is raze and rebuild every city in America and it will work great!
Well, there is precedent...
Wrong kind of cheek my friend
People in Europe spent years walking to the store everyday for food until they discovered that mechanical refrigeration exists...
Something I think that goes underappreciated: in many parts of the world, the food supply chain is shorter and the food is fresher to begin with. You're not meant to shop for 14 days at a time; you're meant to go more frequently and get what you need, fresh.
Bad example. Walking to the store everyday for fresh food would be a drastic improvement for most Americans.
The refrigerator is a relatively modern invention. There's always been a refrigerator for me, but as a child my mother sometimes stayed with people who didn't own one and for her mother they were a new invention many people didn't have.
Actually this idea of just buying things at "the store" is relatively new too. Historically people would make more things themselves, and more food would be purchased directly from farmers who had grown it.
So many times I have come onto a library or tool that would fix my problem, and then realized “oh crap, it’s in Python, I don’t want to spend few hours building a brittle environment for it only for that env to break next time I need to use it” - and went to look for a worse solution in better language.
I really don't get this. I can count on no hands the number of times I've had problems simply going "pip install cool-thing-i-found".
Sure, this is just my experience, but I use Python a lot and use a lot of tools written in Python.
If you can install it with `pip install program-name` it's usually packaged well enough to just work. But if it's a random github repository with a requirements.txt with no or very few version numbers chances are that just running `pip install -r requirements.txt` will lead you down an hour+ rabbit hole of downgrading both your venv's python version and various packages until you get a combination that is close enough to the author`s venv to actually work
Usually happens to me when I find code for some research paper. Even something that's just three months old can be a real pain to get running
I don't disagree with you, but in my experience even having a requirements.txt file is a luxury when it comes to scientific Python code: A lot of the time I end up having to figure out dependencies based purely on whatever the script is importing
If they can't be bothered to make a requirements.txt file, I'm not seeing how uv will be of much help...
uv basically makes that a default. You don’t need to be bothered. Just uv add your dependencies and they are in your pyproject.toml.
Or use `uv add --script`. Then dependencies gets recorded in the script itself, which is great for single-file scripts. But obviously, that won't help if the author can't be bothered to take those steps
Ah, I get it now! The problem occurs when someone publishes something without version pinning, because package versions can become incompatible over time. I don't think I've ever installed something outside of what's available on PyPy, which is probably why I've never run into this issue.
Still, I would think it's rare that package versions of different packages become incompatible?
Seconded. Python, even with virtualenv stuff, has never been bad. There have been a few things that have been annoying especially when you need system libraries (e.g. libav for PyAV to work, etc.), but you have the same issue with every other ecosystem unless the packages come with all batteries included.
To be fair to the GP comment, this is how I feel about Ruby software. I am not nearly as practiced at installing and upgrading in that ecosystem so if there was a way to install tools in a way that lets me easily and completely blow them away, I would be happier to use them.
I still have nightmares about nokogiri gem installs from back in the day :/
Shudder. I'm guessing it was the always breaking libxml2 compilation step right?
This mentality is exactly what many people do wrong in Python. I mean, for a one-off, yes you can have setup instructions like that. But if you want things to work for other people, on other machines, you better include a lock file with checksums. And `pip install whatever` simply does not cut it there.
Except I'm saying my experience is the opposite of the problem you purport. I (as the consumer) have always done "pip install whatever", and have never run into issues.
One of the commentors above explained what the problem really is (basically devs doing "pip install whatever" for their dependencies, instead of managing them properly). That's more a problem of bad development practices though, no?
Nah. That's a dumb UI. If you do cargo add whatever, it does something completely different from cargo install whatever, and there's no way to inadvertently use the wrong one. If pip install whatever leaves your project in a state that may be unusable for other people, but usable for you, that's just design that invites confusion and works-on-my-machine syndrome.
Recently (like for several years), with most packages providing wheels for most platforms, it tends to be less of a problem of things actually working, except for dependencies where the platform specifiers used by Python are insufficient to select the right build of the dependency, like PyTorch.
I know, this is just how it is I guess . Those of us mystified what the big problem is with virtualenv and pip and why we all have to use a tool distributed by a for profit company and it's not even written in python will just have to start a little club or something
I guess this is mostly about data science code and maybe people who publish software in those communities are just doing very poor packaging, so this idea of a "lock file" that freezes absolutely everything with zero chance for any kind of variation is useful. Certainly the worst packaged code I've ever seen with very brittle links to certain python versions and all that is typically some ML sort of thing, so yeah.
This is all anathema to those of us who know how to package and publish software.
Recently I've been playing with Chatterbox and the setup is a nightmare. It specifically wants Python 3.11. You have 3.12? TS. Try to do pip install and you'll get an error about pkg-config calling a function that no longer exists, or something like that.
God, I hate Python. Why is it so hard to not break code?
I experienced that recently - just curious, since you're digging into voice synth, what are open-source voice synth (specifically text-to-speech) which have been working for you. Recently, I have tried PiperTTS (I found the voices very flat, and accented), Coqui (in the past - it wasn't great, and doesn't seem to be supported). I spent a ton of time trying to get Chatterbox to work (on Debian Linux 13) - and ultimately couldn't get the right mix of Python versions, libraries etc. At this moment, I'm using AWS Polly and ElevenLabs (and occasionally MacOS `say`), but would love to have an open-source TTS which feels quality, and I can psychologically invest in. Thanks for any perspective you can share.
>I spent a ton of time trying to get Chatterbox to work (on Debian Linux 13)
Exactly my case. I had to move back to Debian from Ubuntu, where I had installed Chatterbox without much difficulty, and it was hell. You pretty much need Anaconda. With it, it's a cinch.
>what are open-source voice synth which have been working for you.
I tried a few, although rather superficially. Keeping in mind that my 3090 is on my main (Windows) machine, I was constrained to what I could get running on it without too much hassle. Considering that:
* I tried Parler for a bit, although I became disillusioned when I learned all models have an output length limit, rather than doing something internally to split the input into chunks. What little I tried with it sounded pretty good if it stayed within the 30-second window, otherwise it became increasingly (and interestingly) garbled.
* Higgs was good. I gave it one of Senator Armstrong's lines and made it generate the "mother of all omelettes" one, and it was believable-ish; not as emphatic but pretty good. But it was rather too big and slow and required too much faffing around with the generation settings.
* Chatterbox is what I finally settled with for my application, which is making audiobooks for myself to listen to during my walks and bike rides. It fits in the 3070 I have on the Linux machine and it runs pretty quick, at ~2.7 seconds of audio per second.
These are my notes after many hours of listening to Chatterbox:
* The breathing and pauses sound quite natural, and generally speaking, even with all the flaws I'm about to list, it's pleasing to listen to, provided you have a good sample speaker.
* It you go over the 40-second limit, it handles it somewhat more graciously than Parler (IMO). Instead of generating garbage it just cuts off abruptly. In my experience splitting text at 300-350 characters works fairly well, and keeping paragraphs intact where possible generates best results.
* If the input isn't perfectly punctuated it will guess at the sentence structure to read it with the correct cadence and intonation, but some things can still trip it up. I have one particular text where the writer used commas in many places where a period should have gone, and it just cannot figure out the sentence structure like that.
* The model usually tries to guess emotion from the text content, but it mostly gets it wrong.
* It correctly reads quoted dialogue in the middle of narration, by speaking slightly louder. If the text indicates a woman is speaking the model tries to affect a high pitch, with varying degrees of appropriateness in the given context. Honestly, it'd be better if it kept a consistent pitch. And, perplexingly, no matter how much the surrounding text talks about music, it will read "bass" as "bass", instead of "base".
* Quite often the model inserts weird noises at the beginning and end of a clip which will throw you off until you learn to ignore them. It's worse for short fragments, like chapter titles and the like. Very rarely it inserts what are basically cut-off screams, like imagine a professional voice actor is doing a recording and just before he hit stop someone was murdered inside the booth.
* It basically cannot handle numbers more than two digits long. Even simple stuff like "3:00 AM" it will read as complete nonsense like "threenhundred am".
* It also has problems with words in all caps. It's a tossup if it's going to spell it out, yell it, or something in between. In my particular case, I tried all sorts of things to get it to say "A-unit" (as in a unit with the 'A' designation) properly, but sometimes it still manages to fuck it up and go "ah, ah, ah, ah, ah, ah unit".
* Sometimes it will try to guess the accent it should use based on the grammar. For example, I used a sample from a Lovecraft audiobook, with a British speaker, and the output will sometimes turn Scottish out of nowhere, quite jarringly, if the input uses "ya" for "you" and such.
Thank you - this is helpful. I didn't realize how important I was going to value consistency over quality voice, but then when you've got to go back and listen to everything for quality control ... I guess that is the drawback of this phase of "generative" voice synth.
Yeah, in that way it's a lot like image generation. Maybe a single output is good in isolation, but if you want to generate a series maintaining some kind of consistent style, it's very much like a lottery. The models don't have dials to control emphasis, cadence, emotiveness, accent, etc., so they guess from the content. For example, imagine a serious scene that calls for a somber tone, but then one of the characters makes a dark or ironic joke. A human would maintain the same reading voice, but these models would instead switch to a much more chipper register for that one line.
> pip install cool-thing-i-found
This is the entire problem. You gonna put that in a lock file or just tell your colleagues to run the same command?
I meant I'm running that command as the consumer, and have never had problems. When I make my own packages, I ensure that anyone doing the same thing for my package won't have issues by using version pinning.
Having packages in a package manager is the problem?
like democracy, it's the worst programming language except vs everything else...
This comment is pithy, but I reject the sentiment.
In 2025, the overall developer experience is much better in (1) Rust compared to C++, and (2) Java/DotNet(C#) compared to Python.
I'm talking about type systems/memory safety, IDEs (incl. debuggers & compilers), package management, etc.
Recently, I came back to Python from Java (for a job). Once you take the drug of a virtual machine (Java/DotNet), it is hard to go back to native binaries.
Last, for anyone unfamiliar with this quote, the original is from Winston Churchill:
How come it's easier if the tool is in another language? What are the technical (or cultural) reasons? Do most C programs use static linking, or just not have deps?
When I need to build an established project written [mostly] in C or C++, even if I don't have the dependencies installed, it's typically just a matter of installing my distro's packages for the deps and then running configure and make, or whatever. It usually works for me. Python almost never does until I've torn half my hair out wrapping my brain around whatever new band-aid bullshit they've come up with since last time, still not having understood it fully, and muddled through to a working build via ugly shortcuts I'm sure are suboptimal at best.
I don't really know why this is, at a high level, and I don't care. All I know is that Python is, for me, with the kinds of things I tend to need to build, the absolute fucking worst. I hope uv gets adopted and drives real change.
My last dance with Python was trying to build Ardupilot, which is not written in Python but does have a build that requires a tool written in Python, for whatever reason. I think I was on my Mac, and I couldn't get this tool from Homebrew. Okay, I'll install it with Pip—but now Pip is showing me this error I've never seen before about "externally managed environments", a concept I have no knowledge of. Okay, I'll try a venv—but even with the venv activated, the Ardupilot makefile can't find the tool in its path. Okay, more googling, I'll try Pipx, as recommended broadly by the internet—I don't remember what was wrong with this approach (probably because whatever pipx does is totally incomprehensible to me) but it didn't work either. Okay, what else? I can do the thing everybody is telling me not to do, passing `--break-system-packages` to plain old Pip. Okay, now the fucking version of the tool is wrong. Back it out and install the right version. Now it's working, but at what cost?
This kind of thing always happens, even if I'm on Linux, which is where I more usually build stuff. I see errors nobody has ever posted about before in the entire history of the internet, according to Google. I run into incomprehensible changes to the already incomprehensible constellation of Python tooling, made for incomprehensible reasons, and by incomprehensible I mean I just don't care about any of it, I don't have time to care, and I shouldn't have to care. Because no other language or build system forces me to care as much, and as consistently, as Python does. And then I don't care again for 6 months, a year, 2 years, until I need to do another Python thing, and whatever I remember by then isn't exactly obsolete but it's still somehow totally fucking useless.
The universe has taught me through experience that this is what Python is, uniquely. I would welcome it teaching me otherwise.
I agree with you wholeheartedly, besides not preferring dynamic programming languages, I would in the past have given python more of a look because of its low barrier to entry...but I have been repulsed by how horrific the development ux story has been and how incredibly painful it is to then distribute the code in a portable ish way.
UV is making me give python a chance for the first time since 2015s renpy project I did for fun.
That's because many people don't pay attention to reproducibility of their developed software. If there is no lock file in a repo that nails the exact versions and checksums, then I already know it's likely gonna be a pain. That's shoddy work of course, but that doesn't stop people from not paying attention to reproducibility.
One could argue, that this is one difference between npm and such, and what many people use in the Python ecosystem. npm and cargo and so on are automatically creating lock files. Even people, who don't understand why that is important, might commit them to their repositories, while in the Python ecosystem people who don't understand it, think that committing a requirements.txt only (without checksums) is OK.
However, it is wrong, to claim, that in the Python ecosystem we didn't have the tools to do it right. We did have them, and that well before uv. It took a more care though, which is apparently too much for many people already.
The lock file shouldn't be in the repository. That forces the developers into maintenance that's more properly the responsibility of the CI/CD pipeline. Instead, the lock file should be published with the other build artifacts—the sdist and wheel(s) in Python's case. And it should be optional so that people who know what they're doing can risk breaking things by installing newer versions of locked dependencies should the need arise.
It absolutely should be. Otherwise you don’t have reproducible builds.
You can reproduce the release just fine using the lock file published alongside the release. Checking it in creates unnecessary work for devs, who should only be specifying version constraints when absolutely necessary.
> Checking it in creates unnecessary work for devs, who should only be specifying version constraints when absolutely necessary.
The unnecessary work of a `git commit`?
Having the file be versioned creates no requirement to update its contents any more frequently than before, and it streamlines "publishing alongside the release". The presence of the lockfile in the repo doesn't in any way compel devs to use the lockfile.
You aren’t kidding. Especially if it’s some bioinformatics software that is just hanging out there on GitHub older than a year…
Do you think bioinformatics libs written in C++ do not have the same issues?
They’re weren’t that many that weren’t pre compiled for Linux in the c++ world. Python is bad, but others have issues too.
C/C++ often had to compile used “make” which I’ll admit to being better at the conda/pip.
I suspect this is because the c/c++ code was developed by people with a more comp Sci background. Configure/make/make install..I remember compiling this one.
https://mafft.cbrc.jp/alignment/software/source.html
If the software made it biogrids life was easier
https://biogrids.org/
But a lot of the languages had their own quirks and challenges (Perl cpan, Java…). Containerization kinda helps.
I mean, I think this is par for the course by anything written by a grad student. Be thankful it's not written in matlab
The only thing I dreaded more was trying to run other people's C++ projects.
vcpkg seems to help a lot there, at least for Windows code and msbuild/Visual Studio.
Which means you’re already generally in worse shape than Python. At least Python’s half baked packaging systems try to be multi-platform.
vcpkg is also multi-platform (Linux, macOS). I just haven't used it for any of those yet.
I was into Python enough that I put it into my username but this is also my experience. I have had quasi-nightmares about just the bog of installing a Python project.
I used to think this sentiment was exaggerated. Then I tried installing Dots OCR. What a nightmare, especially when NVIDIA drivers are involved.
Same! And Python was my first, and is currently my second-highest-skill language. If someone's software's installation involves Python, I move on without trying. It used to be that it would require a Python 2 interpreter.
Honorable mention: Compiling someone else's C code. Come on; C compiles to a binary; don't make the user compile.
There's a lot more involved in distributing C (and C++) programs than just compiling them:
I'm assuming a Linux based system here, but consider the case where you have external dependencies. If you don't want to require that the user installs those, then you gotta bundle then or link them statically, which is its own can of worms.
Not to mention that a user with an older glibc may not be able to run your executable, even if they have your dependencies installed. Which you can, for example, solve by building against musl or a similar glibc alternative. But in the case of musl, the cost is a significant overhead if your program does a lot of allocations, due to it lacking many of the optimizations found in glibc's malloc. Mitigating that is yet another can of worms.
There's a reason why tools like Snap, AppImage, Docker, and many more exist, each of which are their own can of worms
Yea def. I think Linux's ABI diaspora and the way it handles dependencies is pain, and the root behind both those distro methods you mention, and why software is distributed as source instead of binaries. I contrast this with Rust. (And I know you can do this with C and C++, but it's not the norm:
Another nice thing is that, if you can live with the slower musl malloc, then building a "universal" Linux binary with Cargo takes just two commands:
$ rustup target add x86_64-unknown-linux-musl
$ cargo build --target x86_64-unknown-linux-musl --release
Similarly for cross-compiling for Windows
It may be fixed now, but devil's in the details. As one example, musl has (or had) chronic issues with it's dns resolver and large responses.
Definitely. I haven't tried building anything that requires DNS using musl, but I've had to work around musl's much, much slower malloc implementation
The musl wiki lists a number of differences between it and glibc that can have an impact:
https://wiki.musl-libc.org/functional-differences-from-glibc...
I should try that!
> C compiles to a binary; don't make the user compile.
C compiles to many different binaries depending on the target architecture. The software author doesn't necessarily have the resources to cross-compile for your system.
Incidentally, this is probably exactly the thing that has made most of those Python installations problematic for you. Because when everything is available as a pre-built wheel, very much less can go wrong. But commonly, Python packages depend on included C code for performance reasons. (Pre-built results are still possible that Just Work for most people. For example, very few people nowadays will be unable to install Numpy from a wheel, even though it depends on C and Fortran.)
> Honorable mention: Compiling someone else's C code. Come on; C compiles to a binary; don't make the user compile.
Unless you’re on a different architecture, then having the source code is much more useful.
Or often just the same architecture with a slightly different OS version.
The python community was in profound denial for a very long time.
I dread running my own Python projects if I haven't worked with them in a while.
Couldn't agree more. I have a project at work from 2016 that builds multiple different HMIs (C++) along with 2 embedded systems (C). They all have to play nicely with each other as they share some structures and can all be updated in the field with a single file on a USB stick. So there is a bash script that builds everything from a fresh clone, makes update files, and some other niceties. Then, there is a single python script that generates a handful of tables from a json file.
Guess which part of the build I spent fixing the other day... It wasn't the ~200000 lines of c/c++ or the 1000+ line bash script. No. It was 100 lines of python that was last touched 2 years years ago. Python really doesn't work as a scripting language.
How about shipping one? Like even just shipping some tools to internal users is a pain
I really don't understand this. I find it really easy.
Just stick to what's in your linux distribution and you've got no problems.
No need, run python as a container. No need to mix what's installed on the hostOS.
https://hub.docker.com/_/python
this manages to be even worse. since it's setup full of holes to usable (eg reaching out on the filesystem), you get the worst of random binaries without isolation, plus the dead end for updates you get in practice when dealing with hundreds of containers outside of a professionally managed cluster.
Not even trying to compile build other people's C/C++ projects on *nix?
pfff... "other people projects".. I was not even able to run my own projects until I started using Conda.
I have used
pip freeze > requirements.txt
pip install -r requirements.txt
Way before "official" lockfile existed.
Your requirements.txt becomes a lockfile, as long as you accept to not use ranges.
Having this in a single tool etc why not, but I don't understand this hype, when it was basically already there.
That works for simple cases. Now, update a transitive dependency used by more than one dependency. You might get lucky and it'll just work.
Not sure how uv helps here, because I am not very familiar with it.
With pip you update a dependency, it won't work if it's not compatible, it'll work if they are. Not sure where the issue is?
> it won't work if it's not compatible
This is very new behavior in pip. Not so long ago, imagine this:
You `pip install foo` which depends on `bar==1.0`. It installs both of those packages. Now you install `pip install baz` which depends on `bar==2.0`. It installs baz, and updates bar to 2.0. Better hope foo's compatible with the newer version!
I think pip only changed in the last year or two to resolve conflicts, or die noisily explaining why it couldn't be done.
Simple for simple cases - but you update a dependency and that updates a dependency that has a window range of dependencies because one version had a security issue which causes you to downgrade three other packages.
It can get complicated. The resolver in uv is part of its magic.
https://docs.astral.sh/uv/reference/internals/resolver/
JavaScript has truly rotted the brains of software developers.
You include the security patch of whatever your dependencies are into your local vetted pypi repository. You control what you consider liabilities and you don't get shocked by breakages in what should be minor versions.
Of course you have to be able to develop software and not just snap Lego's together to manage a setup like that. Which is why uv is so popular.
You can make it a language flame war, but the Python ecosystem has had no problem making this bed for themselves. That's why people are complaining about running other people's projects, not setting up their own.
Sensible defaults would completely sidestep this, that's the popularity of uv. Or you can be an ass to people online to feel superior, which I'm sure really helps.
You're implying that I have to run a local Pypi just to update some dependencies for a project? When other languages somehow manage without that? No way I'm doing that.
Some organizations force you to use their internal dependency repos because the "IT department" or similar has blessed only certain versions in the name of "security" (or at least security theater.)
Inevitably, these versions are out-of-date. Sometimes, they are very, very out of date. "Sorry, I can only install [version from 5 years ago.]" is always great for productivity.
I ran into this recently with a third-party. You'd think a 5 year old version would trigger alarm bells...
This is the norm at all big tech companies now AFAIK.
I use 30 year old software regularly. Newer doesn't mean working.
Sure. I do a lot of retrocomputing and that's fine. I have OSes from the 80's running in emulators.
But when you're developing software, you want the newer stuff. Would you use MySQL 5.0 from 2005? No, you'd be out of your mind.
Im wondering if people like you are getting paid to vet other people’s libraries? Because with every modern project I have ever seen, you can’t do too much the rest of the day with the amount of library updates you have to be vetting.
He's a consultant. Making everyone else sound incompetent is part of the gig.
Cool so how does that work when you’re writing a library that you want to distribute to other people?
> Not sure how uv helps here, because I am not very familiar with it.
Which makes you part of the people the GP is referring to? Try using it anger for a week, you'll come to understand.
It's like Sisyphus rolling a cube up a hill and being offered a sphere instead: "no thanks, I just push harder when I have to overcome the edges."
Can you elaborate on this? How is npm/cargo/etc better than pip on this regard?
As far as I know, files like requirements.txt, package.json, cargo.toml are intended to be used as a snapshot of the dependencies in your project.
In case you need to update dependency A that also affects dependency B and C, I am not sure how one tool is better than other.
Open a requirements.txt and a package.lock.json next to each other and compare. Then you will know the answer to the question what npm, cargo, and others are doing better than pip. Oh, did I sneek a ".lock" in there? Damn right I did.
They will resolve a version that works for all dependencies if it exists.
I remember advocating for running nightly tests on every project/service I worked on because inevitably one night one of the transitive dependencies would update and shit would break. And at least with the nightly test it forced it to break early vs when you needed to do something else like an emergency bug fix and ran into then..
it won’t work of course, no one is that lucky :)
Even more importantly, uv forces you to do it right like npm always did
npm did not always do it right, and IMO still does not do it completely right (nor does pnpm, my preferred replacement for npm -- but it has `--frozen-lockfile` at least that forces it to do the right thing) because transitive dependencies can still be updated.
cargo can also update transitive dependencies (you need `--locked` to prevent that).
Ruby's Bundler does not, which is preferred and is the only correct default behaviour. Elixir's mix does not.
I don't know whether uv handles transitive dependencies correctly, but lockfiles should be absolute and strict for reproducible builds. Regardless, uv is an absolute breath of fresh air for this frequent Python tourist.
npm will not upgrade transient dependencies if you have a lockfile. All the `forzen-lockfile` or `npm ci` commands does is prevent upgrades if you have incompatible versions specified inside of `package.json`, which should never happen unless you have manually edited the `package.json` dependencies by hand.
(It also removed all untracked dependencies in node_modules, which you should also never have unless you've done something weird.)
I'm not sure when that behaviour might have changed, but I have seen it do so. Same with yarn when not specifying a frozen lockfile.
I switched to pnpm as my preferred package manager a couple of years ago because of this, and even that still requires explicit specification.
It was an unpleasant surprise, to say the least.
That works, more or less. But now you have a requirements.txt file with 300 dependencies. Which ones do you actually care about, and which are just transitive things that your top-level deps brought along for the ride? And a year later, when GitHub's Dependabot is telling you have a security vulnerability in some package you've never heard of, do you remember if you even care about that package in the first place, or if it's left over cruft from that time you experimented with aiohttp instead of httpx?
I always just used pip-tools. Your requirements.in is the file that is human-readable and -writable, and sets your top-level deps and the version ranges you want. requirements.txt is your lockfile that you generate from .in with pip-compile. pip-compile writes out comments specifying from where each package in requirements.txt is being required.
uv does it a lot faster and generates requirements.txts that are cross-platform, which is a nice improvement.
As a “pip is mostly fine” person, we would direct the result to a new lock file, so you could still have your direct does and then pin transitives and update
Pips solver could still cause problems in general on changes.
UV having a better solver is nice. Being fast is also nice. Mainly tho it feeling like it is a tool that is maintained and can be improved upon without ripping one’s hair out is a godsend.
Totally agree, UV's solver speed is exciting
This is way less than what uv and other package managers do:
- dev dependencies (or other groups) - distinguishing between direct and indirect dependencies (useful if you want to cut some fat from a project) - dependencies with optional extra dependencies (if you remove the main, it will delete the orphans when relevant)
It's not unachievable with pip and virtualenvs, but verbose and prone to human error.
Like C: if you're careful enough, it can be memory safe. But teams would rather rely on memory safe languages.
It is also manages the runtime, so you can pin a specific runtime to a project. It is very useful and worth investigating.
I think it's a great modern tool, don't get me wrong.
But the main reason shouldn't be the "lockfile". I was replying to the parent comment mainly for that particular thing.
I am on the same boat. I like uv for its speed and other niceties it brings and being a single tool to manage different things. But lockfile is not that big a deal. I never got Poetry as well. Tried it in a project once and the lockfile was a pain with the merges. I didn’t spend much time, so maybe I didn’t understand the tool and workflow or whatever, but pip and pip-tools were just fine working with requirements.txt.
Hey, just so you know, newer lockfiles are meant to fully replace old ones, you shouldn't bother with solving merge conflicts on these files, just accept all the new changes always.
What you SHOULD solve are conflicts in the packages/project file. Once solved, just create a new lockfile and replace the old one.
This applies to lockfiles on any project python or non-python.
Thank you for the tip. I don't run into them these days. The projects have matured and my area of work has shrinked, so changes to the dependencies are rare and my involvement in them is even rarer. But I will keep this mind for future.
The canonical way to do this with pip was using Constraints Files [1]. When you pollute your main requirements.txt it gets harder to see which package is an actual dependency of your project, and which ones are just sub-dependencies. Constraint files also let you not install a package if it's no longer a sub-dependency.
That being said, the uv experience is much nicer (also insanely fast).
[1] https://pip.pypa.io/en/stable/user_guide/#constraints-files
I've never even understood the virtual env dogma. I can see how version conflicts _could_ happen, but they never have. Admittedly, I'm surprised I never have issues installing globally, especially since others keep telling me what a terrible idea it is and how they had nightmare-scenario-X happen to them.
I write Python code for a living and no two projects I work on have the exact same dependencies. This is especially true when working with microservices, or working for multiple customers.
How do you work with multiple projects with different versions of the same dependencies? If you are using the “system python” for everything?
Not system Python (not least because that's a hassle to do these days anyway, with all the safeguards OS vendors have put in), but _my_ version of globally. My (user) global PyEnv version, for example.
Now having said that, I suspect PyEnv is doing some voodoo behind the scenes, because I occasionally see messages like "Package X what's version N, but you have version N1". I've never investigated them though, since both old and new packages seem to work just fine regardless.
I see. I think your Python usage is very limited. Becaue even when I was using Pyenv, I still needed pyenv-virtualenv to make sure my repositories didn't cross install different chain of dependencies into the same version.
> How do you work with multiple projects with different versions of the same dependencies?
You don't… you use the same versions for everything :)
it's very common for different projects to have different requirements, especially for fast moving libraries like transformers. if you rarely run python stuff it might not be a big deal, but i'd rather not have to reinstall stuff (especially big stuff like pytorch builds) every time i switch projects.
That's exactly it. Imagine your company has multiple Python repos, and one depends on foo>=1.0,<2.0, and another depends on foo>=2.0. Venvs let you configure completely isolated environments for each so that they can peacefully coexist. I would not for a moment consider using Python without virtualenvs, though I'm not opinionated about which tool manages them. Uv? Great. Poetry? Fine. `python -m venv`? Whatever. They all get the job done.
Honestly, I can't think of a single good reason not to want to use a venv for Python.
Using the same version of everything lets you have a much easier time when a vulnerability is discovered?
How so? That hasn’t been my experience.
Do you monitor CVEs?
They happen /all the time/.
For a long time there were even compatibilities between the RHEL host python version, and the python version the Red Hat Ansible team were shipping.
> They happen /all the time/.
So I keep hearing ;)
Meanwhile, on my machines ...
Sure, but if you manage a diverse fleet of apps that could be a decade or two old across thousands of servers, some of those being python projects, you really quickly can run into issues between host and application.
I only ever had it a problem with large, poorly maintained projects from work. You know the kind that have two web frameworks required in the same project, and two orms, etc. ;-) That one I definitely put into a venv. But my stuff, no.
And then you're sunk the moment anyone else needs to run your code, or even if you just need to run your own code on another machine.
Never happened.
I salute you for never needing a new computer, ever.
Honestly, this feels like the difference between Cmake and cargo, sure Cmake does work and you can get to do everything you need, you just need discipline, knowledge and patience. On the other hand, you could just have a tool that does it all for you so you can get back to doing the actual work.
I don’t get the hype either. Every time I’ve tried to use tools like pyenv or pipenv they fall down when I try to install anything that doesn’t provide wheels (GDAL), so I give up and stick to pip and virtualenv. Does uv let me install GDAL without hassle?
Pyenv's a different animal. It's meant for installing multiple Python versions at once so that you're not stuck with whatever dog your base OS happens to ship.
Pipenv tried to be what uv is, but it never did seem to work right, and it had too many weird corner cases ("why is it suddenly taking 3 hours to install packages? why it is literally impossible to get it to upgrade one single dependency and not all the others?") to ever be a contender.
Oops, you forgot to sh into you venv and now your env is messed up.
It is indeed fairly simple to implement it, which is why it's so weird that it's never been implemented at a language level
Good luck if you need cross-platform `requirements.txt` files.
This is a good use case. Not sure how this is typically solved, I guess "requirements-os-version.txt"? A bit redundant and repetitive.
I would probably use something like this: https://stackoverflow.com/questions/17803829/how-to-customiz...
But then you have to m x n x o it for different combinations of Python version, OS, CPU architecture, GPU make/model... uv will solve it for you in milliseconds.
How does uv solve that? Like, if you use dependencies that do not cross platforms very well?
uv finds a dependency resolution that works for all platforms by default, and can do things like fork the resolution and choose different versions based on platform or python version requirements.
Webdev since 1998 here. Tabling the python vs JS/etc to comment on npm per se. PNPM is better than npm in every way. Strongest possible recommendation to use it instead of npm; it's faster, more efficient, safer, and more deterministic. See https://pnpm.io/motivation
I've gone all-in on Bun for many of the same reasons. Blazingly fast installs too.
https://bun.sh/
I think at this point everyone on hacker news with even a passing interest in JS has heard of bun, it's promoted relentlessly
I avoided JS for the longest time because i wanted nothing to with node or npm. With bun, i'm finally enjoying javascript.
I'm still meeting devs who haven't heard of it and get their minds blown when they replace npm in their projects. Every day is a chance to meet one of the lucky 10000: https://xkcd.com/1053/
Bun still segfaults way too often for my comfort but I’m crossing my fingers waiting for it to mature. It is definitely nice to have an alternative runtime to Node.
Did you experience any compatibility problems with Bun?
Deno is pretty sweet too... shell scripts that don't need a package.json or a node_modules directory for dependencies.
Yeah, Deno 2 is pretty compelling.
I find pnpm annoying to type, that's why I don't use it
IME after years of using pnpm exclusively having to type `pnpm install` instead of `npm install` is easily the single biggest drawback of replacing `npm` with `pnpm`, so yes.
FWIW I use zsh with auto-auto-completion / auto-completion-as-you-type, so just hitting `p` on an empty command line will remember the most recent command starting with `p` (which was likely `pnpm`), and you can refine with further keystrokes and accept longer prefixes (like I always do that with `git add` to choose between typical ways to complete that statement). IMO people who don't use auto-completion are either people who have a magical ability to hammer text into their keyboards with the speed of light, or people who don't know about anything hence don't know about auto-completion, or terminally obsessive types who believe that only hand-crafting each line is worth while.
I don't know which type of person you are but since typing `pnpm` instead of `npm` bothers you to the degree you refuse to use `pnpm`, I assume you must be of the second type. Did you know you can alias commands? Did you know that no matter your shell it's straightforward to write shell scripts that do nothing but replace obnoxious command invocations with shorter ones? If you're a type 3 person then of course god forbid, no true hacker worth their salt will want to spoil the purity of their artisanal command line incantations with unnatural ersatz-commands, got it.
Command alias? Even Windows can do them these days.
alias it to “p”
Might be worth noting that npm didn’t have lock files for quite a long time, which is the era during which I formed my mental model of npm hell. The popularity of yarn (again importing bundled/cargo-isms) seems like maybe the main reason npm isn’t as bad as it used to be.
npm has evolved, slowly, but evolved, thanks to yarn and pnpm.
It even has some (I feel somewhat rudimentary) support for workspaces and isolated installs (what pnpm does)
Lock files are only needed because of version ranging.
Maven worked fine without semantic versioning and lock files.
Edit: Changed "semantic versioning" to "version ranging"
> Maven worked fine without semantic versioning and lock files.
No, it actually has the exact same problem. You add a dependency, and that dependency specifies a sub-dependency against, say, version `[1.0,)`. Now you install your dependencies on a new machine and nothing works. Why? Because the sub-dependency released version 2.0 that's incompatible with the dependency you're directly referencing. Nobody likes helping to onboard the new guy when he goes to install dependencies on his laptop and stuff just doesn't work because the versions of sub-dependencies are silently different. Lock files completely avoid this.
It is possible to set version ranges but it is hard to see this in real world. Everyone is using pinned dependencies.
Version ranges are really bad idea which we can see in NPM.
My apologies I should have said "version ranging" instead of "semantic versioning".
Before version ranging, maven dependency resolution was deterministic.
Always using exact versions avoids this (your pom.xml essentially is the lock file), but it effectively meant you could never upgrade anything unless every dependency and transitive dependency also supported the new version. That could mean upgrading dozens of things for a critical patch. And it's surely one of the reasons log4j was so painful to get past.
I’ve been out of the Java ecosystem for a while, so I wasn’t involved in patching anything for log4j, but I don’t see why it would be difficult for the majority of projects.
Should just be a version bump in one place.
In the general case Java and maven doesn’t support multiple versions of the same library being loaded at once(not without tricks at least, custom class loaders or shaded deps), so it shouldn’t matter what transitive dependencies depend on.
Right, that's the program. Let's say I really on 1.0.1. I want to upgrade to 1.0.2. Everything that also relies on 1.0.1 also needs to be upgraded.
It effectively means I can only have versions of dependencies that rely on the exact version that I'm updating to. Have a dependency still on 1.0.1 with no upgrade available? You're stuck.
Even worse, let's say you depends on A which depends on B, and B has an update to 1.0.2, if A doesn't support the new version of B, you're equally stuck.
Maven also has some terrible design where it will allow incompatible transitive dependencies to be used, one overwriting the other based on “nearest wins” rather than returning an error.
there are a small number of culprits from logging libraries to guava, netty that can cause these issues. For these you can use the Shade plugin https://maven.apache.org/plugins/maven-shade-plugin/
If in some supply chain attack someone switches out a version's code under your seating apparatus, then good look without lock files. I for one prefer being notified about checksums of things suddenly changing.
Maven releases are immutable
Sounds like the Common Lisp approach, where there are editions or what they call them and those are sets of dependencies at specific versions.
But the problem with that is, when you need another version of a library, that is not in that edition. For example when a backdoor or CVE gets discovered, that you have to fix asap, you might not want to wait for the next Maven release. Furthermore, Maven is Java ecosystem stuff, where things tend to move quite slowly (enterprisey) and comes with its own set of issues.
I've been using pip-tools for the best part of a decade. uv isn't the first time we got lock files. The main difference with uv is how it abstracts away the virtualenv and you run everything using `uv run` instead, like cargo. But you can still activate the virtualenv if you want. At that point the only difference is it's faster.
Yeah, python's tooling for dependency management was definitely not just fine, it was a disaster.
Coming from ruby. However, I think uv has actually now surpassed bundler and the ruby standard toolset for these things. Definitely surpassed npm, which is also not fine. Couldn't speak for cargo.
poetry gave us lock files and consistent installs for years. uv is much, much faster however.
I used poetry professionally for a couple of years and hit so many bugs, it was definitely not a smooth experience. Granted that was probably 3-4 years ago.
I always loved poetry but then I’d always run into that bug where you can’t use repos with authentication. So I’d always go somewhere else eventually.
Some time ago I found out it does work with authentication, but their “counter ascii animation” just covers it… bug has been open for years now…
The very first time I tried to use Poetry I ran into a bug where it couldn't resolve some simple dependencies.
uv actually works.
I started using poetry abiut 4 years ago and definitely hit a lot of bugs around that time, but it seems to have improved considerably. That said, my company has largely moved to uv as it does seem easier to use (particularly for devs coming from other languages).
I've occasionally run into performance issues and bugs with dependency resolution / updates. Not so much recently, but at a previous company we had a huge monorepo and I've seen it take forever.
There was pipenv before that too, which also had a lockfile.
Funny how these things get forgotten to history. There's lots of prior art when it comes to replacing pip.
edit: here's an HN thread about pipenv, where many say the same things about it as they are about UV and Poetry before https://news.ycombinator.com/item?id=16302570
Except pipenv was never anywhere near as good. It meant well but never delivered.
Exactly I jumped onto pipenv, poetry, and pyenv as soon as I heard about them, and though they provided advantages, they all had significant flaws which prevented me being able to give full-throated endorsement as the solutions to Python environments
However, I have zero reservations about uv. I have not encountered bugs, and when features are present they are ready for complete adoption. Plus there's massive speed improvements. There is zero downside to using uv in any application where it can be used and also there are advantages.
and pip-compile before that.
Agree that uv is way way way faster than any of that and really just a joy to use in the simplicity
I have to agree that there were a lot of good options, but uv's speed is what sets it apart.
Also the ability to have a single script with deps using TOML in the headers super eaisly.
Also Also the ability to use a random python tool in effectively seconds with no faffing about.
To be fair, Poetry has done everything uv does for about a decade. uv is much faster, which is great, but lock files, integrated venv management, etc.
Yep, coming from poetry, uv is a pure speed increase with the same feature set.
Python might have been better at this but the community was struggling with the 2 vs 3 rift for years. Maybe new tooling will change it, but my personal opinion is that python does not scale very well beyond a homework assignment. That is its sweet spot: student-sized projects.
Imo the community should've rejected Python 3 and said, find a way to improve things without breaking everyone. JS managed to do it.
The community basically did reject Python 3, at first. Almost nobody used 3.0 / 3.1 / 3.2, to the point where I’ve seen them retconned as beta releases.
Even then though, the core developers made it clear that breaking everyone’s code was the only thing they were willing to do (remember Guido’s big “No 2.8” banner at PyCon?), which left the community with no choice.
Yeah I was in the middle of that and had to keep switching between versions because of how many things were Py2 only or Py3 only. Extra annoying how v3 changed random things like integer division that's mess you up if you forgot.
[dead]
Tooling like npm, cargo, and others existed well before uv came up. I have used poetry years ago, and have had reproducible virtual environments for a long time. It's not like uv, at least in that regard, adds much. The biggest benefit I see so far, and that is also why I use it over poetry, is that it is fast. But the benefit of that is small, since usually one does not change the dependencies of a project that often, and when one does, one can also wait a few seconds longer.
I’ve been saying this for years! JS gets alot of hate for dependency hell.
Why?
It’s almost too easy to add one compared to writing your own functions.
Now compare that to adding a dependency to a c++ project
the thing is I never had issues with virtual environments -- uv just allows me to easily determine what version of python that venv uses.
you mean you can't just do `venv/bin/python --version`?
he means "choose", not "check"
Yes sorry you’re correct. It allows me to specify a version of Python.
> that when Python devs finally get a taste of npm/cargo/bundler in their ecosystem, they freaking love it. Because yes, npm has its issues but lock files and consistent installs are amazing
I think it's more like Rust devs using Python and thinking what the fuck why isn't this more like rustup+cargo?
I tried Python for the first time after I’d been coding with multiple other languages for about 15 years.
The environment, dependency experience created so much friction compared to everything else. Changed my perspective on Docker for local dev.
Glad to hear it seems to finally be fixed.
Why did it take this long? Why did so many prior solutions ultimately fall flat after years and years of attempts? Was Python package/environment management such a hard problem that only VC money could have fixed it?
It's not fixed quite yet because the default recommended way is still pip. And that's the same reason past attempts didn't work.
It didn't, though? Poetry was largely fine, it's just that uv is so much faster. I don't think uv is that much different from Poetry in the day-to-day dependency management, I'm sure there are some slight differences, but Poetry also brought all the modern stuff we expected out of a package manager.
I don't know, Poetry's existed for years, and people still use requirements.txt. Uv is great but isn't exactly unique in Python-land.
Yeah I use poetry, uv and requirements.txt - all great tools for their respective niches.
But you are just using virtualenv with pip. It doesn't change any of the moving pieces except that uv is virtualenv aware and will set up / use them transparently.
You've been able to have the exact same setup forever with pyenv and pyenv-virtualenv except with these nothing ever has to be prefixed. Look, uv is amazing and I would recommend it over everything else but Python devs have had this flow forever.
> But you are just using virtualenv with pip.
No, you aren't.
> It doesn't change any of the moving pieces
It literally does, though iyt maintains a mostly-parallel low-level interface, the implementation is replaced with improved (in speed, in dependency solving, and in other areas.) You are using virtual environments (but not venv/virtualenv) and the same sources that pip uses (but not pip).
> You've been able to have the exact same setup forever with pyenv and pyenv-virtualenv except with these nothing ever has to be prefixed.
Yes, you can do a subset of what uv does with those without prefixes, and if you add pipx and hatch (though with hatch you’ll be prefixing for much the same reason as in uv) you’ll get closer to uv’s functionality.
> Look, uv is amazing and I would recommend it over everything else but Python devs have had this flow forever.
If you ignore the parts of the flow built around modern Python packaging standards like pyproject.toml, sure, pieces of the flow have been around and supported by the right constellation of other standard and nonstandard tools for a while.
Right, it's a reimplementation, a really good one, but it doesn't change how Python packaging and dev environments work mechanically. The PEP that would have actually changed Python packaging to be more like NPM (https://peps.python.org/pep-0582/) was rejected.
> If you ignore the parts of the flow
I don't get this, pip has worked with pyproject.toml since its standardization https://peps.python.org/pep-0621/. You don't need any constellation of tools, the only pieces that aren't provided by upstream is the version manager and the virtualenv manager. The new packaging flow has also worked with the authoritative pypa tools since their standardization https://peps.python.org/pep-0517/ https://peps.python.org/pep-0518/ https://peps.python.org/pep-0751/
Again, uv is great, I just think people are giving this one tool too much credit for the standardization process (that uv is an implementation of) that actually addressed Python packaging issues. Like for example uv run, that's all https://peps.python.org/pep-0751/
Yep, working with bundler and npm for a decade plus has made me appreciate these tools more than you can know. I had just recently moved to Python for a project and was delighted to learn that Python had something similar, and indeed uv is more than just a package manager like bundler. It’s like bundler + rvenv/rvm.
And inspired by uv, we now have rv for RoR!
Personally I never thought it was fine, but the solutions were all bad in some way that made direct venv and requirements files preferable. Poetry started to break this but I had issues with it. uv is the first one that actually feels good.
I feel a little like this too.
My default feeling towards using python in more ways than I did was default no because the tooling wasn't there for others to handle it, no matter how easy it was for me.
I feel uv will help python go even more mainstream.
> but lock files and consistent installs are amazing
Yes, though poetry has lock files, and it didn't create the same positive feelings uv does :)
I would dread cloning a python project more than I would C++, and was the sole purpose I made real effort to avoid the language entirely.
What weird shadow-universe do you inhabit where you found Python developers telling you the tooling was just fine? I thought everyone has agreed packaging was a trash fire since the turn of the century.
Hackernews and also the official Python maintainers
>finally get a taste of npm
good god no thank you.
>cargo
more like it.
cargo is better than npm, yes, but npm is better than pip (in my experience)
As someone who moved from Python to NodeJS/npm ~10yrs ago I can fully support that statement. Dissatisfaction with Python's refusal to get its dependency/package-management act together and seeing how reasonably the task is being dealt with by `npm`—notably with all its flaws—made me firmly stay with NodeJS. Actually virtualenv was for me another reason to keep my fingers out of whatever they're doing now over there in Python-land, but maybe `uv` can change that.
Yeah but I want uv to be default first
Both of these work
other than being much slower than uv, conda has worked great for years
I do prefer uv but it's not like sane python env management hasn't existed
there are severe problems with npm as well. It is not a model I hope is replicated.
conda was great to me
conda ruined my shell and never successfully worked for me. I guess YMMV
All my experience with Conda is from helping my friend nuke it off his laptop
have you tried pixi for this?
same here; I now prefer uv but conda served us very well, and allowed us to maintain stable reproducible environments; being able to have multiple environments for a given project is also sometimes handy vs a single pyproject.toml
miniconda
yup
I somehow had quite enough problems going from bundler 1.13 to 1.16 to 2.x some years ago. I’m glad we have killed that codebase with fire.
pip lock?
[dead]
[dead]
[flagged]
> you should just use virtualenv with pip
This is the most insulting take in the ongoing ruination of Python. You used to be able to avoid virtualenvs and install scripts and dependencies directly runnable from any shell. Now you get endlessly chastised for trying to use Python as a general purpose utility. Debian was a bastion of sanity with the split between dist_packages and site_packages but that's ruined now too.
Unless all python dependencies you ever used were available in your distro (and then at that point, you're no longer using pip, you're using dpkg...), this never worked well. What solves this well is PEP 723 and tooling around it.
With PEP 723 and confortable tooling (like uv), now you get scripts, that are "actually directly runnable", not just "fake directly runnable oops forgot to apt-get install something sorta runnable", and work reliably even when stuff around you is updated.
> You used to be able to avoid virtualenvs and install scripts and dependencies directly runnable from any shell.
This wasn't really the case; in principle anything you installed in the system Python environment, even "at user level", had the potential to pollute that environment and thus interfere with system tools written in Python. And if you did install it at system level, that became files within the environment your system package manager is managing, that it doesn't know how to deal with, because they didn't come from a system package.
But it's worse now because of how many system tools are written in Python — i.e., a mark of Python's success.
Notably, these tools commonly include the system package manager itself. Since you mentioned Debian (actually this is Mint, but ya know):
> Now you get endlessly chastised for trying to use Python as a general purpose utility.No, you don't. Nothing prevents you from running scripts with the system Python that make use of system-provided libraries (including ones that you install later with the system package manager).
If you need something that isn't packaged by your distro, then of course you shouldn't expect your distro to be able to help with it, and of course you should expect to use an environment isolated from the distro's environment. In Python, virtual environments are the method of isolation. All reasonable tooling uses them, including uv.
> Debian was a bastion of sanity with the split between dist_packages and site_packages but that's ruined now too.
It's not "ruined". If you choose to install the system package for pip and to use it with --break-system-packages, the consequences are on you, but you get the legacy behaviour back. And the system packages still put files separately in dist-packages. It's just that... doing this doesn't actually solve all the problems, fundamentally because of how the Python import system works.
Nowadays pip also defaults to installing to the users home folder if you don't run it as root.
Basically the only thing missing from pip install being a smooth experience is something like npx to cleanly run modules/binary files that were installed to that directory. It's still futzing with the PATH variable to run those scripts correctly.
> Nowadays pip also defaults to installing to the users home folder if you don't run it as root.
This could still cause problems if you run system tools as that user.
I haven't checked (because I didn't install my distro's system package for pip, and because I use virtual environments properly) but I'm pretty sure that the same marker-file protection would apply to that folder (there's no folder there, on my system).
> Python as a general purpose utility
This ideology is what caused all the problems to begin with, the base python is built as if it's the only thing in the entire operating systems environment when it's entire packaging system is also built in a way that makes that impossible to do without manually having to juggle package conflicts/incompatibilities.
it's because so many essential system tools now rely on python, and if you install arbitrary code outside of a venv it can clobber the global namespace and break the core OS' guarantees.
I do agree it is annoying, and what they need to do is just provide an automatic "userspace" virtualenv for anything a user installs themselves... but that is a pandoras box tbh. (Do you do it per user? How does the user become aware of this?)
What they needed to do is allow side-by-side installs of different versions of the same distribution package and allow specifying or constraining versions at import time, then you wouldn't have the problem at all.
But that's probably not practical to retrofit given the ecosystem as it is now.
That couldn't happen with Debian's dist_packages which was explicitly for the the system tools managed by apt.
pipx solves this perfectly.
For "applications" (which are distributed on PyPI but include specified entry points for command-line use), yes. For development — installing libraries that your own code will use — you'll still generally need something else (although the restriction is really quite arbitrary).
Agreed! Sorry my read was for apps. You can use --user with pip to install into the user site rather than the system site, however it still causes overlap which can be problematic
This is very true! I was highly surprised when I installed Python from source and found out, that the entire problem is fixed since decades. You can have different Python versions in the same prefix just fine, you just need to pick a default one you install with `make install` and install all the others with `make altinstall`.