Uv is so good. I'm a curmudgeon about adopting new tooling, and tried uv with a lot of skepticism, but it was just better in every way. And even if it wasn't so polished and reliable, the raw speed makes it hard to go back to any other tool.

Uv combined with type hints reaching critical mass in the Python ecosystem, and how solid PyLance is in VSCode, feels so good it has made me consider investing in Python as my primary language for everything. But then I remember that Python is dog slow compared to other languages with comparable ergonomics and first-class support for static typing, and...idk it's a tough sell.

I know the performance meta in Python is to...not use python (bind to C, Rust, JVM) - and you can get pretty far with that (see: uv), but I'd rather spend my limited time building expertise in a language that isn't constantly hemorrhaging resources unless your code secretly calls something written in another language :/

There are so many good language options available today that compete. Python has become dominant in certain domains though, so you might not have a choice - which makes me grateful for these big steps forward in improving the tooling and ecosystem.

I think Python has a place in many developers toolkits. I've never met anyone who hates Python (though I'm sure they exist), whereas for pretty much any other language one could mention there are much more polarizing viewpoints. (as the saying goes "Python is everyone's second favorite programming language").

The Python team needs not feel any pressure to change to compete, Python has already done quite well and found its niche.

raises hand I hate python.

I am a user of pip binaries. Every few years one of them breaks.

As far as I understand, developers never cared about pinning their dependencies and python is fast to deprecate stuff.

  $ uvx remt
      Built pygobject==3.54.5
      Built remt==0.11.0
      Built pycairo==1.28.0
  Installed 12 packages in 9ms
  Traceback (most recent call last):
    File "/home/user/.cache/uv/archive-v0/BLXjdwASU_oMB-R4bIMnQ/bin/remt", line 27, in <module>
    import remt
  File "/home/user/.cache/uv/archive-v0/BLXjdwASU_oMB-R4bIMnQ/lib/python3.13/site-packages/remt/__init__.py", line 20, in <module>
    import pkg_resources
  ModuleNotFoundError: No module named 'pkg_resources'

  $ uvx maybe
  × Failed to build `blessings==1.6`
  ├─▶ The build backend returned an error
  ╰─▶ Call to `setuptools.build_meta:__legacy__.build_wheel` failed (exit status:
      1)

      [stderr]
      /home/user/.cache/uv/builds-v0/.tmpsdhgNf/lib/python3.13/site-packages/setuptools/_distutils/dist.py:289:
      UserWarning: Unknown distribution option: 'tests_require'
        warnings.warn(msg)
      /home/user/.cache/uv/builds-v0/.tmpsdhgNf/lib/python3.13/site-packages/setuptools/_distutils/dist.py:289:
      UserWarning: Unknown distribution option: 'test_suite'
        warnings.warn(msg)
      error in blessings setup command: use_2to3 is invalid.

      hint: This usually indicates a problem with the package or the build
      environment.
  help: `blessings` (v1.6) was included because `maybe` (v0.4.0) depends on
        `blessings==1.6`
I heard rumors from computer vision developers that even libraries deprecate that fast.

Be careful of attributing to python what is really the fault of python lib developers.

Having said that, our team is having to do a bunch of work to move to a new python version for our AWS serverless stuff, which is not something I'd have to worry about with Go (for example). So I agree, there is a problem here.

> Be careful of attributing to python what is really the fault of python lib developers.

If so, you also cannot attribute to Python the virtues of Python lib developers either (in particular, a large library ecosystem).

Yip. What you are talking about is the language ecosystem.

I've been mostly not using Python since it's arrival in the early 90s. Meaning I tinker with it but don't use it professionally.

Well, I hate Python. Main points for me: the scoping (or lack of it), the lack of declaration of new identifiers (linked to lack of scoping), the lack of update operators (like `a ||= (i > 0)`), the general lack of conciseness, the `a if b else c` confusion, the weirdness of comprehensions (particularly nested `for`s), the exceptions that are raised for non-exceptional cases. The heaviness of handling exception, like trying to delete a non-existing dict entry (need an additional `if` or a `try` block) or access a non-existing map entry (need to use `.get()` instead of `[]` or a `try` block).

Also, the syntax of layout is bad. I am not talking about layout itself, I do like Haskell syntax (despite being weird about parens). But if I write `a = b +` in Python on one line, then I get a syntax error, although the parser could instead assume that the expression is not terminated and must (obviously) continue on the next (indented) line. I hate that I need to use `\` or `(...)` to make this clear to the parser. I wrote parsers myself, and I know that it knows what needs to follow, and Python itself shows me that it knows: by raising a completely unnecessary syntax error.

It feels to me that the Python language design confuses `simple and easy` with `primitive`. If feels like a design without the knowledge of programming language research and ergonomy. It feels to me like a dangerous toy language, and I am never sure which of my stupid mistakes will be found by the compiler/interpreter, and which will just silently misinterpreted. And which of my perfectly valid requests will be rejected with an exception. In some aspects it feels less safe than C, particularly due to the lack of scoping and the danger of reuse of variables or introduction of new function local variables when actually, outer 'scope' variables were intended to be written.

This is not really meant as a rant, but it is a personal opinion, and I try to lay out my reasons. I am not trying to shame anyone who loves Python, but I just want to clarify that there are people who hate Python.

> the lack of update operators (like `a ||= (i > 0)`)

What would that even do? Is that the equivalent of `a = a or (i > 0)`? Python does not have a "||" operator.

> the `a if b else c` confusion

I'll agree that Python really dropped the ball on implementing a ternary operator, but I guess Guido really didn't want the C version.

> the weirdness of comprehensions (particularly nested `for`s)

If a comprehension is getting weird because of nesting, then I'd change it to not be a comprehension. I'd rather have nested `for` loops than nested comprehensions.

> the exceptions that are raised for non-exceptional cases

I'd be interested in an example of this.

> like trying to delete a non-existing dict entry (need an additional `if` or a `try` block) or access a non-existing map entry (need to use `.get()` instead of `[]` or a `try` block)

I suggest thinking more about the Zen of Python. Specifically, explicit is better than implicit, and errors should never pass silently. If you're trying to delete a non-existing dict entry, or trying to access a non-existing entry, then in most cases, you have a bug somewhere. Basically, Python believes that your code is expecting that dict entry to exist. Forcing you to use .get or use an `if` is a measure to make your code explicitly declare that it's expected that the dict entry might not exist.

> But if I write `a = b +` in Python on one line, then I get a syntax error [..]

Yeah, the parser could certainly be written to handle this case, but it was deliberately written not to.

> It feels to me like a dangerous toy language

Toy language, I could see. Dangerous? Not at all. I could call it opinionated, though.

> and I am never sure which of my stupid mistakes will be found by the compiler/interpreter, and which will just silently misinterpreted.

Meanwhile, I'd look at C and think "I'm not sure which of my mistakes will lead to a memory leak or an exploitable buffer overflow."

> In some aspects it feels less safe than C, particularly due to the lack of scoping and the danger of reuse of variables or introduction of new function local variables when actually, outer 'scope' variables were intended to be written.

I'd argue the exact opposite: It's more dangerous to allow you to accidentally clobber a global variable when you meant to create a local one.

> This is not really meant as a rant, but it is a personal opinion, and I try to lay out my reasons.

I think the core issue is that Python tries to adopt a different paradigm than languages like C, and it's a paradigm that you just strongly disagree with. Personally, I love it. What's funny is that when I first saw Python, I was like "This language sucks, it makes things too easy and it holds your hand." After using it extensively at work though, I find myself saying "This language is great! It makes things so easy and holds your hand!"

Yes, all your points are valid for yourself. It's OK. I did not try to convince you that Python is bad. I only tried to explain why I hate it. In the hope that maybe you better understand why. It's too easy to say 'I hate XYZ', so I tried to explain myself. I am perfectly fine with you loving it. Let's see whether I can explain myself better:

> ... the Zen of Python ... > ... was deliberately written not to ... > ... Guido really didn't want ...

Yes, sure, I do not question that. I am merely saying that I hate many of the outcomes of that. I also hate discussions about that -- it is so quickly sacrilege to hate Python. I know that it's futile to discuss religion, and I am not trying to do that. But someone said that no-one hates Python -- well, I do, and this is why!

> ... explicit is better than implicit ...

Except for variable declarations, apparently. I would very much like them to be explicit, and it would be a start to fix the scoping problem.

> ... Dangerous? Not at all ...

I perceive a danger when the compiler/interpreter does not tell me that I am doing unintended things. Obviously, it cannot know what I intend. But proper and smaller scopes are generally regarded as helping humans to track the visibility of a variable. I certainly cope better with explicit variable declarations and with smaller scopes. I feel a danger that I accidentally misuse (or reuse) variables, e.g. by missing a `nonlocal` or `global` declaration or by using a variable after an `if` that was meant to be only valid local to that `if` block.

> ... Meanwhile, I'd look at C and think "I'm not sure which of my mistakes will lead to a memory leak or an exploitable buffer overflow." ...

Of course C is dangerous, that's why I cited it. But in at least one aspect, the scoping and related lack of explicit syntax to introduce new variables, I think Python is more dangerous than C. In most other aspect, C is definitely worse.

Comparing another language, and how it is possible to improve: JavaScript managed to fix scoping by introducing `let`. If you don't use `var`, then this particular problem is solved in JavaScript.

> ... the exceptions that are raised for non-exceptional cases ...

E.g., 'file not found' in open(). That's not exceptional, because it potentially happens with every usage of open(). You can never seriously use `open` without handling the case that the file is not found. Therefore, it's not exceptional, but it should be encoded in the return value.

> ... then I'd change it to not be a comprehension. ...

That's not the point. I don't want imperative, I want functional. I like list comprehension. I am not complaining about imperative `for`, but I am complaining about the confusing syntax of nested `for` in comprehensions. I am complaining of being forced to use imperative `for` by the weirdly confusing syntax of functional `for`.

raises hand I hate python.

Every python codebase i’ve had to look after has rotten to the point it doesn’t even build and is a maintenance nightmare.

I also hate whitespace instead of {}

i resist saying that i hate python because that implies that i don't hate aspects of basically all alternatives (or all that are popular anyway)

like with everything else these days, it's about living with it and try to make the best of the good parts of it

i remember getting told in the 00s that i would get used to and love the whitespace-based block definition, and boy i hate it now more than ever with 1000s of hours spent looking at and coding in python

but it is what it is, for whatever reason it has become a must in many particular industries a lot like Java took over some earlier on although it seems to be fading, and javascript is a must in others

it really isn't just about programming languages that these days you either learn to live with some massive annoyances and practices you may hate, or withdraw entirely from society

Oh, don’t get me wrong i don’t refuse to work with it or anything so extreme. These days I end up writing code in any of a handful of languages.

Given the choice though I typically don’t teach for python unless there’s an obvious reason to (some good library for my task or a team i am helping is only python people / devops etc)

These issues are true of most legacy codebases I've worked on in other languages, but I think language design can be a factor here. Do you have any thoughts on if and how Python has led to this rot?

Another comment has explained this already, but the lack of dependency pinning and the general "it's just a script" attitude isn't conducive to long-term stability. During the early days of the llama local LLM runner, I was shocked to discover that releases weren't buildable mere days after being tagged in Git! Days!

Other platforms like Java and .NET enjoy one to two decades of life for source before it becomes mildly challenging to build.

> Other platforms like Java and .NET enjoy one to two decades of life for source before it becomes mildly challenging to build.

Java enjoys months of life for source before it becomes impossible to build, because some prehistoric version of Gradle with shitty Groovy script stopped working.

I think that is a totally fair criticism. There is something to be said about a language being too easy to hack something together with, it's something that makes backend JavaScript coding a long term pain as well I think.

uv helps a lot with the first problem. Not so much with the second, though.

I hate python and I use it everyday because I work in the data space. Its a toy scripting / glue language that has gotten used for far too much that it was not designed for. The usual suspects are also really annoying, such as white space instead of {}, no types, its so damn slow and all projects use so many packages and the packages use packages etc. That last one could just be a personal preference thing to I will admit, but the rest are just almost objectively bad. Especially when building infrastructure like a data platform.

"white space instead of braces" is "just almost objectively bad"?

Why?

usually packages use packages in any worthwhile language with useful packages and desire for code reuse...

It's also funny to hear this about Python which, arguably, has biggest std and amount batteries included out of the box.

Python's stdlib is not really something to brag about. Very inconsistent, multiple ways of doing the same thing, a lot stuff in there people recommend against actually using...so much cruft acquired over decades.

It's why projects use so many packages in the first place.

Indeed. The standard library was primarily designed in, and for, an era where you couldn't just download multiple megabytes of third-party code on a whim.

I wish we had a language that had the syntax of Python (notably including operator overloading, which is absolutely critical for neural networks, ML, data science and numerical computations), the performance, compile times and concurrency support of Go, the type system flexibility of Typescript, and the native platform integration of C/C++.

Than you would probably like the Nim[1] programming language. It has the syntax of python, but transpiles to C/C++. A good type system. The main problem would probably the compiles times. Because you basically compile just C/C++ code. And of course the eco-system is much much smaller than Python.

[1] https://nim-lang.org/

Nim compile+link times can be sub-100 milliseconds with tcc [1] (just saw 93 ms on my laptop) which can yield pretty reasonable edit-compile-test cycles.

The grandparent's scientific/Go interests suggests a need for a large, working ecosystem and there are probably several places in ArrayMancer which need some love for "fallback cases" to work with tcc (or elsewhere in the as you note much smaller Nim eco-system).

EDIT: E.g., for ArrayMancer you need to make stb_image.h work with tcc by adding a nim.cfg/config.nims `passC="-DSTBIW_NO_SIMD -DSTBI_NO_SIMD"` directive, though. And, of course, to optimize compile times, just generally speaking, you always want to import only exactly what you really need which absolutely takes more time/thought.

[1] https://github.com/TinyCC/tinycc

I agree with this sentiment.

I try to learn the basics of new programming language regularly and write a small lisp alike interpreter in it and give myself a maximum of 2 days working on it. It covers things like string handling, regexp, recursion, lambdas, garbage collection, ... and run them through a tiny test suite.

In Python and JS, it was easy to do it and the code was still very readable. In C++, the language I earn my money from, I had a bug I was not able to fix within the given time frame, happening just with gcc not clang, assuming some undefined behavior. In C, I was able to add my own garbage collector with muss less work than I expected ... but

Nim really impressed me, it really felt almost like I wrote it in Python, but an executable which run on its own and being quite a bit faster.

Working mostly in the embedded world, where ecosystem matters somewhat less. If any employee ever would give me a chance to choose a language myself I would definitely try to write a first prototype in Nim.

I think, Nim is probably the best fit.

It also compiles to JS.

Rust doesn't quite hit all of those, but it hits a lot of them.

It's syntax is significantly different from python, but it does have operator overloading.

It's performance is comparable to go, and has good concurrency support, although it is different than go, and there are still some rough edges with "async" code. Compile times aren't as good as go though.

The type system is excellent, although I'm not really sure what you mean by "flexible".

And FFI support is great.

Rust’s compile times are crippling and its type system is easily one of the most rigid of all type systems (lifetimes are part of the type!). The latter is one of Rust’s main selling points because it allows encoding business rules into affine types, but that’s very very far from flexible especially when compared to Typescript (or Python or Haskell and their many ways of polymorphism). Traits add an orthogonal axis of flexibility but they’re still limited by lifetimes (see async_trait and generic associated types and specialization).

“Flexible” means the range from gradual typing (‘any’) to Turing complete conditional types that can do stuff like string parsing (for better or for worse). Structural typing vs instanceof and so on.

There’s really no comparison between Typescript’s type system and Rust’s. It’s worth noting though that Typescript is a bolted on typesystem that has explicitly traded soundness for flexibility. That’s the real tradeoff between Rust and TS IMHO. Rust is sound and expressive but not flexible, while Typescript is expressive and flexible but not sound.

> “Flexible” means the range from gradual typing (‘any’) to Turing complete conditional types that can do stuff like string parsing (for better or for worse).

So the flexibility means one gets to pretend they are doing typing, but in reality they get to sprinkle the code with void casts, because expressing ideas is apparently hard? For better or worse, that is probably the main pillar Rust is designed on.

Also, some prominent projects migrated away from TypeScript to JSDoc type comments due to the transpile times in TypeScript. The type checking task takes the more time the more complex the type-level expressions are. Haskell can also take a long time to compile if you turn on a few extensions and move toward dependent types.

Rust compiles fast if your translation units don’t need too much macro expansion. You add something like Diesel, and you can call for the lunch break.

It’s also worth mentioning Scala with Scala Native and maybe Kotlin with Kotlin/Native. OpenJDK Project Panama FFM now gives a better FFI experiences than JNI.

> Rust’s compile times are crippling

It's kind of a meme here on HN but while Rust compilation times are indeed higher than I wished they were, calling them “crippling” is a massive exaggeration.

My daily driver is a mid-range desktop computer from 2018 and I regularly code on a laptop from 2012, and even then it's completely manageable: cargo check is nigh instant on both, incremental compilation in debug mode is more than fast enough and even incremental rebuild in release mode and full debug builds are OK-ish (especially since they don't happen as often as the other above). There's only full builds in release mode that are arguably slow in the 2012 laptop (though on the project where it's the biggest problem, the majority of the time is spent compiling a C++ dependency), but then again it's a very obsolete piece of hardware and this isn't supposed to happen more than every six week when you update your compiler toolchain.

I welcome you to try to work on a cxx-qt project to understand what crippling compile times look like.

I’m not memeing here, I’ve struggled with this issue on a variety of different projects since I first started using Rust seven years ago.

Everything about cxx-qt is wrong. The Rust ecosystem deserves better. Not QML. Not any markup-like things like Slint. No immediate mode. And definitely not GTK. So much wasted potential.

So... what do you think GUI on rust should look like?

This really depends on what you are working on. As an example, compiling the protobuf stuff can get insanely slow for some reason.

For any larger project I would recommend working with a cargo workspace to compile as little as possible on each check/test/run/build.

Then you can build a DAG of crates and stick e.g. the Protobuf stuff in its own separate corner where it only needs to be recompiled on full rebuilds or when you work on it.

Feels a bit shitty to have to resort to managing crates instead of modules simply due to the compile times, but it is what it is.

https://doc.rust-lang.org/cargo/reference/workspaces.html

Yeah, second that. This is just release management best practice.

And it makes total sense to me, it’s a way of organizing your dependency graph by the lifetimes of your components.

This will also simplify testing, development, and audit. You won’t need to recompile autogen schemas as often as the business logic implementation anyway. Depending on artifacts pushed through a release pipeline is even more reliable. You can QA everything in guaranteed isolation while keeping it conveniently a workspace monorepo.

If only someone had a way of automating this! Say perhaps using some kind of digital computer...

I wish rustc could automatically build a DAG of “smallest group of modules with cyclic references” and use that as a compilation unit, but that's unfortunately not the case so far and it won't until someone steps up to build such a system.

On compile time for Typescript: when projects have large unioned or conditional types, Typescript's compilation time isn't all that fast, and sometimes even slower than Rust, more often when Rust will be compiling incrementally (I write both Rust and Typescript extensively).

Worse, typescript may even run out of it's allocated memory sometimes.

Yeah

Go feels like C with training wheels.

Rust feels like riding a bike where one leg pedals the front wheel and another one pedals the back wheel, and you have one handlebar for each wheel as well and a very smart alarm system but it is very clunky to ride (and they tell you it's "flexible")

Pretty normal bike for those who used to C++ bikes before.

Yes, you pedal the C++ bike by squeezing the brake handles and brake by pedaling backwards

I feel like if Python’s lack of typing is going to be considered a drawback, then the solution needs to be true typing. Rust’s strict typing is an advantage, not a drawback.

Among other things, a flexible type system is one where you can write a function that takes objects with keys in snake_case, and then (recursively) converts them into camelCase. In Typescript, you can make such a function generic, and it'll all type check. Your code editor will even offer autocomplete for the camelCase keys, even though the only explicit type definition is camel_case.

rust hits none.

This looks exactly what I'm trying to do with SPy -- although SPy is still "not there" and it's WIP. I literally wrote an intro post about it yesterday: https://antocuni.eu/2025/10/29/inside-spy-part-1-motivations...

SPy looks really interesting! I've run across projects like MyPyc before, but as you say they kill a lot of the "joy" of Python by removing things like decorators and operator overloading.

One question, more out of curiosity than any genuine need: do you (or do you plan to) support any kind of trait/interface based polymorphism? E.g. it's a pretty common idiom to have a function that works on "any iterable type" and that sort of thing, which seems like it would be best modeled as an interface. I guess you could argue that's at odds with Python's tradition of "duck typing" but then again, so is static typing in general so I'm not sure.

I haven't fully decided which route to take in that direction, but I think I'd like to support something similar do "go": you declare an interface (or, in python typing terms a Protocol) and then all the types which have the right methods automatically implement that interface/protocol

This is an enlightening post and I am very excited about SPy.

I only wish discussions happened elsewhere than Discord, e.g. Zulip, where you can have web-public channels, which is great for searchable documentation, and you can interact with channels by email if you so desire.

thank you for the feedback! I wasn't aware of these features of Zulip, there are way too many competing technologies in this space and I miss the days in which everybody was IRC and that's it.

Mojo is aiming at that. I've decided it's this years advent of code language for me and kinda looking forward to learning more about it.

I think Julia largely accomplishes these goals except for the platform integration.

It also has poor tooling when compared to Python. Julia's package manager is good, and so is it's tools for performance optimisation, but for type checking, app/cli creation, semver checking and IDE integration, the tooling is quite bad. Also, the compile times are shit, and the type system makes it very hard to make a type checker in the first place.

All except maybe the compile times. I read the beginning of the article about the features of Uv; Julia has all that built in.

There's Mojo, but it's been a while since I've heard anything about it.

https://www.modular.com/mojo

Is it ready to use? Does it have libraries, tools and documentation? Is it usable for anything else beside AI? Does it work on all major platforms?

There’s been a lot of hype around Mojo, but is anyone actually using it?

Does it deliver on the bold claims of its designers?

It is still in active development

So is V-lang

Kotlin on a modern JVM comes pretty close.

* Performance - the JVM is very competitive with C/C++ performance.

* Compile times - Not go fast, but not C/C++/Rust slow.

* Concurrency - Virtual threads (finalized in 21) bring in the concurrency capabilities of go to the JVM

* Type System Flexibility - Kotlin isn't quite as flexible as Typescript, but it's pretty close. It's more flexible than java but not as flexible as scala. I think it strikes a good middle ground.

* Native platform integration - This is probably the weakest part of the JVM but it's gotten a lot better with the work done on Project Panama (mostly delivered in 22). Jextract makes it a lot easier to make native integrations with Java. Definitely not as painful as the JNI days.

There's also kotlin native that you could play around with (I haven't).

I feel so much comfortable with Go's type system than Typescript's

Why would operator overloading be absolutely critical? Just use methods. The pipe operator already has around 15 uses and counting, it doesn’t need more

Because writing a.add(b).divide(c) is miserable

That's a fine approach for "plumbing" type work, you know "join this thing to that thing then call that thing" - and that is most of the code in the world today but it falls apart in math heavy code.

You really just want operators when you're performing tons of operations, it's an absolute wall of text when it's all method calls.

Instead of writing 2+2, you're suggesting writing 2.plus(2) or plus(2,2).

[deleted]

Also, case matching of Scala <3

Free Pascal with better tooling?

You may be interested in Mojo, it's a project by Chris Lattner. It aims to have Python-like syntax and smooth integration with Python but allow Rust-like low-level control (I believe it has a borrow checker). Unfortunately, I believe it's proprietary.

> But then I remember that Python is dog slow compared to other languages with comparable ergonomics and first-class support for static typing, and...idk it's a tough sell.

Case in point: uv itself is not written in Python. It's a Rust tool.

It always amazes me when people work on an ecosystem for a language but then don't buy enough into that to actually use it to do the work.

Avoidance of dogfooding is a big red flag to me.

There’s this thing where you work to requirements instead of picking things on vibes, it’s called engineering.

I understand the argument but the language used for uv (rust) and python don't have the same goal.

Python aims to be simple, not particularly fast (though it is getting faster)

I don't see a problem with that. Pick the language adapted to your problem. Python isn't aiming at solving every problem and that's okay.

> Python aims to be simple

Well, it wildly missed the mark there. Nothing about modern Python is simple. It's a very complex language hiding behind friendly syntax.

Hard disagree. Python is so simple anyone can get up and running with coding in a few lines in the REPL.

Simple != easy. Rich Hickey has the details:

https://www.youtube.com/watch?v=SxdOUGdseq4

[deleted]

Python performance is poor. Python achieves performance by having as little python code as possible. The runtime and all performant libraries are not in python.

It's ok for IO bound but not for CPU bound work.

And that's how languages start optimizing towards being a better language to write compilers in ;-)

It's completely fair for a language to have a niche different that 'quick start-up and runtime'.

Different tools for different use cases.

Would you write an assembler IDE in assembler?

I use Python for >90% of my code these days. I love uv for its ux and speed. I 0% care it wasn't written in Python. In fact, making it fully independent of any existing Python environment is a plus for the Python user. No weird bootstrap problems.

It does not make me switch to Rust for my own code.

Well Python Language should be seen as an UI layer for C++/C. Not out of character to use Rust for "heavy lifting"

In a similar vein, Microsoft TypeScript team picked Go for the new TypeScript transpiler engine.

That's like every language. The language you work in (typically) has all sorts of libraries that are lower level. And even if not it's not like you're code is 'running', it's compiling to lower level code that actually runs and the compilers that do that work are usually black box unless you're a compiler engineer.

The power of Python is that it's high level and very powerful and has a great community and ecosystem of tools/libraries. There's absolutely zero problem and totally a good thing if there are core libraries written in faster languages.

> It always amazes me when people work on an ecosystem for a language but then don't buy enough into that to actually use it to do the work.

> Avoidance of dogfooding is a big red flag to me.

I'm making PAPER for a reason.

Interpreted languages are not the best choicce for a variety of software. for eg: Vast amount of performance issues in typescript cli boil down to being written in typescript, and team is porting to golang now.

Tools specifically CLI tools, are best written in statically typed compiled languages.

Bash is written in what?

the python interpreter is not written in python :-)

it can be though and pypy is

My main question is: how good is it when it breaks? Because with most build/package tools that's when the misery starts.

> I know the performance meta in Python is to...not use python (bind to C, Rust, JVM) - and you can get pretty far with that (see: uv), but I'd rather spend my limited time building expertise in a language that isn't constantly hemorrhaging resources unless your code secretly calls something written in another language :/

In case it encourages you: a lot of uv's performance benefits come from things that are not the implementation language. In particular, it has a much more intelligent system for caching downloaded package artifacts, and when asked to pre-compile bytecode it can use multiple cores (this is coming soon to pip, to my understanding; actually the standard library already has a primitive implementation).

On performance: 3.13 removed the GIL and added experimental first-party JIT (like PyPy).

In two years I bet we’ll be seeing v8 level performance out of CPython.

The “Faster CPython” team were let go from Microsoft because they could only produce a 1.5x speedup in four years instead of the planned 5x.

It’s wildly optimistic to now expect a 10x speedup in two years, with fewer resources.

Them being let go "because they didn't meet 5x" is hearsay. The only source for that is in social-media commentary and opinion pieces. Microsoft described the layoffs as "organizational adjustments".

That sounds awfully like a scene by scene replay of Google's Unladen Swallow experience.

https://peps.python.org/pep-3146/#rationale-implementation

At least Microsoft hired a bunch of experienced Python core developers for their effort. IIRC Unladen Swallow was just a couple of interns.

Wow, know you make me curious about the business processes at Microsoft. Did they see that they would earn more money if the interpreter had a 5x speedup, that they wouldn’t see with 1.5x? Or was it trust broken?

Instead of generating more revenue, it would drive down costs. You will need less computers to do the same amount of work if the work can be done faster.

Lower costs could open new markets, as it would allow you to charge less and still make a profit.

Have you checked out what PyPy is already capable of?

Depends if they are the right resources.

Depends if it’s possible.

Python is slow due to design decisions in the language. For example operator dispatch is slow without some kind of static analysis. But this is hindered by how dynamic the language is.

It's hard to make Python run fast when it pervasively uses duck typing. It makes types only resolvable at runtime. JIT is the only thing that can work here at the moment, but I think that needs to make very similar assumptions to a branch predictor, plus it needs to identify lexical regions (is that what they're called?). People here have criticised PyPy, but I've forgotten why.

I'd be surprised if we saw anything more than the 4x speedup from compiling Python with something like Nuitka/mypyc/etc can bring.

I also believe the JIT in v8 and Python are different, the latter relying on copy-and-patch while v8 uses a bunch of different techniques together.

Obviously dumb microbenchmark, but here's ~17x on my machine:

  $ time python -c 'sum(range(1_000_000_000))'

  real 0m19.997s
  user 0m19.992s
  sys 0m0.005s

  $ time pypy -c 'sum(range(1_000_000_000))'

  real 0m1.146s
  user 0m1.126s
  sys 0m0.020s

I think some relatively simple math JITs and compiles nicely like that, but when you start using other parts of the language heavily like you would in a real project, it averages out to about ~4x due to the object, VM and locking model, I believe. It's been a while since I've looked into this.

I would surprised to see performance as good as V8, although that would be great. As I recall the v8 team performed exceptionally well in a corporate environment that badly wanted js performance to improve, and maybe inherited some Hotspot people at the right time.

I'd be quite delighted to see, say, 2x Python performance vs. 3.12. The JIT work has potential, but thus far little has come of it, but in fairness it's still the early days for the JIT. The funding is tiny compared to V8. I'm surprised someone at Google, OpenAI et al isn't sending a little more money that way. Talk about shared infrastructure!

pypy is probably faster. Lets put effort into that. BUT the dynamic features that make python lovely are always going to limit its performance.

If you're using python because you have to then you might not like all that and might see it as something to toss out. This makes me sad.

I bet we’ll be seeing python compiled to JVM of getting JVM levels of performance. Much better than v8

There have for a long time been IronPython (CLR) and and Jython (JVM).

But, they don't have the full compatibility with CPython, so nobody really picks them up.

Jython seems to be effectively dead though - it only has 2.7 compatibility.

You are right: GraalPy (https://www.graalvm.org/python/) is where it's at these days.

JVM Python exists for the longest time now, where "exists" is purely technical. It's very cursed and bad, keeping in line with the rest of Java-adjacent stack.

Yet this "Java-adjacent stack" wipes the floor with Python and its ilk w.r.t performance and is what's actually running the world outside of some silicon valley ephemeral unicorns.

Has something changed that allows a more relaxed refcounting / less eager "gc"? Py_DECREF was what murdered any hope of performance back when we hooked up 3.3 to OMR... Well that and the complete opacity of everything implemented in C

It didn't "remove the GIL". It added an experimental free-threading mode which removes it, but is still considered experimental and not widely used in production yet.

[deleted]

Am I the only one that's sad that poetry happened before pdm otherwise we might have had pdm as a standard instead of uv, addressing many of the things uv addresses without all the extra bells and whistles that make it cumbersome. I don't like the wedding between package manager and install manager.

... but then again neither pdm nor uv would have happened without poetry.

I think in Python specifically, an install manager is absolutely the right call. There's far too much breakage between Python versions.

I recently had to downgrade one of our projects to 3.12 because of a dependency we needed. With uv, I can be sure that everybody will be running the project on 3.12, it just all happens automatically. Without uv, I'd get the inevitable "but your changes crashed the code, have you even tested them?"

How do extra bells and whistles bother you? You had the option to not use them. Like you said yourself, they’re “extra”.

What is the distinction between "package manager and install manager"?

One installs Python packages into a Python installation and the other manages Python installations.

Honestly I think poetry was a bigger development than uv. I used pipenv before it, and requirements before that, and I can't imagine going back. I've yet to fully embrace uv and migrate away from poetry for that reason (even thought it seems inevitable at this point, there's just no need)

Just profile the slow parts and rewrite them in rustm, easy.

> But then I remember that Python is dog slow compared to other languages with comparable ergonomics and first-class support for static typing, and...idk it's a tough sell.

Post like these aptly describe why companies are downsizing in lieu of AI assistants, and they are not wrong for doing so.

Yes, Python is "slow". The thing is, compute is cheap these days and development time is expensive. $1000 per month is considered expensive as hell for an EC2 instance, but no developer would work for $12000 a year.

Furthermore, in modern software dev, most of the bottlenecks is network latency. If your total end to end operation takes 200ms mostly because of network calls, it doesn't matter if you code runs in 10 ms or 5ms as far as compute goes.

When it comes to development, the biggest uses of time are

1. Interfacing with some API or tool, for which you have to write code 2. Making a change, testing a change, fixing bugs.

Python has both covered better than any other language. Just today, it took me literally 10 mins to write code for a menu bar for my Mac using rumps python library so I have most commonly used commands available without typing into a terminal, and that is without using an LLM. Go ahead and try to do the same in Java or Rust or C++ and I promise you that unless you have experience with Mac development, its going to take you way more time. Python has additional things like just putting breakpoint() where you want the debugger, jupyter notebooks for prototyping, and things like lazy imports where you use import inside a function so large modules only get loaded when they run. No compilation step, no complex syntax. Multiprocessing is very easy to use as a replacement for threading, really dunno why people want to get rid of GIL so much. Functionally the only difference is overhead in launching a thread vs launching a process, and shared memory. But with multiprocessing API, you simply spin up a worker pool and send data over Pipes, and its pretty much just as fast as multithreading.

In the end, the things that matter are results. If LLMs can produce code that works, no matter how stringy it is, that code can run in production and start making company money, while they don't have to pay you money for multiple months to write the code yourself. Likewise, if you are able to develop things fast, and a company has to spend a bit more on compute, its a no brainer on using Python.

Meanwhile like strong typing, speed, GIL, and other popular things that get mentioned is all just echos of bullshit education that you learned in CS, and people repeat them without actually having any real world experience. So what if you have weak typing and make mistakes - code fails to run or generate correct results, you go and fix the code, and problem solved. People act like failing code makes your computer explode or something. There is no functional difference between a compilation failure and a code running failure. And as far as production goes, there has never been a case of a strong type language that gets used that gets deployed and doesn't have any bugs, because those bugs are all logic bugs within the actual code. And consequently, with Python, its way easier to fix those bugs.

Youtube, Uber, and a bunch of other well used services all run Python backends for a good reason. And now with skilled LLM usage, a single developer can write services in days that would take a team of engineers to write in weeks.

So TL:DR, if you actually want to stay competitive, use Python. The next set of LLMs are all going to be highly specialized smaller models, and being able to integrate them into services with Pytorch is going to be a very valuable skill, and nobody who is hiring will give a shit how memory safe Rust is.

I write Python all day for work and I run into its issues all the time, be it performance or weak typing or just lack of modern language features. If you’re just vibecoding all day and that’s ok for what you want it to do, all the more power to you, but do at least take a moment to understand that when people want things you see no value in maybe they just have different requirements than you do.

Notice how you start your reply with a personal experience? Thats the big difference between what people say, and what I say. Its not about misunderstanding. You can't find a single flaw in what I said, so your only reply is "well in my experience Python sucks".

So its a you problem, not a python problem.

Strong typing doesn't make code better. Proof = Linux system is written in C, without strong typing (unless you consider basic types strong typing), and it works. Good developers make code good. While bad developers make bad code in strongly typed languages.

Python doesn't lack modern language features. It potentially lacks things that you may like, but you aren't the authority on modern language features.

If you are writing code that needs to run fast, of courses Python is a poor choice. But its likely that only a portion of your code needs to run fast, at which point you can easily use python native code interfaces, or even easier, compile a small binary written in C, and launch that from Python.

You argued that people who disagree with you lack real-world experience, then when I give you mine you turn around and call me out for it? It's possible that my concerns are a "me" problem but your problem seems that you forgot what I was replying to.

In my experience, Python is often bad for the things I have needed it for, which includes PyTorch just FYI even though I think it's patently absurd that people running the coding models of the future are going to be directly talking to PyTorch. You claim that this is because I am up on my ivory tower, and that I am not an authority on language design. Those might be true. But you most definitely are not an authority on software engineering or language design either, and your statements can only leave one with the conclusion that you have never thought deeply at the examples you're pulling out. There is a huge difference between bugs that are caught during development and those that are found out "in the wild", including the fact that sometimes you don't even get to fix things after they have shipped. In fact in a past life we were bitten by trivial bugs in the Linux kernel all the time where we found it did not in fact work but the code had already been deployed to a hundred million phones in Asia that were never going to be updated. I think the fact that we had a few dozen very good people on staff to resolve the issues caused by simple bugs that strongly typed languages make difficult is a bad place to be, actually.

The irony here is UV is written in Rust.

That's not exactly for slowness reasons. The creators of uv have stated that if pip followed the same algorithms they'd see similar performance benefits. People greatly overstate the Python performance penalty.

... Actually, could you show me where they said it? I've been explaining it from first principles from a while now and it would be nice to be able to go "but don't take my word for it" as well.

(Python does incur a hefty performance penalty for things that are actually CPU bound. But that doesn't describe most of the process of installing Python packages; and the main part that is CPU bound is implemented by CPython in C.)

Python is bad for large projects, and it's not just because of speed.

I see it shine for scripts and AI but that's it.

Do you not consider Youtube or Uber large projects?

>$1000 per month is considered expensive as hell for an EC2 instance, but no developer would work for $12000 a year.

If using Python instead of what we use, our cloud costs would be more than double.

And I can't go to CEO and CFO and explain to them that I want to double the cloud costs (which are already seen as high).

Then, our development speed won't really improve because we have large projects.

That being said, I think using Python for scripting is great in our case.

>If using Python instead of what we use, our cloud costs would be more than double.

You won't double your costs and I can safely say that without knowing anything about your service. Because if you use PyPy, the speed up is generally 2x. So right there, your argument is defeated.

But even without PyPy, I am willing to bet that your service has a dominant component of either network calls, or data processing. In the former case, you can use Python with things like uvloop and multiprocessing to basically structure your service so that you are filling the processing time with waiting for network calls, spending minimal time in actual compute, which means you don't get any slower. In the latter case, its pretty straightforward to write a data cruncher in a more optimized language like C and run it from Python - even if you don't wanna deal with native interfaces, an LLM can write you a small udp server that can listen on a unix pipe for messages to process, and from python you just launch it with subprocess and send data to it.

So unless you run a highly specialized service with a very niche use case, you would be absolutely fine with Python.

And even then, Im willing to bet based on average use case that your cloud costs aren't even close to optimal. In the case of AWS, so many companies use managed services like Dynamo DB when they could easily run a reserved small EC2 instance and run their own version of Mongo or something similar. At my current company, the only AWS services we use is S3 and EC2, where things like database software or caching software is all manually installed and ran on EC2. If you know what you are doing, its not really that hard to manage. We even have or own version of AWS Lambda.

[flagged]

Don't you know most slow things are only from apis! /s

For someone writing a whole essay why python's speed is not a problem, one would expect to have worked in workloads where performance actually matters. For most people in this industry, that would be web APIs. Though it's kinda fine for low traffic APIs.

GP comment reeks of textbook "performance doesn't matter" rhetoric.

[deleted]

[flagged]