I was commiserating with my brother over how difficult it is to set up an environment to run one LLM or diffusion model, let alone multiple or a combination. It's 5 percent CUDA/ROCm difficulties and 95% Python difficulties. We have a theory that Lanyone working with generative AI has to tolerate output that is only 90% right, and is totaly fine working with a language and environment that only 90% works.
Why is Python so bad at that? It's less kludgy than Bash scripts, but even those are easier to get working.
Yeah, but it's easily solved, with directives, headers, or make files that specify which language standard it follows. Better yet, you can use different syntax with different language standards, so it's clear which to follow. If a compiler can automatically figure whether I'm compiling C or C++, why can't a Python interpreter figure out if I'm running version two or three, of the same language?
> JS/TS/npm is just as bad with probably more build tools/frameworks.
This is flat out wrong. NPM packages by default are local to a directory. And I haven't seen a package rely on a specific minor version of node in literally years. Node's back compat is also great, there was one hiccup 5 or 6 years ago where a super popular native package was deprecated ago but that's been about it.
I can take current LTS node and run just about any package from the NPM repo written within the last 4 or 5 years and it will just work. Meanwhile plenty of python packages somehow need specific point releases. What the unholy hell.
Node version manager does exist, and it can be setup to work per directory, which is super cool, but I haven't needed NVM in literal years.
I think it can install Python itself too. Though I have had issues with that - especially with SSL certificate locations, which is one of Linux's other clusterfucks.
The project is like 80% there by having a pyproject file that should work with uv and poetry. The just aren't any package versions specified and the python version is incredibly lax, and no lock file is provided.
in this context uv works perfectly fine with poertry, if you publish a wheel from poetry uv can use it. You don't have to switch anything in your project to make it work.
A tool that was only released, what, a year or two ago? It simply won't be present in nearly all OS/distros. Only modern or rolling will have it (maybe). It's funny when the recommended python dependency manager managers are just as hard to install and use as the script themselves. Very python.
"Fixing python packaging" is somewhat harder than AGI.
I was commiserating with my brother over how difficult it is to set up an environment to run one LLM or diffusion model, let alone multiple or a combination. It's 5 percent CUDA/ROCm difficulties and 95% Python difficulties. We have a theory that Lanyone working with generative AI has to tolerate output that is only 90% right, and is totaly fine working with a language and environment that only 90% works.
Why is Python so bad at that? It's less kludgy than Bash scripts, but even those are easier to get working.
This is a generic problem.
JS/TS/npm is just as bad with probably more build tools/frameworks.
Rust is a mess.
Go, well.
Even perl was quite complicated.
Yeah, but it's easily solved, with directives, headers, or make files that specify which language standard it follows. Better yet, you can use different syntax with different language standards, so it's clear which to follow. If a compiler can automatically figure whether I'm compiling C or C++, why can't a Python interpreter figure out if I'm running version two or three, of the same language?
> JS/TS/npm is just as bad with probably more build tools/frameworks.
This is flat out wrong. NPM packages by default are local to a directory. And I haven't seen a package rely on a specific minor version of node in literally years. Node's back compat is also great, there was one hiccup 5 or 6 years ago where a super popular native package was deprecated ago but that's been about it.
I can take current LTS node and run just about any package from the NPM repo written within the last 4 or 5 years and it will just work. Meanwhile plenty of python packages somehow need specific point releases. What the unholy hell.
Node version manager does exist, and it can be setup to work per directory, which is super cool, but I haven't needed NVM in literal years.
This is how we'll know ASI has arrived.
Have you considered offering a uvx command to run to get people going quickly?
Though I think you would still need to have the Python build dependencies installed for that to work.
If you restrict your dependencies to only those for which wheels are available, then uv should just be able to handle them for you.
I think it can install Python itself too. Though I have had issues with that - especially with SSL certificate locations, which is one of Linux's other clusterfucks.
Just point people to uv/uvx.
The project is like 80% there by having a pyproject file that should work with uv and poetry. The just aren't any package versions specified and the python version is incredibly lax, and no lock file is provided.
in this context uv works perfectly fine with poertry, if you publish a wheel from poetry uv can use it. You don't have to switch anything in your project to make it work.
A tool that was only released, what, a year or two ago? It simply won't be present in nearly all OS/distros. Only modern or rolling will have it (maybe). It's funny when the recommended python dependency manager managers are just as hard to install and use as the script themselves. Very python.
[flagged]
[flagged]