System Requirements
  Works literally everywhere
Haha, on one of my machines my python version is too old, and the package/dependencies don't want to install.

On another machie the python version is too new, and the package/dependencies don't want to install.

I opened a couple of PRs to fix this situation:

https://github.com/KittenML/KittenTTS/pull/21 https://github.com/KittenML/KittenTTS/pull/24 https://github.com/KittenML/KittenTTS/pull/25

If you have `uv` installed, you can try my merged ref that has all of these PRs (and #22, a fix for short generation being trimmed unnecessarily) with

    uvx --from git+https://github.com/akx/KittenTTS.git@pr-21-22-24-25 kittentts --output output.wav --text "This high quality TTS model works without a GPU"

Thanks for the quick intro into UV, it looks like docker layers for python

I found the TTS a bit slow so I piped the output into ffplay with 1.2x speedup to make it sound a bit better

   uvx --from git+https://github.com/akx/KittenTTS.git@pr-21-22-24-25 kittentts --text "I serve 12 different beers at my restaurant for over 1000000 customers" --voice expr-voice-3-m --output - | ffplay -af "atempo=1.2" -f wav -

Ah, yeah, good catch – I added the model-native speed multiplier to the CLI too (`--speed=1.2` for instance).

Install it with uvx that should solve the python issues.

https://docs.astral.sh/uv/guides/tools/

uv installation:

https://docs.astral.sh/uv/getting-started/installation/

Yeah some people have a problem and think "I'll use Python". Now they have like fifty problems.

It doesn't work on Fedora because of the lack of g++ having the right version.

Not sure if they've fixed between then and now, but I just had it working locally on Fedora.

  > g++ --version
  g++ (GCC) 15.1.1 20250521 (Red Hat 15.1.1-2)
  Copyright (C) 2025 Free Software Foundation, Inc.

We are working to fix that. Thanks

"Fixing python packaging" is somewhat harder than AGI.

I was commiserating with my brother over how difficult it is to set up an environment to run one LLM or diffusion model, let alone multiple or a combination. It's 5 percent CUDA/ROCm difficulties and 95% Python difficulties. We have a theory that Lanyone working with generative AI has to tolerate output that is only 90% right, and is totaly fine working with a language and environment that only 90% works.

Why is Python so bad at that? It's less kludgy than Bash scripts, but even those are easier to get working.

This is a generic problem.

JS/TS/npm is just as bad with probably more build tools/frameworks.

Rust is a mess.

Go, well.

Even perl was quite complicated.

Yeah, but it's easily solved, with directives, headers, or make files that specify which language standard it follows. Better yet, you can use different syntax with different language standards, so it's clear which to follow. If a compiler can automatically figure whether I'm compiling C or C++, why can't a Python interpreter figure out if I'm running version two or three, of the same language?

> JS/TS/npm is just as bad with probably more build tools/frameworks.

This is flat out wrong. NPM packages by default are local to a directory. And I haven't seen a package rely on a specific minor version of node in literally years. Node's back compat is also great, there was one hiccup 5 or 6 years ago where a super popular native package was deprecated ago but that's been about it.

I can take current LTS node and run just about any package from the NPM repo written within the last 4 or 5 years and it will just work. Meanwhile plenty of python packages somehow need specific point releases. What the unholy hell.

Node version manager does exist, and it can be setup to work per directory, which is super cool, but I haven't needed NVM in literal years.

This is how we'll know ASI has arrived.

Have you considered offering a uvx command to run to get people going quickly?

Though I think you would still need to have the Python build dependencies installed for that to work.

If you restrict your dependencies to only those for which wheels are available, then uv should just be able to handle them for you.

I think it can install Python itself too. Though I have had issues with that - especially with SSL certificate locations, which is one of Linux's other clusterfucks.

Just point people to uv/uvx.

The project is like 80% there by having a pyproject file that should work with uv and poetry. The just aren't any package versions specified and the python version is incredibly lax, and no lock file is provided.

in this context uv works perfectly fine with poertry, if you publish a wheel from poetry uv can use it. You don't have to switch anything in your project to make it work.

A tool that was only released, what, a year or two ago? It simply won't be present in nearly all OS/distros. Only modern or rolling will have it (maybe). It's funny when the recommended python dependency manager managers are just as hard to install and use as the script themselves. Very python.

[flagged]

[flagged]

Python man

    man python
There you go.

  PYTHON(1)                   General Commands Manual                  PYTHON(1)
  
  NAME
       python - an object-oriented programming language

  SYNOPSIS
       python [ -c command | script | - ] [ arguments ]
  
  DESCRIPTION
       Python is the standard programming language.
Computer scientists love Python, not just because whitespace comes first ASCIIbetically, but because it's the standard. Everyone else loves Python because it's PYTHON!

Python is used not because it's good but because it's good enough just like Windows and plastics.

There are still people who use machine wide python installs instead of environments? Python dependency hell was already bad years ago, but today it's completely impractical to do it this way. Even on raspberries.

Debian pretty much "solved" this by making pip refuse to install packages if you are not in an venv.

It needed distro buy in and implementation, but this is from the Python side: https://peps.python.org/pep-0668/

IIRC that's actually a change in upstream pip.

Ditto OpenSUSE, at least on Tumbleweed

Well, with my python 3.13.5 not even that works!

Pretty impressive but this seems to be a staple of most AI/ML projects.

"Works on my machine" or "just use docker", although here the later doesn't even seem to be an option.

Using venv won't save you from having the wrong version of the actual Python interpreter installed.

Yep. Python stopped being Python a decade ago. Now there are just innumberable Pythons. Perl... on the otherhand, you can still run any perl script from any time on any system perl interpreter and it works! Granted, perl is unpopular and not getting constant new features re: hardcore math/computation libs.

Anyway, I think I'll stick with Festival 1.96 for TTS. It's super fast even on my core2duo and I have exactly zero chance of getting this Python 3'ish script to run on any machine with an OS older than a handful of years.

It breaks my heart that Perl fell out of favor. Perl “6” didn’t help in the slightest.

Such an ignorant thing to say for something that requires 25MB RAM.

Not sure what the size has to do with anything.

I send you a 500kb Windows .exe file and claim it runs literally everywhere.

Would it be ignorant to say anything against it because of its size?

we all know runs anywhere in this context means compute wise. It's dumb to blame author for your dev setup issues.

I didn’t realize that that’s what it meant until you mentioned it.

It reminds me of the costs and benefits of RollerCoaster Tycoon being written in assembly language. Because it was so light on resources, it could run on any privately owned computer, or at least anything x86, which was pretty much everything at the time.

Now, RISC architectures are much more common, so instead of the rare 68K Apple/Amiga/etc computer that existed at the time, it's super common to want to run software on an ARM or occasionally RISC-V processor, so writing in x86 assembly language would require emulation, making for worse performance than a compiled language.

I had the too new.

This package is the epitome of dependency hell.

Seriously, stick with piper-tts.

Easy to install, 50MB gives you excellent results and 100MB gives you good results with hundreds of voices.

You're getting a lot of comments along the lines of "Why don't you just ____," which only shows how Stockholmed the entire Python community is.

With no other language are you expected to maintain several entirely different versions of the language, each of which is a relatively large installation. Can you imagine if we all had five different llvms or gccs just to compile five different modern C projects?

I'm going to get downvoted to oblivion, but it doesn't change the reality that Python in 2025 is unnecessarily fragile.

That’s exactly what I have. The C++ codebases I work on build against a specific pinned version of LLVM with many warnings (as errors) enabled, and building with a different version entails a nonzero amount of effort. Ubuntu will happily install several versions of LLVM side by side or compilation can be done in a Docker container with the correct compiler. Similarly, the TypeScript codebases I work with test against specific versions of node.js in CI and the engine field in package.json is specified. The different versions are managed via nvm. Python is the same via uv and pyproject.yaml.

I don't doubt it, but I don't think that situation is accepted as the default in C/C++ development. For the most part, I expect OSS to compile with my own clang.

I agree with your point, but

> if we all had five different llvms or gccs

Oof, those are poor examples. Most compilers using LLVM other than clang do ship with their own LLVM patches, and cross-compiling with GCC does require installing a toolchain for each target.

Cross-compiling is a totally different subject… I'm trying to make an apples-to-apples comparison. If you compile a lot of OSS C projects for the host architecture, you typically do not need multiple LLVMs or GCCs. Usually, the makefile detects various things about the platform and compiler and then fails with an inscrutable error. But that is a separate issue! haha

> Can you imagine if we all had five different llvms or gccs just to compile five different modern C projects?

Yes, because all I have to do is look at the real world.

system python is for system applications that are known to work together. If you need a python install for something else, there's venv or conda and then pip install stuff.

[flagged]

You're supposed to use venv for everything but the python scripts distributed with your os