> Why does "good" C have to be zero alloc?

GP didn't say "zero-alloc", but "minimal alloc"

> Why should "nice" javaesque make little sense in C?

There's little to no indirection in idiomatic C compared with idiomatic Java.

Of course, in both languages you can write unidiomatically, but that is a great way to ensure that bugs get in and never get out.

In C, direct memory control is the top feature, which means you can assume anyone who uses your code is going to want to control memory through the process. This means not allocating from wherever and returning blobs of memory, which means designing different APIs, which is part of the reason why learning C well takes so long.

I started writing sort of a style guide to C a while ago, which attempts to transfer ideas like this one more by example:

https://github.com/codr7/hacktical-c

Echoing my sibling comment - thanks for sharing this.

Thanks for sharing this work.

  > Of course, in both languages you can write unidiomatically, but that is a great way to ensure that bugs get in and never get out.
Why does "unidiomatic" have to imply "buggy" code? You're basically saying an unidiomatic approach is doomed to introduce bugs and will never reduce them.

It sounds weird. If I write Python code with minimal side effects like in Haskell, wouldn't it at least reduce the possibility of side-effect bugs even though it wasn't "Pythonic"?

AFAIK, nothing in the language standard mentions anything about "idiomatic" or "this is the only correct way to use X". The definition of "idiomatic X" is not as clear-cut and well-defined as you might think.

I agree there's a risk with an unidiomatic approach. Irresponsibly applying "cool new things" is a good way to destroy "readability" while gaining almost nothing.

Anyway, my point is that there's no single definition of "good" that covers everything, and "idiomatic" is just whatever convention a particular community is used to.

There's nothing wrong with applying an "unidiomatic" mindset like awareness of stack/heap alloc, CPU cache lines, SIMD, static/dynamic dispatch, etc in languages like Java, Python, or whatever.

There's nothing wrong either with borrowing ideas like (Haskell) functor, hierarchical namespaces, visibility modifiers, borrow checking, dynamic dispatch, etc in C.

Whether it's "good" or not is left as an exercise for the reader.

> Why does "unidiomatic" have to imply "buggy" code?

Because when you stray from idioms you're going off down unfamiliar paths. All languages have better support for specific idioms. Trying to pound a square peg into a round hole can work, but is unlikely to work well.

> You're basically saying an unidiomatic approach is doomed to introduce bugs and will never reduce them.

Well, yes. Who's going to reduce them? Where are you planning to find people who are used to code written in an unusual manner?

By definition alone, code is written for humans to read. If you're writing it in a way that's difficult for humans to read, then of course the bug level can only go up and not down.

> It sounds weird. If I write Python code with minimal side effects like in Haskell, wouldn't it at least reduce the possibility of side-effect bugs even though it wasn't "Pythonic"?

"Pythonic" does not mean the same thing as "Idiomatic code in Python".