I think the OO hatred comes from how academia and certain enterprise organisations for our industry picked it up and taught it like a religion. Molding an entire generation og developers who wrote some really horrible code because they were taught that abstractions were, always, correct. It obviously weren't so outside those institutions, the world slowly realized that abstractions were in many ways worse for cyclomatic complexity than what came before. Maybe not in a perfect world where people don't write shitty code on a thursday afternoon after a long day of horrible meetings in a long week of having a baby cry every night.

As with everything, there isn't a golden rule to follow. Sometimes OO makes sense, sometimes it doesn't. I rarely use it, or abstractions in general, but there are some things where it's just the right fit.

Much like Agile, or Hungarian notation. When a general principle becomes a religion it ceases to be a good general principle.

> I think the OO hatred comes from how academia and certain enterprise organisations for our industry picked it up and taught it like a religion.

This, this, this. So much this.

Back when I was in uni, Sun had donated basically an entire lab of those computers terminals that you used to sign in to with a smart card (I forgot the name). In exchange, the uni agreed to teach all classes related to programming in Java, and to have the professors certify in Java (never mind the fact that nobody ever used that laboratory because the lab techs had no idea how to work with those terminals).

As a result of this, every class from algorithms, to software architecture felt like like a Java cult indoctrination. One of the professors actually said C was dead because Java was clearly superior.

> One of the professors actually said C was dead because Java was clearly superior.

In our uni (around 1998/99) all professors said that except the Haskell teacher who indeed called Java a mistake (but c also).

Turns out everyone was completely wrong except for that one guy working in Haskell.

Tale as old as time.

Java was probably close to 50% of the job market at some point in the 2000s and C significantly dried up with C++ taking its place. So I'm afraid everyone was right actually.

To be honest, I'm convinced the reason so many people dislike Java is because they have had to use it in a professional context only. It's not really a hobbyist language.

Just for the record, I don't think C ever dried up in the embedded space. And the embedded space is waaaay bigger than most people realise, because almost all of it is proprietary, so very little "leaks" onto the public interwebs.

Believe it or not but there is plenty of Java and C++ in the embedded space. It’s far from being a C fortress.

Probably the Sun Ray computer.

https://en.wikipedia.org/wiki/Sun_Ray

This was it!

And now you know how Nvidia CUDA got so popular.