> the language most of the team know the best
I fully agree. The challenge is, some will want to use the latest languages and technologies because they want to learn it (personal development, meaning: the next job). Sometimes the "new thing" can be limited to (non-critical) testing and utilities. But having many languages and technologies just increases the friction, complicates things, and prevents refactoring. Even mixing just scripts with regular languages is a problem; calling one language from another is similar. The same with unnecessary remote APIs. Less technologies is often better, even if the technologies are not the best (eg. using PostgreSQL for features like fulltext search, event processing, etc.)
This is a bit related to external dependencies vs build yourself (AKA reinvent the wheel). Quite often the external library, long term, causes more issues than building it yourself (assuming you _can_ build a competent implementation).
> This is a bit related to external dependencies vs build yourself (AKA reinvent the wheel). Quite often the external library, long term, causes more issues than building it yourself (assuming you _can_ build a competent implementation).
I feel like this happens mostly because simpler is better, and most of these dependencies don’t follow a good “UNIX” philosophy of modularity, being generic etc. something that you’d notice the standard libraries try to achieve.
Most of these third party dependencies are just a very specific feature that starts to add more use cases until it becomes bloated to support multiple users with slightly different needs.
Yep, true in my experience as well. And in the age of LLMs it is not so difficult to ask it to extract just this or that piece of functionality into another package but with a different API. So these days it's even easier to roll your own stuff. It's not such a huge time sink as it sometimes was before.