I feel like 1 and 2 are only applicable in cases of novelty.

The thing is, if you build enough of the same kinds of systems in the same kinds of domains, you can kinda tell where you should optimize ahead of time.

Most of us tend to build the same kinds of systems and usually spend a career or a good chunk of our careers in a given domain. I feel like you can't really be considered a staff/principal if you can't already tell ahead of time where the perf bottleneck will be just on experience and intuition.

I feel like every time I have expected an area to be the major bottleneck it has been. Sometimes some areas perform worse than I expected, usually something that hasn't been coded well, but generally its pretty easy to spot the computationally heavy or many remote call areas well before you program them.

I have several times done performance tests before starting a project to confirm it can be made fast enough to be viable, the entire approach can often shift depending on how quickly something can be done.

It really depends on your requirements. C10k requires different design than a web server that sees a few requests per second at most, but the web might never have been invented if the focus was always on that level of optimization.

The number 1 issue Ive experienced with poor programmers is a belief that theyre special snowflakes who can anticipate the future.

It's the same thing with programmers who believe in BDUF or disbelieve YAGNI - they design architectures for anticipated futures which do not materialize instead of evolving the architecture retrospectively in line with the future which did materialize.

I think it's a natural human foible. Gambling, for instance, probably wouldnt exist if humans' gut instincts about their ability to predict future defaulted to realistic.

This is why no matter how many brilliant programmers scream YAGNI, dont do BDUF and dont prematurely optimize there will always be some comment saying the equivalent of "akshually sometimes you should...", remembering that one time when they metaphorically rolled a double six and anticipated the necessary architecture correctly when it wasnt even necessary to do so.

These programmers are all hopped up on a different kind of roulette these days...

Sure, don't build your system to keep audit trails until after you have questions to answer so that you know what needs to go in those audit trails.

Don't insist on file-based data ingestion being a wrapper around a json-rpc api just because most similar things are moving that direction; what matters is whether someone has specifically asked for that for this particular system yet.

.

Not all decisions can be usefully revisited later. Sometimes you really do need to go "what if..." and make sure none of the possibilities will bite too hard. Leaving the pizza cave occasionally and making sure you (have contacts who) have some idea about the direction of the industry you're writing stuff for can help.

    > Sure, don't build your system to keep audit trails until after you have questions to answer so that you know what needs to go in those audit trails...what matters is whether someone has specifically asked for that for this particular system yet.
I spent ~15 years in life sciences.

You're going to build an audit trail, no matter what. There's no validated system in LS that does not have an audit trail.

It's just like e-commerce; you're going to have a cart and a checkout page. There's no point in calling that a premature optimization. Every e-commerce website has more or less the same set of flows with simply different configuration/parameters/providers.

Going "what if?" and then validating a customer requirement that exists NOW is NOT the same thing as trying to pre-empt a customer's requirement which might exist in the future.

Audit trails are commonly neglected coz somebody didnt ask the right questions, not coz somebody didnt try to anticipate the future.

Aye. The number one way to make software amenable to future requirements is to keep it simple so that it's easy to change in future. Adding complexity for anticipated changes works against being able to support the unanticipated ones.

> you can kinda tell where you should optimize ahead of time

Rules are "kinda" made to be broken. Be free.

I've been sticking to these rules (and will keep sticking to them) for as long as I can program (I've been doing it for the last 30 years).

IMHO, you can feel that a bottleneck is likely to occur, but you definitely can't tell where, when, or how it will actually happen.

ROFL, I wish Pike had known what he was talking about. /s ;)

Rob Pike and I (and probably most of us) work(ed) on different kind of things.

Notice my use of the word "Novelty".

I get hired because I'm very good at building specific kinds of systems so I tend to build many variants of the same kinds of systems. They are generally not that different and the ways in which the applications perform are similar.

I do not generally write new algorithms, operating systems, nor programming languages.

I don't think this is so hard to understand the nuance of Pike's advice and what we "mortals" do in or day-to-day to earn a living.

Rob Pike wrote Unix and Golang, but sure, you’re built different.

Rob Pike is responsible for many cool things, but Unix isn't one of them. Go is a wonderful hybrid (with its own faults) of the schools of Thompson and Wirth, with a huge amount of Pike.

If you'd said Plan 9 and UTF-8 I'd agree with you.

Rob Pike definitely wrote large chunks of Unix while at Bell Labs. It's wrong to say he wrote all of it like the GP did but it is also wrong to diminish his contributions.

Unless you meant to imply that UNIX isn't cool.

I did not say he wrote all of it. “Write” can include co-authorship.

A lot of people are learning some history today, beautiful to see.

I think that if you meant co-authorship you could have made that clearer. A 'contributed to' would have saved some unique ids.

> Rob Pike wrote Unix

Unix was created by Ken Thompson and Dennis Ritchie at Bell Labs (AT&T) in 1969. Thompson wrote the initial version, and Ritchie later contributed significantly, including developing the C programming language, which Unix was subsequently rewritten in.

Pike didn’t create Unix initially, but was a contributor to it. He, with a team, unquestionably wrote it.

> but was a contributor to it. He, with a team, unquestionably wrote it.

contribute < wrote.

His credits are huge, but I think saying he wrote Unix is misattribution.

Credits include: Plan 9 (successor to Unix), Unix Window System, UTF-8 (maybe his most universally impactful contribution), Unix Philosophy Articulation, strings/greps/other tools, regular expressions, C successor work that ultimately let him to Go.

Are you under the impression he was, like, a hands-off project manager or something? His involvement was in writing it. Not singlehandedly, but certainly as part of a team. He unquestionably wrote it. He did not envision it like he did the other projects you mention, but the original credit was only in the writing of.

To say "Rob Pike wrote Unix" is completely inaccurate. He joined after v7, in 1980.

Nobody seems to be questioning that he was involved in Unix. Given that he didn't write it, what did he do for the project? Quality assurance? Support? Marketing? Court jester?

Do you think Rob Pike ever decided that maybe what was done before isn't good enough? Stop putting artificial limits on your own competency.