At my former job at a FAANG, I did the math on allocating developers machines with 16GB vs 64GB based on actual job tasks with estimates of how much thumb twiddling waiting time that this would save and then multiplied that out by the cost of the developer's time. The cost benefit showed a reasonable ROI that was realized in Weeks for Senior dev salaries (months for juniors).

Based on this, I strongly believe that if you're providing hardware for software engineers, it rarely if ever makes sense to buy anything but the top spec Macbook Pro available, and to upgrade every 2-3 years. I can't comment on non desktop / non-mac scenarios or other job families. YMMV.

No doubt the math checks out, but I wonder if developer productivity can be quantified that easily. I believe there's a lot of research pointing to people having a somewhat fixed amount of cognitive capacity available per day, and that aligns well with my personal experience. A lot of times, waiting for the computer to finish feels like a micro-break that saves up energy for my next deep thought process.

Your brain tends to do better if you can stay focused on your task for consecutive, though not indefinite, periods of time. This varies from person to person, and depends on how long a build/run/test takes. But the challenge for many is that 'break' often becomes a context switch, a potential loss of momentum, and worse may open me up to a distraction rather than a productive use of my time.

For me, personally, a better break is one I define on my calendar and helps me defragment my brain for a short period of time before re-engaging.

I recommend investigating the concept of 'deep work' and drawing your own conclusions.

>"A lot of times, waiting for the computer to finish feels like a micro-break that saves up energy for my next deep thought process."

As an ISV I buy my own hardware so I do care about expenses. I can attest that to me waiting for computer to finish feels like a big irritant that can spoil my programming flow. I take my breaks whenever I feel like and do not need a computer to help me. So I pay for top notch desktops (within reason of course).

There’s also the time to market and bureaucracy cost. I took over a place where there was a team of people devoted making sure you had exactly what PC you need.

Configuring devices more generously often lets you get some extra life out of it for people who don’t care about performance. If the beancounters make the choice, you’ll buy last years hardware at a discount and get jammed up when there’s a Windows or application update. Saving money costs money because of the faster refresh cycle.

My standard for sizing this in huge orgs is: count how many distinct applications launch per day. If it’s greater than 5-7, go big. If it’s less, cost optimize with a cheaper config or get the function on RDS.

Also worth factoring in that top-spec hardware will have a longer usable life, especially for non-power users.

This is true, but I find my train of thought slips away if I have to wait more than a handful of seconds, let alone two minutes.

Tying this back to your point, those limited hours of focus time come in blocks, in my experience, and focus time is not easily "entered", either.

One person's micro breaks are another person's disruption of flow state

Simple estimates work surprisingly well for a lot of things because a lot of the 'unquantifiable' complexity being ignored behaves like noise. When you have dozens of factors pulling in different directions—some developers multitask better, some lose flow more easily, some codebases are more memory-hungry, and so on—it all tends to just average out, and the result is reasonably accurate. Accurate enough that it's useful data to make a decision with, at least.

That sounds reasonable, but there are also factors pulling in the opposite direction, for example Wirth's Law [1], that suggests devs with powerful computers create inefficient software.

1. https://en.wikipedia.org/wiki/Wirth%27s_law

For me the issue is at work with 16gb of ram, I'm basically always running into swap and having things grind to a halt. My personal workstation has 64gb and the only time I experience issues is when something's leaking memory

Well depends what kind of time periods you're talking. I've seen one in the past that was 60 minutes vs. 20 minutes (for a full clean compile, but often that is where you find yourself) - that is far more than a micro-break, that is a big chunk of time wasted.

You’re not waiting for the end of a thing though. You might hope you are, but the truth is there’s always one little thing you still have to take care of. So until the last build is green and the PR is filed, you’re being held hostage by the train of thought that’s tied to this unit of work. Thinking too much about the next one just ends up adding time to this one.

You’re a grownup. You should know when to take a break and that’ll be getting away from the keyboard, not just frittering time waiting for a slow task to complete.

The hours I sometimes spend waiting on a build are time that won't come back latter. Sometimes I've done other tasks but I can only track so much and so often it isn't worth it.

a faster machine can get me to productive work faster.

Most of my friends at FAANG all do their work on servers remotely. Remote edit, remote build. The builds happen in giant networked cloud builders, 100s to 1000s per build. Giving them a faster local machine would do almost nothing because they don't do anything local.

...and this is a great setup.

On the laptop you need: - low weight so you can easily take it with you to work elsewhere - excellent screen/GPU - multiple large connected screens - plenty of memory - great keyboard/pointer device

Also: great chair

Frankly, what would be really great is a Mac Vision Pro fully customised as a workstation.

When I worked at a FAANG, most developers could get remote virtual machine for their development needs. They could pick the machine type and size. It was one of the first thing you'd learn how to do in your emb^H^H^H onboarding :)

So it wasn't uncommon to see people with a measly old 13" macbook pro doing the hard work on a 64cpu/256GB remote machine. Laptops were essentially machines used for reading/writing emails, writing documents and doing meetings. The IDEs had a proprietary extensions to work with remote machines and the custom tooling.

Ah so the coding was done locally but run remotely?

I nearly went insane when I was forced to code using Citrix.

> Ah so the coding was done locally but run remotely?

Both, depending on the case and how much you were inclined to fiddle with your setup. And on what kind of software you were writing (most software had a lot of linux-specific code, so running that on a macbook was not really an option).

A lot of colleagues were using either IntelliJ or VScode with proprietary extensions.

A lot of my work revolved around writing scripts and automating stuff, so IntelliJ was an absolutely overkill for me, not to mention that the custom proprietary extensions created more issues than they solved ("I just need to change five lines in a script for christ's sake, i don't need 20GB of stuff to do that")... So ended up investing some time in improving my GNU Emacs skills and reading the GNU Screen documentation, and did all of my work in Emacs running in screen for a few years.

It was very cool to almost never have to actually "stop working". Even if you had to reboot your laptop, your work session was still there uninterrupted. Most updates were applied automatically without needing a full system reboot. And I could still add my systemd units to the OS to start the things i needed.

Also, building onto that, I later integrated stuff like treemacs and eglot mode (along with the language servers for specific languages) and frankly I did not miss much from the usual IDEs.

> I nearly went insane when I was forced to code using Citrix.

Yeah I can see that.

In my case I was doing most of my work in a screen session, so I was using the shell for "actual work" (engineering) and the work macbook for everything else (email, meetings, we browsing etc).

I think that the ergonomics of gnu emacs are largely unchanged if you're using a gui program locally, remotely or a shell session (again, locally or remotely), so for me the user experience was largely unchanged.

Had i had to do my coding in some gui IDE on a remote desktop session I would probably have gone insane as well.

It sounds more like doing embedded development with a TFTP boot to an NFS mounted root filesystem.

more than that, in the faang jobs I've had you could not even check code out onto your laptop. it had to live on the dev desktop or virtual machine, and be edited remotely.

> it rarely if ever makes sense to buy anything but the top spec Macbook Pro available

God I wish my employers would stop buying me Macbook Pros and let me work on a proper Linux desktop. I'm sick of shitty thermally throttled slow-ass phone chips on serious work machines.

Just Friday I was dealing with a request from purchasing asking if a laptop with an ultra-low-power 15W TDP CPU and an iGPU with "8GB DDR4 graphics memory (shared) was a suitable replacement for one with a 75W CPU (But also a Core i9) and NVidia RTX4000 mobile 130W GPU in one of our lead engineer's CAD workstations.

No, those are not the same. There's a reason one's the size of a pizza box and costs $5k and the other's the size of an iPad and costs $700.

And yes, I much prefer to build tower workstations with proper thermals and full-sized GPUs, that's the main machine at their desk, but sometimes they need a device they can take with them.

Curious perspective. Apple silicon is both performant and very power efficient. Of course there are applications where even a top spec MacBook would be unsuitable, but I imagine that would be a very small percentage of folks needing that kind of power.

Sadly, the choice is usually between Mac and Windows—not a Linux desktop. In that case, I’d much prefer a unix-like operating system like MacOS.

To be clear, I am not a “fanboy” and Apple continues to make plenty of missteps. Not all criticisms against Apple are well founded though.

You very clearly have no experience on powerful desktop machines. A 9950x will absolutely demolish an M3 or M4 Macbook Pro in any possible test, especially multicore testing. And I don't care how "performant" or "efficient" you think it is, those M series chips will be thermally throttled like anything else packaged into a laptop.

Oh, and the vastly superior dekstop rig will also come out cheaper, even with a quality monitor and keyboard.

That’s my bad for not clarifying I am talking solely about the laptop form factor here. It’s a given that laptops are not comparable in performance to desktops. In terms of laptop hardware, Apple Silicon performs quite well

Nice assumptions though.

It’s not just my opinion that Apple silicon is pretty performant and efficient for the form factor; you can look up the stats yourself if you cared to. Yet, it seems you may be one of those people that is hostile towards Apple for less well-founded reasons. It’s not a product for everyone, and that’s ok.

I have a 7950x desktop and an M3 max, they are very distant in performance for development, albeit I'll give credit to Apple for good single core performance that show in some contexts.

I have a decent rig I built (5900x, 7900xt) of course it blows my M1 MacBook out of the water.

You seem like a reasonable person that can admit there’s some nice things about Apple Silicon even though it doesn’t meet everyone’s needs.

Wish my employers didn’t same calculation.

Gave developers 16GB RAM and 512MB storage. Spent way too much time worrying about available disk space and needlessly redownloading docker images off the web.

But at least they saved money on hardware expenses!

You mean 512GB storage?

I always bought a really large monitor for work with my own cash. When most dev's had 19" or 20" I got a 30" for $1500.

best money ever spent. lasted years and years.

for cpus - I wonder how the economics work out when you get into say 32 or 64 core threadrippers? I think it still might be worth it.

My personal experience using virtual desktops vs a MacBook aligns with your analysis. This despite the desktop virtual machines having better network connections. A VM with 16 GB of memory and 8 VCPUs can't compete with an M1 Max laptop.

FAANG manages the machines. Setting aside the ethics of this level of monitoring, I'd be curious to validate this by soft-limiting OS memory usage and tracking metrics like number of PRs and time someone is actively on the keyboard.

To put a massive spanner in this, companies are going to be rolling out seemingly mandatory AI usage, which has huge compute requirements .. which are often fulfilled remotely. And has varying, possibly negative, effects on productivity.

I think those working on user-facing apps could do well having a slow computer or phone, just so they can get a sense of what the actual user experience is like.

No doubt you mean well. In some cases it’s obvious- low memory machine can’t handle some docket setup, etc.

In reality, you can’t even predict time to project completion accurately. Rarely is a fast computer a “time saver”.

Either it’s a binary “can this run that” or a work environment thing “will the dev get frustrated knowing he has to wait an extra 10 minutes a day when a measly $1k would make this go away”