> The new law, "Claude's Law" dictates that processing speed will increase by a factor of 10 every year.
Is the limit of currently in silico mainly the cleverness of the design or is it physics? While there will be scope for better designs I'm not sure it's at the level of a factor of 10 every year for 10 years.
But perhaps moving fully into 3D chips designs might give a significant boost if the cooling problem can be fixed.
Ultimately computers are just the universe set up in a specific way such that when the universe rolls forward it does a useful computation.
In terms of the best way to build such a machine - I don't think it's inevitable it's biological - biology has great energy efficiency, high levels of connectively, analog inputs and outputs, temporal dimensions and the ability to mix global and local signalling etc. However the actual max rate of neuron firing is relatively low.
You've also not mentioned quantum computing.
I would suggest that on some level we're still well inside "cleverness of the design" territory BUT that that hides a lot of complexity and nuance - for example we really haven't started to take advantage of the third dimension by any reasonable means. a perfectly optimal cpu would probably look more like a sphere than a flat square chip.
To my limited understanding we have started with layering techniques but its still pretty early days.
The reason we dont do that more is both because of difficulty in design but also manufacturing.