Biological computers are inevitable. We are the most compelling proof of concept of this that we have. Our entire civilization may be a prototype of one already.
From my perspective, I belive things will happen in the following order;
1. AI will eventually take over all silicon chip design. Human designs pale in comparison. Moores Law, which currently indicates that humans are reaching the practical limitations of their own silicon chip design skills; will give way to a new law. The new law, "Claude's Law" dictates that processing speed will increase by a factor of 10 every year. And for a decade or so, it does. There is no reason to ever fabricate another human designed chip ever again. To do so would be an irresponsible waste of fabrication resources.
2. AI will reach the practical limits of silicon processing capability 10 years after humans designed their last commercial chip. Chip performance increases begin to slow, and it looks like the end of unit performance increases for silicon based computing technology is approaching.
3. AI pivots to biological computers. Next generation computers emerge that are made from DNA and living tissue. Although the shape of a computer server remains mostly unchanged, a next generation biological computer is basically just "a really big brain in a jar."
4. Biological robots?
> The new law, "Claude's Law" dictates that processing speed will increase by a factor of 10 every year.
Is the limit of currently in silico mainly the cleverness of the design or is it physics? While there will be scope for better designs I'm not sure it's at the level of a factor of 10 every year for 10 years.
But perhaps moving fully into 3D chips designs might give a significant boost if the cooling problem can be fixed.
Ultimately computers are just the universe set up in a specific way such that when the universe rolls forward it does a useful computation.
In terms of the best way to build such a machine - I don't think it's inevitable it's biological - biology has great energy efficiency, high levels of connectively, analog inputs and outputs, temporal dimensions and the ability to mix global and local signalling etc. However the actual max rate of neuron firing is relatively low.
You've also not mentioned quantum computing.
I would suggest that on some level we're still well inside "cleverness of the design" territory BUT that that hides a lot of complexity and nuance - for example we really haven't started to take advantage of the third dimension by any reasonable means. a perfectly optimal cpu would probably look more like a sphere than a flat square chip.
To my limited understanding we have started with layering techniques but its still pretty early days.
The reason we dont do that more is both because of difficulty in design but also manufacturing.
I doubt that undirected statistical systems perform better than expert systems. The latter are already in use, nobody designs new chips exclusively with pen and paper. Also the current limits of computation are more due to atomic size and information speed, less due to humans just not working fast enough.
Fantasy land type mindset.
What is so fantasy land about a DNA computer?
https://en.wikipedia.org/wiki/DNA_computing
Bro got cooked
Low quality ad-hominem comments that don't add value are frowned upon here. This isn't Reddit.
Stop cooking me
[dead]
[dead]