> What's the machine code of an AGI gonna look like?

Right now the guess is that it will be mostly a bunch of multiplications and additions.

> It makes one illegal instruction and crashes?

And our hearth quivers just slightly the wrong way and we die. Or a tiny blood cloth plugs a vessel in our brain and we die. Do you feel that our fragility is a good reason why meat cannot be intelligent?

> I jest but really think about the metal.

Ok. I'm thinking about the metal. What should this thinking illuminate?

> The inside of modern computers is tightly controlled with no room for anything unpredictable.

Let's assume we can't make AGI because we need randomness and unpredictability in our computers. We can very easily add unpredictability. The simple and stupid solution is to add some sensor (like a camera CCD) and stare at the measurement noise. You don't even need a lens on that CCD. You can cap it so it sees "all black", and then what it measures is basically heat noise of the sensors. Voila. Your computer has now unpredictability. People who actually make semiconductors probably can come up with even simpler and easier ways to integrate unpredictability right on the same chip we compute with.

You still haven't really argued why you think "unpredictableness" is the missing component of course. Beside the fact that it just feels right to you.

Mmmm well my meatsuit can't easily make my own heart quiver the wrong way and kill me. Computers can treat data as code and code as data all pretty easily. It's core to several languages (like lisp). As such making illegal instructions or violating the straightjacket of a system such an "intelligence" would operate in is likely. If you could make an intelligent process, what would it think of an operating system kernel (the thing you have to ask for everything, io memory, etc)? Does the "intelligent" process fear for itself when it's going to get descheduled? What is the bitpattern for fear? Can you imagine an intelligent process in such a place, as static representation of data in ram? To get write something down you call out to a library and maybe the CPU switches out to a brk system call to map more virtual memory? It all sounds frankly ridiculous. I think AGI proponents fundamentally misunderstand how a computer works and are engaging in magical thinking and taking the market for a ride.

I think it's less about the randomness and more about that all the functionality of a computer is defined up front, in software, in training, in hardware. Sure you can add randomness and pick between two paths randomly but a computer couldn't spontaneously pick to go down a path that wasn't defined for it.

> Mmmm well my meatsuit can't easily make my own heart quiver the wrong way and kill me.

It very much can. Jump scares, deep grief are known to cause heart attacks. It is called stress cardiomyopathy. Or your meatsuit can indiredtly do that by ingesting the wrong chemicals.

> If you could make an intelligent process, what would it think of an operating system kernel

Idk. What do you think of your hypothalamus? It can make you unconscious at any time. It in fact makes you unconscious about once a day. Do you fear it? What if one day it won’t wake you up? Or what if it jacks up your internal body temperature and cooks you alive from the inside? It can do that!

Now you might say you don’t worry about that, because through your long life your hypothalamus proved to be reliable. It predictably does what it needs to do, to keep you alive. And you would be right. Your higher cognitive functions have a good working relationship with your lower level processes.

Similarly for an AGI to be inteligent it needs to have a good working relationship with the hardware it is running on. That means that if the kernel is temperamental and idk descheduling the higher level AGI process then the AGI will mallfunction and not appear that inteligent. Same as if you meet Albert Einstein while he is chemically put to sleep. He won’t appear inteligent at all! At best he will be just drooling there.

> Can you imagine an intelligent process in such a place, as static representation of data in ram?

Yes. You can’t? This is not really a convincing argument.

> It all sounds frankly ridiculous.

I think what you are doing is that you are looking at implementation details and feeling a disconnect between that and the possibility of inteligence. Do you feel the same ridiculousnes about a meatblob doing things and appearing inteligent?

> a computer couldn't spontaneously pick to go down a path that wasn't defined for it.

Can you?