Aircraft engineers studied birds to make airplanes.

If AI engineers don't study brains, they will probably never build an intelligent ai.

At least stimulate the brain of an ant or a small lizard, that shouldn't be hard to do.

Maybe try to do more things with primates or animals to teach them things.

I don't understand why cognitive sciences are not approached when dealing with ai, that seems obvious, but all I see is people viewing the brain like it's a computer with an algorithm.

Launching a bird shaped wood plank will never lead to flight.

Feels like we forgot that science is about understanding things. AI engineers don't analyse trained neural networks, they're black boxes. What's the point?

Maybe scientists are just bad at science.

There are so many questions to ask.

What current generation LLMs are doing is like being trained on a dataset of human dances and then the users somehow expect it to do more than replicate the dances it has already seen. It is supposed to replicate the internal brain state of a human just from seeing the dances, but if it ever came up with a dance that isn't in the dataset, it will be punished. Finally, people expect it to be intelligent, because humans are just dance move predictors and intelligence is equivalent to dance move prediction, it should now do something that it was explicitly punished not to do, i.e. come up with new dances.