Model weights are code, for a dive into that see [0]. That shows how to encode Boolean logic using NAND gates in an MLP.

The expressivity is there, the only question is how to encode useful functions into those weights, especially when we don’t know how to write those functions by hand.

[0] http://neuralnetworksanddeeplearning.com/chap1.html