Oh wow, this looks like a 3d render of a perceptron when I started reading about neural networks. I guess essentially neural networks are built based on that idea? Inputs > weight function to to adjust the final output to desired values?
Oh wow, this looks like a 3d render of a perceptron when I started reading about neural networks. I guess essentially neural networks are built based on that idea? Inputs > weight function to to adjust the final output to desired values?
The layers themselves are basically perceptrons, not really any different to a generalized linear model.
The ‘secret sauce’ in a deep network is the hidden layer with a non-linear activation function. Without that you could simplify all the layers to a linear model.
A neural network is basically a multilayer perceptron
https://en.wikipedia.org/wiki/Multilayer_perceptron
Yes, vanilla neural networks are just lots of perceptrons