300 years is a stretch. But Legendre described linear regression ~220 years ago (1805). And from a very high level perspective, modern neural networks are mostly just stacks of linear regression layers with non-linearities sandwiched between them. I'm obviously oversimplifying it a lot, but that't the gist of it.