PyTorch was partly inspired by the python Autograd library (circa 2015 [1]) to the point where they called their autodiff [2] system "autograd" [3]. Jax is the direct successor of the Autograd library and several of the Autograd developers work on Jax to this day. Of course, for that matter, PyTorch author Adam Paszke is currently on the JAX team and seems to work on JAX and Dex these days.
[1] https://pypi.org/project/autograd/#history
[2] https://www.cs.toronto.edu/~rgrosse/courses/csc421_2019/read...
[2] https://web.archive.org/web/20170422051747/http://pytorch.or...
Yes, PyTorch borrowed from Autograd, Chainer, etc.
...but PyTorch felt friendlier and more Pythonic, and it came with a comprehensive library of prebuilt components for deep learning in `torch.nn`.
See https://news.ycombinator.com/item?id=45848768