I think this is a step in the right direction, but I really dislike the Pythonification of everything. After using IsaacSim/IsaacLab for work, I'm convinced that Python is not the right tool for the job.

Developers inevitably write slow, error filled code when dealing with Python and working with the type annotations can be a pain.

Happy there's something to replace PhysX for robotics, and I do really like MuJoCo's API, but really wish we could get some good C/C++ APIs.

Apart from the language, NVIDIA doesn't seem to be great when dealing with software. IsaacSim and IsaacLab have so many bugs, are incredibly slow, and hard to debug. We spend so many hours on my team findings bugs for IsaacSim, it's just a pain. On version 5.0 and still feels like beta software.

Also IsaacSim's relience on USD to hold the scene structure and update prims makes it so hard to program for. USD isn't really performant when trying to generate a large amount of scenes. And the USD interface stops working completly when simulation starts on IsaacLab. I hope Newton goes a different route, and has less of a reliacne on USD. IMO USD should just be used as an interchange format, rather than how you actually represent the scene and properties internally. I much prefer that approach, which Unreal Engine seems to support.

Lastly, my god the names in this field are terrible. USD (Googling becomes a pain sometimes), Newton (Already another engine), Warp (literally the name of the architecture and a way to write Python GPU kernels, wtf).

Agreed on most, and naming is terrible. Note that at run-time Python is out-of-the loop, since Newton Physics records a CUDA graph, and executes it, so performance is not impacted (aside from startup JIT time for modified kernels). I'd prefer C/C++ as well, and although you can call Warp-compiled kernels from C++ (without Python, see my https://github.com/erwincoumans/warp_cpp project), it would be better to have native C/C++ support without requiring a Python interpreter. It just happens that almost all Deep Learning/RL for robotics uses Python.

Hopefully I didn't come across as too negative, My entire team and I are really excited for Newton. Hope to get some time this week to try things out!

It's just a bad programming language. Guido has bad taste, it creates lazy, boring affordances that don't scale beyond a single file without paying for it somewhere.

Note that I'm not saying it's not useful (python is what C was to unix if it was invented now)

my take:

The language is A ok.

It's way better JS, and at the time it was a breath of fresh air. It was NodeJS before JS, and it was much better than node will be (I scratched out that "never").

I do recall Guido's claims about developers cognition - stating that (quoting from memory) a person can only hold up to 50k lines of code in their head, and Py can have way more code in that dimension that C or Java, but that's just stupid.

But similarly to JS it's heavily misused.

"But similarly to JS it's heavily misused."

Anything easy to use, will have also many unskilled people use it.

Python has always been an incredibly useful scripting and glue language and as such it is pretty much perfect (or rather: Python 2.x has been). The problems started only to appear when people tried using it as an actual programming language for writing large code bases entirely in Python. E.g. even a great tool can be the wrong tool for a job.

Define large?

For Python projects, I would define anything above 20kloc as 'large'.

My problem is that every machine learning ecosystem has decided that python is the main API and that if you want to call your model from another programming language you either have to build everything yourself (see llama.cpp/ggml) or use something extremely bare bones like IREE or worst of all, access python over IPC.