I've learned calculus a few times in and out of school. I learned it best from a home-schooling focused text book. The author published a paper describing his notation [0]; see section 3 which is very readable.

The paper explains:

> Most calculus students glaze over the notation for higher derivatives, and few, if any, books bother to give any reasons behind what the notation means. It's important to go back and consider why the notation is what it is, and what the pieces are supposed to represent.

> In modern calculus, the derivative is always taken with respect to some variable. However, this is not strictly required, as the differential operation can be used in a context-free manner. The processes of taking a differential and solving for a derivative (i.e., some ratio of differentials) can be separated out into logically separate operations.

> In such an operation, instead of doing d/dx (taking the derivative with respect to the variable x), one would separate out performing the differential and dividing by dx as separate steps. Originally, in the Leibnizian conception of the differential, one did not even bother solving for derivatives, as they made little sense from the original geometric construction of them.

> For a simple example, the differential of x^3 can be found using a basic differential operator such that d(x^3) = 3x^2 dx. The derivative is simply the differential divided by dx. This would yield d(x^3)/dx = 3x^2.

Is this significantly different than what is normally taught in schools? Using the notation described here is the first time it has felt like a tool rather than a formality to me, and it's quite different than the way I was thinking while taking calculus in high school. I'm not really sure how unusual this notation is though?

[0]: https://arxiv.org/pdf/1801.09553.pdf

Thanks for that reference. Very useful.