There are a lot more ways to look at and understand these mysterious beasts called matrices. They seem to represent a more fundamental primordial truth. I'm not sure what it is. Determinant of a matrix indicate the area of or volume spanned by its component vectors. Complex matrices used in Fourier transform are beautiful. Quantum mechanics and AI seem to be built on matrices. There is hardly any area of mathematics that doesn't utilize matrices as tools. What exactly is a matrix? Just a grid of numbers? don't think so.
The fundamental truth is that matrices represent linear transformations, and all of linear algebra is developed in terms of linear transformations rather than just grid of numbers. It all becomes much clearer when you let go of the tabular representation and study the original intentions that motivated the operations you do on matrices.
My appreciation for the subject grew considerably after working through the book "Linear Algebra done right" by Axler https://linear.axler.net
Spatial transformations? Take a look at the complex matrices in Fourier transforms with nth roots of unity as its elements. The values are cyclic, and do not represent points in an n-D space of Euclidean coordinates.
Yes; I wrote linear transformation on purpose not to remain constrained on spatial or geometric interpretations.
The (discrete) Fourier transform is also a linear transformation, which is why the initial effort of thinking abstractly in terms of vector spaces and transformations between them pays lots of dividends when it's time to understand more advanced topics such as the DFT, which is "just" a change of basis.
>[Matrices] seem to represent a more fundamental primordial truth.
No, matrices (or more specifically matrix multiplication) are a useful result picked out of a huge search space defined as "all the ways to combine piles of numbers with arithmetic operators". The utility of the discovery is determined by humans looking for compact ways to represent ideas (abstraction). One of the most interesting anecdotes in the history of linear algebra was how Hamilton finally "discovered" a way to multiply them. "...he was out walking along the Royal Canal in Dublin with his wife when the solution in the form of the equation i2 = j2 = k2 = ijk = −1 occurred to him; Hamilton then carved this equation using his penknife into the side of the nearby Broom Bridge" [0]
The "primordial truth" is found in the selection criteria of the human minds performing the search.
0 - https://en.wikipedia.org/wiki/William_Rowan_Hamilton
A matrix is just a grid of numbers.
A lot of areas use use grid of numbers. And matrix theory actually incorporates every area that uses grids of numbers, and every rule in those areas.
For example the simplest difficult thing in matrix theory, matrix multiplication is an example for this IMO. It looks really weird in the context of grid of numbers, and its properties seem incidental, and the proofs are complicated. But matrix multiplication is really simple and natural in the context of linear transformations between vector spaces.
This is the most important part.
"...linear transformations between vector spaces."
When you understand what that implies you can start reasoning about it visually.
The 3 simplest (that you can find in blender or any other 3d program, or even partly in 2d programs).
Translation (moving something left,right,up,down,in,out).
Rotation (turn something 2 degrees, 90 degrees, 180 degrees, 360 degrees back to the same heading)
Scaling (make something larger, smaller, etc)
(And a few more that doesn't help right now)
The 2 first can be visualized simply in 2d, just take a paper/book/etc. Move it left-right, up down, rotate it.. the book in the original position and rotation compared to the new position and rotation can be described as a vector space transformation, why?
Because you can look at it in 2 ways, either the book moved from your vantage point, or you follow the book looking at it the same way and the world around the book moved.
In both cases, something moved from one space (point of reference) to another "space".
The thing that defines the space is a "basis vector", basically it says what is "up", what is "left" and was in "in" in the way we move from one space to another.
Think of it as, you have a piece card on a paper. Draw an line/axis along the bottom edge as the X count, then draw on the left side upwards the Y count. In the X,Y space (from space) you count the X and Y steps of various feature points.
Now draw the "to space" as another X axis and another Y axis (could be rotated, could be scaled, could just be moved) and take the counts in steps and put them inside the "to space" measured in equal units as they were in the from space.
Once the feature points are replicated in the "to space" you should have the same image as before, just within the new space.
This is the essence of a so called linear(equal number steps) transform (moved somewhere else), and also exactly what multiplying a set of vectors by a matrix achieves (simplified, in this context, the matrix really is mostly a representation of a number of above mentioned basis vectors that defines the X, Y,etc of the movement).
the set of all matrices of a fixed size are a vector space because matrix addition and scalar multiplication are well-defined and follow all vector space axioms.
But be careful of the map–territory relation.
If you can find a model that is a vector space, that you can extend to an Inner product space and extend that to a Hilbert space; nice things happen.
Really the amazing part is finding a map (model) that works within the superpowers of algorithms, which often depends upon finding many to one reductions.
Get stuck with a hay in the haystack problem and math as we know it now can be intractable.
Vector spaces are nice and you can map them to abstract algebra, categories, or topos and see why.
I encourage you to dig into the above.
Take linear algebra
OK, now what?
A matrix is just a list of where a linear map sends each basis element (the nth column of a matrix is the output vector for the nth input basis vector). Lots of things are linear (e.g. scaling, rotating, differentiating, integrating, projecting, and any weighted sums of these things). Lots of other things are approximately linear locally (the derivative if it exists is the best linear approximation. i.e. the best matrix to approximate a more general function), and e.g. knowing the linear behavior near a fixed point can tell you a lot about even nonlinear systems.
Yes, I think of them as saying "and this is what the coordinates in our coordinate system [basis] shall mean from now on". Systems of nonlinear equations, on the other hand, are some kind of sea monsters.