This pairs really well with the little book https://github.com/little-book-of/linear-algebra, I have recently updated it with more content and clearer explanations, so if you're diving into this topic, you might find it helpful.
This pairs really well with the little book https://github.com/little-book-of/linear-algebra, I have recently updated it with more content and clearer explanations, so if you're diving into this topic, you might find it helpful.
pairs well? Isn't it the same book?
Based on the URL correlation and content, it sure appears to be the same book.
By pairing, I mean that you can read the book alongside the notebook. Sometimes, in the notebook, I don't explain the concepts, only some Python code.
Thank you for creating and open sourcing this. I have a question if I may.
If a person has the goal of getting into a field like computer vision or machine learning, would they be able to build useful things right away if they completed this book?
Definitely! If you scroll down a bit, you will see in Chapter 10 that I've included some fun applications, things like 2D/3D geometry, linear regression, recommender systems, and even a quick intro to PageRank. I wanted to show how these ideas connect to real-world problems, so it's not just theory. Hope you find it interesting.
Much appreciated! I'll give this a go and see how far I get.
great work!, will take a look over the weekend.
Here is the Google Colab link for you https://colab.research.google.com/github/little-book-of/line...
>the core ideas of linear algebra
I think it is actually delusional to say that a book which does not properly define what a vector is, contains the "core ideas" of linear algebra.
Linear algebra is so much more than lines in R^n. It is a powerful theory because it is abstract.
if you know how to handle real or complex coordinate vectors and matrices you’re one only isomorphism away (aka choice of basis) from dealing with an “abstract” vector space (except if you want to talk about finite fields or infinite dimensions). it seems like a really good starting point for many learners’ backgrounds…
>f you know how to handle real or complex coordinate vectors and matrices you’re one only isomorphism away (aka choice of basis) from dealing with an “abstract” vector space
No, you aren't. How would you explain that matrices are both linear transformations and vectors? How would explain what a dual space is? How would you understand the properties of the Fourier transformation, which is a mapping between functions, which are also vectors, and itself also is a vector?
as i said if you want infinite dimensional things like Fourier transforms acting on function spaces you may benefit from additional abstraction. but even those people will benefit from having learned Rn first.
i’m as much with bourbaki as the next guy. but that’s not really how most engineers learn in practice.
as for treating linear maps between finite dimensional spaces as vectors that’s quite straightforward to do in coordinate terms.
again i refer you to the classics like Golub and van Lean that have been reprinted many many times and educated generations.
even Golub and van Loan basically start there.
A vector is an ordered collection of scalars, arranged either in a row or a column
Seems like a valid definition of a vector to me. What should be added or removed, in your view?
Offer a better alternative, then? E-books are handy.