>the core ideas of linear algebra

I think it is actually delusional to say that a book which does not properly define what a vector is, contains the "core ideas" of linear algebra.

Linear algebra is so much more than lines in R^n. It is a powerful theory because it is abstract.

if you know how to handle real or complex coordinate vectors and matrices you’re one only isomorphism away (aka choice of basis) from dealing with an “abstract” vector space (except if you want to talk about finite fields or infinite dimensions). it seems like a really good starting point for many learners’ backgrounds…

>f you know how to handle real or complex coordinate vectors and matrices you’re one only isomorphism away (aka choice of basis) from dealing with an “abstract” vector space

No, you aren't. How would you explain that matrices are both linear transformations and vectors? How would explain what a dual space is? How would you understand the properties of the Fourier transformation, which is a mapping between functions, which are also vectors, and itself also is a vector?

as i said if you want infinite dimensional things like Fourier transforms acting on function spaces you may benefit from additional abstraction. but even those people will benefit from having learned Rn first.

i’m as much with bourbaki as the next guy. but that’s not really how most engineers learn in practice.

as for treating linear maps between finite dimensional spaces as vectors that’s quite straightforward to do in coordinate terms.

again i refer you to the classics like Golub and van Lean that have been reprinted many many times and educated generations.

even Golub and van Loan basically start there.

A vector is an ordered collection of scalars, arranged either in a row or a column

Seems like a valid definition of a vector to me. What should be added or removed, in your view?

Offer a better alternative, then? E-books are handy.