GraphMath

First steps

Vectors as coordinates, matrices as instructions and the first structural ideas

How do coordinates, matrices and equations fit into one picture?

This chapter introduces the basic objects of linear algebra and connects them immediately to structure. Vectors are coordinates, matrices are instructions, matrix multiplication becomes transformation and the same viewpoint leads to span, independence, orthogonality, transpose and the first encounter with A x = b.

Key ideas

The chapter starts with notation, but the deeper point is that the same few structures reappear everywhere.

  • A vector is a list of coordinates and a matrix tells how each coordinate direction moves
  • Matrix-vector multiplication is built from scaled columns, so the columns of a matrix already encode its action
  • Linearity means addition and scaling are preserved, which is why grids become skewed but still structured
  • Span and independence describe whether vectors create new directions or repeat old ones
  • Dot product and orthogonality measure directional overlap and prepare the way for projection
  • Transpose changes rows into columns and begins the bridge between matrix domain, codomain and linear equations

This chapter is a map of the ideas that later chapters develop in a more specialized way.

Why does matrix-vector multiplication sit at the center of the chapter?

Because it connects coordinates, columns, transformations and equations in one definition. Once you see a matrix as acting on vectors by combining its columns, many later ideas become variations of the same structure.

Related chapters

Chapter contents

Why do span and independence appear so early?

Because they answer the first structural question about a set of vectors: do these vectors create new directions or not. That question later controls basis, rank, solvability and the geometry of matrix action.

Was this chapter helpful?

Quick feedback helps us improve the site.