Linear algebra provides the language to describe geometric change through arrays of numbers. The definitions below use standard notation so you can immediately match them to the formulas you see in lectures, papers, and code.

Formal Definitions

  • Matrix: A matrix $A \in \mathbb{R}^{m \times n}$ is a rectangular array $A = [a_{ij}]$ with $m$ rows and $n$ columns. It represents a linear map $A : \mathbb{R}^n \to \mathbb{R}^m$ via matrix-vector multiplication that applies the transformation to column vectors.
  • Vector: A vector $v \in \mathbb{R}^n$ is written as a column \(v = \begin{bmatrix} v_1 \\ v_2 \\ \vdots \\ v_n \end{bmatrix}.\) Vectors belong to a vector space where addition $u+v$ and scalar multiplication $c v$ are defined component-wise.
  • Linear Transformation: A function $T : \mathbb{R}^n \to \mathbb{R}^m$ is linear if it respects addition and scalar multiplication: $T(u + v) = T(u) + T(v)$ and $T(c v) = c T(v)$. After choosing bases, such a transformation is realized as $T(v) = A v$ for some matrix $A$.
  • Eigenvectors and Eigenvalues: For a square matrix $A \in \mathbb{R}^{n \times n}$, an eigenvector $v \neq 0$ satisfies $A v = \lambda v$, where $\lambda \in \mathbb{R}$ (or $\mathbb{C}$) is the eigenvalue. Eigenvectors point in invariant directions for $A$, and the eigenvalue records how the vector scales.

Operations on Matrices and Vectors

  • Vector addition: Given $u, v \in \mathbb{R}^n$, their sum is \(u + v = \begin{bmatrix} u_1 + v_1 \\ \vdots \\ u_n + v_n \end{bmatrix}.\)

  • Scalar multiplication: Scaling a vector by $c \in \mathbb{R}$ gives \(c v = \begin{bmatrix} c v_1 \\ \vdots \\ c v_n \end{bmatrix},\) which stretches or shrinks $v$ without changing its direction (unless $c = 0$).

  • Matrix-vector multiplication: For $A \in \mathbb{R}^{m \times n}$ and $v \in \mathbb{R}^n$, \(A v = \begin{bmatrix} a_{1,1} & \cdots & a_{1,n} \\ \vdots & \ddots & \vdots \\ a_{m,1} & \cdots & a_{m,n} \end{bmatrix} \begin{bmatrix} v_1 \\ \vdots \\ v_n \end{bmatrix}.\) Each entry of the resulting vector is the dot product of a row of $A$ with $v$.

  • Matrix addition: If $A, B \in \mathbb{R}^{m \times n}$, then \(A + B = [a_{ij} + b_{ij}],\) which corresponds to the sum of the associated linear maps.

  • Matrix multiplication: When $A \in \mathbb{R}^{m \times n}$ and $B \in \mathbb{R}^{n \times p}$, \((AB)_{ik} = \sum_{j=1}^{n} a_{ij} b_{jk}.\) The product $AB$ describes the composite map that first sends a vector through $B$, then through $A$.

  • Transpose: The transpose of $A$ is $A^\top$ with entries \((A^\top)_{ij} = a_{ji}\). Transpose changes rows into columns and is essential when forming symmetric bilinear forms $v^\top A v$.

  • Dot product: Given $u, v \in \mathbb{R}^n$, \(u \cdot v = \sum_{i=1}^{n} u_i v_i.\) The dot product produces a scalar that measures alignment and defines the squared norm $|v|^2 = v \cdot v$.

The formulas above give a compact, symbolic view of how matrices and vectors interact. Once you see these expressions, it becomes much easier to trace the flow of data through systems of equations, transformations in computer graphics, or algorithms in data science.