Matrices

Definition

A matrix is a rectangular array of numbers, symbols, or expressions arranged in rows and columns. Formally, an m×nm \times n matrix AA has mm rows and nn columns:

A=[a11a12a1na21a22a2nam1am2amn]A = \begin{bmatrix} a_{11} & a_{12} & \cdots & a_{1n} \\ a_{21} & a_{22} & \cdots & a_{2n} \\ \vdots & \vdots & \ddots & \vdots \\ a_{m1} & a_{m2} & \cdots & a_{mn} \end{bmatrix}

Augmented Matrices as Systems of Linear Equations

Matrix Representation

Types of Matrices

  1. Square Matrix: A matrix with the same number of rows and columns (m=nm = n)
  2. Identity Matrix: A square matrix with 1's on the main diagonal and 0's elsewhere
  3. Augmented Matrix: A matrix used to represent a system of linear equations
  4. Zero Matrix: A matrix with all elements equal to zero
  5. Diagonal Matrix: A square matrix with non-zero elements only on the main diagonal
  6. Triangular Matrix:
    • Upper triangular: non-zero elements only on and above the main diagonal
    • Lower triangular: non-zero elements only on and below the main diagonal
  7. Symmetric Matrix: A square matrix equal to its transpose (A=ATA = A^T)

Matrix Operations

Addition and Subtraction

For matrices AA and BB of the same dimensions:

(A±B)ij=Aij±Bij(A \pm B)_{ij} = A_{ij} \pm B_{ij}

Scalar Multiplication

For a scalar cc and matrix AA:

(cA)ij=cAij(cA)_{ij} = c \cdot A_{ij}

Matrix Multiplication

For an m×nm \times n matrix AA and an n×pn \times p matrix BB:

(AB)ij=k=1nAikBkj(AB)_{ij} = \sum_{k=1}^{n} A_{ik} \cdot B_{kj}

Transpose

The transpose of an m×nm \times n matrix AA is the n×mn \times m matrix ATA^T where:

(AT)ij=Aji(A^T)_{ij} = A_{ji}

Properties and Applications

Determinant

For a square matrix, the determinant is a scalar value that provides information about the matrix's invertibility and the volume scaling factor of the linear transformation it represents.

Inverse

A square matrix AA has an inverse A1A^{-1} if AA1=A1A=IAA^{-1} = A^{-1}A = I, where II is the identity matrix.

Rank

The rank of a matrix is the dimension of the vector space generated by its columns (or rows).

Eigenvalues and Eigenvectors

For a square matrix AA, a non-zero vector vv is an eigenvector with eigenvalue λ\lambda if Av=λvAv = \lambda v.

Applications

  1. Systems of Linear Equations: Matrices provide a compact way to represent and solve systems using methods like Gaussian elimination and Row-Reduced Echelon Form.

  2. Linear Transformations: Matrices represent linear transformations between vector spaces.

  3. Computer Graphics: Transformation matrices are used for rotation, scaling, and translation in 2D and 3D graphics.

  4. Data Science: Matrices are fundamental in techniques like Principal Component Analysis (PCA) and Linear Regression.

  5. Quantum Mechanics: Matrices represent observables and transformations in quantum systems.

Computational Methods

Various algorithms exist for matrix operations, including:

  • LU Decomposition
  • QR Factorization
  • Singular Value Decomposition (SVD)
  • Eigenvalue Decomposition