Linear Transformations

Definition

A linear transformation T:VWT: V \rightarrow W between vector spaces VV and WW is a function that preserves vector addition and scalar multiplication:

  1. T(u+v)=T(u)+T(v)T(u + v) = T(u) + T(v) for all u,vVu, v \in V (preserves addition)
  2. T(cv)=cT(v)T(c \cdot v) = c \cdot T(v) for all vVv \in V and scalars cc (preserves scalar multiplication)

Examples

Rotation in R2\mathbb{R}^2

A rotation by angle θ\theta in the plane is a linear transformation:

T(xy)=(cosθsinθsinθcosθ)(xy)T\begin{pmatrix} x \\ y \end{pmatrix} = \begin{pmatrix} \cos\theta & -\sin\theta \\ \sin\theta & \cos\theta \end{pmatrix} \begin{pmatrix} x \\ y \end{pmatrix}

Differentiation

The differentiation operator D:PnPn1D: P_n \rightarrow P_{n-1} defined by D(p(x))=p(x)D(p(x)) = p'(x) is a linear transformation on the space of polynomials.

Projection

The projection of a vector onto a subspace is a linear transformation.

Matrix Representation

Every linear transformation T:VWT: V \rightarrow W between finite-dimensional vector spaces can be represented by a matrix with respect to chosen bases.

If {v1,v2,,vn}{v_1, v_2, \ldots, v_n} is a basis for VV and {w1,w2,,wm}{w_1, w_2, \ldots, w_m} is a basis for WW, then:

  1. Compute T(vj)T(v_j) for each basis vector vjv_j
  2. Express each T(vj)T(v_j) as a linear combination of the basis vectors of WW:
T(vj)=a1jw1+a2jw2++amjwmT(v_j) = a_{1j}w_1 + a_{2j}w_2 + \cdots + a_{mj}w_m
  1. The matrix representation of TT is the m×nm \times n matrix:
[T]=[a11a12a1na21a22a2nam1am2amn][T] = \begin{bmatrix} a_{11} & a_{12} & \cdots & a_{1n} \\ a_{21} & a_{22} & \cdots & a_{2n} \\ \vdots & \vdots & \ddots & \vdots \\ a_{m1} & a_{m2} & \cdots & a_{mn} \end{bmatrix}

Properties

Kernel and Image

  • The kernel (or null space) of TT, denoted ker(T)\ker(T) or Null(T)\text{Null}(T), is the set of all vectors in VV that map to the zero vector in WW:
ker(T)={vV:T(v)=0}\ker(T) = \{v \in V : T(v) = 0\}
  • The image (or range) of TT, denoted Im(T)\text{Im}(T) or Range(T)\text{Range}(T), is the set of all vectors in WW that are the image of some vector in VV:
Im(T)={T(v):vV}\text{Im}(T) = \{T(v) : v \in V\}

Rank and Nullity

  • The rank of TT is the dimension of its image: rank(T)=dim(Im(T))\text{rank}(T) = \dim(\text{Im}(T))
  • The nullity of TT is the dimension of its kernel: nullity(T)=dim(ker(T))\text{nullity}(T) = \dim(\ker(T))

Rank-Nullity Theorem

For a linear transformation T:VWT: V \rightarrow W where VV is finite-dimensional:

dim(V)=rank(T)+nullity(T)\dim(V) = \text{rank}(T) + \text{nullity}(T)

This fundamental theorem connects the dimension of the domain, the dimension of the image, and the dimension of the kernel.

Composition of Linear Transformations

If S:UVS: U \rightarrow V and T:VWT: V \rightarrow W are linear transformations, then their composition TS:UWT \circ S: U \rightarrow W defined by (TS)(u)=T(S(u))(T \circ S)(u) = T(S(u)) is also a linear transformation.

In terms of matrix representations, if [S][S] is the matrix of SS and [T][T] is the matrix of TT, then the matrix of TST \circ S is the product [T][S][T][S].

Invertible Linear Transformations

A linear transformation T:VWT: V \rightarrow W is invertible if there exists a linear transformation T1:WVT^{-1}: W \rightarrow V such that T1T=IVT^{-1} \circ T = I_V and TT1=IWT \circ T^{-1} = I_W, where IVI_V and IWI_W are the identity transformations on VV and WW respectively.

A linear transformation is invertible if and only if:

  1. It is injective (one-to-one): ker(T)={0}\ker(T) = {0}
  2. It is surjective (onto): Im(T)=W\text{Im}(T) = W

For finite-dimensional spaces of the same dimension, these conditions are equivalent.

Applications

Linear transformations are fundamental in:

  • Computer graphics (rotations, scaling, projections)
  • Quantum mechanics (operators on state spaces)
  • Signal processing (Fourier transforms)
  • Machine learning (linear models, dimensionality reduction)
  • Differential equations (solving systems of linear ODEs)