Linear Dependence

Definition

A set of vectors {v1,v2,,vn}{v_1, v_2, \dots, v_n} from a Vector Space V\mathbb{V} is linearly dependent if there exist scalars c1,c2,,cnc_1, c_2, \dots, c_n, not all zero, such that:

c1v1+c2v2++cnvn=0c_1v_1 + c_2v_2 + \dots + c_nv_n = \mathbf{0}

If no such scalars exist (or equivalently, if c1v1+c2v2++cnvn=0c_1v_1 + c_2v_2 + \dots + c_nv_n = \mathbf{0} implies all ci=0c_i = 0), then the vectors are linearly independent.

Visual Examples

Linearly Dependent Vectors Example

Linearly Independent Vectors Example

Examples

In R2\mathbb{R}^2

Consider the vectors:

  • v1=(10)v_1 = \begin{pmatrix} 1 \ 0 \end{pmatrix}
  • v2=(20)v_2 = \begin{pmatrix} 2 \ 0 \end{pmatrix}

These are linearly dependent because 2v1v2=02v_1 - v_2 = \mathbf{0}

In Polynomial Space

The polynomials p1(x)=x2+xp_1(x) = x^2 + x and p2(x)=x2+x1p_2(x) = x^2 + x - 1 are linearly independent because no non-zero scalar multiple of one equals the other.

Properties

  1. Any set of vectors containing 0\mathbf{0} is linearly dependent
  2. Any set containing a vector that's a Linear Combination of others is dependent
  3. A single non-zero vector is always linearly independent
  4. Adding a vector to a linearly independent set might create dependence

Why Do We Care?

Linear independence helps us:

  1. Find Bases for vector spaces
  2. Determine minimal generating sets
  3. Calculate Dimensions of spaces
  4. Solve systems of equations efficiently

Think of linear independence as a way to identify when vectors are truly "different" from each other - when none can be created from combinations of the others.

Exercise

Prove that if {v1,v2,,vn}{v_1, v_2, \dots, v_n} is linearly dependent, then one of the vectors can be written as a linear combination of the others.