Search Knowledge

© 2026 LIBREUNI PROJECT

Linear Algebra / Linear Algebra

Inner Product Spaces

Inner Product Spaces

Up to this point, our vector spaces have been “bare.” We can add vectors and scale them, but we have no notion of how long a vector is, or what the angle between two vectors might be. In bare linear algebra, there is no “perpendicular.”

Inner product spaces equip vector spaces with this geometric structure. This allows us to define limits, identify the “closest” approximation to a signal, and decompose data into independent components.

Geometry from Algebra

An inner product is a function that takes two vectors and returns a scalar. For Rn\mathbb{R}^n, the standard inner product is the dot product uv=uiviu \cdot v = \sum u_i v_i. To generalize this to any vector space VV over R\mathbb{R} or C\mathbb{C}, we define an inner product u,v\langle u, v \rangle as a map satisfying:

  1. Conjugate Symmetry: u,v=v,u\langle u, v \rangle = \overline{\langle v, u \rangle}.
  2. Linearity: αu+v,w=αu,w+v,w\langle \alpha u + v, w \rangle = \alpha \langle u, w \rangle + \langle v, w \rangle.
  3. Positive Definiteness: v,v0\langle v, v \rangle \geq 0, and v,v=0    v=0\langle v, v \rangle = 0 \iff v = \mathbf{0}.

Defining Length and Distance

The norm (length) of a vector is defined as v=v,v\|v\| = \sqrt{\langle v, v \rangle}. This is the generalized Pythagorean theorem. The distance between two points is then simply d(u,v)=uvd(u, v) = \|u - v\|.

The Cauchy-Schwarz Inequality

One of the most powerful results in mathematics is the Cauchy-Schwarz inequality: u,vuv|\langle u, v \rangle| \leq \|u\| \|v\| This ensures that the “angle” θ\theta defined by cosθ=u,vuv\cos \theta = \frac{\langle u, v \rangle}{\|u\| \|v\|} is always between -1 and 1 for real spaces.

Orthogonality: The Power of Perpendiculars

Two vectors are orthogonal if u,v=0\langle u, v \rangle = 0. In terms of data, orthogonal vectors are completely “unrelated” or “uncorrelated” in the geometry defined by that inner product.

A set of vectors is orthonormal if they are all orthogonal to each other and all have length 1. Working with an orthonormal basis {e1,,en}\{e_1, \dots, e_n\} is trivial compared to a general basis because the coefficients of any vector vv are just the inner products: v=v,e1e1++v,enenv = \langle v, e_1 \rangle e_1 + \dots + \langle v, e_n \rangle e_n

Gram-Schmidt: Generating Order from Chaos

The Gram-Schmidt process is a recipe for turning any basis into an orthonormal one. It works by taking the first vector, then taking the second and subtracting the part that “leaks” into the first, and so on.

python

Interactive Lab

Read the code, make a small change, then run it and inspect the output. Runtime setup messages stay outside the terminal so the result remains focused on what the program prints.

Step 1
Inspect the idea
Step 2
Edit the program
Step 3
Run and compare

Best Approximations and Projections

The most significant application of inner products is the Orthogonal Projection. Given a subspace WW and a vector vv not in WW, the “closest” vector in WW to vv is the projection PW(v)P_W(v).

This is the engine behind Least Squares Regression. If we have a system Ax=bAx = b that has no solution, we look for the xx that minimizes the error Axb\|Ax - b\|. This happens when AxAx is the projection of bb onto the column space of AA.

Fun Application: Fourier Series

Calculus students often see Fourier series as a specialized topic. In reality, it is just linear algebra on an infinite-dimensional inner product space of functions L2L^2. The functions sin(nx)\sin(nx) and cos(nx)\cos(nx) form an orthogonal basis. Calculating Fourier coefficients is exactly the same as calculating coordinates in Rn\mathbb{R}^n using dot products!

Exercises

If two vectors u and v are orthogonal, what is ||u + v||²?

Why might the classical Gram-Schmidt process fail in high-precision computer science tasks?

Which of the following is the definition of a Unitary matrix U?