Inner Product Spaces
Up to this point, our vector spaces have been “bare.” We can add vectors and scale them, but we have no notion of how long a vector is, or what the angle between two vectors might be. In bare linear algebra, there is no “perpendicular.”
Inner product spaces equip vector spaces with this geometric structure. This allows us to define limits, identify the “closest” approximation to a signal, and decompose data into independent components.
Geometry from Algebra
An inner product is a function that takes two vectors and returns a scalar. For , the standard inner product is the dot product . To generalize this to any vector space over or , we define an inner product as a map satisfying:
- Conjugate Symmetry: .
- Linearity: .
- Positive Definiteness: , and .
Defining Length and Distance
The norm (length) of a vector is defined as . This is the generalized Pythagorean theorem. The distance between two points is then simply .
The Cauchy-Schwarz Inequality
One of the most powerful results in mathematics is the Cauchy-Schwarz inequality: This ensures that the “angle” defined by is always between -1 and 1 for real spaces.
Orthogonality: The Power of Perpendiculars
Two vectors are orthogonal if . In terms of data, orthogonal vectors are completely “unrelated” or “uncorrelated” in the geometry defined by that inner product.
A set of vectors is orthonormal if they are all orthogonal to each other and all have length 1. Working with an orthonormal basis is trivial compared to a general basis because the coefficients of any vector are just the inner products:
Gram-Schmidt: Generating Order from Chaos
The Gram-Schmidt process is a recipe for turning any basis into an orthonormal one. It works by taking the first vector, then taking the second and subtracting the part that “leaks” into the first, and so on.
Interactive Lab
Read the code, make a small change, then run it and inspect the output. Runtime setup messages stay outside the terminal so the result remains focused on what the program prints.
Best Approximations and Projections
The most significant application of inner products is the Orthogonal Projection. Given a subspace and a vector not in , the “closest” vector in to is the projection .
This is the engine behind Least Squares Regression. If we have a system that has no solution, we look for the that minimizes the error . This happens when is the projection of onto the column space of .
Fun Application: Fourier Series
Calculus students often see Fourier series as a specialized topic. In reality, it is just linear algebra on an infinite-dimensional inner product space of functions . The functions and form an orthogonal basis. Calculating Fourier coefficients is exactly the same as calculating coordinates in using dot products!