There’s an interesting little identity that holds for norms — translation-invariant metrics on vector spaces over or — that come from inner products. Even more interestingly, it actually characterizes such norms.
Geometrically, if we have a parallelogram whose two sides from the same point are given by the vectors and , then we can construct the two diagonals and . It then turns out that the sum of the squares on all four sides is equal to the sum of the squares on the diagonals. We write this formally by saying
where we’ve used the fact that opposite sides of a parallelogram have the same length. Verifying this identity is straightforward, using the definition of the norm-squared:
On the other hand, what if we have a norm that satisfies this parallelogram law? Then we can use the polarization identities to define a unique inner product.
where we ignore the second term when working over real vector spaces.
However, if we have a norm that does not satisfy the parallelogram law and try to use it in these formulas, then the resulting form must fail to be an inner product. If we did get an inner product, then the norm would satisfy the parallelogram law, which it doesn’t.
Now, I haven’t given any examples of norms on vector spaces which don’t satisfy the parallelogram law, but they show up all the time in functional analysis. For now I just want to point out that such things do, in fact, exist.