Skip to main content Link Search Menu Expand Document (external link) Copy Copied

Proof of the real spectral theorem

Orthogonal bases and Gram-Schmidt

Proposition: (The Gram-Schmidt process) Let V be a real inner product space of dimension n, and let v1,,vk be a linearly independent set in V. Then there is a set w1,,wk of vectors such that

  • wi,wj=0 if ij.
  • the span of w1,,wl is the same as the span of v1,,vl for lk.

Proof: Let w1=v1 and w2=v2v2,w1w12w1. Then the span of w2 and w1 is the same as that of v2 and v1, and w2,w1=0 by construction. Now suppose we have constructed w1,,wl with the desired property. Set

wl+1=vl+1i=1lvl+1,wiwi2wi.

Then wl+1,wi=0 and the span property is preserved.

Orthogonal complements

If W is a subspace of V, define W={v:v,w=0}.

Proposition: W is a subspace of V. Furthermore:

  • dimW+dimW=dimV (so WW=0.)
  • (W)=W.
  • if UW, then WU.

Proof: Suppose dimW=k and dimV=n. Use Gram-Schmidt to construct an orthogonal basis v1,,vn for V whose first k elements are an orthogonal basis for W. A vector

v=aivi

is in W if and only if ai=0 for i=1,,k.

Proposition: Suppose that A is a self adjoint operator and AWW. Then AWW.

Proof: Suppose zW and wW. Then

Az,w=z,Aw=z,Aw=0

since AwW.

Proof of the (real) spectral theorem

We have a self-adjoint map A:VV. Pick a basis for V and use Gram-Schmidt to construct an orthonormal basis (an orthogonal basis where the elements all have norm 1).

The Q-matrix for this basis is the identity, and so the inner product is just the dot product.

If [A] is the matrix representation of A in this basis, then the matrix representation of A is the transpose of [A]. So since A is self-adjoint, [A] is symmetric.

We know that a symmetric matrix has a real eigenvalue λ1 with eigenvector v1.

Let V1 be the orthogonal complement to the one-dimensional space W spanned by v1. Since v1 is an eigenvector, AWW. Therefore AV1V1. Furthermore, if x,yV1, then

Ax,y=x,Ay

so A is self-adjoint as a linear map from V1 to itself. Thus we can continue by induction to construct an orthogonal basis of eigenvectors for A.

Orthogonal matrices

Let QMn(R) be a symmetric matrix. As such it is a self adjoint map from Rn to itself with respect to the usual dot product. Therefore there is a basis v1,,vn of Rn consisting of orthonormal eigenvectors for the dot product Q- eigenvalues λ1,,λn.

Let P be the matrix whose columns are the vectors vi written in the standard basis of Rn. Since the vi are an orthonormal basis, the matrix P satisfies PTP=I.

At the same time,

QP=PΛ

where Λ is the diagonal matrix with entries λi. Since the vi are linearly independent, the matrix P is invertible and Q is diagonalizable:

P1QP=Λ

The bilinear map v,w defined by

v,w=vTQw

is an inner product provided that v,v0 with equality only one v=0. If we write v0 in terms of the orthogonal basis v1,,vn:

v=aivi

then we get

v,v=(aiviT)Q(aivi)=aiviTλivi=ai2λiviTvi

which will be positive provided that all λi>0.

Example

Let Q be the symmetric matrix

Q=(3113)

Its eigenvalues are 2 and 4 with eigenvectors

[22]

and

[2222].

The norm of a vector in the inner product given by Q is

(x,y)=3x2+2xy+3y2

The level curves of this are ellipses, and the eigenvectors point in the directions of the major and minor axes of the ellipse.

img

These ellipses are the family

2(xy)2+4(x+y)2=C