7. Galois Extensions

Galois Extensions

Automorphisms

Let \(K/F\) be an extension fields and let \(\Aut(K/F)\) be the group of field homomorphisms \(\sigma: K\to K\) which are the identity when restricted to \(F\).

Some basics:

Galois Extensions

Proposition: Let \(E/F\) be the splitting field over \(F\) of some polynomial \(f(x)\in F[x]\). Then

\[ \vert \Aut(E/F)\vert\le [E:F]. \]

If \(f(x)\) is separable, then this is an equality.

Definition: \(E/F\) is called a Galois extension if \(\vert\Aut(E/F)\vert=[E:F]\). In this case the automorphism group is called the Galois group of the extension.

Proposition 5 says that separable splitting fields are Galois extensions.

The Galois correspondence

Let \(E/F\) be a Galois extension with Galois group \(G\). Then there is a bijective (inclusion reversing) correspondence between:

The correspondence is given by \(H\to E^{H}\) for \(H\subset G\) in one direction, and \(L\to \Aut(E/L)\subset G\) in the other direction.

Further:

Some examples

Overview of the proof

There are two “directions” we need to consider.

  1. Suppose \(E/F\) is a separable splitting field extension. Then \(\vert\Aut(E/F)\vert=[E:F]\).
  2. Suppose \(E\) is a field and \(G\) is a finite group of automorphisms of \(E\). Then the fixed field \(E^{G}\) satisfies \([E:E^{G}]=\vert G\vert\) and \(E/E^{G}\) is a separable splitting field.

These mean together that:

This is the prototype of the Galois correspondence.

More on the proof - Step 1

The first assertion to consider is that, if \(E/F\) is a separable splitting field, then \(\vert\Aut(E/F)\vert=[E:F].\) This is a consequence of the theorem on extension of automorphisms. The proof is by induction. Clearly if \(E=F\) then \(\Aut(E/F)\) is trivial and \([E:F]=1.\) Now suppose we know the result for all separable splitting fields of degree less than \(n\) and suppose \(E/F\) has degree \(n.\) Choose an element \(\alpha\in E\) of degree greater than one over \(F\) and let \(f(x)\) be its minimal polynomial. Let \(\beta\) be any other root of \(f(x)\). Since \(E/F\) is a splitting field, \(\beta\in E\). Consider the diagram:

\[ \begin{xy} \xymatrix{ E \ar[r] & E \\ F(\alpha)\ar[r]\ar[u]&F(\beta)\ar[u] \\ F\ar[r]\ar[u] & F\ar[u] \\ } \end{xy} \]

Since \(F(\alpha)\) is isomorphic to \(F(\beta)\), the extension theorem says that there is an automorphism of \(E\) carrying \(\alpha\) to \(\beta\). Since there are \([F(\alpha):F]\) choices of \(\beta\), there are \(n\) such extensions \(\sigma_{\beta}\) corresponding to the \(n\) roots \(\beta\) of the minimal polynomial of \(\alpha\) over \(F\).

Now \(E/F(\alpha)\) is still a separable splitting field, so our induction hypothesis says that there are \([E:F(\alpha)]\) automorphisms of \(E\) fixing \(F(\alpha)\). Take any automorphism \(\tau\) of \(E/F\). It carries \(\alpha\) to some \(\beta\), so \(\sigma_{\beta}^{-1}\tau\) fixes \(\alpha\) and therefore \(\tau=\sigma_{\beta}\phi\) where \(\phi\in\Aut(E/F(\alpha))\).

It’s not hard to show that this representation is unique, and so

\[ \vert\Aut(E/F)\vert=\vert\Aut(E/F(\alpha))[F(\alpha):F] = [E:F(\alpha)][F(\alpha):F]=[E:F] \]

More on the proof - Step 2

Now we want to show that, if \(G\) is a group of automorphisms of a field \(E\), then \(E/E^{G}\) is a separable splitting field of degree \(\vert G\vert\). The key tool here is a result known as linear independence of characters.

Lemma: Let \(G\) be a group, let \(L\) be a field, and let \(\sigma_1,\ldots, \sigma_n\) be distinct homomorphisms \(G\to L^{\times}\). Then the \(\sigma_{i}\) are linearly independent over \(L\), meaning that if \(f=\sum_{i=1}^{n} a_{i}\sigma_{i}\) is the zero map for some collection of \(a_{i}\in L\), then all \(a_{i}\) are zero.

Proof: Suppose that the \(\sigma_{i}\) are dependent. Choose a linear relation of minimal length where all the coefficients are nonzero:

\[ f=\sum a_{i}\sigma_{i} = 0 \]

Let \(h\in G\) such that \(\sigma_{1}(h)\) and \(\sigma_{n}(h)\) are different. Now \(f(g)=0\) for all \(g\in G\), and also \(f(hg)=0\) for all \(g\in G\) since it’s the same set of elements. Therefore

\[ k=\sum a_{i}\sigma_{i}(h)\sigma_{i}=0. \]

Now \(k-\sigma_{n}(h)f\) is also identically zero. The coefficients of \(\sigma_{n}\) in \(k\) and \(f\) are both \(\sigma_{h}(h)a_{n}\) so they cancel. On the other hand, the coefficients of \(\sigma_{1}\) are \(a_{1}\sigma_{1}(h)\) and \(a_{1}\sigma_{n}(h)\) which are different; so \(k-\sigma_{n}(h)f\) is a relation among the \(\sigma_{i}\) of shorter length. Thus the \(\sigma_{i}\) are independent.

Notice that if \(L\) is a field, \(L^{\times}\) is a group and we can restrict an automorphism of \(L\) to \(L^{\times}\) to obtain a character \(L^{\times}\to L\). Therefore distinct automorphisms of \(L\) are linearly independent over \(L\).

More on the proof - Step 3

Now we want to prove that \([E:E^{G}]=\vert G\vert\). Let’s use \(F=E^{G}\) to simplify the notation. Choose a basis \(\alpha_{1},\ldots, \alpha_{n}\) for \(E/F\) and let \(\sigma_{1},\ldots, \sigma_{m}\) be the elements of \(G\). Form \(m\times n\) the matrix

\[ S=\left(\begin{matrix} \sigma_{1}(\alpha_{1}) & \sigma_{1}(\alpha_{2}) & \cdots & \sigma_{1}(\alpha_{n})\\ \sigma_{2}(\alpha_{1}) & \sigma_{2}(\alpha_{2}) & \cdots & \sigma_{2}(\alpha_{n}) \\ \vdots & \vdots & \vdots &\vdots \\ \sigma_{m}(\alpha_{1}) & \sigma_{m}(\alpha_{2}) & \cdots & \sigma_{m}(\alpha_{n}) \\ \end{matrix}\right) \]

Let’s first look at the row rank of this matrix. Suppose that

\[ [\beta_1\cdots\beta_m]S = 0. \]

It follows that \(\sum_{i=1}^{m} \beta_{i}\sigma_{i}(\alpha_{j})=0\) for all \(\alpha_j\), and, since the \(\alpha_{j}\) span \(E/F\), we condlue \(\sum_{i=1}^{m}\beta_{i}\sigma_{i}(x)=0\) for all \(x\in E\). By linear independence this means that all \(\beta_{i}=0\) and so the row rank of \(S\) is \(m\).

Now let’s look at the column rank. For this, notice that if \(\sigma:E\to E\) is an automorphism, then \(\sigma(S)\) (obtained by applying \(\sigma\) to each entry of \(S\)) is obtained from \(S\) by rearranging the rows. In other words

\[ \sigma(S)=\Pi(\sigma)S \]

where \(\Pi(\sigma)\) is an \(m\times m\) permutation matrix. Now suppose \(\boldsymbol{\beta}=[\beta_{1},\ldots, \beta_{n}]\) satisfies

\[ S\boldsymbol{\beta}=S\left[\begin{matrix} \beta_{1} \\ \vdots \\ \beta_{n}\end{matrix}\right]=0. \]

Then, for any \(\sigma\in G\), we have

\[ \sigma(S\boldsymbol{\beta})=\sigma(S)\sigma(\boldsymbol{\beta})=\Pi(\sigma)S\boldsymbol{\beta}=0 \]

In other words, if \(\boldsymbol{\beta}\) is in the (left) kernel of \(S\), so is \(\sigma(\boldsymbol{\beta})\).

Now suppose \(\boldsymbol{\beta}\) is nonzero and satisfies \(S\boldsymbol{\beta}=0\) and let \(y=\sum_{i=1}^{m}\sigma_{i}(\boldsymbol{\beta})\). This is a column vector whose entries are \(\sum_{i=1}^{m}\sigma_{i}(\beta_{j})\). These sums are all fixed by \(G\), since \(\sigma(\sum_{i=1}^{m}\sigma_{i}(\beta_{j}))\) is just a permutation of the terms in the sum. We can introduce a function \(Y:E\to E\) by setting

\[ Y_j(x)=\sum_{i=1}^{m}\sigma_{i}(x\beta_{j})=\sum_{i=1}^{m}\sigma_{i}(\beta_{j})\sigma_{i}(x). \]

Since \(\boldsymbol{\beta}\) is nonzero, by linear independence at least one of \(Y_{j}\) is nonzero so there is an \(x\in E\) such that

\[ \boldsymbol{Y}=\sum_{i=1}^{m}\sigma_{i}(x\boldsymbol{\beta})\not=0 \]

But \(\boldsymbol{Y}\) is in \(F^{n}\) and \(S\boldsymbol{Y}=0\). This means

\[ \sigma_{i}(\sum_{j=1}^{n} Y_{j}(x)\alpha_{j})=0 \]

for all \(i\); and since the \(Y_{j}(x)\in F\) and the \(\alpha_{j}\) are independent we must have \(Y_{j}(x)=0\). This contradiction means that there cannot be a nonzero \(\boldsymbol{\beta}\) with \(S\boldsymbol{\beta}=0\). We conclude that the column rank of \(S\) is \(n\).

Since the row and column ranks of a matrix are the same, we have \(n=m\).

More on the proof - Step 4

We finally need to verify that \(E/E^{G}\) is a separable splitting field. First, let \(\alpha\in E\) be any element of \(E\) with minimal polynomial \(q(x)\). Consider the orbit \(\lbrace \alpha_{1},\ldots, \alpha_{k}\rbrace\) of \(\alpha\) under the action of \(G\). The polynomial

\[ p(x)=\prod_{i=1}^{k}(x-\alpha_{i}) \]

is fixed by \(G\) so has coefficients in \(E^{G}\); it also has \(\alpha\) as a root so \(q(x)\) divides \(p(x)\). Therefore all roots of \(q(x)\) belong to \(E\). Since every polynomial with coefficients in \(E^{G}\) that has a root in \(E\) splits in \(E\), \(E\) is a splitting field over \(E^{G}\).

To show separability, let \(\beta_{1},\ldots, \beta_{n}\) be a basis for \(E/E^{G}\). Let \(p_{i}(x)\) be the minimal polynomial of \(\beta_{i}\) over \(E^{G}\). We’ve shown already that each \(p_{i}(x)\) splits completely in \(E\). Consider the product \(f(x)\) of all the \(p_{i}\) and let \(f_{1}(x)\) be its square free part (that is, the product of its irreducible factors, all to the first power). Then \(f_{1}(x)\) is separable and has the \(\beta_{i}\) as roots, and therefore \(E\) is the splitting field of the separable polynomial \(f_{1}(x)\).

Definition: If \(\alpha\in E\), the elements \(\sigma(\alpha)\), with \(\sigma\in G\), are called the conjugates of \(\alpha\) (or the Galois conjugates).

The full proof of the correspondence

See the proof in Dummit and Foote, which basically applies our numerical result that \([E:E^{G}]=\vert G\vert\), the fact that \(E/E^{G}\) is a separable splitting field, and our existence theorem that, if \(E/F\) is a separable splitting field, then \(\vert\Aut(E/F)\vert =[E:F]\)j to obtain the correspondence.