Roots of Unity

The \(n\)th roots of unity are the solutions to \(x^n = 1\). For example,

  • When \(n = 2\), the solutions are \(-1, 1\).

  • When \(n = 3\), the solutions are \((-1 \pm i \sqrt{3})/2, 1\).

  • When \(n = 4\), the solutions are \(-1, 1, i, -i\).

The easy way

The \(n\)th roots of unity are given transcendentally by

\[\cos(2\pi k/n) + i \sin(2\pi k/n)\]

Can we solve with radicals instead? That is, can we write solutions just using addition, subtraction, multiplication, division, and root extraction?

Yes: \(\sqrt[n]{1}\).

This feels like cheating, so we sharpen the question. Firstly, we stipulate that we want all solutions of \(x^n = 1\), and secondly, we clarify that \(\sqrt[n]{a}\) only has one value, and all we know about this value is that when raised to the \(n\)th power, we get \(a\). Our method should work no matter which root it is.

In other words, we act as if we’re writing a program using a library with a nth root extraction function whose only guarantee is that the nth power of the returned value is \(a\).

Thus our method must find all solutions even if \(\sqrt[n]{1}\) evaluates to \(1\). This precludes our trivial solution.

The hard way

The \(n\)th roots of unity are a cyclic group of order \(n\), and a root is primitive if it generates the group. Thus if we find a primitive root, we find them all.

For coprime \(p, q\), the product of a primitive \(p\)th root of unity and a primitive \(q\)th root of unity is a primitive \(pq\)th root of unity. For a prime \(p\), taking the \(p^{k-1}\)th root of a primtiive \(p\)th root of unity results in a primitive \(p^k\)th root of unity.

Thus we need only study the \(p\)th roots of unity for prime \(p\).

The small cases \(p = 2,3,5,7\) are tedious exercises: for \(p = 7\), we exploit the fact that the polynomial is palindromic, then solve a cubic.

The prime \(p = 11\) is the smallest case that requires a different approach. The following method is due to Vandermonde, and it generalizes to all primes above 11. (Compare with Gauss' method.)

We wish to solve the equation

\[x^{10} + x^9 + ... + 1 = 0\]

Let \(\beta\) be a 10th root of unity (which we compute by multiplying a primitive square root and a primitive 5th root).

Pick a generator of \(\mathbb{Z}^\times_{11}\) (also known as a primitive root of 11 in number theory), such as 2. Let \(\alpha\) be a primitive 11th root of unity and consider the sequence:

\[\alpha , \alpha^2, \alpha^4, \alpha^8, \alpha^5, \alpha^{10}, \alpha^9, \alpha^7, \alpha^3, \alpha^6\]

Then the Lagrange resolvent is

\[t = \alpha + \beta \alpha^2 + ... + \beta^9 \alpha^6\]

We show that \(t^{10}\) is known, that is, it can be written in terms of \(\beta\), without any appearances of \(\alpha\). Let

\[t^{10} = \rho_0(\beta) + \rho_1(\beta)\alpha +...+ \rho_{10}(\beta)\alpha^{10}\]

where the \(\rho_i\)'s are rational functions with known coefficients. Now if we replace \(\alpha\) by \(\alpha^2\), then \(t\) becomes \(\beta^{-1}t\), hence \(t^{10}\) is unchanged, which means

\[\rho_0 + \rho_1\alpha^2 + ... + \rho_{10} \alpha^{9} = \rho_0 + \rho_1\alpha +...+ \rho_{10}\alpha^{10}\]

where the \(\beta\)'s have been omitted for clarity. Thus

\[0 = (\rho_1 - \rho_{10})\alpha + (\rho_2 - \rho_1)\alpha^2 +...+ (\rho_{10} - \rho_9)\alpha^6\]

But since \(\alpha ,..., \alpha^{10}\) are linearly independent (otherwise \(x^{10} +... + 1 = 0\) would be reducible), we must have \(\rho_1 - \rho_{10} = 0, ...\), that is, \(\rho_1 = \rho_2 = ... = \rho\) for some \(\rho\). Then

\[t^{10} = \rho_0 + \rho(\alpha + ... + \alpha^{10}) = \rho_0 - \rho\]

which is independent of \(\alpha\) and thus known.

For \(i = 1,...,10\) let \(t_i\) be equal to \(t\) where every \(\beta\) has been replaced by \(\beta^i\). A similar argument to the one used above shows that \(t_i^{10}\) is known for all \(i\), thus:

\[\alpha = \frac{t_1 + ... + t_{10}}{10} = \frac{\sqrt[10]{t_1^{10}} + ... + \sqrt[10]{t_{10}^{10}}}{10}\]

But there’s a catch. Consider \(5 = 2 + 3 = \sqrt[4]{2^4} + \sqrt[4]{3^4}\). This only holds if the correct fourth roots are taken. For example, according to our definition of a radical, it’s perfectly valid for the right-hand side to evaluate to \(2i - 3i\).

Similarly, our equation for \(\alpha\) only holds if the correct 10th roots are taken 10 times in a row. Recall our method should work no matter which root we get, so really we must add a step: we multiply each 10th root by each of the 10th roots of unity. Out of the 100 total combinations, we find valid solutions for \(\alpha\) by trial and error.

A better method is to prove that \(u_i = t_i t_1^{10-i}\) is known, which allows us to determine each \(t_i\) from \(t_1\) without taking roots. This is vacuously true for \(u_{10} = t_{10}\), which is known (and in fact \(t_{10} = -1\)). Then if \(t' = \sqrt[10]{t^{10}}\), we have:

\[\alpha = \frac{t' + u_2/t'^8 + u_3/t'^7 + ... + u_{10}}{10}\]

At first glance, we might think we’d have to multiply \(t'\) by each 10th root of unity to recover \(t\), but it turns out \(\alpha\) is a primitive 11th root of unity so long as \(t'^{10} = t^{10}\). Roughly speaking, replacing \(t\) with \(t\) multiplied by some 10th root of unity is like starting the Lagrange resolvent at a different item in the sequence.

Real radical

Once upon a time, I assumed \(\sqrt[n]{a}\) was only defined for real \(a\), and furthermore, it would evaluate to a purely imaginary root for negative \(a\) and a real root otherwise. This fit with how I imagined classic mathematics. Back then, complex numbers were a little naughty: they might be necessary, but ideally only as stepping stones to reach respectable real solutions of an equation.

With this simple definition of a radical, for non-negative inputs, they could use a crude method such as bisection to get arbitrarily close to the root. For negative inputs, they could simply solve the problem for the absolute value of the input, then multiply by \(i\).

However, with this interpretation, we can no longer solve \(x^n = 1\) with radicals. Consider \(n = 25\). Following the above procedure, we first find a primtive 5th root of unity. We might find the 5th primitive root:

\[\alpha = \frac{-1-\sqrt{5}}{4} - i \sqrt{\frac{5 - \sqrt{5}}{8}}\]

We next take a 5th root of \(\alpha\) to obtain a primitive 25th root of unity. However, a radical that only works on reals means we cannot simply write \(\sqrt[5]{\alpha}\). Rather, we must find reals \(a, b\) such that:

\[(a + bi)^5 = \alpha\]

Comparing the real parts of either side:

\[a^5 - 5 a^3 b^2 + 10 a b^4 = \frac{-1-\sqrt{5}}{4}\]

We have \(a^2 + b^2 = 1\) since \(a + bi\) is a root of unity, thus:

\[16 a^5 - 25 a^3 + 10 a + \frac{1+\sqrt{5}}{4} = 0\]

I haven’t proved it, but this looks irreducible over \(\mathbb{Q}[\sqrt{5}]\). It turns out to have exactly 3 real roots, so assuming my hunch is right, one can show the Galois group is the full symmetric group \(S_5\) and hence impossible to solve with radicals.

Therefore, it seems we must abandon our simplistic definition of radicals, and resign ourselves to our first definition. This raises questions: what did early mathematicians think of computing nth roots of complex numbers? Would del Ferro, Tartaglia, and Cardano have considered it an acceptable operation? How would they find approximations? Newton’s method only burst onto the scene later on, and it took even longer to figure out how to avoid pitfalls when complex numbers are involved.

Ben Lynn 💡