# Modular Exponentiation

Suppose we are asked to compute \(3^5\) modulo \(7\). We could calculate \(3^5 = 243\) and then reduce \(243\) mod \(7\), but a better way is to observe \(3^4 = (3^2)^2\). Since \(3^2 = 9 = 2\) we have \(3^4 = 2^2 = 4\), and lastly

The second way is better because the numbers involved are smaller.

This trick, known as repeated squaring, allows us to compute \(a^k\) mod \(n\) using only \(O(\log k)\) modular multiplications. (We can use the same trick when exponentiating integers, but then the multiplications are not modular multiplications, and each multiplication takes at least twice as long as the previous one.)

## The Discrete Log Problem

Let us examine the behaviour of the successive powers of \(3\) modulo \(7\).

Note we compute each power by mulitplying the previous answer by \(3\) then reducing modulo \(7\). Beyond this, the sequence repeats itself (why?):

and so on.

At a glance, the sequence \(3, 2, 6, 4, 5, 1\) seems to have no order or structure whatsoever. In fact, although there are things we can say about this sequence (for example, members three elements apart add up to 7), it turns out that so little is known about the behaviour of this sequence that the following problem is difficult to solve efficiently:

(The discrete log problem) Let \(p\) be a prime, and \(g, h\) be two elements of \(\mathbb{Z}_p^*\). Suppose it is known that \(g^x = h \pmod{p}\). Then what is \(x\)?

**Example**: One instance of the discrete log problem:
find \(x\) so that \(3^x = 6 \pmod{7}\). (Answer: \(x = 3\). Strictly speaking,
any \(x = 3 \pmod{6}\) will work.)

Recall when we first encountered modular inversion we argued we could try every element in turn to find an inverse, but this was too slow to be used in practice. The same is true for discrete logs: we could try every possible power until we find it, but this is impractical.

Euclid’s algorithm gave us a fast way to compute inverses. However no fast algorithm for finding discrete logs is known. The best discrete log algorithms known are faster than trying every element, but are not polynomial time.

## Nonunits

Why don’t we bother studying the behaviour of nonunits under exponentiation?

First consider when \(n = p^k\) for some prime \(p\). Then \(a\in\mathbb{Z}_n\) is nonunit exactly when \(\gcd(a, n) \gt 1\), which in this case means \(a = d p\) for some \(d\).

We have \(a^k = d^k p^k = 0\), thus in at most \(k\) steps we hit zero, which is uninteresting (at least for our purposes).

In general, write \(n=p_1^{k_1}...p_n^{k_m}\). By the Chinese Remainder Theorem we have

Thus an element \(a\in\mathbb{Z}_n\) corresponds to some element \((a_1,...,a_m)\) on the right-hand side, and \(a\) is a nonunit if at least one of the \(a_i\) is a multiple of \(p_i\). From above, this means in at most \(k_i\) steps, the \(i\)th member will reach zero, so in general, for some \(k\), each \(a_i^k\) is zero or a unit, hence we can restrict our study to units.

Note we have again followed an earlier suggestion: we handle the prime power case first and then generalize using the Chinese Remainder Theorem.

*blynn@cs.stanford.edu*💡