Discrete Problems

Cryptography requires problems that are hard in the average case.

For our purposes, a probability space is a finite set $\Omega = \{0,1\}^n$, and a function $Pr:2^\Omega \rightarrow [0,1].$ such that $Pr[F] = \Sigma_{x\in F} Pr[x]$ for all $F \subseteq \Omega$.

(See the Wikipedia for the general definition of a probability space. Briefly, a probability space is a set $S$, a $\sigma$-algebra $X$ on $S$, and a measure $P$ on $X$ such that $P(S) = 1$.)

Type 1 Random Variables

A Type 1 RV is a function $X : \Omega \rightarrow S$, for some finite set $S$.

A predicate is a function $Q(x):S\rightarrow\{0,1\}$, i.e. it maps the set to a boolean. Define $Pr[Q(x)] = \Sigma_{y \in \Omega, Q(X(y))}Pr[y]$.

Some more notation:

  • $X \leftarrow \Omega$ defines a uniform random variable on $\Omega$ ($X \leftarrow \{0,1\}^n$). Formally, $Pr[A] = |A|/|\Omega|$ for any $A \subseteq \Omega$, $X$ is the identity function, thus $Pr[X=a] = 1/|\Omega|$ for all $a \in \Omega$.

  • $A(x,r)$ denotes a randomized algorithm: $x$ is the input, and $r \in \{0,1\}^m = \Omega$ is the randomness.

  • $X \leftarrow A(y)$ denotes a random variable $X$ as the output of the algorithm $A$ on input $y$. Output of algorithm is a probability distribution.

  • For a predicate $Q$, define $Pr[Q(x) | x \leftarrow A(y)]$ as the probability that $Q(x)$ holds for output of $A$ on input $y$.

Example: Let $A(n)$ be a randomized algorithm that outputs an odd prime less than or equal to $n$. Then

\[Pr[(\frac{X}{p}) = 1 | p \leftarrow A(n), X \leftarrow \mathbb{F}_p^*] = 1/2 \]

Type 2 Random Variables

A Type 2 RV is a function $X : \Omega \rightarrow \mathbb{R}$.

We shall always use a uniform distribution on $\Omega$, which means that for all $r \in \mathbb{R}$,

\[Pr[X=r] = |\{y|X(y) = r\}|/|\Omega|\]

Define

\[ E[X] = \Sigma_{y \in \Omega}Pr[y]X(y) \]

and

\[ \sigma^2(x) = E[X^2] - (E[X])^2 = E[(X - E[X])^2].\]

$X, Y$ are independent if for all $a,b$ we have

\[Pr[X=a \wedge Y=b] = Pr[X=a]Pr[Y=b].\]

Large Deviation Bounds

Markoff’s Inequality:

\[Pr[X \gt \lambda] \le E[|x|]/\lambda.\]

Chebyshev’s Inequality:

\[Pr[|X-E[X]|\ge\lambda\sigma(X)]\le 1/\lambda^2.\]

Example: Suppose $X_1, ..., X_n$ are pairwise independent random variables such that $Pr[X_i = 1 ] = p$, $Pr[X_i = 0] = 1-p$.

Chernoff Bound: For $i=1,...,n$, let $x_i = 0$ with probability $p$ and $x_i = 1$ otherwise. Suppose the $x_i$ are independent (stronger than pairwise independent), and define $X = \Sigma^n_{i=1}x_i.$ Then

\[Pr[|X - E[X]| \ge \sqrt{n}\lambda]\le 2e^{-2\lambda^2}.\]