Cramér's theorem (large deviations)

From HandWiki

Cramér's theorem is a fundamental result in the theory of large deviations, a subdiscipline of probability theory. It determines the rate function of a series of iid random variables. A weak version of this result was first shown by Harald Cramér in 1938.

Statement

The logarithmic moment generating function (which is the cumulant-generating function) of a random variable is defined as:

[math]\displaystyle{ \Lambda(t)=\log \operatorname E [\exp(tX_1)]. }[/math]

Let [math]\displaystyle{ X_1, X_2, \dots }[/math] be a sequence of iid real random variables with finite logarithmic moment generating function, i.e. [math]\displaystyle{ \Lambda(t) \lt \infty }[/math] for all [math]\displaystyle{ t \in \mathbb R }[/math].

Then the Legendre transform of [math]\displaystyle{ \Lambda }[/math]:

[math]\displaystyle{ \Lambda^*(x):= \sup_{t \in \mathbb R} \left(tx-\Lambda(t) \right) }[/math]

satisfies,

[math]\displaystyle{ \lim_{n \to \infty} \frac 1n \log \left(P\left(\sum_{i=1}^n X_i \geq nx \right)\right) = -\Lambda^*(x) }[/math]

for all [math]\displaystyle{ x \gt \operatorname E[X_1]. }[/math]

In the terminology of the theory of large deviations the result can be reformulated as follows:

If [math]\displaystyle{ X_1, X_2, \dots }[/math] is a series of iid random variables, then the distributions [math]\displaystyle{ \left(\mathcal L ( \tfrac 1n \sum_{i=1}^n X_i) \right)_{n \in \N} }[/math] satisfy a large deviation principle with rate function [math]\displaystyle{ \Lambda^* }[/math].

References