Here’s a bunch of exponential family distributions. Recall:
\begin{equation} p\qty(x;\eta) = b\qty(x) \exp \qty(\eta^{T}T\qty(x) - a\qty(\eta)) \end{equation}
normal, berunouli, posisson, binomial, negative binomial, geometric, chi-squared, exponential are all in
normal distribution
\(\mu\) the mean, \(\sigma\) the variance
\begin{equation} p\qty(x;\mu, \Sigma) = \frac{1}{\qty(2\pi)^{\frac{|x|}{2}} \text{det}\qty(\Sigma)^{\frac{1}{2}}} \exp \qty(-\frac{1}{2} \qty(x-\mu)^{T}\Sigma^{-1}\qty(x-\mu)) \end{equation}
\begin{equation} p\qty(x; \mu, \sigma) = \frac{1}{\sqrt{2\pi\sigma^{2}}} \exp \qty({ \frac{-(x-u)^{2}}{2 \sigma^{2}}}) \end{equation}
\begin{equation} \mathbb{E}[x] = \mu \end{equation}
\begin{equation} \text{Var}\qty [x] = \sigma^{2} \end{equation}
This is exponential family distribution. For \(\sigma^{2} = 1\):
\begin{equation} p\qty(x;\mu) = \frac{1}{\sqrt{2\pi}} \exp \qty(-\frac{1}{2} x^{2}) \exp \qty(\mu x - \frac{1}{2} \mu^{2}) \end{equation}
- \(\eta = \mu\)
- \(T\qty(x) = x\)
- \(a\qty(\eta) = \frac{\mu^{2}}{2} = \frac{\eta^{2}}{2}\)
- \(b\qty(x) = \qty(\frac{1}{\sqrt{2\pi}}) \exp \qty(-\frac{x^{2}}{2})\)
Bernoulli distribution
Success with probability \(p\), failure with probability \(1-p\).
\begin{equation} p(x; p) = p^{x} (1-p)^{1-x} \end{equation}
\begin{equation} \mathbb{E}[x] = \mu \end{equation}
\begin{equation} \text{Var}\qty [x] = p\qty(1-p) \end{equation}
This is an exponential family distribution.
\begin{equation} p\qty(x; p) = \exp \qty(\qty(\log \qty(\frac{p}{1-p})) x + \log\qty(1-p)) \end{equation}
- \(\eta = \log \qty(\frac{p}{1-p})\)
- \(T\qty(x) = x\)
- \(a\qty(\eta) = \log \qty(1+e^{\eta})\)
- \(b\qty(x) = 1\)
Poisson distribution
What is the chance of having an event occurring \(x\) times in a unit time when on average, this event happens at a rate of \(\lambda\) per unit time.
\begin{equation} p\qty(x; \lambda) = e^{-\lambda} \frac{\lambda^{x}}{x!} \end{equation}
\begin{equation} \mathbb{E}[x] = \lambda \end{equation}
\begin{equation} \text{Var}[x] = \lambda \end{equation}
This is an exponential family distribution.
\begin{equation} p\qty(x; \lambda) = \frac{1}{x!} \exp \qty(\log\qty(\lambda) x - e^{\log \lambda }) \end{equation}
- \(\eta = \log \qty(\lambda)\)
- \(T\qty(x) = x\)
- \(a\qty(\eta) = \exp \qty(\eta)\)
- \(b\qty(x) = \frac{1}{x!}\)
Binomial Distribution
What’s probability of \(n\) coin flips getting exactly \(x\) heads given one head’s probability is \(p\)
\begin{equation} p\qty(x; p,n) = \mqty(n \\ x) p^{x} \qty(1-p)^{n-x} \end{equation}
\begin{equation} \mathbb{E}[x] = np \end{equation}
\begin{equation} \text{Var}[x] = np\qty(1-p) \end{equation}
This admits an exponential distribution.
\begin{equation} p\qty(x; p,n) = \mqty(n \\ x) \exp \qty(x \log\qty(\frac{p}{1-p}) - n \log\qty(1+ \qty(\frac{p}{1-p}))) \end{equation}
- \(\eta = \log \qty(\frac{p}{1-p})\)
- \(T\qty(x) = x\)
- \(a\qty(\eta) = n \log \qty(1+\qty(\frac{p}{1-p})) = n \log \qty(1+e^{\eta})\)
- \(b\qty(x) = \mqty(n \\ x)\)
Negative Binomial Distribution
What is the probability of it taking \(x\) trials until the \(k\) th success, given success probability of one trial is \(p\).
\begin{equation} p\qty(x; p,k) = \mqty(x-1 \\ k-1) \qty(1-p)^{x-k} p^{k} \end{equation}
\begin{equation} \mathbb{E}[x] = \frac{k\qty(1-p)}{p} \end{equation}
\begin{equation} \text{Var}[x] = \frac{k\qty(1-p)}{p^{2}} \end{equation}
This is a exponential distribution as well.
\begin{equation} p\qty(x; p, k)= \mqty(x-1 \\ k-1) \exp \qty(\log\qty(1-p)x - k \log \qty(\frac{1-p}{p})) \end{equation}
- \(\eta = \log \qty(1-p)\)
- \(T\qty(x) = x\)
- \(a\qty(\eta) = k \log \qty(\frac{1-p}{p})\)
- \(b\qty(x) = \mqty(x-1 \\ k-1)\)
Geometric Distribution
How many times do you have to do the trial to get at least one success The above, but \(k=1\). This admits:
\begin{equation} p\qty(x; p) = \qty(1-p)^{x-1} p \end{equation}
Filling \(k=1\) from above.
