“If sample size is large and IID, the sampling distribution is normal. The larger \(N\) is, the more normal the resulting shape is.”

We can use the central limit theorem to estimate the sum of IID random variables:

Let there be \(n\) random variables named \(X_{j}\), they are IID, and they have \(E[x] = \mu\), and \(Var(x) = \sigma^{2}\)

We have that:

\begin{equation} \sum_{i=1}^{N} X_{n} \sim N(n\mu, n \sigma^{2}), \text{as}\ n \to \infty \end{equation}

That, as long as you normalize a random variable and have enough of it, you get closer and closer to the normal distribution.

Notably, for the central limit theorem to hold, the variance has to be finite (that the results vary in a certain finite value \(\sigma\). With that \(\sigma\) value, we can see above that the central limit theorem will eventually converge to the normal. THis is useful for the Random Walk Hypothesis.

**REMEMBER THAT IF YOU ARE APPROXIMATIGN DISCRETE THINGS YOU NEED continuity correction!!!**