Probabilistic Random Generator
Last edited: August 8, 2025fooling a particular circuit as something as random
probability
Last edited: August 8, 2025probability of an event is the proportion of times the event occurs in many repeated trials. It is “our belief that an event \(E\) occurs”.
“the probability of a outcome is a number between 0-1 which highlights how likely the outcome is likely to occur realtive to other outcomes”
Frequentist Definition of Probability
That is, it is a number between \(0-1\). Whereby:
\begin{equation} P(E) = \lim_{n \to \infty} \frac{n(E)}{n} \end{equation}
probability density function
Last edited: August 8, 2025PDFs is a function that maps continuous random variables to the corresponding probability.
\begin{equation} P(a < X < b) = \int_{x=a}^{b} f(X=x)\dd{x} \end{equation}
note: \(f\) is no longer in units of probability!!! it is in units of probability scaled by units of \(X\). That is, they are DERIVATIVES of probabilities. That is, the units of \(f\) should be \(\frac{prob}{unit\ X}\). So, it can be greater than \(1\).
We have two important properties:
probability distribution
Last edited: August 8, 2025probability distributions “assigns probability to outcomes”
\(X\) follows distribution \(D\). \(X\) is a “\(D\) random variable”, where \(D\) is some distribution (normal, gaussian, etc.)
syntax: \(X \sim D\).
Each distribution has three properties:
- variables (what is being modeled)
- values (what values can they take on)
- parameters (how many degrees of freedom do we have)
Types of Distribution
discrete distribution
- described by PMF
continuous distribution
- described by PDF
parametrized distribution
We often represent probability distribution using a set of parameters \(\theta_{j}\). For instance, a normal distribution is given by \(\mu\) and \(\sigma\), and a PMF is by the probability mass for each.
probability mass function
Last edited: August 8, 2025PMF is a function that maps possible outcomes of a discrete random variables to the corresponding actual probabilities.
For random variable \(Y\), we have:
\begin{equation} f(k) = P(Y=k) \end{equation}
and \(f\) is a function that is the PMF, which is the mapping between a random variable and a value it takes on to the probability that the random variable takes on that value.
Shorthand
\begin{equation} P(Y=k) = p(y), where\ y=k \end{equation}
its written smaller \(y\) represents a case of \(Y\) where \(Y=y\).