Linear Regression
Last edited: October 10, 2025Suppose you have a dataset of two features, you can write a predictor \(h\qty(x)\) in:
\begin{equation} h\qty(x) = \theta_{0} + \theta_{1} x_{1} + \theta_{2} x_{2} \end{equation}
This is a smidge unwieldy, because we have to keep tacking up \(\theta\) terms with an exception whenever we have new features. So, a trick, is that we set \(x_0 = 1\). This yields equivalently…
definition
\begin{equation} h\qty(x) = \sum_{j=0}^{m} \theta_{j} x_{j} = \theta^{T} x \end{equation}
master theorem
Last edited: October 10, 2025A general recurrence relation solution formula. It’s a generalization of the “tree” method in example 1.
intuition
Every Recursion Theorem problem has a struggle between most work sitting at the “bottom” of the tree (number of subproblems explode, \(a > b^{d}\)) vs most work sitting at the “top” of the tree (problem lower in the tree are smaller, \(a < b^{d}\)).
constituents
Consider:
- \(a\): number of subproblems
- \(b\): input size shrink
- \(d\) need to do \(n^{d}\) work to merge subproblems
Suppose \(a \geq 1, b> 1\), and \(d\) are constants. Suppose \(T\qty(n) = aT\qty(\frac{n}{b}) + O\qty(n^{d})\), we then have:
recurrence relation
Last edited: October 10, 2025Calculating the runtime of a recursive scheme.
requirements
- A recursive function \(T\qty(n)\) in terms of \(T\qty(k), k< n\)
- A base case \(T\qty(1)\)
additional information
motivation
Consider merge sort. It’s running time is of shape:
\begin{equation} T\qty(n) = 2 T\qty(\frac{n}{2}) + O\qty(n) \end{equation}
Two submerges, plus the \(O\qty(n)\) merge operation. For the sake of argument clarity (not to mix Big-oh notation and just the recurrence relation), let’s write \(O\qty(n) := 11n\).
\begin{equation} T\qty(n) = 2 T\qty(\frac{n}{2}) + 11n \end{equation}
SU-CS161 SEP302025
Last edited: October 10, 2025Key Sequence
Notation
New Concepts
Important Results / Claims
Questions
Interesting Factoids
[redirect] Linear Regression
Last edited: September 9, 2025example: house price prediction
1 dimension
We want to predict sales price from feet above ground.
\begin{equation} h(x) = \theta_{0} + \theta_{1} x \end{equation}
This makes: \(h : \mathbb{R} \to \mathbb{R}\). and the \(\theta = \qty(\theta_{0}, \theta_{1})\) are what we call parameters or weights.
d dimensions
\begin{equation} h(x) = \theta_{0} + \sum_{j=1}^{d}\theta_{j}x_{j} \end{equation}
but this is like clumsy, so if we come up with a special feature \(x_0 = 1\), we can just make it the linear model it is:
