Computer Systems Index
Last edited: August 8, 2025Notes on CS 107, C, MIPS, and computational systems.
Lectures
- SU-CS107 SEP272023
- SU-CS107 SEP292023
- SU-CS107 OCT022023
- SU-CS107 OCT032023
- SU-CS107 OCT042023
- SU-CS107 OCT062023
- SU-CS107 OCT092023
- SU-CS107 OCT112023
- SU-CS107 OCT132023
- SU-CS107 OCT182023
- SU-CS107 OCT202023
- SU-CS107 OCT232023
- SU-CS107 OCT252023
- SU-CS107 OCT272023
- SU-CS107 NOV102023
- SU-CS107 NOV132023
- SU-CS107 DEC012023
Worksheets
conceptual grammar
Last edited: August 8, 2025conceptual grammar is the proposed universal grammar which connects semantic primes. In theory, this grammar is universal across languages.
There are three main categories of conceptual grammars:
- Combinatorics (connecting one idea to another)
- Account of valancies? #what
- Propositional complementation (location “something that happen in this place”
ConDef Abstract
Last edited: August 8, 2025Current automated lexicography (term definition) techniques cannot include contextual or new term information as a part of its synthesis. We propose a novel data harvesting scheme leveraging lead paragraphs in Wikipedia to train automated context-aware lexicographical models. Furthermore, we present ConDef, a fine-tuned BART trained on the harvested data that defines vocabulary terms from a short context. ConDef is determined to be highly accurate in context-dependent lexicography as validated on ROUGE-1 and ROUGE-L measures in an 1000-item withheld test set, achieving scores of 46.40% and 43.26% respectively. Furthermore, we demonstrate that ConDef’s synthesis serve as good proxies for term definitions by achieving ROUGE-1 measure of 27.79% directly against gold-standard WordNet definitions.Accepted to the 2022 SAI Computing Conference, to be published on Springer Nature’s Lecture Notes on Networks and Systems Current automated lexicography (term definition) techniques cannot include contextual or new term information as a part of its synthesis. We propose a novel data harvesting scheme leveraging lead paragraphs in Wikipedia to train automated context-aware lexicographical models. Furthermore, we present ConDef, a fine-tuned BART trained on the harvested data that defines vocabulary terms from a short context. ConDef is determined to be highly accurate in context-dependent lexicography as validated on ROUGE-1 and ROUGE-L measures in an 1000-item withheld test set, achieving scores of 46.40% and 43.26% respectively. Furthermore, we demonstrate that ConDef’s synthesis serve as good proxies for term definitions by achieving ROUGE-1 measure of 27.79% directly against gold-standard WordNet definitions.
conditional Gaussian model
Last edited: August 8, 2025Say you have one continuous variable \(X\), and one discrete variable \(Y\), and you desire to express the probability of \(X\) conditioned upon \(Y\) using a gaussian model:
\begin{equation} p(x|y) = \begin{cases} \mathcal{N}(x \mid \mu_{1}, \sigma_{1}^{2}), y^{1} \\ \dots \\ \mathcal{N}(x \mid \mu_{1}, \sigma_{1}^{2}), y^{n} \\ \end{cases} \end{equation}
conditional plan
Last edited: August 8, 2025conditional plan is a POMDP representation technique. We can represent a conditional plan as a tree.
toy problem
crying baby POMDP problem:
- actions: feed, ignore
- reward: if hungry, negative reward
- state: two states: is the baby hungry or not
- observation: noisy crying (she maybe crying because she’s genuinely hungry or crying just for kicks)
formulate a conditional plan
we can create a conditional plan by generating a exponential tree based on the observations. This is a policy which tells you what you should do given the sequence of observations you get, with no knowledge of the underlying state.
