Computational Task
Last edited: August 8, 2025Decision problems
- a decision problem: \(\Sigma^{*} \to \qty {\text{no}, \text{yes}}\)
- we often associate the “yes” instances of this decision problem as a \(L \subseteq \Sigma^{*}\) language
“Given boolean formula \(\varphi\), accept IFF \(\varphi\) is SAT”
Function problems
Give me a particular case of:
\begin{equation} f(w) : \Sigma^{*} \to \Sigma^{*} \end{equation}
note that there is a unique answer.
“Give a formula \(\varphi\), output lex first satisfying assignments/number of satisfying assignments”
computer number system
Last edited: August 8, 2025bit
A computer is built out of binary gates:

So, having voltage into \(B\) allows current to pass through between \(S\) and \(D\), it could be on/off.
byte
Accumulation of \(8\) bits
Computer memory is a large array of bytes. It is only BYTE ADDRESSABLE: you can’t address a bit in isolation.
bases
Generate, each base uses digits \(0\) to \(base-1\).
We prefix 0x to represent hexadecimal, and 0b to represent binary.
Computer Systems Index
Last edited: August 8, 2025Notes on CS 107, C, MIPS, and computational systems.
Lectures
- SU-CS107 SEP272023
- SU-CS107 SEP292023
- SU-CS107 OCT022023
- SU-CS107 OCT032023
- SU-CS107 OCT042023
- SU-CS107 OCT062023
- SU-CS107 OCT092023
- SU-CS107 OCT112023
- SU-CS107 OCT132023
- SU-CS107 OCT182023
- SU-CS107 OCT202023
- SU-CS107 OCT232023
- SU-CS107 OCT252023
- SU-CS107 OCT272023
- SU-CS107 NOV102023
- SU-CS107 NOV132023
- SU-CS107 DEC012023
Worksheets
conceptual grammar
Last edited: August 8, 2025conceptual grammar is the proposed universal grammar which connects semantic primes. In theory, this grammar is universal across languages.
There are three main categories of conceptual grammars:
- Combinatorics (connecting one idea to another)
- Account of valancies? #what
- Propositional complementation (location “something that happen in this place”
ConDef Abstract
Last edited: August 8, 2025Current automated lexicography (term definition) techniques cannot include contextual or new term information as a part of its synthesis. We propose a novel data harvesting scheme leveraging lead paragraphs in Wikipedia to train automated context-aware lexicographical models. Furthermore, we present ConDef, a fine-tuned BART trained on the harvested data that defines vocabulary terms from a short context. ConDef is determined to be highly accurate in context-dependent lexicography as validated on ROUGE-1 and ROUGE-L measures in an 1000-item withheld test set, achieving scores of 46.40% and 43.26% respectively. Furthermore, we demonstrate that ConDef’s synthesis serve as good proxies for term definitions by achieving ROUGE-1 measure of 27.79% directly against gold-standard WordNet definitions.Accepted to the 2022 SAI Computing Conference, to be published on Springer Nature’s Lecture Notes on Networks and Systems Current automated lexicography (term definition) techniques cannot include contextual or new term information as a part of its synthesis. We propose a novel data harvesting scheme leveraging lead paragraphs in Wikipedia to train automated context-aware lexicographical models. Furthermore, we present ConDef, a fine-tuned BART trained on the harvested data that defines vocabulary terms from a short context. ConDef is determined to be highly accurate in context-dependent lexicography as validated on ROUGE-1 and ROUGE-L measures in an 1000-item withheld test set, achieving scores of 46.40% and 43.26% respectively. Furthermore, we demonstrate that ConDef’s synthesis serve as good proxies for term definitions by achieving ROUGE-1 measure of 27.79% directly against gold-standard WordNet definitions.
