GDB
Last edited: August 8, 2025GDB is gnu’s very own debugger
b main
orb 72
(set breakpoint onmain
function or line72
)r args
(run with args)p thingname
orp 3+5
(print a variable or return value)p/t
print as binaryp/x
print as hex
info
(get args, locals)n
s
continue
next, step, continue
int test;
short lsb = 0xff;
test |= lsb
printf("%d\n",lsb);
int test;
gear
Last edited: August 8, 2025Here’s a list of gears that I like and stuff that I use.
clothing
t-shirt
MUJI Washed Heavy Weight Crew Neck
If you ever have the pleasure of buying these in China, you should. It’s cheaper there. This is a white T-Shirt, its 100% cotton, its heavy, you can wipe your glasses with it, why not?
shorts
Amazon Essential Inseam Drawstring
They are like very basic shorts; they are a nice quality, don’t seem to break, and is drawstring instead of rubber band which makes them nice to adjust.
General Inference
Last edited: August 8, 2025See inference.
In general, the joint probability distribution tables are very hard to solve because it requires—for instance for binary variables—requries \(2^{n}\) entires, which is a lot.
- how do you define very large models?
- how do you perform inference with very large models
- what about the data can we use to inform the design process
“If you can tell me a generative story, we can compress our joint probability distribution”. Get ready for…… inference with causality with Baysian Network.
general relativity
Last edited: August 8, 2025Generalization
Last edited: August 8, 2025Compositionality
Getting the right contents.
- semantic capacity tested — knowing the propositional content
- operationalization — form => meaning mapping
- measure of success — generalizing to the right meaning representation for novel expressions (this is non-trivial; multiple compatible generalization maybe applicable depending on context)
task
Given that a model can map certain expressions to their meaning representations, can they also do this for new expressions?
results
lexical generalization (fill in new words/pairs) isn’t too hard for new NN