Talks
Posters
Takes
- although parsing maybe dead for natural language, structure helps parse scientific information (i.e. drugs, molecules, proteins, etc.)
- two idea: 1) how to formalize approach mathematically 2) what can LMs do that humans can’t do?
- information-rich statefulness + constraints for pruning space is the unlock for ability to build on previous results; i.e. “critical thinking”
Tasks to Do
- EMNLP2025 Fan: medium is not the message: I wonder if we can remove keyboard based signals from BM25 using this method
- EMNLP2025 Xu: tree of prompting: a bunch of multi-hop retrieval datasets to benchmark forRAG-DOLL
Tasks Can Do
- EMNLP2025 Keynote: Heng Ji: “protein LLM requires early exit to capture dynamical Beauvoir”; what if we Mixture of Depth a protein LM?
- EMNLP2025 Hutson: measuring informative of open and questions: formalize this as a rho– POMDP , or use actual value of information measures with Belman backup
- EMNLP2025 Karamanolakis: interactive machine teaching: use MCTS UCB to pick the next set of constitutions to optimize for
