One-Liner
Novelty
Background
How do LLMs do graphs?
- predict text from graphs (convert graph into text, autoregression)
- align text with graph (GNN + LLM late fusion)
- encode text with graph (stick LLM embedding to a GNN as a prompt)
How do LLMs do graphs?