Papers
* denotes equal contribution (first, second, or senior)
-
Multi-Head Low-Rank Attention
Songtao Liu, Hongwu Peng, Zhiwei Zhang, Zhengyu Chen, Yue Guo
International Conference on Learning Representations (ICLR), 2026
[arXiv] [Blog] [Code] [Weights&Data]
-
High-Layer Attention Pruning with Rescaling
Songtao Liu, Peng Liu
Transactions on Machine Learning Research (TMLR), 2026
[arXiv] [Code]
-
Preference Optimization for Molecule Synthesis with Conditional Residual Energy-based Models
Songtao Liu, Hanjun Dai, Peng Liu
International Conference on Machine Learning (ICML), 2024
Oral Presentation
[arXiv] [Code]
-
Graph Adversarial Diffusion Convolution
Songtao Liu, Jinghui Chen, Lu Lin, Marinka Zitnik, Dinghao Wu
International Conference on Machine Learning (ICML), 2024
[arXiv] [Code]
-
FusionRetro: Molecule Representation Fusion via In-Context Learning for Retrosynthetic Planning
Songtao Liu, Zhengkai Tu*, Minkai Xu*, Zuobai Zhang*, Lu Lin, Rex Ying, Jian Tang, Peilin Zhao, Dinghao Wu
International Conference on Machine Learning (ICML), 2023
[arXiv] [Code]
-
Local Augmentation for Graph Neural Networks
Songtao Liu, Rex Ying, Hanze Dong, Lanqing Li, Tingyang Xu, Yu Rong, Peilin Zhao, Junzhou Huang, Dinghao Wu
International Conference on Machine Learning (ICML), 2022
[arXiv] [Code]
Professional Services
Conference/Journal Reviewer: ICML 2022/2024/2025, NeurIPS 2022/2023/2024/2025, ICLR 2024/2025/2026, AISTATS 2024, TMLR 2024
Miscellaneous
Research is boring if it can't push the boundaries. I first got into AI back in 2017 through an NLP course, and by 2018 I was already digging into LLMs like BERT and GPT-1. Transformers were second nature to me by then. But as an undergrad, I didn't have the research insight to see where the field was heading, and I ended up missing the wave. Instead, I went into GNNs and Security in 2019, then AI4Science in 2021. For one reason or another, none of these three fields really scale. That's my honest take after actually working in them. The communities, in my experience, also had their toxic side. I came back to LLMs in 2024, and it's been a breath of fresh air. People are easy to work with, and collaborations just click. But there's no getting that lost time back.