Songtao Liu
[Google Scholar] [Github]
Email: skl5761@psu.edu
Hi, this is Songtao. I’m currently a Computer Science Ph.D. candidate at Penn State. I received my B.S. in Computer Science and Data Science from Fudan University.
My research focuses on scalable and efficient discrete generative modeling, with an emphasis on developing algorithms that reduce inference cost and solve diverse problems across domains such as language.
I am on the job market for 2025-2026!
* denotes equal contribution (first, second, or senior)
Since several people asked about the contributions of my paper, I have listed contributions under the selected publications.
Multi-Head Low-Rank Attention
Songtao Liu, Hongwu Peng, Zhiwei Zhang, Zhengyu Chen, Yue Guo
[pdf]
I led all technical aspects of the paper—including the core ideas, coding, experiments, and writing. My collaborators contributed computing resources.
Evaluating Molecule Synthesizability via Retrosynthetic Planning and Reaction Prediction
Songtao Liu, Dandan Zhang, Zhengkai Tu, Hanjun Dai, Peng Liu
arXiv, 2024
[arXiv]
I led all technical aspects of the paper—including the core ideas, coding, experiments, and writing. DZ proposed using SciFinder to evaluate the synthetic routes, and we together conducted the human evaluation.
Preference Optimization for Molecule Synthesis with Conditional Residual Energy-based Models
Songtao Liu, Hanjun Dai, Yue Zhao, Peng Liu
International Conference on Machine Learning (ICML), 2024
Oral Presentation
[arXiv] [Code]
I led all technical aspects of the paper—including the core ideas, coding, experiments, and writing. HD proposed presenting the work through the lens of residual energy–based models. YZ supplied compute resources for rebuttal-period experiments.
FusionRetro: Molecule Representation Fusion via In-Context Learning for Retrosynthetic Planning
Songtao Liu, Zhengkai Tu*, Minkai Xu*, Zuobai Zhang*, Lu Lin, Rex Ying, Jian Tang, Peilin Zhao, Dinghao Wu
International Conference on Machine Learning (ICML), 2023
[arXiv] [Code]
I led all technical aspects of the paper—including the core ideas, coding, experiments, and writing. ZT assisted with writing the introduction and related work. MX suggested organizing the presentation of our method into three principled steps. ZZ helped implement one baseline.
High-Layer Attention Pruning with Rescaling
Songtao Liu, Peng Liu
arXiv, 2025
[arXiv]
Graph Adversarial Diffusion Convolution
Songtao Liu, Jinghui Chen, Tianfan Fu, Lu Lin, Marinka Zitnik, Dinghao Wu
International Conference on Machine Learning (ICML), 2024
[arXiv] [Code]
Local Augmentation for Graph Neural Networks
Songtao Liu, Rex Ying, Hanze Dong, Lanqing Li, Tingyang Xu, Yu Rong, Peilin Zhao, Junzhou Huang, Dinghao Wu
International Conference on Machine Learning (ICML), 2022
[arXiv] [Code]
Conference/Journal Reviewer: ICML 2022/2024/2025, NeurIPS 2022/2023/2024/2025, ICLR 2024/2025, AISTATS 2024, TMLR
DS 330: Visual Analytics, Penn State, Spring 2025
DS 320: Data Integration and Fusion, Penn State, Spring 2024
DS/CMPSC 410: Programming Models for Big Data, Penn State, Fall 2023
IST 815: Foundations of Information Security and Assurance, Penn State, Fall 2023