Haotian Sun

ML Ph.D. Student @ Georgia Tech


Education

Aug. 2023 — present
Ph.D. in Machine Learning
Georgia Institute of Technology, Atlanta, GA
Advisor: Dr. Bo Dai and Dr. Chao Zhang
Research focus: machine learning, foundation models
Aug. 2022 — May 2023
M.S. in Computational Science and Engineering
Georgia Institute of Technology, Atlanta, GA
Advisor: Dr. Chao Zhang
GPA: 4.00/4.00
Sept. 2015 — Aug. 2017
Engineer's Degree (Diplôme d'Ingénieur)
CentraleSupélec, France
GPA: 3.9/4.0
Sept. 2013 — Jun. 2020
Combined Bachelor's/Master's Degree in Electrical Engineering and Automation
Xi'an Jiaotong University (XJTU), China
Enrolled in XJTU Honors Youth Program, a highly selective nationwide program accepting under 120 students each year. GPA: 90.7/100

Research Experiences

Spring 2024
Hydra : Model Factorization Framework for Black-Box LLM Personalization,
Graduate Research Assistant, Georgia Tech
- Developed a model factorization framework that personalizes black-box LLMs by capturing user-specific patterns and shared knowledge without relying on access to the model's inherent parameters;
- Designed a reranker and adapter that prioritize relevant historical records and align with user preferences;
- Delivered 9.01% average improvement over SoTA prompt-based methods across diverse personalization tasks.
Fall 2023
BBox-Adapter : Lightweight Adapting for Black-Box Large Language Models,
Graduate Research Assistant, Georgia Tech
- Proposed an effective adapting approach for Black-Box LLMs, which offers a transparent, privacy-conscious, and cost-effective solution for customizing commercial black-box LLMs with only APIs;
- Designed an online adaptation framework iteratively sampling from previous inferences and optimizing the backend lightweight adapter (up to 0.3B);
- Achieved 5.90% improvement over the base model with 31.30 times less training cost and 1.84 times less inference cost than the official SFT service.
Spring 2023
AdaPlanner : Adaptive Planning from Feedback with Language Models,
Graduate Research Assistant, Georgia Tech
- Proposed a closed-loop approach, AdaPlanner, which allows the LLM agent to refine its self-generated plan adaptively in response to environmental feedback;
- Developed a code-style LLM prompt structure that facilitates plan generation across a variety of tasks, environments, and agent capabilities;
- Proposed a skill discovery mechanism that leverages successful plans as few-shot exemplars, enabling the agent to plan and refine with fewer task demonstrations.
Spring 2023
ToolQA : A Dataset for LLM Question Answering with External Tools,
Graduate Research Assistant, Georgia Tech
- Introduced a new dataset called ToolQA, which is designed to faithfully evaluate LLMs' ability to use external tools for question answering;
- Minimized the overlap between our benchmark data and LLMs' pre-training data, enabling a more precise evaluation of LLMs' tool-use reasoning abilities;
- Conducted an in-depth diagnosis of existing tool-use LLMs to highlight their strengths, weaknesses, and potential improvements.
Spring 2023
Two Birds with One Stone : Enhancing Uncertainty Quantification and Interpretability with Graph Functional Neural Process,
Graduate Research Assistant, Georgia Tech
Mentor: Prof. Chao Zhang
- Proposed a generalized uncertainty-aware and interpretable graph classification model that combines graph functional neural process (FNP) and graph generative models to enhance calibration and interpretability;
- Improved the expected calibration error by up to 4.8% and rationale F1 score by up to 10.8% compared with the strongest baselines.
Fall 2022
Autoregressive Diffusion Model for Graph Generation,
Graduate Research Assistant, Georgia Tech
- Proposed a node-absorbing diffusion process that operates directly in the discrete graph space;
- Designed a diffusion ordering network that learns an optimal node absorbing ordering from graph topology and a denoising network that uses the reverse node order to reconstruct the graph efficiently;
- Achieved better generation performance than previous state-of-the-art and guaranteed fast generation speed.

Publications

C7
EC-DIT: Scaling Diffusion Transformers with Adaptive Expert-Choice Routing
Haotian Sun, Tao Lei, Bowen Zhang, Yanghao Li, Haoshuo Huang, Ruoming Pang, Bo Dai, Nan Du
arXiv .
Project PDF DOI
C6
HYDRA: Model Factorization Framework for Black-Box LLM Personalization
Yuchen Zhuang, Haotian Sun, Yue Yu, Rushi Qiang, Qifan Wang, Chao Zhang, Bo Dai
Accepted at NeurIPS'25 .
Project PDF Code DOI
C5
BBox-Adapter: Lightweight Adapting for Black-Box Large Language Models
Haotian Sun*, Yuchen Zhuang*, Wei Wei, Chao Zhang, Bo Dai
Accepted at ICML'24 (Spotlight, top 3.5%) .
Project PDF Code DOI
C4
AdaPlanner: Adaptive Planning from Feedback with Language Models
Haotian Sun*, Yuchen Zhuang*, Lingkai Kong, Bo Dai, Chao Zhang
Accepted at NeurIPS'23 .
Project PDF Code DOI
C3
ToolQA: A Dataset for LLM Question Answering with External Tools
Yuchen Zhuang, Yue Yu, Kuan Wang, Haotian Sun, Chao Zhang
Accepted at NeurIPS'23 (Datasets and Benchmarks Track) .
Project PDF Code DOI
C2
Two Birds with One Stone: Enhancing Calibration and Interpretability with Graph Functional Neural Process
Lingkai Kong*, Haotian Sun*, Yuchen Zhuang, Haorui Wang, Wenhao Mu, Chao Zhang
Accepted at AISTATS'24 .
Project PDF Code DOI
C1
Autoregressive Diffusion Model for Graph Generation
Lingkai Kong, Jiaming Cui, Haotian Sun, Yuchen Zhuang, B Aditya Prakash, Chao Zhang
Accepted at ICML'23 .
Project PDF DOI