Haotian Sun
ML Ph.D. Student @ Georgia Tech
Education
Aug. 2023 — present
Ph.D. in Machine Learning
Georgia Institute of Technology, Atlanta, GA
Advisor:
Dr. Bo Dai and Dr. Chao Zhang
Research focus: machine learning, foundation models
Aug. 2022 — May 2023
M.S. in Computational Science and Engineering
Georgia Institute of Technology, Atlanta, GA
Advisor:
Dr. Chao Zhang
GPA: 4.00/4.00
Sept. 2015 — Aug. 2017
Engineer's Degree (Diplôme d'Ingénieur)
CentraleSupélec, France
GPA: 3.9/4.0
Sept. 2013 — Jun. 2020
Combined Bachelor's/Master's Degree in Electrical Engineering and Automation
Enrolled in XJTU Honors Youth Program, a highly selective nationwide program accepting under 120 students each year.
GPA: 90.7/100
Research Experiences
Spring 2024
Graduate Research Assistant, Georgia Tech
- Developed a model factorization framework that personalizes black-box LLMs by capturing user-specific patterns and shared knowledge without relying on access to the model's inherent parameters;
- Designed a reranker and adapter that prioritize relevant historical records and align with user preferences;
- Delivered 9.01% average improvement over SoTA prompt-based methods across diverse personalization tasks.
- Designed a reranker and adapter that prioritize relevant historical records and align with user preferences;
- Delivered 9.01% average improvement over SoTA prompt-based methods across diverse personalization tasks.
Fall 2023
Graduate Research Assistant, Georgia Tech
- Proposed an effective adapting approach for Black-Box LLMs, which offers a transparent, privacy-conscious, and cost-effective solution for customizing commercial black-box LLMs with only APIs;
- Designed an online adaptation framework iteratively sampling from previous inferences and optimizing the backend lightweight adapter (up to 0.3B);
- Achieved 5.90% improvement over the base model with 31.30 times less training cost and 1.84 times less inference cost than the official SFT service.
- Designed an online adaptation framework iteratively sampling from previous inferences and optimizing the backend lightweight adapter (up to 0.3B);
- Achieved 5.90% improvement over the base model with 31.30 times less training cost and 1.84 times less inference cost than the official SFT service.
Spring 2023
Graduate Research Assistant, Georgia Tech
- Proposed a closed-loop approach, AdaPlanner, which allows the LLM agent to refine its self-generated plan adaptively in response to environmental feedback;
- Developed a code-style LLM prompt structure that facilitates plan generation across a variety of tasks, environments, and agent capabilities;
- Proposed a skill discovery mechanism that leverages successful plans as few-shot exemplars, enabling the agent to plan and refine with fewer task demonstrations.
- Developed a code-style LLM prompt structure that facilitates plan generation across a variety of tasks, environments, and agent capabilities;
- Proposed a skill discovery mechanism that leverages successful plans as few-shot exemplars, enabling the agent to plan and refine with fewer task demonstrations.
Spring 2023
Graduate Research Assistant, Georgia Tech
- Introduced a new dataset called ToolQA, which is designed to faithfully evaluate LLMs' ability to use external tools for question answering;
- Minimized the overlap between our benchmark data and LLMs' pre-training data, enabling a more precise evaluation of LLMs' tool-use reasoning abilities;
- Conducted an in-depth diagnosis of existing tool-use LLMs to highlight their strengths, weaknesses, and potential improvements.
- Minimized the overlap between our benchmark data and LLMs' pre-training data, enabling a more precise evaluation of LLMs' tool-use reasoning abilities;
- Conducted an in-depth diagnosis of existing tool-use LLMs to highlight their strengths, weaknesses, and potential improvements.
Spring 2023
Graduate Research Assistant, Georgia Tech
Mentor:
Prof. Chao Zhang
- Proposed a generalized uncertainty-aware and interpretable graph classification model that combines graph functional neural process (FNP) and graph generative models to enhance calibration and interpretability;
- Improved the expected calibration error by up to 4.8% and rationale F1 score by up to 10.8% compared with the strongest baselines.
- Improved the expected calibration error by up to 4.8% and rationale F1 score by up to 10.8% compared with the strongest baselines.
Fall 2022
Graduate Research Assistant, Georgia Tech
- Proposed a node-absorbing diffusion process that operates directly in the discrete graph space;
- Designed a diffusion ordering network that learns an optimal node absorbing ordering from graph topology and a denoising network that uses the reverse node order to reconstruct the graph efficiently;
- Achieved better generation performance than previous state-of-the-art and guaranteed fast generation speed.
- Designed a diffusion ordering network that learns an optimal node absorbing ordering from graph topology and a denoising network that uses the reverse node order to reconstruct the graph efficiently;
- Achieved better generation performance than previous state-of-the-art and guaranteed fast generation speed.
Publications
C7
Haotian Sun,
Tao Lei,
Bowen Zhang,
Yanghao Li,
Haoshuo Huang,
Ruoming Pang,
Bo Dai,
Nan Du
arXiv .
C6
Yuchen Zhuang,
Haotian Sun,
Yue Yu,
Rushi Qiang,
Qifan Wang,
Chao Zhang,
Bo Dai
Accepted at NeurIPS'25 .
C5
Haotian Sun*,
Yuchen Zhuang*,
Wei Wei,
Chao Zhang,
Bo Dai
Accepted at ICML'24 (Spotlight, top 3.5%) .
C4
Haotian Sun*,
Yuchen Zhuang*,
Lingkai Kong,
Bo Dai,
Chao Zhang
Accepted at NeurIPS'23 .
C3
Yuchen Zhuang,
Yue Yu,
Kuan Wang,
Haotian Sun,
Chao Zhang
Accepted at NeurIPS'23 (Datasets and Benchmarks Track) .
C2
Lingkai Kong*,
Haotian Sun*,
Yuchen Zhuang,
Haorui Wang,
Wenhao Mu,
Chao Zhang
Accepted at AISTATS'24 .
C1
Lingkai Kong,
Jiaming Cui,
Haotian Sun,
Yuchen Zhuang,
B Aditya Prakash,
Chao Zhang
Accepted at ICML'23 .