Hongli Zhan | 詹弘立
honglizhan@utexas.edu

Office: RLP 4.108
Arrogance is a sign of ignorance.
I am a Ph.D. candidate in Computational Linguistics at The University of Texas at Austin, where I’m blessed to be advised by Professor Junyi Jessy Li.
The ambition of my Ph.D. research is to build emotionally intelligent AI systems in a broader social context (see my first-authored publications at EMNLP 2022, ACL 2023, EMNLP 2023 Findings, COLM 2024). During my internship in the industry, I’ve also worked on aligning language models (see my recent first-authored work at ICML 2025).
Casually, I go by Henry.
Education
Ph.D. in Computational Linguistics, 2021 – Present
The University of Texas at Austin
⁃ Advisor: Dr. Junyi Jessy Li
The University of Texas at Austin
⁃ Advisor: Dr. Junyi Jessy Li
B.A. in English Linguistics, 2017 – 2021
Shanghai Jiao Tong University
⁃ Awards: Outstanding Undergraduate; Outstanding Undergraduate Thesis Award
Shanghai Jiao Tong University
⁃ Awards: Outstanding Undergraduate; Outstanding Undergraduate Thesis Award
Industry Experience
Research Scientist Intern, IBM Research, Summer 2025
IBM Thomas J. Watson Research Center, Yorktown Heights, NY
IBM Thomas J. Watson Research Center, Yorktown Heights, NY
Research Scientist Intern, IBM Research, Summer 2024
IBM Thomas J. Watson Research Center, Yorktown Heights, NY
⁃ Hosted by the Responsible and Inclusive Technologies Research Group
⁃ Manager: Dr. Raya Horesh; Mentors: Dr. Muneeza Azmat & Dr. Mikhail Yurochkin
⁃ Work resulted in a first-authored patent and a first-authored paper at ICML 2025
IBM Thomas J. Watson Research Center, Yorktown Heights, NY
⁃ Hosted by the Responsible and Inclusive Technologies Research Group
⁃ Manager: Dr. Raya Horesh; Mentors: Dr. Muneeza Azmat & Dr. Mikhail Yurochkin
⁃ Work resulted in a first-authored patent and a first-authored paper at ICML 2025
News
[2025/05/19] ![]() |
[2025/05/01] ![]() |
[2025/04/22] ![]() |
[2024/11/05] ![]() |
[2024/10/03] ![]() |
[2024/07/10] ![]() |
[2024/06/07] ![]() |
[2024/05/20] ![]() |
[2023/10/07] ![]() |
[2023/05/02] ![]() |
[2023/01/26] ![]() |
[2022/10/06] ![]() |
[2021/08/25] I joined the Ph.D. program in computational linguistics at UT Austin! |
[2021/06/26] I received my bachelor's degree as an outstanding graduate from SJTU! |
Selected Publications
* denotes equal contributions
-
ICML 2025SPRI: Aligning Large Language Models with Context-Situated PrinciplesIn Proceedings of the 42nd International Conference on Machine Learning. 2025. [26.9% acceptance rate (3,260 out of 12,107 submissions); Work started and partially done during my internship at IBM Research]
-
EMNLP 2023FindingsEvaluating Subjective Cognitive Appraisals of Emotions from Large Language ModelsIn Findings of the Association for Computational Linguistics: EMNLP 2023. Dec 2023. [45.4% acceptance rate (1,758 out of 3,868 submissions)]