Yunzhi

Yunzhi Yao

a Ph.D. candidate in Zhejiang University

About Me

Hello! I’m Yunzhi Yao (姚云志), a final-year Ph.D. student (expected to graduate in June 2026) at the College of Computer Science and Technology, Zhejiang University. I am fortunate to be advised by Prof. Huajun Chen and Prof. Ningyu Zhang.

I earned my Bachelor’s degree in Software Engineering and Dual Degree in Finance from Shandong University in 2021.

Btw, I’m interested in Astrology. If you are interested in talking about research and life with me, please feel free to reach out!

Experience

Alibaba Tongyi Lab · Research Intern
April 2026 – Present
Ant Group, Ling Team · Research Intern
Worked on coding pretraining for large language models
University of California, Los Angeles (UCLA) · Visiting Research Scholar
Advised by Prof. Nanyun Peng
Microsoft Research Asia (MSRA) · Research Intern
Advised by Shaohan Huang

Research Interest

My primary research interests lie in machine learning for natural language processing, with a particular focus on the knowledge mechanisms underpinning large language models (LLMs). I am actively exploring the following topics:

Selected Publications

Rethinking Knowledge Editing in Reasoning Era.

Yunzhi Yao, Jiaxin Qin, Ningyu Zhang, Haoming Xu, Yuqi Zhu, Zeping Yu, Mengru Wang, Yuqi Tang, Jia-Chen Gu, Shumin Deng, Nanyun Peng, Huajun Chen.
PDF

Reflection on Knowledge Editing: Charting the Next Steps.

Yunzhi Yao, Canyu Chen, Jia-Chen Gu, Shumin Deng, Manling Li, Nanyun Peng.
Blog

CaKE: Circuit-aware Editing Enables Generalizable Knowledge Learners.

Yunzhi Yao, Jizhan Fang, Jia-Chen Gu, Ningyu Zhang, Shumin Deng, Huajun Chen, Nanyun Peng.
EMNLP 2025, ACL 2025 KnowFM, Oral @ NENLP 2025
PDF Code

Knowledge Circuits in Pretrained Transformers.

Yunzhi Yao, Ningyu Zhang, Zekun Xi, Mengru Wang, Ziwen Xu, Shumin Deng, Huajun Chen.
In Proceedings of the 38th Neural Information Processing Systems (NeurIPS 2024)
PDF Code Video

Editing Large Language Models: Problems, Methods, and Opportunities.

Yunzhi Yao, Peng Wang, Bozhong Tian, Siyuan Cheng, Zhoubo Li, Shumin Deng, Huajun Chen, Ningyu Zhang.
In Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing (EMNLP 2023)
PDF Code Video

Knowledge Rumination for Pre-trained Language Models.

Yunzhi Yao, Peng Wang, Shengyu Mao, Chuanqi Tan, Fei Huang, Huajun Chen, Ningyu Zhang.
In Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing (EMNLP 2023)
PDF Code

Schema-aware Reference as Prompt Improves Data-efficient Knowledge Graph Construction.

Yunzhi Yao, Shengyu Mao, Ningyu Zhang, Xiang Chen, Shumin Deng, Xi Chen, Huajun Chen.
In Proceedings of the 46th International ACM SIGIR Conference (SIGIR 2023)
PDF Code

Kformer: Knowledge Injection in Transformer Feed-forward Layers.

Yunzhi Yao, Shaohan Huang, Li Dong, Furu Wei, Huajun Chen, Ningyu Zhang.
In Proceedings of the 11th Natural Language Processing and Chinese Computing (NLPCC 2022)
PDF Code

Adapt-and-Distill: Developing Small, Fast and Effective Pretrained Language Models for Domains.

Yunzhi Yao, Shaohan Huang, Wenhui Wang, Li Dong, Furu Wei.
In Findings of the Association for Computational Linguistics: ACL-IJCNLP 2021 (ACL 2021)
PDF Code

Honors and Awards

Professional Services

Tutorials

Knowledge Editing for Large Language Models @ IJCAI 2024.

Ningyu Zhang, Jia-Chen Gu, Yunzhi Yao, Mengru Wang, Xiang Chen, Shumin Deng.
Link PDF

Knowledge Editing for Large Language Models @ COLING 2024.

Ningyu Zhang, Yunzhi Yao, Shumin Deng.
Link PDF

Knowledge Editing for Large Language Models @ AACL 2023.

Ningyu Zhang, Yunzhi Yao, Shumin Deng.
Link PDF