About Me
I am an LLM Algorithm Engineer at Z.AI. Fisrst, I joined the Huawei Noah Ark Lab Recommendation and Search Laboratory in 2022 after completing my M.S. degree at Peking University. In 2025,I join Z.A I for Code LLM. My primary focus is on enhancing large language models’ capabilities in code competitions and Retrieval-Augmented Generation (RAG). I have also made significant contributions to LLM4Rec (Large Language Models for Recommendation) projects. Now I focus on how to using LLM to solve real world code problem.
My research interests span LLM for Recommendation, LLM for Code, RAG, and Recommender Systems. Currently, I’m dedicated to leveraging the power of Large Language Models to solve complex coding competition and real world code problem.
🔥 News
- 2025.08: 🎉🎉 One paper accepted by EMNLP 2025.
- 2025.09: 🎉🎉 Two papers accepted by NeurIPS 2025.
- 2025.05: 🎉🎉 Four papers accepted by ACL 2025.
- 2025.04: 🎉🎉 One paper accepted by WWW 2025.
- 2025.01: 🎉🎉 One paper accepted by Coling 2025.
- 2024.11: 🎉🎉 One paper accepted by EMNLP 2024.
📝 Highlighted Publications
-
Humanity’s Last Code Exam: Can Advanced LLMs Conquer Human’s Hardest Code Competition?
Xiangyang Li, Xiaopeng Li, Kuicai Dong, Quanhu Zhang, Rongju Ruan, Xinyi Dai, Xiaoshuang Liu, Shengchun Xu, Yasheng Wang, Ruiming Tang
EMNLP 2025 -
CoIR: A Comprehensive Benchmark for Code Information Retrieval Models
Xiangyang Li, Kuicai Dong, Yi Quan Lee, Wei Xia, Hao Zhang, Xinyi Dai, Yasheng Wang, Ruiming Tang
ACL 2025 -
CTRL: Connect Collaborative and Language Model for CTR Prediction
Xiangyang Li, Bo Chen, Lu Hou, Ruiming Tang
TORS -
A Unified Framework for Multi-Domain CTR Prediction via Large Language Models
Zichuan Fu, Xiangyang Li, Chuhan Wu, Yichao Wang, Kuicai Dong, Xiangyu Zhao, Mengchen Zhao, Huifeng Guo, Ruiming Tang
TOIS -
IntTower: The Next Generation of Two-Tower Model for Pre-Ranking System
Xiangyang Li, Bo Chen, HuiFeng Guo, Jingjie Li, Chenxu Zhu, Xiang Long, Sujian Li, Yichao Wang, Wei Guo, Longxia Mao, Jinxing Liu, Zhenhua Dong, Ruiming Tang
CIKM 2022 (ORAL)
🎖 Honors and Awards
- 2019-2022 DLP-KDD Workshop Best Paper Award
- 2015-2019 National Inspirational Scholarship, Outstanding Student Award
- 2015-2019 First-Class Scholarship
🏆 Competitions & Projects
-
3rd Place in AAAI 2021 COVID-19 Fake News Detection Challenge
View Solution -
Lead Contributor to “Dive into Deep Learning” TensorFlow implementation (3.8k stars)
View Project
📖 Education
- M.S., Peking University (2019.09 - 2022.06)
- B.S., Nanjing University of Posts and Telecommunications (2015.09 - 2019.06)
- Exchange Student at Nanjing University (Fall 2017)
💬 Invited Talks
- 2023 Presented research on CTRL at Tsinghua University
💻 Professional Experience
Huawei Noah’s Ark Lab (2022.08 - 2025.10)
LLM Algorithm Engineer
- Leading development of LLM code competition capabilities
- Advancing RAG (Retrieval-Augmented Generation) technologies
- Contributing to the Pangu Large Language Model
- Previously worked on recommendation and search systems
Internships
MeiTuan (2021.06 - 2021.09)
Algorithm Intern
- Developed and trained a coarse ranking model for MeiTuan’s food delivery channel
- Identified 20 key features and implemented an XGBoost model based on CVR
- Achieved 0.89 AUC in production and 1.7% OPM improvement
- Successfully deployed model to alleviate system stress
JingDong (2021.01 - 2021.04)
Algorithm Intern
- Applied CEM optimization algorithm to regulate traffic distribution in JingDong’s main search
- Balanced merchant traffic promotion while ensuring GMV growth
- Optimized online feedback mechanisms for search results