About Me
I am an LLM Algorithm Engineer at Huawei Noah’s Ark Lab, where I joined the Recommendation and Search Laboratory in 2022 after completing my M.S. degree at Peking University. My primary focus is on enhancing large language models’ capabilities in code competitions and Retrieval-Augmented Generation (RAG). I have also made significant contributions to LLM4Rec (Large Language Models for Recommendation) projects.
My research interests span LLM for Recommendation, LLM for Code, RAG, and Recommender Systems. Currently, I’m dedicated to leveraging the power of Large Language Models to solve complex coding challenges and advance RAG technologies.
🔥 News
- 2025.05: 🎉🎉 Four papers accepted by ACL 2025.
- 2025.04: 🎉🎉 One paper accepted by WWW 2025.
- 2025.01: 🎉🎉 One paper accepted by Coling 2025.
- 2024.11: 🎉🎉 One paper accepted by EMNLP 2024.
📝 Highlighted Publications
-
CoIR: A Comprehensive Benchmark for Code Information Retrieval Models
Xiangyang Li, Kuicai Dong, Yi Quan Lee, Wei Xia, Hao Zhang, Xinyi Dai, Yasheng Wang, Ruiming Tang
ACL 2025 -
CTRL: Connect Collaborative and Language Model for CTR Prediction
Xiangyang Li, Bo Chen, Lu Hou, Ruiming Tang
TORS -
LLMTreeRec: Unleashing the Power of Large Language Models for Cold-Start Recommendations
Wenlin Zhang, Chuhan Wu, Xiangyang Li, Yuhao Wang, Kuicai Dong, Yichao Wang, Xinyi Dai, Xiangyu Zhao, Huifeng Guo, Ruiming Tang
Coling 2025 (ORAL) -
A Unified Framework for Multi-Domain CTR Prediction via Large Language Models
Zichuan Fu, Xiangyang Li, Chuhan Wu, Yichao Wang, Kuicai Dong, Xiangyu Zhao, Mengchen Zhao, Huifeng Guo, Ruiming Tang
TOIS -
IntTower: The Next Generation of Two-Tower Model for Pre-Ranking System
Xiangyang Li, Bo Chen, HuiFeng Guo, Jingjie Li, Chenxu Zhu, Xiang Long, Sujian Li, Yichao Wang, Wei Guo, Longxia Mao, Jinxing Liu, Zhenhua Dong, Ruiming Tang
CIKM 2022 (ORAL) -
Low Resource Style Transfer via Domain Adaptive Meta Learning
Xiangyang Li, Xiang Long, Yu Xia, Sujian Li
NAACL 2022 (ORAL)
🎖 Honors and Awards
- 2019-2022 DLP-KDD Workshop Best Paper Award
- 2015-2019 National Inspirational Scholarship, Outstanding Student Award
- 2015-2019 First-Class Scholarship
🏆 Competitions & Projects
-
3rd Place in AAAI 2021 COVID-19 Fake News Detection Challenge
View Solution -
Lead Contributor to “Dive into Deep Learning” TensorFlow implementation (3.8k stars)
View Project
📖 Education
- M.S., Peking University (2019.09 - 2022.06)
- B.S., Nanjing University of Posts and Telecommunications (2015.09 - 2019.06)
- Exchange Student at Nanjing University (Fall 2017)
💬 Invited Talks
- 2023 Presented research on CTRL at Tsinghua University
💻 Professional Experience
Huawei Noah’s Ark Lab (2022.01 - Present)
LLM Algorithm Engineer
- Leading development of LLM code competition capabilities
- Advancing RAG (Retrieval-Augmented Generation) technologies
- Contributing to the Pangu Large Language Model
- Previously worked on recommendation and search systems
Internships
MeiTuan (2021.06 - 2021.09)
Algorithm Intern
- Developed and trained a coarse ranking model for MeiTuan’s food delivery channel
- Identified 20 key features and implemented an XGBoost model based on CVR
- Achieved 0.89 AUC in production and 1.7% OPM improvement
- Successfully deployed model to alleviate system stress
JingDong (2021.01 - 2021.04)
Algorithm Intern
- Applied CEM optimization algorithm to regulate traffic distribution in JingDong’s main search
- Balanced merchant traffic promotion while ensuring GMV growth
- Optimized online feedback mechanisms for search results