I recently joined the School of Artificial Intelligence at Beijing Normal University (BNU) as an Associate Professor. Prior to this, I worked at the Institute of Information Engineering, Chinese Academy of Sciences, from September 2020 to September 2024. In July 2020, I earned my Ph.D. from the same institute, where I was advised by Associate Professor Yong Liu and Professor Weiping Wang.

My research focuses on foundational machine learning theory, particularly the generalization theory of large-scale methods. Addressing the lag in foundational theory compared to empirical algorithms in large-scale machine learning, I aim to uncover the underlying principles and narrow the gap between theory and practical algorithms. Ultimately, I strive to guide large-scale algorithm design for a balance between computational efficiency and generalization performance. Specific interests include:

  • Fundamental Research on Large Language Models: Delving into the foundational theory of large language models, explaining unique capabilities like scaling laws, context learning, and complex reasoning. Improving model architecture for computational efficiency and performance and researching the next generation of efficient language models with reduced parameters.

  • Generalization Theory of Deep Neural Networks: Exploring connections between neural networks and kernel methods, studying generalization in non-stationary spectral kernel networks, refining current neural network models, and using random matrix theory to understand phenomena in deep networks.

  • Optimal Generalization Guarantees for Large-Scale ML: Investigating optimal generalization guarantees, relaxing assumptions, and enhancing large-scale algorithms, including federated learning, distributed learning, and random features.

Career

InstitutionTitleTime
Beijing Normal UniversityAssociate Professor2024.09 - present
Microsoft Research Asia (NLC Group)Visiting Scholar2024.04 - 2024.06
Institute of Information Engineering, CASAssociate Research Fellow2023.10 - 2024.09
Institute of Information Engineering, CASPostdoc Researcher2020.09 - 2023.10

Education

InstitutionMajorDegreeTime
University of Chinese Academy of Sciences (UCAS)Cyber SecurityPh.D.2015.09 - 2020.06
Northeastern UniversitySoftware Engineering (International class)Bachelor2011.09 - 2015.06

Selected Papers [Full List] [Google Scholar]

  • A Survey on Model Compression for Large Language Models. [pdf]
    Jian Li, Yong Liu, Weiping Wang.
    Transactions of the Association for Computational Linguistics (TACL), 2024, Aceepted. CCF-B Journal / JCR Q1.

  • Distilling mathematical reasoning capabilities into Small Language Models. [pdf]
    Xunyu Zhu, Jian Li*, Yong Liu, Can Ma, Weiping Wang.
    Neural Networks, 2024. CCF-B Journal / JCR Q1.

  • Optimal Rates for Agnostic Distributed Learning. [pdf] [code]
    Jian Li, Yong Liu, Weiping Wang.
    IEEE Transactions On Information Theory (TIT), 2023. CCF-A Journal.

  • Optimal Convergence Rates for Distributed Nyström Approximation. [pdf] [code]
    Jian Li, Yong Liu, Weiping Wang.
    Journal of Machine Learning Research (JMLR), 2023. CCF-A Journal.

  • Convolutional Spectral Kernel Learning with Generalization Guarantees. [pdf] [code]
    Jian Li, Yong Liu, Weiping Wang.
    Artificial Intelligence (AI), 2022. CCF-A Journal.

  • Optimal Convergence Rates for Agnostic Nyström Kernel Learning. [pdf]
    Jian Li, Yong Liu, Weiping Wang.
    International Conference on Machine Learning (ICML), 2023. CCF-A Conference.

  • Multi-Class Learning: From Theory to Algorithm. [pdf] [poster] [sildes] [3-minute video] [code]
    Jian Li, Yong Liu, Rong Yin, Hua Zhang, Lizhong Ding, Weiping Wang.
    Advances in Neural Information Processing Systems 31 (NeurIPS), 2018. CCF-A Conference.

  • Federated learning for non-iid data: From theory to algorithm. [pdf] [presentation] [🏆Best Student Paper Reward] (1/92)
    Bojian Wei, Jian Li*, Yong Liu, Weiping Wang.
    Pacific Rim International Conference on Artificial Intelligence (PRICAI), 2021. CCF-C Conference.

Projects

  • National Key R&D Program of China (2022YFB3105302.2), 2022.12 - 2025.11, ¥1,200,000.
    Aggregation and Collaborative Techniques for Cross-platforms Heterogenous Data.

  • National Natural Science Foundation of China (No. 62106257), 2022.01 - 2024.12, ¥300,000.
    Large Scale Structured Prediction with Automated Spectral Kernel Learning.

  • China Postdoctoral Science Foundation (Special Support, No. 2023T160680), 2023.07 - 2024.03, ¥180,000.
    Research on Deep Differentiable Gaussian Processes for Structured Prediction.

  • Special Research Assistant Project of CAS, 2020.09 - 2022.09, ¥800,000.
    Large-scale Few-shot Automated Machine Learning.

  • Talent Program Class A of Institute of Information Engineering, CAS, Tenure-track Professor, 2023.10 - 2026.09.

  • Talent Program Class B of Institute of Information Engineering, CAS, Tenure-track Young Professor, 2020.09 - 2023.10.

Patents

Granted

  • Hailun Lin, Yong Liu, Jian Li, Weiping Wang. A Large-Scale Ontology Merging Method that Integrates Representation Learning and Divide-and-Conquer Strategy: China. Granted No.CN110059194A. Granted Date: April 8, 2022.

Pending

  • Jian Li, Yong Liu, Liubin Wang, Yiguo Yang, Juhong Wang.Neural Network Architecture Search Method, Device, Computer Equipment, and Storage Medium. CN:202011567991.3. App. Date: December 25, 2020.
  • Jian Li, Jiaoyang Li, Bojian Wei, Yong Liu, Weiping Wang. A Federated Learning Method and System Based on Attention Mechanism. CN: 202311073645.3. App. Date: August 24, 2023
  • Jian Li, Jiaoyang Li, Zheng Lin, Yong Liu, Weiping Wang. A Vertical Domain Large Model Method and System Based on Knowledge Distillation and Prompt Engineering. CN: 202311073641.5. App. Date: August 24, 2023.

Students

  • Ph.D. students
    • 🎓Yilin Kang (2020.09 - 2023.06 ), Differential Privacy.
      Publications: Computers & Security, CIKM, ICCS.
      Post-graduation: Researcher in Purple Mountain Laboratories.
    • Xunyu Zhu (2020.09 - present), Efficient LLM Inference.
      Publications: Neural Networks $\times$ 2, TACL, ICDM.
      In submissions: TACL.
    • Boxuan Che (2022.09 - present), Efficient Graph Neural Networks.
  • Master students
    • 🎓Bojian Wei (2020.09 - 2022.06), Federated Learning on Heterogenous Data.
      Publications: PRICAI 2021 (best student paper award), TNNLS, ECML-PKDD, IJCNN.
      Post-graduation: Management Trainee in Bank of China Head Office.
    • Xuning Zhang (2022.09 - present), Federated Learning.
      Excellent Bachelor’s Thesis in Wuhan University in 2023.
      In submissions: AAAI 2025.

Honors and Awards

  • Microsoft Research Asia StarTrack Scholars Program, 2024
  • Talent Plan Class A of IIE, CAS, 2023.
  • PRICAI 2021 best student paper award, 2021.
  • Special Research Assistant of Chinese Academy of Sciences, 2020.
  • Talent Plan Class B of IIE, CAS, 2020.
  • AIDU Talents of Baidu Research, 2020.
  • Joint Ph.D. Program with Stanford University (Discontinued due to COVID-19), 2020.02 - 2021.02.
  • Outstanding Graduates of Beijing, 2020.
  • Outstanding Graduates of University of Chinese Academy of Sciences (UCAS), 2020.
  • Outstanding Graduates of Institute of Information Engineering, CAS, 2020.
  • National Scholarship for Doctoral students, 2019.
  • ZhuLiYueHua Scholarship for Excellent Doctoral Student, 2019.
  • CAS Presidential Scholarship, 2019.
  • National Scholarship for Doctoral students, 2018.

Academic Service

  • Mathematics Guest Editor
  • Program committee of Conference: ICML, NeurIPS, ICLR, AAAI, IJCAI, ECAI, etc.
  • Reviewers of Journals: TPAMI, JMLR, Pattern Recognition, etc.