About Me

I am currently a second-year Ph.D. student at Tianjin Key Laboratory of Visual Computing and Intelligent Perception (VCIP), Nankai University, advised by Prof. Xiang Li and Prof. Jian Yang. I am also a research intern at Tiansuan Lab, Ant Group. My research mainly focuses on vision-language models and efficient model computing.

The code of my research work will be open-source, and I will also attach a detailed Chinese interpretation of the paper. Although the interpretation may be somewhat fragmented, I will do my best to present the insights and ideas behind the paper.

I am also maintaining a curated list of prompt learning methods for vision-language models.[Links].

The journey of scientific research is challenging, but I’m passionate about my work. If you’re interested in my research or encounter any research problems, please feel free to contact me via email (zhengli97 [at] {mail.nankai.edu.cn, qq.com}).

Publications

sym

[CVPR 2024] PromptKD: Unsupervised Prompt Distillation for Vision-Language Models.
Zheng Li, Xiang Li, Xinyi Fu, Xin Zhang, Weiqiang Wang, Shuo Chen, Jian Yang.
[Paper][Code][Project Page][中文解读]
PromptKD is a simple and effective prompt-driven unsupervised distillation framework for VLMs, with state-of-the-art performance.

sym

[PR 2024] Dual Teachers for Self-Knowledge Distillation.
Zheng Li, Xiang Li, Lingfeng Yang, Renjie Song, Jian Yang, Zhigeng Pan.
[Paper][PDF][Code(TBD)][中文解读(知乎)]
DTSKD explores a new self-KD framework where the student network receives self-supervisions by dual teachers from two dramatically distinct fields.

sym

[AAAI 2023] Curriculum Temperature for Knowledge Distillation.
Zheng Li, Xiang Li, Lingfeng Yang, Borui Zhao, Renjie Song, Lei Luo, Jun Li, Jian Yang.
[Paper][Code][Project Page][中文解读]
CTKD organizes the distillation task from easy to hard through a dynamic and learnable temperature. The temperature is learned during the student’s training process with a reversed gradient that aims to maximize the distillation loss in an adversarial manner.

sym

[ICCV 2021] Online Knowledge Distillation for Efficient Pose Estimation.
Zheng Li, Jingwen Ye, Mingli Song, Ying Huang, Zhigeng Pan.
[Paper][Code][Project Page][中文解读]
OKDHP first proposes to distill the pose structure knowledge in a one-stage manner.

sym

[ACCV 2020] Online Knowledge Distillation via Multi-branch Diversity Enhancement.
Zheng Li, Ying Huang, Defang Chen, Tianren Luo, Ning Cai, Zhigeng Pan.
[Paper]
OKDMDE is a simple and effective technique to enhance model diversity in online knowledge distillation.

Competitions

Honors and Awards

  • 2022.03. Outstanding Graduates of Zhejiang Province.
  • 2021.10. National Scholarship by Ministry of Education of China.

Internships

  • 2023.08 - Present. Ant Group, Hangzhou. Research Intern.
  • 2021.10 - 2023.08. Megvii Research, Nanjing. Research Intern. Led by Renjie Song.
  • 2021.10 - 2022.05. PCALab, Nanjing University of Science and Technology. Visiting Student.

Review Services

  • 2022 - Present. AAAI, ECCV, CVPR, ICML, NeurlPS, ICLR, KBS, TNNLS…

Personal Hobbies

  • Photography 📸. I am a contracted photographer for 500px Gallery. Some photos I took while traveling and mountaineering.
  • Mountaineering 🗻. Summit: Haba Snow Mountain (5396m)
  • Trail Running 🏃‍♂️ (iTRA) . TNF100 Ultra Trail Challenge Moganshan - 30km Group. Finish: 5h28min (33km).