Hello, I am Keyan Zhou(ε‘¨ζŸ―θ¨€), a second year master student at the Artificial Intelligence Research Institute of Soochow University, under the supervision of Prof. Juntao Li and Prof. Min Zhang.

Before this, I received my Bachelor’s degree (2019-2023, computer science) from Soochow University.

At present, I am working as a Multi-modal LLM R&D Intern at ByteDance, focusing on enhancing domain-specific reasoning capabilities of LVLMs.

πŸ€” My research interests center on the concept of knowledge in LLMs/LVLMs, particularly within Long Contexts and Long Generation. Specifically, I focus on the following aspects:

  • Knowledge Dynamics: Exploring mechanisms for LLMs/LVLMs to dynamically integrate internal knowledge with external information. Resolving knowledge conflicts, mitigating outdated information, and addressing safety risks to improve model trustworthiness.
  • Reliable Reasoning: Focusing on how LLMs/LVLMs can iteratively refine their reasoning by cross-verifying and self-correct knowledge from multi-sources to reduce hallucinations and improve reliability.

🀝 I’m looking for a PhD position in 2026 Fall. Please email me at jonaszhou01@gmail.com if there is a potential opportunity!

πŸ“ Publications

* denotes equal contribution.

ACL 2025
sym

L-CiteEval: A Suite for Evaluating Fidelity of Long-context Models

Zecheng Tang*, Keyan Zhou*, Juntao Li, Baibei Ji, Jianye Hou, Min Zhang

  • This work proposes a long-context benchmark L-CiteEval, which evaluates the citation quality of LCMs and highlights the tendency of current open-source LCMs to rely on intrinsic knowledge rather than the provided context for generating responses.

EMNLP 2024
sym

CMD: a framework for Context-aware Model self-Detoxification

Zecheng Tang*, Keyan Zhou*, Juntao Li, Yuyang Ding, Pinzheng Wang, Yan Bowen, Renjie Hua, Min Zhang

  • This work proposes a context-aware detoxification framework, balancing detoxification and generation quality.

ICLR 2025
sym

Revealing and Mitigating Over-attention in Knowledge Editing

Pinzheng Wang, Zecheng Tang, Keyan Zhou, Juntao Li, Qiaoming Zhu, Min Zhang

  • This work reveals the over-attention issue in knowledge eiditing.

Arxiv 2025
sym

LOOM-Scope: a comprehensive and efficient LOng-cOntext Model evaluation framework

Zecheng Tang, Haitian Wang, Quantong Qiu, Baibei Ji, Ruoxi Sun, Keyan Zhou, Juntao Li, Min Zhang

  • This work standardizes long-context evaluation across 22 benchmarks, integrates inference acceleration techniques, and introduces a lightweight comprehensive long-context benchmark called LOOMBench.

πŸŽ– Honors and Awards

  • National Scholarship, Ministry of Education
  • Soochow University Outstanding Graduate
  • Huawei Scholarship
  • Mathematical Contest in Modeling(MCM) Finalist Winner

πŸ“– Educations

  • 2023.09 - current, Master, Artificial Intelligence Research Institute, Soochow University, Suzhou.
  • 2019.09 - 2023.06, Bachelor, Institute of Computer Science and Technology, Soochow University, Suzhou.

πŸ’¬ Invited Talks

πŸ’» Internships

  • 2025.06 - current, Multi-modal LLM R&D Intern, ByteDance, Shanghai, China.
  • 2025.03 - 2025.05, Long-Context LLM Research Intern, MiraclePlus, Shanghai, China.