I am currently a Visiting Research Student at the Big Data Institute (BDI) Lab, The Hong Kong University of Science and Technology (Guangzhou) (HKUST GZ), supervised by Prof. Yongqi Zhang. I am also a final-year undergraduate student at the Graph Data Intelligence (GDI) Lab, Beijing Institute of Technology (BIT), supervised by Prof. Rong-Hua Li. My research interests lie in Large Language Models and Data Mining.
Currently, I am working as a Research Intern at Tencent Yuanbao (CSIG) in the Multimodal Algorithm Group, where I focus on Multimodal RAG and agentic multimodal reasoning. I will begin my Ph.D. in Fall 2026. I am available only for internships in Shenzhen during my first Ph.D. year, and open to internships without location restrictions from the second year onward.
If you have internship opportunities related to Agentic (M)LLMs, please feel free to reach out to me via email!
Email: enjundu.cs@gmail.com
The Hong Kong University of Science and Technology
Visiting Student
Beijing Institute of Technology
Bachelor of Cyberspace Science and Technology
My current research interests are:
Large Language Model: LLM Application, (M)LLM Reasoning, Retrieval augmented generation, RL-LLM
Data Mining: Data-Centric AI, Knowledge Graph Reasoning Here are some of my research works.

Click the Cite button above to demo the feature to enable visitors to import publication metadata into their reference management software.
Mar 8, 2026

Click the Cite button above to demo the feature to enable visitors to import publication metadata into their reference management software.
Feb 8, 2026

Click the Cite button above to demo the feature to enable visitors to import publication metadata into their reference management software.
Oct 5, 2025

Abstract Knowledge graph reasoning in the fully-inductive setting — where both entities and relations at test time are unseen during training — remains an open challenge. We introduce GraphOracle, a novel framework that transforms each knowledge graph into a Relation-Dependency Graph (RDG) encoding directed precedence links between relations. A multi-head attention mechanism produces context-aware relation embeddings that guide inductive message passing over the original KG. Experiments on 60 benchmarks show up to 25% improvement in fully-inductive and 28% in cross-domain scenarios.
Mar 25, 2025

Click the Cite button above to demo the feature to enable visitors to import publication metadata into their reference management software.
Mar 25, 2025

Click the Cite button above to demo the feature to enable visitors to import publication metadata into their reference management software.
Mar 14, 2025
I genuinely believe that meaningful progress in academia stems from open dialogue and thoughtful debate. If you have any questions about my research–or if you’ve previously contacted me through GitHub issues and haven’t received a response–please feel free to reach out via email. I’m always happy to chat, collaborate, or offer assistance where I can.
Throughout my academic journey, I’ve been fortunate to receive support and inspiration from many generous people. I’m always eager to give back to the community and engage with others passionate about learning and discovery.
That said, I’m not interested in discussions centered around citation counts, publication metrics, or quantitative comparisons between individuals. If your outreach is primarily motivated by such metrics, I kindly ask that you refrain from contacting me. I’m most interested in conversations about meaningful problems, creative solutions, and insightful ideas.
Preferred Email:
Optional Email:
Please avoid contacting me at enjun_du@bit.edu.cn, as I will no longer have access to this address.