📢📢 News
- One paper accepted by EMNLP 2024 Findings. “Reconfidencing LLMs from the Grouping Loss Perspective”
- Check out our survey about the role of small models in the LLM era
- Our QR-Neuron is available: a package of extracting query-relevant neurons in LLMs
- Our new version of YAGO knowledge base is available, which is accepted by SIGIR 2024
- Our PEARL is available on 🤗 HuggingFace, a lightweight and powerful embedding model for short texts
My name is Lihu Chen (陈立虎), and I am currently a Research Associate at Imperial College London.
Before that, I did a one-year postdoc at Inria Saclay. I obtained my PhD at DIG team of Télécom Paris, which is a member of Institut Polytechnique de Paris. I was co-supervised by Fabian Suchanek and Gaël Varoquaux.
My research primarily focuses on natural language processing (NLP) and large language models (LLMs). I am dedicated to developing efficient, reliable, and open-source models and tools, with a particular emphasis on information extraction and biomedical applications. Specifically, my research topics include:
- Large vs Small: Collaborative AI Modeling Approaches. In our survey, we systematically examine the collaborations between LLMs and SMs. My research will primarily build upon this foundational concept, focusing on leveraging the complementary strengths of both LLMs and SMs.
- Interpretable and Trustworthy Models. The goal of interpretability is to provide a human-understandable explanation of a model’s internal reasoning process, i.e., how the model works (transparency). I am interested in confidence estimation, knowledge mechanism of LLMs, and fact-checking.
- Efficient Knowledge-Augmented LLMs. LLMs may struggle with tasks that require domain-specific expertise or up-to-date information. We would like to know how to cost-effectively retrieve external knowledge to augment the reasoning capabilities.
Misc
- My Github
- Human Language: Chinese Mandarin (Native); English (Fluent); French (Basic)
- Travel Photos
Contact
Email:[firstname].[lastname][AT]imperial.ac.uk