Skip to main content

🏢 Renmin University of China

YuLan-Mini: An Open Data-efficient Language Model
·4206 words·20 mins· loading · loading
AI Generated 🤗 Daily Papers Natural Language Processing Large Language Models 🏢 Renmin University of China
YuLan-Mini: An open, data-efficient 2.42B parameter LLM achieving top-tier performance with innovative training techniques.
RetroLLM: Empowering Large Language Models to Retrieve Fine-grained Evidence within Generation
·4628 words·22 mins· loading · loading
AI Generated 🤗 Daily Papers Natural Language Processing Question Answering 🏢 Renmin University of China
RetroLLM unifies retrieval & generation in LLMs, boosting accuracy and cutting costs.
HtmlRAG: HTML is Better Than Plain Text for Modeling Retrieved Knowledge in RAG Systems
·2200 words·11 mins· loading · loading
AI Generated 🤗 Daily Papers Natural Language Processing Question Answering 🏢 Renmin University of China
HtmlRAG boosts RAG system accuracy by using HTML, not plain text, to model retrieved knowledge, improving knowledge representation and mitigating LLM hallucination.