Skip to main content

🏢 Polytechnique Montréal

NeoBERT: A Next-Generation BERT
·2699 words·13 mins· loading · loading
AI Generated 🤗 Daily Papers Natural Language Processing Large Language Models 🏢 Polytechnique Montréal
NeoBERT: A new encoder that enhances bidirectional language understanding with cutting-edge architecture, data, and training, achieving SOTA results with only 250M parameters.