Should We Still Pretrain Encoders with Masked Language Modeling? Paper • 2507.00994 • Published Jul 1 • 78
EuroBERT Encoding model Collection Suite of models for improved integration into RAG (for information retrieval), designed for ease-of-use and practicability in industrial context • 4 items • Updated Apr 10
EuroBERT: Scaling Multilingual Encoders for European Languages Paper • 2503.05500 • Published Mar 7 • 81
Is Preference Alignment Always the Best Option to Enhance LLM-Based Translation? An Empirical Analysis Paper • 2409.20059 • Published Sep 30, 2024 • 17