Research material on research about pre-training encoders, with extensive comparison on masked language modeling paradigm vs causal langage modeling.
ArGiMi
community
AI & ML interests
None defined yet.
Recent Activity
View all activity
Suite of models for improved integration into RAG (for information retrieval), designed for ease-of-use and practicability in industrial context
-
EuroBERT/EuroBERT-210m
Fill-Mask • 0.3B • Updated • 8.77k • 78 -
EuroBERT/EuroBERT-610m
Fill-Mask • 0.8B • Updated • 2.09k • 32 -
EuroBERT/EuroBERT-2.1B
Fill-Mask • 2B • Updated • 379 • 63 -
EuroBERT: Scaling Multilingual Encoders for European Languages
Paper • 2503.05500 • Published • 80
Research material on research about pre-training encoders, with extensive comparison on masked language modeling paradigm vs causal langage modeling.
Suite of models for improved integration into RAG (for information retrieval), designed for ease-of-use and practicability in industrial context
-
EuroBERT/EuroBERT-210m
Fill-Mask • 0.3B • Updated • 8.77k • 78 -
EuroBERT/EuroBERT-610m
Fill-Mask • 0.8B • Updated • 2.09k • 32 -
EuroBERT/EuroBERT-2.1B
Fill-Mask • 2B • Updated • 379 • 63 -
EuroBERT: Scaling Multilingual Encoders for European Languages
Paper • 2503.05500 • Published • 80