German BERT base

Released, Oct 2020, this is a German BERT language model trained collaboratively by the makers of the original German BERT (aka "bert-base-german-cased") and the dbmdz BERT (aka bert-base-german-dbmdz-cased). In our paper, we outline the steps taken to train our model and show that it outperforms its predecessors.

Overview

Paper: here
Architecture: BERT base
Language: German

Performance

GermEval18 Coarse: 78.17
GermEval18 Fine:   50.90
GermEval14:        87.98

See also:
deepset/gbert-base deepset/gbert-large deepset/gelectra-base deepset/gelectra-large deepset/gelectra-base-generator deepset/gelectra-large-generator

Authors

Branden Chan: branden.chan [at] deepset.ai Stefan Schweter: stefan [at] schweter.eu Timo Möller: timo.moeller [at] deepset.ai

About us

deepset is the company behind the production-ready open-source AI framework Haystack.

Some of our other work:

Get in touch and join the Haystack community

For more info on Haystack, visit our GitHub repo and Documentation.

We also have a Discord community open to everyone!

Twitter | LinkedIn | Discord | GitHub Discussions | Website | YouTube

By the way: we're hiring!

Downloads last month
27,427
Safetensors
Model size
111M params
Tensor type
F32
·
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and the model is not deployed on the HF Inference API.

Model tree for deepset/gbert-base

Finetunes
47 models

Dataset used to train deepset/gbert-base

Spaces using deepset/gbert-base 3