arnosimons commited on
Commit
d0100a5
·
verified ·
1 Parent(s): b5069db

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -7,7 +7,7 @@ pipeline_tag: fill-mask
7
 
8
  # Model Card for **Astro-HEP-BERT**
9
 
10
- **Astro-HEP-BERT** is a bidirectional transformer designed primarily to generate contextualized word embeddings for analyzing epistemic change in astrophysics and high-energy physics. Built upon Google's "bert-base-uncased," the model underwent additional training for three epochs using approximately 21.5 million paragraphs extracted from around 600,000 scholarly articles sourced from arXiv, all pertaining to astrophysics and/or high-energy physics (HEP). The sole training objective was masked language modeling.
11
 
12
  For further insights into the model and the corpus, please refer to the Astro-HEP-BERT paper [link coming soon].
13
 
 
7
 
8
  # Model Card for **Astro-HEP-BERT**
9
 
10
+ **Astro-HEP-BERT** is a bidirectional transformer designed primarily to generate contextualized word embeddings for analyzing epistemic change in astrophysics and high-energy physics (<a target="_blank" rel="noopener noreferrer" href="https://doi.org/10.3030/101044932" >NEPI project</a> at TU Berlin). Built upon Google's "bert-base-uncased," the model underwent additional training for three epochs using approximately 21.5 million paragraphs extracted from around 600,000 scholarly articles sourced from arXiv, all pertaining to astrophysics and/or high-energy physics (HEP). The sole training objective was masked language modeling.
11
 
12
  For further insights into the model and the corpus, please refer to the Astro-HEP-BERT paper [link coming soon].
13