Fill-Mask
Transformers
Safetensors
English
mdlm
custom_code
nielsr HF Staff commited on
Commit
e87dd49
·
verified ·
1 Parent(s): 9e6829b

Add pipeline tag, link to project page, correct paper link

Browse files

This PR adds the `pipeline_tag: text-generation` to the model card metadata, linking the project page, and correcting the paper link to the Hugging Face Papers link.

Files changed (1) hide show
  1. README.md +6 -7
README.md CHANGED
@@ -1,12 +1,13 @@
1
  ---
2
- library_name: transformers
3
- license: apache-2.0
4
- language:
5
- - en
6
  datasets:
7
  - Skylion007/openwebtext
 
 
 
 
8
  metrics:
9
  - perplexity
 
10
  ---
11
 
12
  ## Using MDLM
@@ -28,9 +29,7 @@ was trained using a forward diffusion process that generates inputs varying from
28
  reconstruct the original input from these varying levels of masking, outputting logits in the process.
29
  The training regimen comprised one million steps on the OpenWebText corpus, involving the processing of a total of `33 billion` tokens.
30
 
31
- For more details, please see our paper: [Simple and Effective Masked Diffusion Language Models](http://arxiv.org/abs/2406.07524).
32
-
33
-
34
 
35
  ## Citation
36
 
 
1
  ---
 
 
 
 
2
  datasets:
3
  - Skylion007/openwebtext
4
+ language:
5
+ - en
6
+ library_name: transformers
7
+ license: apache-2.0
8
  metrics:
9
  - perplexity
10
+ pipeline_tag: text-generation
11
  ---
12
 
13
  ## Using MDLM
 
29
  reconstruct the original input from these varying levels of masking, outputting logits in the process.
30
  The training regimen comprised one million steps on the OpenWebText corpus, involving the processing of a total of `33 billion` tokens.
31
 
32
+ For more details, please see our paper: [Simple and Effective Masked Diffusion Language Models](https://huggingface.co/papers/2406.07524).
 
 
33
 
34
  ## Citation
35