obake2ai commited on
Commit
20d341a
·
verified ·
1 Parent(s): bfcadf6

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -21,7 +21,7 @@ All data was obtained ethically and in compliance with the site's terms and cond
21
  No copyright texts are used in the training of this model without the permission.
22
 
23
  - GPT-J 6B was trained on [the Pile](https://pile.eleuther.ai), a large-scale curated dataset created by [EleutherAI](https://www.eleuther.ai).
24
- - Frankenstein; or, The Modern Prometheus, 1818 (Public domain)
25
 
26
  ## Training procedure
27
  This model was trained for 402 billion tokens over 383,500 steps on TPU v3-256 pod. It was trained as an autoregressive language model, using cross-entropy loss to maximize the likelihood of predicting the next token correctly.
 
21
  No copyright texts are used in the training of this model without the permission.
22
 
23
  - GPT-J 6B was trained on [the Pile](https://pile.eleuther.ai), a large-scale curated dataset created by [EleutherAI](https://www.eleuther.ai).
24
+ - Frankenstein; or, The Modern Prometheus, Mary Shelley, 1818 (Public domain)
25
 
26
  ## Training procedure
27
  This model was trained for 402 billion tokens over 383,500 steps on TPU v3-256 pod. It was trained as an autoregressive language model, using cross-entropy loss to maximize the likelihood of predicting the next token correctly.