DrNicefellow's picture
Update README.md
ce2f87c verified
|
raw
history blame
463 Bytes
metadata
license: apache-2.0

Self trained GPT-2 large. Around 770M parameters.

The tokenizer is the one from https://huggingface.co/openai-community/gpt2.

It is being trained on around 400B tokens and this is step 43k.

The evaluation is being conducted now.

License

This model is available under the Apache 2.0 License. Well, also MIT License. So both should be followed.

Discord Server

Join our Discord server here.