|
--- |
|
language: |
|
- en |
|
--- |
|
|
|
# FIDO-GPT: Generative AI behind "Fidonet Cybernetic Immortality" Project |
|
|
|
[FIDONet](https://en.wikipedia.org/wiki/FidoNet) is a historic computer network based on nightly mail exchange between servers |
|
via telephone lines, which was popular in 1990-s. In [FIDONet Cybernetic Immortality Project](https://soshnikov.com/art/fidoci) |
|
we are looking to create exhibits that will revive now-almost-dead FIDONet by automatically writing correspondence in |
|
FIDONet style via generative large language models. |
|
|
|
This model is based on [GPT2-large](https://huggingface.co/gpt2-large) model, and was fine-tuned for 2 epochs on archives of |
|
[ExecPC BBS](https://en.wikipedia.org/wiki/ExecPC_BBS), obtained from [here](https://breakintochat.com/collections/messages/fidonet/index.html). |
|
This process took around 9 hours on NVidia A100 compute in Yandex Datasphere service. |
|
|
|
This code can be used for generation: |
|
|
|
```python |
|
from transformers import pipeline, AutoModelForCausalLM,AutoTokenizer |
|
import torch |
|
|
|
model_name = 'estonto/fido-gpt' |
|
|
|
model = AutoModelForCausalLM.from_pretrained(model_name) |
|
tokenizer = AutoTokenizer.from_pretrained(model_name) |
|
pipe = pipeline(model=model,tokenizer=tokenizer,task="text-generation",device="cuda") |
|
result = pipe("<s>Topic: COMPUTING",do_sample=True,max_length=500)[0]['generated_text'].replace('\\n','\n') |
|
``` |
|
|
|
Project idea and model training: [Dmitry Soshnikov](https://soshnikov.com) |