Update README.md
Browse files
README.md
CHANGED
@@ -1,3 +1,22 @@
|
|
|
|
|
|
|
|
|
|
|
|
1 |
|
|
|
|
|
2 |
|
3 |
-
This is an *unofficial* reupload of t5-learning-no-pretraining-cs-task in the `SafeTensors` format using `transformers` `4.40.1`. The goal of this reupload is to prevent older models that are still relevant baselines from becoming stale as a result of changes in HuggingFace. Additionally, I may include minor corrections, such as model max length configuration.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
---
|
2 |
+
arxiv: 2102.02017
|
3 |
+
language:
|
4 |
+
- code
|
5 |
+
---
|
6 |
|
7 |
+
# Studying the Usage of Text-To-Text Transfer Transformer to Support Code-Related Tasks
|
8 |
+
## Using Transfer Learning for Code-Related Tasks
|
9 |
|
10 |
+
This is an *unofficial* reupload of `t5-learning-no-pretraining-cs-task` based off the [author's repo](https://github.com/antonio-mastropaolo/TransferLearning4Code), in the `SafeTensors` format using `transformers` `4.40.1`. I manually converted the checkpoints using the `tf_2_pytorch_T5.py` script and converted the tokenizers with my own script. The goal of this reupload is to prevent older models that are still relevant baselines from becoming stale as a result of changes in HuggingFace. Additionally, I may include minor corrections, such as model max length configuration.
|
11 |
+
|
12 |
+
## Citation
|
13 |
+
|
14 |
+
```bibtex
|
15 |
+
@article{Mastropaolo2021StudyingTU,
|
16 |
+
title={Studying the Usage of Text-To-Text Transfer Transformer to Support Code-Related Tasks},
|
17 |
+
author={Antonio Mastropaolo and Simone Scalabrino and Nathan Cooper and David Nader-Palacio and Denys Poshyvanyk and Rocco Oliveto and Gabriele Bavota},
|
18 |
+
journal={2021 IEEE/ACM 43rd International Conference on Software Engineering (ICSE)},
|
19 |
+
year={2021},
|
20 |
+
pages={336-347}
|
21 |
+
}
|
22 |
+
```
|