agentlans commited on
Commit
7da4ac3
·
verified ·
1 Parent(s): 836ac21

Upload 13 files

Browse files
README.md CHANGED
@@ -1,3 +1,109 @@
1
- ---
2
- license: apache-2.0
3
- ---
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ language:
3
+ - en
4
+ license: apache-2.0
5
+ base_model: google/flan-t5-small
6
+ tags:
7
+ - generated_from_trainer
8
+ datasets:
9
+ - sentence-paraphrases
10
+ model-index:
11
+ - name: flan-t5-small-simplifier
12
+ results: []
13
+ ---
14
+
15
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
16
+ should probably proofread and complete it, then remove this comment. -->
17
+
18
+ # flan-t5-small-simplifier
19
+
20
+ For paraphrasing and simplifying English text.
21
+
22
+ Fine-tuned version of [google/flan-t5-small](https://huggingface.co/google/flan-t5-small) on the agentlans/sentence-paraphrases dataset.
23
+ It achieves the following results on the evaluation set:
24
+ - Loss: 1.1518
25
+ - Num Input Tokens Seen: 32939232
26
+
27
+ ## Intended uses & limitations
28
+
29
+ Works best on sentence length texts.
30
+
31
+ ```
32
+ import torch
33
+ from transformers import pipeline
34
+
35
+ # Check if GPU is available
36
+ device = 0 if torch.cuda.is_available() else -1
37
+
38
+ # Initialize the pipeline
39
+ model_name = "agentlans/flan-t5-small-simplifier"
40
+ flan_t5_pipeline = pipeline("text2text-generation", model=model_name, device=device)
41
+
42
+ # Example input
43
+ input_text = "While navigating the labyrinthine corridors of epistemological uncertainty, the precocious philosopher—whose seminal work on phenomenological interpretation had already garnered significant academic acclaim—paused momentarily to contemplate the intricate interplay between subjective perception and objective reality, ultimately recognizing that the boundaries of human understanding are perpetually fluid and dynamically reconstructed through continuous intellectual discourse and empirical investigation."
44
+
45
+ # Generate output
46
+ output = flan_t5_pipeline(input_text, max_length=1024)
47
+
48
+ # Print the result
49
+ print(output[0]["generated_text"])
50
+ # The precocious philosopher, who had already been a major academic acclaim for his seminal work on phenomenological interpretation, paused momentarily to contemplate the intricate interplay between subjective perception and objective reality, recognizing that the boundaries of human understanding are perpetually fluid and dynamically reconstructed through continuous intellectual discourse and empirical investigation.
51
+ ```
52
+
53
+ Limitations:
54
+ - English only
55
+ - Doesn't handle mixed language texts well (for example, English with Greek letter words)
56
+ - Might not be able to simplify some texts
57
+
58
+ ## Training and evaluation data
59
+
60
+ agentlans/sentence-paraphrases
61
+
62
+ This dataset is a curated collection of sentence-length paraphrases derived from two primary sources:
63
+
64
+ humarin/chatgpt-paraphrases
65
+ xwjzds/paraphrase_collections.
66
+
67
+ Dataset Details
68
+ Dataset Description
69
+
70
+ The dataset is structured to provide pairs of sentences from an original text and its paraphrase(s). For each entry:
71
+
72
+ The "text" field contains the least readable paraphrase.
73
+ The "paraphrase" field contains the most readable paraphrase.
74
+
75
+ Readability was assessed using the agentlans/deberta-v3-xsmall-zyda-2-readability model.
76
+
77
+ ## Training procedure
78
+
79
+ ### Training hyperparameters
80
+
81
+ The following hyperparameters were used during training:
82
+ - learning_rate: 5e-05
83
+ - train_batch_size: 8
84
+ - eval_batch_size: 8
85
+ - seed: 42
86
+ - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
87
+ - lr_scheduler_type: linear
88
+ - num_epochs: 2.0
89
+
90
+ ### Training results
91
+
92
+ | Training Loss | Epoch | Step | Validation Loss | Input Tokens Seen |
93
+ |:-------------:|:------:|:-----:|:---------------:|:-----------------:|
94
+ | 1.4423 | 0.2224 | 10000 | 1.2431 | 3655312 |
95
+ | 1.3884 | 0.4448 | 20000 | 1.2093 | 7331520 |
96
+ | 1.3782 | 0.6673 | 30000 | 1.1859 | 10990432 |
97
+ | 1.3595 | 0.8897 | 40000 | 1.1787 | 14653328 |
98
+ | 1.3059 | 1.1121 | 50000 | 1.1665 | 18326104 |
99
+ | 1.3298 | 1.3345 | 60000 | 1.1589 | 21991016 |
100
+ | 1.2994 | 1.5569 | 70000 | 1.1562 | 25656600 |
101
+ | 1.2952 | 1.7794 | 80000 | 1.1518 | 29314808 |
102
+
103
+
104
+ ### Framework versions
105
+
106
+ - Transformers 4.43.3
107
+ - Pytorch 2.3.0+cu121
108
+ - Datasets 3.2.0
109
+ - Tokenizers 0.19.1
all_results.json ADDED
@@ -0,0 +1,16 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "epoch": 2.0,
3
+ "eval_loss": 1.1517876386642456,
4
+ "eval_runtime": 2.8065,
5
+ "eval_samples": 2500,
6
+ "eval_samples_per_second": 890.786,
7
+ "eval_steps_per_second": 111.526,
8
+ "num_input_tokens_seen": 32939232,
9
+ "total_flos": 1.1959161056722944e+16,
10
+ "train_loss": 1.358397777055082,
11
+ "train_runtime": 3698.2072,
12
+ "train_samples": 359680,
13
+ "train_samples_per_second": 194.516,
14
+ "train_steps_per_second": 24.314,
15
+ "train_tokens_per_second": 8905.5
16
+ }
config.json ADDED
@@ -0,0 +1,62 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "_name_or_path": "google/flan-t5-small",
3
+ "architectures": [
4
+ "T5ForConditionalGeneration"
5
+ ],
6
+ "classifier_dropout": 0.0,
7
+ "d_ff": 1024,
8
+ "d_kv": 64,
9
+ "d_model": 512,
10
+ "decoder_start_token_id": 0,
11
+ "dense_act_fn": "gelu_new",
12
+ "dropout_rate": 0.1,
13
+ "eos_token_id": 1,
14
+ "feed_forward_proj": "gated-gelu",
15
+ "initializer_factor": 1.0,
16
+ "is_encoder_decoder": true,
17
+ "is_gated_act": true,
18
+ "layer_norm_epsilon": 1e-06,
19
+ "model_type": "t5",
20
+ "n_positions": 512,
21
+ "num_decoder_layers": 8,
22
+ "num_heads": 6,
23
+ "num_layers": 8,
24
+ "output_past": true,
25
+ "pad_token_id": 0,
26
+ "relative_attention_max_distance": 128,
27
+ "relative_attention_num_buckets": 32,
28
+ "task_specific_params": {
29
+ "summarization": {
30
+ "early_stopping": true,
31
+ "length_penalty": 2.0,
32
+ "max_length": 200,
33
+ "min_length": 30,
34
+ "no_repeat_ngram_size": 3,
35
+ "num_beams": 4,
36
+ "prefix": "summarize: "
37
+ },
38
+ "translation_en_to_de": {
39
+ "early_stopping": true,
40
+ "max_length": 300,
41
+ "num_beams": 4,
42
+ "prefix": "translate English to German: "
43
+ },
44
+ "translation_en_to_fr": {
45
+ "early_stopping": true,
46
+ "max_length": 300,
47
+ "num_beams": 4,
48
+ "prefix": "translate English to French: "
49
+ },
50
+ "translation_en_to_ro": {
51
+ "early_stopping": true,
52
+ "max_length": 300,
53
+ "num_beams": 4,
54
+ "prefix": "translate English to Romanian: "
55
+ }
56
+ },
57
+ "tie_word_embeddings": false,
58
+ "torch_dtype": "float32",
59
+ "transformers_version": "4.43.3",
60
+ "use_cache": true,
61
+ "vocab_size": 32128
62
+ }
eval_results.json ADDED
@@ -0,0 +1,9 @@
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "epoch": 2.0,
3
+ "eval_loss": 1.1517876386642456,
4
+ "eval_runtime": 2.8065,
5
+ "eval_samples": 2500,
6
+ "eval_samples_per_second": 890.786,
7
+ "eval_steps_per_second": 111.526,
8
+ "num_input_tokens_seen": 32939232
9
+ }
generation_config.json ADDED
@@ -0,0 +1,7 @@
 
 
 
 
 
 
 
 
1
+ {
2
+ "_from_model_config": true,
3
+ "decoder_start_token_id": 0,
4
+ "eos_token_id": 1,
5
+ "pad_token_id": 0,
6
+ "transformers_version": "4.43.3"
7
+ }
model.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:60b88aa834b0082713d975c6bfc35e06f3285d250cc0d36872a769c3993903b7
3
+ size 307867048
special_tokens_map.json ADDED
@@ -0,0 +1,125 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "additional_special_tokens": [
3
+ "<extra_id_0>",
4
+ "<extra_id_1>",
5
+ "<extra_id_2>",
6
+ "<extra_id_3>",
7
+ "<extra_id_4>",
8
+ "<extra_id_5>",
9
+ "<extra_id_6>",
10
+ "<extra_id_7>",
11
+ "<extra_id_8>",
12
+ "<extra_id_9>",
13
+ "<extra_id_10>",
14
+ "<extra_id_11>",
15
+ "<extra_id_12>",
16
+ "<extra_id_13>",
17
+ "<extra_id_14>",
18
+ "<extra_id_15>",
19
+ "<extra_id_16>",
20
+ "<extra_id_17>",
21
+ "<extra_id_18>",
22
+ "<extra_id_19>",
23
+ "<extra_id_20>",
24
+ "<extra_id_21>",
25
+ "<extra_id_22>",
26
+ "<extra_id_23>",
27
+ "<extra_id_24>",
28
+ "<extra_id_25>",
29
+ "<extra_id_26>",
30
+ "<extra_id_27>",
31
+ "<extra_id_28>",
32
+ "<extra_id_29>",
33
+ "<extra_id_30>",
34
+ "<extra_id_31>",
35
+ "<extra_id_32>",
36
+ "<extra_id_33>",
37
+ "<extra_id_34>",
38
+ "<extra_id_35>",
39
+ "<extra_id_36>",
40
+ "<extra_id_37>",
41
+ "<extra_id_38>",
42
+ "<extra_id_39>",
43
+ "<extra_id_40>",
44
+ "<extra_id_41>",
45
+ "<extra_id_42>",
46
+ "<extra_id_43>",
47
+ "<extra_id_44>",
48
+ "<extra_id_45>",
49
+ "<extra_id_46>",
50
+ "<extra_id_47>",
51
+ "<extra_id_48>",
52
+ "<extra_id_49>",
53
+ "<extra_id_50>",
54
+ "<extra_id_51>",
55
+ "<extra_id_52>",
56
+ "<extra_id_53>",
57
+ "<extra_id_54>",
58
+ "<extra_id_55>",
59
+ "<extra_id_56>",
60
+ "<extra_id_57>",
61
+ "<extra_id_58>",
62
+ "<extra_id_59>",
63
+ "<extra_id_60>",
64
+ "<extra_id_61>",
65
+ "<extra_id_62>",
66
+ "<extra_id_63>",
67
+ "<extra_id_64>",
68
+ "<extra_id_65>",
69
+ "<extra_id_66>",
70
+ "<extra_id_67>",
71
+ "<extra_id_68>",
72
+ "<extra_id_69>",
73
+ "<extra_id_70>",
74
+ "<extra_id_71>",
75
+ "<extra_id_72>",
76
+ "<extra_id_73>",
77
+ "<extra_id_74>",
78
+ "<extra_id_75>",
79
+ "<extra_id_76>",
80
+ "<extra_id_77>",
81
+ "<extra_id_78>",
82
+ "<extra_id_79>",
83
+ "<extra_id_80>",
84
+ "<extra_id_81>",
85
+ "<extra_id_82>",
86
+ "<extra_id_83>",
87
+ "<extra_id_84>",
88
+ "<extra_id_85>",
89
+ "<extra_id_86>",
90
+ "<extra_id_87>",
91
+ "<extra_id_88>",
92
+ "<extra_id_89>",
93
+ "<extra_id_90>",
94
+ "<extra_id_91>",
95
+ "<extra_id_92>",
96
+ "<extra_id_93>",
97
+ "<extra_id_94>",
98
+ "<extra_id_95>",
99
+ "<extra_id_96>",
100
+ "<extra_id_97>",
101
+ "<extra_id_98>",
102
+ "<extra_id_99>"
103
+ ],
104
+ "eos_token": {
105
+ "content": "</s>",
106
+ "lstrip": false,
107
+ "normalized": false,
108
+ "rstrip": false,
109
+ "single_word": false
110
+ },
111
+ "pad_token": {
112
+ "content": "<pad>",
113
+ "lstrip": false,
114
+ "normalized": false,
115
+ "rstrip": false,
116
+ "single_word": false
117
+ },
118
+ "unk_token": {
119
+ "content": "<unk>",
120
+ "lstrip": false,
121
+ "normalized": false,
122
+ "rstrip": false,
123
+ "single_word": false
124
+ }
125
+ }
spiece.model ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:d60acb128cf7b7f2536e8f38a5b18a05535c9e14c7a355904270e15b0945ea86
3
+ size 791656
tokenizer.json ADDED
The diff for this file is too large to render. See raw diff
 
tokenizer_config.json ADDED
@@ -0,0 +1,938 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "added_tokens_decoder": {
3
+ "0": {
4
+ "content": "<pad>",
5
+ "lstrip": false,
6
+ "normalized": false,
7
+ "rstrip": false,
8
+ "single_word": false,
9
+ "special": true
10
+ },
11
+ "1": {
12
+ "content": "</s>",
13
+ "lstrip": false,
14
+ "normalized": false,
15
+ "rstrip": false,
16
+ "single_word": false,
17
+ "special": true
18
+ },
19
+ "2": {
20
+ "content": "<unk>",
21
+ "lstrip": false,
22
+ "normalized": false,
23
+ "rstrip": false,
24
+ "single_word": false,
25
+ "special": true
26
+ },
27
+ "32000": {
28
+ "content": "<extra_id_99>",
29
+ "lstrip": false,
30
+ "normalized": false,
31
+ "rstrip": false,
32
+ "single_word": false,
33
+ "special": true
34
+ },
35
+ "32001": {
36
+ "content": "<extra_id_98>",
37
+ "lstrip": false,
38
+ "normalized": false,
39
+ "rstrip": false,
40
+ "single_word": false,
41
+ "special": true
42
+ },
43
+ "32002": {
44
+ "content": "<extra_id_97>",
45
+ "lstrip": false,
46
+ "normalized": false,
47
+ "rstrip": false,
48
+ "single_word": false,
49
+ "special": true
50
+ },
51
+ "32003": {
52
+ "content": "<extra_id_96>",
53
+ "lstrip": false,
54
+ "normalized": false,
55
+ "rstrip": false,
56
+ "single_word": false,
57
+ "special": true
58
+ },
59
+ "32004": {
60
+ "content": "<extra_id_95>",
61
+ "lstrip": false,
62
+ "normalized": false,
63
+ "rstrip": false,
64
+ "single_word": false,
65
+ "special": true
66
+ },
67
+ "32005": {
68
+ "content": "<extra_id_94>",
69
+ "lstrip": false,
70
+ "normalized": false,
71
+ "rstrip": false,
72
+ "single_word": false,
73
+ "special": true
74
+ },
75
+ "32006": {
76
+ "content": "<extra_id_93>",
77
+ "lstrip": false,
78
+ "normalized": false,
79
+ "rstrip": false,
80
+ "single_word": false,
81
+ "special": true
82
+ },
83
+ "32007": {
84
+ "content": "<extra_id_92>",
85
+ "lstrip": false,
86
+ "normalized": false,
87
+ "rstrip": false,
88
+ "single_word": false,
89
+ "special": true
90
+ },
91
+ "32008": {
92
+ "content": "<extra_id_91>",
93
+ "lstrip": false,
94
+ "normalized": false,
95
+ "rstrip": false,
96
+ "single_word": false,
97
+ "special": true
98
+ },
99
+ "32009": {
100
+ "content": "<extra_id_90>",
101
+ "lstrip": false,
102
+ "normalized": false,
103
+ "rstrip": false,
104
+ "single_word": false,
105
+ "special": true
106
+ },
107
+ "32010": {
108
+ "content": "<extra_id_89>",
109
+ "lstrip": false,
110
+ "normalized": false,
111
+ "rstrip": false,
112
+ "single_word": false,
113
+ "special": true
114
+ },
115
+ "32011": {
116
+ "content": "<extra_id_88>",
117
+ "lstrip": false,
118
+ "normalized": false,
119
+ "rstrip": false,
120
+ "single_word": false,
121
+ "special": true
122
+ },
123
+ "32012": {
124
+ "content": "<extra_id_87>",
125
+ "lstrip": false,
126
+ "normalized": false,
127
+ "rstrip": false,
128
+ "single_word": false,
129
+ "special": true
130
+ },
131
+ "32013": {
132
+ "content": "<extra_id_86>",
133
+ "lstrip": false,
134
+ "normalized": false,
135
+ "rstrip": false,
136
+ "single_word": false,
137
+ "special": true
138
+ },
139
+ "32014": {
140
+ "content": "<extra_id_85>",
141
+ "lstrip": false,
142
+ "normalized": false,
143
+ "rstrip": false,
144
+ "single_word": false,
145
+ "special": true
146
+ },
147
+ "32015": {
148
+ "content": "<extra_id_84>",
149
+ "lstrip": false,
150
+ "normalized": false,
151
+ "rstrip": false,
152
+ "single_word": false,
153
+ "special": true
154
+ },
155
+ "32016": {
156
+ "content": "<extra_id_83>",
157
+ "lstrip": false,
158
+ "normalized": false,
159
+ "rstrip": false,
160
+ "single_word": false,
161
+ "special": true
162
+ },
163
+ "32017": {
164
+ "content": "<extra_id_82>",
165
+ "lstrip": false,
166
+ "normalized": false,
167
+ "rstrip": false,
168
+ "single_word": false,
169
+ "special": true
170
+ },
171
+ "32018": {
172
+ "content": "<extra_id_81>",
173
+ "lstrip": false,
174
+ "normalized": false,
175
+ "rstrip": false,
176
+ "single_word": false,
177
+ "special": true
178
+ },
179
+ "32019": {
180
+ "content": "<extra_id_80>",
181
+ "lstrip": false,
182
+ "normalized": false,
183
+ "rstrip": false,
184
+ "single_word": false,
185
+ "special": true
186
+ },
187
+ "32020": {
188
+ "content": "<extra_id_79>",
189
+ "lstrip": false,
190
+ "normalized": false,
191
+ "rstrip": false,
192
+ "single_word": false,
193
+ "special": true
194
+ },
195
+ "32021": {
196
+ "content": "<extra_id_78>",
197
+ "lstrip": false,
198
+ "normalized": false,
199
+ "rstrip": false,
200
+ "single_word": false,
201
+ "special": true
202
+ },
203
+ "32022": {
204
+ "content": "<extra_id_77>",
205
+ "lstrip": false,
206
+ "normalized": false,
207
+ "rstrip": false,
208
+ "single_word": false,
209
+ "special": true
210
+ },
211
+ "32023": {
212
+ "content": "<extra_id_76>",
213
+ "lstrip": false,
214
+ "normalized": false,
215
+ "rstrip": false,
216
+ "single_word": false,
217
+ "special": true
218
+ },
219
+ "32024": {
220
+ "content": "<extra_id_75>",
221
+ "lstrip": false,
222
+ "normalized": false,
223
+ "rstrip": false,
224
+ "single_word": false,
225
+ "special": true
226
+ },
227
+ "32025": {
228
+ "content": "<extra_id_74>",
229
+ "lstrip": false,
230
+ "normalized": false,
231
+ "rstrip": false,
232
+ "single_word": false,
233
+ "special": true
234
+ },
235
+ "32026": {
236
+ "content": "<extra_id_73>",
237
+ "lstrip": false,
238
+ "normalized": false,
239
+ "rstrip": false,
240
+ "single_word": false,
241
+ "special": true
242
+ },
243
+ "32027": {
244
+ "content": "<extra_id_72>",
245
+ "lstrip": false,
246
+ "normalized": false,
247
+ "rstrip": false,
248
+ "single_word": false,
249
+ "special": true
250
+ },
251
+ "32028": {
252
+ "content": "<extra_id_71>",
253
+ "lstrip": false,
254
+ "normalized": false,
255
+ "rstrip": false,
256
+ "single_word": false,
257
+ "special": true
258
+ },
259
+ "32029": {
260
+ "content": "<extra_id_70>",
261
+ "lstrip": false,
262
+ "normalized": false,
263
+ "rstrip": false,
264
+ "single_word": false,
265
+ "special": true
266
+ },
267
+ "32030": {
268
+ "content": "<extra_id_69>",
269
+ "lstrip": false,
270
+ "normalized": false,
271
+ "rstrip": false,
272
+ "single_word": false,
273
+ "special": true
274
+ },
275
+ "32031": {
276
+ "content": "<extra_id_68>",
277
+ "lstrip": false,
278
+ "normalized": false,
279
+ "rstrip": false,
280
+ "single_word": false,
281
+ "special": true
282
+ },
283
+ "32032": {
284
+ "content": "<extra_id_67>",
285
+ "lstrip": false,
286
+ "normalized": false,
287
+ "rstrip": false,
288
+ "single_word": false,
289
+ "special": true
290
+ },
291
+ "32033": {
292
+ "content": "<extra_id_66>",
293
+ "lstrip": false,
294
+ "normalized": false,
295
+ "rstrip": false,
296
+ "single_word": false,
297
+ "special": true
298
+ },
299
+ "32034": {
300
+ "content": "<extra_id_65>",
301
+ "lstrip": false,
302
+ "normalized": false,
303
+ "rstrip": false,
304
+ "single_word": false,
305
+ "special": true
306
+ },
307
+ "32035": {
308
+ "content": "<extra_id_64>",
309
+ "lstrip": false,
310
+ "normalized": false,
311
+ "rstrip": false,
312
+ "single_word": false,
313
+ "special": true
314
+ },
315
+ "32036": {
316
+ "content": "<extra_id_63>",
317
+ "lstrip": false,
318
+ "normalized": false,
319
+ "rstrip": false,
320
+ "single_word": false,
321
+ "special": true
322
+ },
323
+ "32037": {
324
+ "content": "<extra_id_62>",
325
+ "lstrip": false,
326
+ "normalized": false,
327
+ "rstrip": false,
328
+ "single_word": false,
329
+ "special": true
330
+ },
331
+ "32038": {
332
+ "content": "<extra_id_61>",
333
+ "lstrip": false,
334
+ "normalized": false,
335
+ "rstrip": false,
336
+ "single_word": false,
337
+ "special": true
338
+ },
339
+ "32039": {
340
+ "content": "<extra_id_60>",
341
+ "lstrip": false,
342
+ "normalized": false,
343
+ "rstrip": false,
344
+ "single_word": false,
345
+ "special": true
346
+ },
347
+ "32040": {
348
+ "content": "<extra_id_59>",
349
+ "lstrip": false,
350
+ "normalized": false,
351
+ "rstrip": false,
352
+ "single_word": false,
353
+ "special": true
354
+ },
355
+ "32041": {
356
+ "content": "<extra_id_58>",
357
+ "lstrip": false,
358
+ "normalized": false,
359
+ "rstrip": false,
360
+ "single_word": false,
361
+ "special": true
362
+ },
363
+ "32042": {
364
+ "content": "<extra_id_57>",
365
+ "lstrip": false,
366
+ "normalized": false,
367
+ "rstrip": false,
368
+ "single_word": false,
369
+ "special": true
370
+ },
371
+ "32043": {
372
+ "content": "<extra_id_56>",
373
+ "lstrip": false,
374
+ "normalized": false,
375
+ "rstrip": false,
376
+ "single_word": false,
377
+ "special": true
378
+ },
379
+ "32044": {
380
+ "content": "<extra_id_55>",
381
+ "lstrip": false,
382
+ "normalized": false,
383
+ "rstrip": false,
384
+ "single_word": false,
385
+ "special": true
386
+ },
387
+ "32045": {
388
+ "content": "<extra_id_54>",
389
+ "lstrip": false,
390
+ "normalized": false,
391
+ "rstrip": false,
392
+ "single_word": false,
393
+ "special": true
394
+ },
395
+ "32046": {
396
+ "content": "<extra_id_53>",
397
+ "lstrip": false,
398
+ "normalized": false,
399
+ "rstrip": false,
400
+ "single_word": false,
401
+ "special": true
402
+ },
403
+ "32047": {
404
+ "content": "<extra_id_52>",
405
+ "lstrip": false,
406
+ "normalized": false,
407
+ "rstrip": false,
408
+ "single_word": false,
409
+ "special": true
410
+ },
411
+ "32048": {
412
+ "content": "<extra_id_51>",
413
+ "lstrip": false,
414
+ "normalized": false,
415
+ "rstrip": false,
416
+ "single_word": false,
417
+ "special": true
418
+ },
419
+ "32049": {
420
+ "content": "<extra_id_50>",
421
+ "lstrip": false,
422
+ "normalized": false,
423
+ "rstrip": false,
424
+ "single_word": false,
425
+ "special": true
426
+ },
427
+ "32050": {
428
+ "content": "<extra_id_49>",
429
+ "lstrip": false,
430
+ "normalized": false,
431
+ "rstrip": false,
432
+ "single_word": false,
433
+ "special": true
434
+ },
435
+ "32051": {
436
+ "content": "<extra_id_48>",
437
+ "lstrip": false,
438
+ "normalized": false,
439
+ "rstrip": false,
440
+ "single_word": false,
441
+ "special": true
442
+ },
443
+ "32052": {
444
+ "content": "<extra_id_47>",
445
+ "lstrip": false,
446
+ "normalized": false,
447
+ "rstrip": false,
448
+ "single_word": false,
449
+ "special": true
450
+ },
451
+ "32053": {
452
+ "content": "<extra_id_46>",
453
+ "lstrip": false,
454
+ "normalized": false,
455
+ "rstrip": false,
456
+ "single_word": false,
457
+ "special": true
458
+ },
459
+ "32054": {
460
+ "content": "<extra_id_45>",
461
+ "lstrip": false,
462
+ "normalized": false,
463
+ "rstrip": false,
464
+ "single_word": false,
465
+ "special": true
466
+ },
467
+ "32055": {
468
+ "content": "<extra_id_44>",
469
+ "lstrip": false,
470
+ "normalized": false,
471
+ "rstrip": false,
472
+ "single_word": false,
473
+ "special": true
474
+ },
475
+ "32056": {
476
+ "content": "<extra_id_43>",
477
+ "lstrip": false,
478
+ "normalized": false,
479
+ "rstrip": false,
480
+ "single_word": false,
481
+ "special": true
482
+ },
483
+ "32057": {
484
+ "content": "<extra_id_42>",
485
+ "lstrip": false,
486
+ "normalized": false,
487
+ "rstrip": false,
488
+ "single_word": false,
489
+ "special": true
490
+ },
491
+ "32058": {
492
+ "content": "<extra_id_41>",
493
+ "lstrip": false,
494
+ "normalized": false,
495
+ "rstrip": false,
496
+ "single_word": false,
497
+ "special": true
498
+ },
499
+ "32059": {
500
+ "content": "<extra_id_40>",
501
+ "lstrip": false,
502
+ "normalized": false,
503
+ "rstrip": false,
504
+ "single_word": false,
505
+ "special": true
506
+ },
507
+ "32060": {
508
+ "content": "<extra_id_39>",
509
+ "lstrip": false,
510
+ "normalized": false,
511
+ "rstrip": false,
512
+ "single_word": false,
513
+ "special": true
514
+ },
515
+ "32061": {
516
+ "content": "<extra_id_38>",
517
+ "lstrip": false,
518
+ "normalized": false,
519
+ "rstrip": false,
520
+ "single_word": false,
521
+ "special": true
522
+ },
523
+ "32062": {
524
+ "content": "<extra_id_37>",
525
+ "lstrip": false,
526
+ "normalized": false,
527
+ "rstrip": false,
528
+ "single_word": false,
529
+ "special": true
530
+ },
531
+ "32063": {
532
+ "content": "<extra_id_36>",
533
+ "lstrip": false,
534
+ "normalized": false,
535
+ "rstrip": false,
536
+ "single_word": false,
537
+ "special": true
538
+ },
539
+ "32064": {
540
+ "content": "<extra_id_35>",
541
+ "lstrip": false,
542
+ "normalized": false,
543
+ "rstrip": false,
544
+ "single_word": false,
545
+ "special": true
546
+ },
547
+ "32065": {
548
+ "content": "<extra_id_34>",
549
+ "lstrip": false,
550
+ "normalized": false,
551
+ "rstrip": false,
552
+ "single_word": false,
553
+ "special": true
554
+ },
555
+ "32066": {
556
+ "content": "<extra_id_33>",
557
+ "lstrip": false,
558
+ "normalized": false,
559
+ "rstrip": false,
560
+ "single_word": false,
561
+ "special": true
562
+ },
563
+ "32067": {
564
+ "content": "<extra_id_32>",
565
+ "lstrip": false,
566
+ "normalized": false,
567
+ "rstrip": false,
568
+ "single_word": false,
569
+ "special": true
570
+ },
571
+ "32068": {
572
+ "content": "<extra_id_31>",
573
+ "lstrip": false,
574
+ "normalized": false,
575
+ "rstrip": false,
576
+ "single_word": false,
577
+ "special": true
578
+ },
579
+ "32069": {
580
+ "content": "<extra_id_30>",
581
+ "lstrip": false,
582
+ "normalized": false,
583
+ "rstrip": false,
584
+ "single_word": false,
585
+ "special": true
586
+ },
587
+ "32070": {
588
+ "content": "<extra_id_29>",
589
+ "lstrip": false,
590
+ "normalized": false,
591
+ "rstrip": false,
592
+ "single_word": false,
593
+ "special": true
594
+ },
595
+ "32071": {
596
+ "content": "<extra_id_28>",
597
+ "lstrip": false,
598
+ "normalized": false,
599
+ "rstrip": false,
600
+ "single_word": false,
601
+ "special": true
602
+ },
603
+ "32072": {
604
+ "content": "<extra_id_27>",
605
+ "lstrip": false,
606
+ "normalized": false,
607
+ "rstrip": false,
608
+ "single_word": false,
609
+ "special": true
610
+ },
611
+ "32073": {
612
+ "content": "<extra_id_26>",
613
+ "lstrip": false,
614
+ "normalized": false,
615
+ "rstrip": false,
616
+ "single_word": false,
617
+ "special": true
618
+ },
619
+ "32074": {
620
+ "content": "<extra_id_25>",
621
+ "lstrip": false,
622
+ "normalized": false,
623
+ "rstrip": false,
624
+ "single_word": false,
625
+ "special": true
626
+ },
627
+ "32075": {
628
+ "content": "<extra_id_24>",
629
+ "lstrip": false,
630
+ "normalized": false,
631
+ "rstrip": false,
632
+ "single_word": false,
633
+ "special": true
634
+ },
635
+ "32076": {
636
+ "content": "<extra_id_23>",
637
+ "lstrip": false,
638
+ "normalized": false,
639
+ "rstrip": false,
640
+ "single_word": false,
641
+ "special": true
642
+ },
643
+ "32077": {
644
+ "content": "<extra_id_22>",
645
+ "lstrip": false,
646
+ "normalized": false,
647
+ "rstrip": false,
648
+ "single_word": false,
649
+ "special": true
650
+ },
651
+ "32078": {
652
+ "content": "<extra_id_21>",
653
+ "lstrip": false,
654
+ "normalized": false,
655
+ "rstrip": false,
656
+ "single_word": false,
657
+ "special": true
658
+ },
659
+ "32079": {
660
+ "content": "<extra_id_20>",
661
+ "lstrip": false,
662
+ "normalized": false,
663
+ "rstrip": false,
664
+ "single_word": false,
665
+ "special": true
666
+ },
667
+ "32080": {
668
+ "content": "<extra_id_19>",
669
+ "lstrip": false,
670
+ "normalized": false,
671
+ "rstrip": false,
672
+ "single_word": false,
673
+ "special": true
674
+ },
675
+ "32081": {
676
+ "content": "<extra_id_18>",
677
+ "lstrip": false,
678
+ "normalized": false,
679
+ "rstrip": false,
680
+ "single_word": false,
681
+ "special": true
682
+ },
683
+ "32082": {
684
+ "content": "<extra_id_17>",
685
+ "lstrip": false,
686
+ "normalized": false,
687
+ "rstrip": false,
688
+ "single_word": false,
689
+ "special": true
690
+ },
691
+ "32083": {
692
+ "content": "<extra_id_16>",
693
+ "lstrip": false,
694
+ "normalized": false,
695
+ "rstrip": false,
696
+ "single_word": false,
697
+ "special": true
698
+ },
699
+ "32084": {
700
+ "content": "<extra_id_15>",
701
+ "lstrip": false,
702
+ "normalized": false,
703
+ "rstrip": false,
704
+ "single_word": false,
705
+ "special": true
706
+ },
707
+ "32085": {
708
+ "content": "<extra_id_14>",
709
+ "lstrip": false,
710
+ "normalized": false,
711
+ "rstrip": false,
712
+ "single_word": false,
713
+ "special": true
714
+ },
715
+ "32086": {
716
+ "content": "<extra_id_13>",
717
+ "lstrip": false,
718
+ "normalized": false,
719
+ "rstrip": false,
720
+ "single_word": false,
721
+ "special": true
722
+ },
723
+ "32087": {
724
+ "content": "<extra_id_12>",
725
+ "lstrip": false,
726
+ "normalized": false,
727
+ "rstrip": false,
728
+ "single_word": false,
729
+ "special": true
730
+ },
731
+ "32088": {
732
+ "content": "<extra_id_11>",
733
+ "lstrip": false,
734
+ "normalized": false,
735
+ "rstrip": false,
736
+ "single_word": false,
737
+ "special": true
738
+ },
739
+ "32089": {
740
+ "content": "<extra_id_10>",
741
+ "lstrip": false,
742
+ "normalized": false,
743
+ "rstrip": false,
744
+ "single_word": false,
745
+ "special": true
746
+ },
747
+ "32090": {
748
+ "content": "<extra_id_9>",
749
+ "lstrip": false,
750
+ "normalized": false,
751
+ "rstrip": false,
752
+ "single_word": false,
753
+ "special": true
754
+ },
755
+ "32091": {
756
+ "content": "<extra_id_8>",
757
+ "lstrip": false,
758
+ "normalized": false,
759
+ "rstrip": false,
760
+ "single_word": false,
761
+ "special": true
762
+ },
763
+ "32092": {
764
+ "content": "<extra_id_7>",
765
+ "lstrip": false,
766
+ "normalized": false,
767
+ "rstrip": false,
768
+ "single_word": false,
769
+ "special": true
770
+ },
771
+ "32093": {
772
+ "content": "<extra_id_6>",
773
+ "lstrip": false,
774
+ "normalized": false,
775
+ "rstrip": false,
776
+ "single_word": false,
777
+ "special": true
778
+ },
779
+ "32094": {
780
+ "content": "<extra_id_5>",
781
+ "lstrip": false,
782
+ "normalized": false,
783
+ "rstrip": false,
784
+ "single_word": false,
785
+ "special": true
786
+ },
787
+ "32095": {
788
+ "content": "<extra_id_4>",
789
+ "lstrip": false,
790
+ "normalized": false,
791
+ "rstrip": false,
792
+ "single_word": false,
793
+ "special": true
794
+ },
795
+ "32096": {
796
+ "content": "<extra_id_3>",
797
+ "lstrip": false,
798
+ "normalized": false,
799
+ "rstrip": false,
800
+ "single_word": false,
801
+ "special": true
802
+ },
803
+ "32097": {
804
+ "content": "<extra_id_2>",
805
+ "lstrip": false,
806
+ "normalized": false,
807
+ "rstrip": false,
808
+ "single_word": false,
809
+ "special": true
810
+ },
811
+ "32098": {
812
+ "content": "<extra_id_1>",
813
+ "lstrip": false,
814
+ "normalized": false,
815
+ "rstrip": false,
816
+ "single_word": false,
817
+ "special": true
818
+ },
819
+ "32099": {
820
+ "content": "<extra_id_0>",
821
+ "lstrip": false,
822
+ "normalized": false,
823
+ "rstrip": false,
824
+ "single_word": false,
825
+ "special": true
826
+ }
827
+ },
828
+ "additional_special_tokens": [
829
+ "<extra_id_0>",
830
+ "<extra_id_1>",
831
+ "<extra_id_2>",
832
+ "<extra_id_3>",
833
+ "<extra_id_4>",
834
+ "<extra_id_5>",
835
+ "<extra_id_6>",
836
+ "<extra_id_7>",
837
+ "<extra_id_8>",
838
+ "<extra_id_9>",
839
+ "<extra_id_10>",
840
+ "<extra_id_11>",
841
+ "<extra_id_12>",
842
+ "<extra_id_13>",
843
+ "<extra_id_14>",
844
+ "<extra_id_15>",
845
+ "<extra_id_16>",
846
+ "<extra_id_17>",
847
+ "<extra_id_18>",
848
+ "<extra_id_19>",
849
+ "<extra_id_20>",
850
+ "<extra_id_21>",
851
+ "<extra_id_22>",
852
+ "<extra_id_23>",
853
+ "<extra_id_24>",
854
+ "<extra_id_25>",
855
+ "<extra_id_26>",
856
+ "<extra_id_27>",
857
+ "<extra_id_28>",
858
+ "<extra_id_29>",
859
+ "<extra_id_30>",
860
+ "<extra_id_31>",
861
+ "<extra_id_32>",
862
+ "<extra_id_33>",
863
+ "<extra_id_34>",
864
+ "<extra_id_35>",
865
+ "<extra_id_36>",
866
+ "<extra_id_37>",
867
+ "<extra_id_38>",
868
+ "<extra_id_39>",
869
+ "<extra_id_40>",
870
+ "<extra_id_41>",
871
+ "<extra_id_42>",
872
+ "<extra_id_43>",
873
+ "<extra_id_44>",
874
+ "<extra_id_45>",
875
+ "<extra_id_46>",
876
+ "<extra_id_47>",
877
+ "<extra_id_48>",
878
+ "<extra_id_49>",
879
+ "<extra_id_50>",
880
+ "<extra_id_51>",
881
+ "<extra_id_52>",
882
+ "<extra_id_53>",
883
+ "<extra_id_54>",
884
+ "<extra_id_55>",
885
+ "<extra_id_56>",
886
+ "<extra_id_57>",
887
+ "<extra_id_58>",
888
+ "<extra_id_59>",
889
+ "<extra_id_60>",
890
+ "<extra_id_61>",
891
+ "<extra_id_62>",
892
+ "<extra_id_63>",
893
+ "<extra_id_64>",
894
+ "<extra_id_65>",
895
+ "<extra_id_66>",
896
+ "<extra_id_67>",
897
+ "<extra_id_68>",
898
+ "<extra_id_69>",
899
+ "<extra_id_70>",
900
+ "<extra_id_71>",
901
+ "<extra_id_72>",
902
+ "<extra_id_73>",
903
+ "<extra_id_74>",
904
+ "<extra_id_75>",
905
+ "<extra_id_76>",
906
+ "<extra_id_77>",
907
+ "<extra_id_78>",
908
+ "<extra_id_79>",
909
+ "<extra_id_80>",
910
+ "<extra_id_81>",
911
+ "<extra_id_82>",
912
+ "<extra_id_83>",
913
+ "<extra_id_84>",
914
+ "<extra_id_85>",
915
+ "<extra_id_86>",
916
+ "<extra_id_87>",
917
+ "<extra_id_88>",
918
+ "<extra_id_89>",
919
+ "<extra_id_90>",
920
+ "<extra_id_91>",
921
+ "<extra_id_92>",
922
+ "<extra_id_93>",
923
+ "<extra_id_94>",
924
+ "<extra_id_95>",
925
+ "<extra_id_96>",
926
+ "<extra_id_97>",
927
+ "<extra_id_98>",
928
+ "<extra_id_99>"
929
+ ],
930
+ "clean_up_tokenization_spaces": true,
931
+ "eos_token": "</s>",
932
+ "extra_ids": 100,
933
+ "model_max_length": 512,
934
+ "pad_token": "<pad>",
935
+ "sp_model_kwargs": {},
936
+ "tokenizer_class": "T5Tokenizer",
937
+ "unk_token": "<unk>"
938
+ }
train_results.json ADDED
@@ -0,0 +1,11 @@
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "epoch": 2.0,
3
+ "num_input_tokens_seen": 32939232,
4
+ "total_flos": 1.1959161056722944e+16,
5
+ "train_loss": 1.358397777055082,
6
+ "train_runtime": 3698.2072,
7
+ "train_samples": 359680,
8
+ "train_samples_per_second": 194.516,
9
+ "train_steps_per_second": 24.314,
10
+ "train_tokens_per_second": 8905.5
11
+ }
trainer_state.json ADDED
@@ -0,0 +1,1548 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "best_metric": 1.1517876386642456,
3
+ "best_model_checkpoint": "/media/user/Expansion/flan-t5-small-simplifier/checkpoint-80000",
4
+ "epoch": 2.0,
5
+ "eval_steps": 10000,
6
+ "global_step": 89920,
7
+ "is_hyper_param_search": false,
8
+ "is_local_process_zero": true,
9
+ "is_world_process_zero": true,
10
+ "log_history": [
11
+ {
12
+ "epoch": 0.01112099644128114,
13
+ "grad_norm": 3.2175512313842773,
14
+ "learning_rate": 4.972197508896798e-05,
15
+ "loss": 1.6509,
16
+ "num_input_tokens_seen": 183488,
17
+ "step": 500
18
+ },
19
+ {
20
+ "epoch": 0.02224199288256228,
21
+ "grad_norm": 3.9029898643493652,
22
+ "learning_rate": 4.9443950177935946e-05,
23
+ "loss": 1.583,
24
+ "num_input_tokens_seen": 363744,
25
+ "step": 1000
26
+ },
27
+ {
28
+ "epoch": 0.03336298932384341,
29
+ "grad_norm": 3.4091885089874268,
30
+ "learning_rate": 4.9165925266903915e-05,
31
+ "loss": 1.5553,
32
+ "num_input_tokens_seen": 547584,
33
+ "step": 1500
34
+ },
35
+ {
36
+ "epoch": 0.04448398576512456,
37
+ "grad_norm": 2.5965495109558105,
38
+ "learning_rate": 4.888790035587189e-05,
39
+ "loss": 1.5444,
40
+ "num_input_tokens_seen": 732480,
41
+ "step": 2000
42
+ },
43
+ {
44
+ "epoch": 0.055604982206405695,
45
+ "grad_norm": 4.170108318328857,
46
+ "learning_rate": 4.860987544483986e-05,
47
+ "loss": 1.5137,
48
+ "num_input_tokens_seen": 916672,
49
+ "step": 2500
50
+ },
51
+ {
52
+ "epoch": 0.06672597864768683,
53
+ "grad_norm": 3.025068759918213,
54
+ "learning_rate": 4.8331850533807835e-05,
55
+ "loss": 1.5057,
56
+ "num_input_tokens_seen": 1095952,
57
+ "step": 3000
58
+ },
59
+ {
60
+ "epoch": 0.07784697508896797,
61
+ "grad_norm": 2.7047712802886963,
62
+ "learning_rate": 4.80538256227758e-05,
63
+ "loss": 1.5019,
64
+ "num_input_tokens_seen": 1276936,
65
+ "step": 3500
66
+ },
67
+ {
68
+ "epoch": 0.08896797153024912,
69
+ "grad_norm": 2.844285488128662,
70
+ "learning_rate": 4.777580071174377e-05,
71
+ "loss": 1.5074,
72
+ "num_input_tokens_seen": 1464080,
73
+ "step": 4000
74
+ },
75
+ {
76
+ "epoch": 0.10008896797153025,
77
+ "grad_norm": 3.055643081665039,
78
+ "learning_rate": 4.749777580071175e-05,
79
+ "loss": 1.4959,
80
+ "num_input_tokens_seen": 1647288,
81
+ "step": 4500
82
+ },
83
+ {
84
+ "epoch": 0.11120996441281139,
85
+ "grad_norm": 2.6245856285095215,
86
+ "learning_rate": 4.721975088967972e-05,
87
+ "loss": 1.4777,
88
+ "num_input_tokens_seen": 1836832,
89
+ "step": 5000
90
+ },
91
+ {
92
+ "epoch": 0.12233096085409252,
93
+ "grad_norm": 3.1758244037628174,
94
+ "learning_rate": 4.694172597864769e-05,
95
+ "loss": 1.4778,
96
+ "num_input_tokens_seen": 2020728,
97
+ "step": 5500
98
+ },
99
+ {
100
+ "epoch": 0.13345195729537365,
101
+ "grad_norm": 2.518728494644165,
102
+ "learning_rate": 4.666370106761566e-05,
103
+ "loss": 1.4758,
104
+ "num_input_tokens_seen": 2198968,
105
+ "step": 6000
106
+ },
107
+ {
108
+ "epoch": 0.1445729537366548,
109
+ "grad_norm": 3.8143937587738037,
110
+ "learning_rate": 4.638567615658363e-05,
111
+ "loss": 1.4846,
112
+ "num_input_tokens_seen": 2381968,
113
+ "step": 6500
114
+ },
115
+ {
116
+ "epoch": 0.15569395017793594,
117
+ "grad_norm": 2.766146421432495,
118
+ "learning_rate": 4.6107651245551604e-05,
119
+ "loss": 1.4468,
120
+ "num_input_tokens_seen": 2562176,
121
+ "step": 7000
122
+ },
123
+ {
124
+ "epoch": 0.16681494661921709,
125
+ "grad_norm": 3.891373634338379,
126
+ "learning_rate": 4.582962633451958e-05,
127
+ "loss": 1.4586,
128
+ "num_input_tokens_seen": 2744688,
129
+ "step": 7500
130
+ },
131
+ {
132
+ "epoch": 0.17793594306049823,
133
+ "grad_norm": 3.277316093444824,
134
+ "learning_rate": 4.555160142348754e-05,
135
+ "loss": 1.4606,
136
+ "num_input_tokens_seen": 2926000,
137
+ "step": 8000
138
+ },
139
+ {
140
+ "epoch": 0.18905693950177935,
141
+ "grad_norm": 3.242478132247925,
142
+ "learning_rate": 4.5273576512455517e-05,
143
+ "loss": 1.446,
144
+ "num_input_tokens_seen": 3108520,
145
+ "step": 8500
146
+ },
147
+ {
148
+ "epoch": 0.2001779359430605,
149
+ "grad_norm": 2.3061795234680176,
150
+ "learning_rate": 4.499555160142349e-05,
151
+ "loss": 1.4348,
152
+ "num_input_tokens_seen": 3289352,
153
+ "step": 9000
154
+ },
155
+ {
156
+ "epoch": 0.21129893238434164,
157
+ "grad_norm": 3.4106180667877197,
158
+ "learning_rate": 4.471752669039146e-05,
159
+ "loss": 1.4345,
160
+ "num_input_tokens_seen": 3473784,
161
+ "step": 9500
162
+ },
163
+ {
164
+ "epoch": 0.22241992882562278,
165
+ "grad_norm": 2.88779354095459,
166
+ "learning_rate": 4.443950177935943e-05,
167
+ "loss": 1.4423,
168
+ "num_input_tokens_seen": 3655312,
169
+ "step": 10000
170
+ },
171
+ {
172
+ "epoch": 0.22241992882562278,
173
+ "eval_loss": 1.2430843114852905,
174
+ "eval_runtime": 2.8218,
175
+ "eval_samples_per_second": 885.962,
176
+ "eval_steps_per_second": 110.922,
177
+ "num_input_tokens_seen": 3655312,
178
+ "step": 10000
179
+ },
180
+ {
181
+ "epoch": 0.23354092526690393,
182
+ "grad_norm": 2.829268217086792,
183
+ "learning_rate": 4.4161476868327405e-05,
184
+ "loss": 1.4321,
185
+ "num_input_tokens_seen": 3847328,
186
+ "step": 10500
187
+ },
188
+ {
189
+ "epoch": 0.24466192170818504,
190
+ "grad_norm": 3.807185173034668,
191
+ "learning_rate": 4.388345195729537e-05,
192
+ "loss": 1.4487,
193
+ "num_input_tokens_seen": 4025576,
194
+ "step": 11000
195
+ },
196
+ {
197
+ "epoch": 0.2557829181494662,
198
+ "grad_norm": 2.449007511138916,
199
+ "learning_rate": 4.360542704626335e-05,
200
+ "loss": 1.413,
201
+ "num_input_tokens_seen": 4213832,
202
+ "step": 11500
203
+ },
204
+ {
205
+ "epoch": 0.2669039145907473,
206
+ "grad_norm": 3.363555431365967,
207
+ "learning_rate": 4.3327402135231324e-05,
208
+ "loss": 1.4313,
209
+ "num_input_tokens_seen": 4395352,
210
+ "step": 12000
211
+ },
212
+ {
213
+ "epoch": 0.2780249110320285,
214
+ "grad_norm": 2.702747344970703,
215
+ "learning_rate": 4.3049377224199286e-05,
216
+ "loss": 1.4179,
217
+ "num_input_tokens_seen": 4582368,
218
+ "step": 12500
219
+ },
220
+ {
221
+ "epoch": 0.2891459074733096,
222
+ "grad_norm": 3.286044120788574,
223
+ "learning_rate": 4.277135231316726e-05,
224
+ "loss": 1.4137,
225
+ "num_input_tokens_seen": 4770640,
226
+ "step": 13000
227
+ },
228
+ {
229
+ "epoch": 0.30026690391459077,
230
+ "grad_norm": 2.6410391330718994,
231
+ "learning_rate": 4.249332740213524e-05,
232
+ "loss": 1.4303,
233
+ "num_input_tokens_seen": 4951280,
234
+ "step": 13500
235
+ },
236
+ {
237
+ "epoch": 0.3113879003558719,
238
+ "grad_norm": 3.239133358001709,
239
+ "learning_rate": 4.2215302491103205e-05,
240
+ "loss": 1.4185,
241
+ "num_input_tokens_seen": 5133552,
242
+ "step": 14000
243
+ },
244
+ {
245
+ "epoch": 0.322508896797153,
246
+ "grad_norm": 2.943094253540039,
247
+ "learning_rate": 4.1937277580071174e-05,
248
+ "loss": 1.4065,
249
+ "num_input_tokens_seen": 5319992,
250
+ "step": 14500
251
+ },
252
+ {
253
+ "epoch": 0.33362989323843417,
254
+ "grad_norm": 2.180136203765869,
255
+ "learning_rate": 4.165925266903915e-05,
256
+ "loss": 1.4096,
257
+ "num_input_tokens_seen": 5501592,
258
+ "step": 15000
259
+ },
260
+ {
261
+ "epoch": 0.3447508896797153,
262
+ "grad_norm": 2.4302403926849365,
263
+ "learning_rate": 4.138122775800712e-05,
264
+ "loss": 1.4203,
265
+ "num_input_tokens_seen": 5689312,
266
+ "step": 15500
267
+ },
268
+ {
269
+ "epoch": 0.35587188612099646,
270
+ "grad_norm": 2.850964069366455,
271
+ "learning_rate": 4.1103202846975093e-05,
272
+ "loss": 1.4291,
273
+ "num_input_tokens_seen": 5870384,
274
+ "step": 16000
275
+ },
276
+ {
277
+ "epoch": 0.3669928825622776,
278
+ "grad_norm": 1.9641114473342896,
279
+ "learning_rate": 4.082517793594306e-05,
280
+ "loss": 1.4054,
281
+ "num_input_tokens_seen": 6048624,
282
+ "step": 16500
283
+ },
284
+ {
285
+ "epoch": 0.3781138790035587,
286
+ "grad_norm": 2.287353754043579,
287
+ "learning_rate": 4.054715302491103e-05,
288
+ "loss": 1.4118,
289
+ "num_input_tokens_seen": 6229728,
290
+ "step": 17000
291
+ },
292
+ {
293
+ "epoch": 0.38923487544483987,
294
+ "grad_norm": 3.4425182342529297,
295
+ "learning_rate": 4.0269128113879006e-05,
296
+ "loss": 1.4061,
297
+ "num_input_tokens_seen": 6410968,
298
+ "step": 17500
299
+ },
300
+ {
301
+ "epoch": 0.400355871886121,
302
+ "grad_norm": 2.0604770183563232,
303
+ "learning_rate": 3.9991103202846975e-05,
304
+ "loss": 1.3871,
305
+ "num_input_tokens_seen": 6591312,
306
+ "step": 18000
307
+ },
308
+ {
309
+ "epoch": 0.41147686832740216,
310
+ "grad_norm": 2.671599864959717,
311
+ "learning_rate": 3.971307829181495e-05,
312
+ "loss": 1.42,
313
+ "num_input_tokens_seen": 6777912,
314
+ "step": 18500
315
+ },
316
+ {
317
+ "epoch": 0.4225978647686833,
318
+ "grad_norm": 2.176579475402832,
319
+ "learning_rate": 3.943505338078292e-05,
320
+ "loss": 1.417,
321
+ "num_input_tokens_seen": 6964240,
322
+ "step": 19000
323
+ },
324
+ {
325
+ "epoch": 0.4337188612099644,
326
+ "grad_norm": 3.2378785610198975,
327
+ "learning_rate": 3.915702846975089e-05,
328
+ "loss": 1.3992,
329
+ "num_input_tokens_seen": 7147288,
330
+ "step": 19500
331
+ },
332
+ {
333
+ "epoch": 0.44483985765124556,
334
+ "grad_norm": 3.0986084938049316,
335
+ "learning_rate": 3.887900355871886e-05,
336
+ "loss": 1.3884,
337
+ "num_input_tokens_seen": 7331520,
338
+ "step": 20000
339
+ },
340
+ {
341
+ "epoch": 0.44483985765124556,
342
+ "eval_loss": 1.2093411684036255,
343
+ "eval_runtime": 2.7984,
344
+ "eval_samples_per_second": 893.355,
345
+ "eval_steps_per_second": 111.848,
346
+ "num_input_tokens_seen": 7331520,
347
+ "step": 20000
348
+ },
349
+ {
350
+ "epoch": 0.4559608540925267,
351
+ "grad_norm": 3.4998202323913574,
352
+ "learning_rate": 3.860097864768684e-05,
353
+ "loss": 1.3915,
354
+ "num_input_tokens_seen": 7515512,
355
+ "step": 20500
356
+ },
357
+ {
358
+ "epoch": 0.46708185053380785,
359
+ "grad_norm": 3.0249533653259277,
360
+ "learning_rate": 3.832295373665481e-05,
361
+ "loss": 1.3967,
362
+ "num_input_tokens_seen": 7696752,
363
+ "step": 21000
364
+ },
365
+ {
366
+ "epoch": 0.47820284697508897,
367
+ "grad_norm": 6.8868513107299805,
368
+ "learning_rate": 3.8044928825622775e-05,
369
+ "loss": 1.4106,
370
+ "num_input_tokens_seen": 7878832,
371
+ "step": 21500
372
+ },
373
+ {
374
+ "epoch": 0.4893238434163701,
375
+ "grad_norm": 2.2134385108947754,
376
+ "learning_rate": 3.776690391459075e-05,
377
+ "loss": 1.3847,
378
+ "num_input_tokens_seen": 8059592,
379
+ "step": 22000
380
+ },
381
+ {
382
+ "epoch": 0.5004448398576512,
383
+ "grad_norm": 2.2698676586151123,
384
+ "learning_rate": 3.748887900355872e-05,
385
+ "loss": 1.3941,
386
+ "num_input_tokens_seen": 8245432,
387
+ "step": 22500
388
+ },
389
+ {
390
+ "epoch": 0.5115658362989324,
391
+ "grad_norm": 2.4593448638916016,
392
+ "learning_rate": 3.7210854092526695e-05,
393
+ "loss": 1.3716,
394
+ "num_input_tokens_seen": 8429400,
395
+ "step": 23000
396
+ },
397
+ {
398
+ "epoch": 0.5226868327402135,
399
+ "grad_norm": 2.5121207237243652,
400
+ "learning_rate": 3.6932829181494664e-05,
401
+ "loss": 1.3733,
402
+ "num_input_tokens_seen": 8618216,
403
+ "step": 23500
404
+ },
405
+ {
406
+ "epoch": 0.5338078291814946,
407
+ "grad_norm": 2.2493703365325928,
408
+ "learning_rate": 3.665480427046263e-05,
409
+ "loss": 1.3754,
410
+ "num_input_tokens_seen": 8801640,
411
+ "step": 24000
412
+ },
413
+ {
414
+ "epoch": 0.5449288256227758,
415
+ "grad_norm": 3.08921217918396,
416
+ "learning_rate": 3.637677935943061e-05,
417
+ "loss": 1.3694,
418
+ "num_input_tokens_seen": 8977800,
419
+ "step": 24500
420
+ },
421
+ {
422
+ "epoch": 0.556049822064057,
423
+ "grad_norm": 2.215222120285034,
424
+ "learning_rate": 3.609875444839858e-05,
425
+ "loss": 1.3674,
426
+ "num_input_tokens_seen": 9160904,
427
+ "step": 25000
428
+ },
429
+ {
430
+ "epoch": 0.5671708185053381,
431
+ "grad_norm": 2.553903818130493,
432
+ "learning_rate": 3.582072953736655e-05,
433
+ "loss": 1.3735,
434
+ "num_input_tokens_seen": 9348344,
435
+ "step": 25500
436
+ },
437
+ {
438
+ "epoch": 0.5782918149466192,
439
+ "grad_norm": 2.546022891998291,
440
+ "learning_rate": 3.554270462633452e-05,
441
+ "loss": 1.382,
442
+ "num_input_tokens_seen": 9532664,
443
+ "step": 26000
444
+ },
445
+ {
446
+ "epoch": 0.5894128113879004,
447
+ "grad_norm": 2.917534112930298,
448
+ "learning_rate": 3.5264679715302496e-05,
449
+ "loss": 1.3654,
450
+ "num_input_tokens_seen": 9717800,
451
+ "step": 26500
452
+ },
453
+ {
454
+ "epoch": 0.6005338078291815,
455
+ "grad_norm": 3.355299472808838,
456
+ "learning_rate": 3.4986654804270464e-05,
457
+ "loss": 1.3876,
458
+ "num_input_tokens_seen": 9902536,
459
+ "step": 27000
460
+ },
461
+ {
462
+ "epoch": 0.6116548042704626,
463
+ "grad_norm": 2.67924427986145,
464
+ "learning_rate": 3.470862989323844e-05,
465
+ "loss": 1.3575,
466
+ "num_input_tokens_seen": 10082216,
467
+ "step": 27500
468
+ },
469
+ {
470
+ "epoch": 0.6227758007117438,
471
+ "grad_norm": 3.040212392807007,
472
+ "learning_rate": 3.44306049822064e-05,
473
+ "loss": 1.3644,
474
+ "num_input_tokens_seen": 10263992,
475
+ "step": 28000
476
+ },
477
+ {
478
+ "epoch": 0.6338967971530249,
479
+ "grad_norm": 3.726254940032959,
480
+ "learning_rate": 3.415258007117438e-05,
481
+ "loss": 1.3616,
482
+ "num_input_tokens_seen": 10444128,
483
+ "step": 28500
484
+ },
485
+ {
486
+ "epoch": 0.645017793594306,
487
+ "grad_norm": 4.716592788696289,
488
+ "learning_rate": 3.387455516014235e-05,
489
+ "loss": 1.3662,
490
+ "num_input_tokens_seen": 10626928,
491
+ "step": 29000
492
+ },
493
+ {
494
+ "epoch": 0.6561387900355872,
495
+ "grad_norm": 2.9317166805267334,
496
+ "learning_rate": 3.359653024911032e-05,
497
+ "loss": 1.38,
498
+ "num_input_tokens_seen": 10808960,
499
+ "step": 29500
500
+ },
501
+ {
502
+ "epoch": 0.6672597864768683,
503
+ "grad_norm": 2.411684989929199,
504
+ "learning_rate": 3.331850533807829e-05,
505
+ "loss": 1.3782,
506
+ "num_input_tokens_seen": 10990432,
507
+ "step": 30000
508
+ },
509
+ {
510
+ "epoch": 0.6672597864768683,
511
+ "eval_loss": 1.185857892036438,
512
+ "eval_runtime": 2.9618,
513
+ "eval_samples_per_second": 844.079,
514
+ "eval_steps_per_second": 105.679,
515
+ "num_input_tokens_seen": 10990432,
516
+ "step": 30000
517
+ },
518
+ {
519
+ "epoch": 0.6783807829181495,
520
+ "grad_norm": 2.765089273452759,
521
+ "learning_rate": 3.3040480427046265e-05,
522
+ "loss": 1.3808,
523
+ "num_input_tokens_seen": 11172032,
524
+ "step": 30500
525
+ },
526
+ {
527
+ "epoch": 0.6895017793594306,
528
+ "grad_norm": 2.808806896209717,
529
+ "learning_rate": 3.2762455516014234e-05,
530
+ "loss": 1.3925,
531
+ "num_input_tokens_seen": 11356608,
532
+ "step": 31000
533
+ },
534
+ {
535
+ "epoch": 0.7006227758007118,
536
+ "grad_norm": 2.6583220958709717,
537
+ "learning_rate": 3.248443060498221e-05,
538
+ "loss": 1.3716,
539
+ "num_input_tokens_seen": 11538056,
540
+ "step": 31500
541
+ },
542
+ {
543
+ "epoch": 0.7117437722419929,
544
+ "grad_norm": 2.2725088596343994,
545
+ "learning_rate": 3.2206405693950184e-05,
546
+ "loss": 1.3689,
547
+ "num_input_tokens_seen": 11721264,
548
+ "step": 32000
549
+ },
550
+ {
551
+ "epoch": 0.722864768683274,
552
+ "grad_norm": 2.927656412124634,
553
+ "learning_rate": 3.1928380782918146e-05,
554
+ "loss": 1.3722,
555
+ "num_input_tokens_seen": 11896688,
556
+ "step": 32500
557
+ },
558
+ {
559
+ "epoch": 0.7339857651245552,
560
+ "grad_norm": 2.0601186752319336,
561
+ "learning_rate": 3.165035587188612e-05,
562
+ "loss": 1.3408,
563
+ "num_input_tokens_seen": 12084440,
564
+ "step": 33000
565
+ },
566
+ {
567
+ "epoch": 0.7451067615658363,
568
+ "grad_norm": 2.5769150257110596,
569
+ "learning_rate": 3.13723309608541e-05,
570
+ "loss": 1.3874,
571
+ "num_input_tokens_seen": 12264224,
572
+ "step": 33500
573
+ },
574
+ {
575
+ "epoch": 0.7562277580071174,
576
+ "grad_norm": 2.845653772354126,
577
+ "learning_rate": 3.1094306049822066e-05,
578
+ "loss": 1.3755,
579
+ "num_input_tokens_seen": 12446200,
580
+ "step": 34000
581
+ },
582
+ {
583
+ "epoch": 0.7673487544483986,
584
+ "grad_norm": 2.3848676681518555,
585
+ "learning_rate": 3.0816281138790034e-05,
586
+ "loss": 1.3463,
587
+ "num_input_tokens_seen": 12628992,
588
+ "step": 34500
589
+ },
590
+ {
591
+ "epoch": 0.7784697508896797,
592
+ "grad_norm": 3.2360849380493164,
593
+ "learning_rate": 3.053825622775801e-05,
594
+ "loss": 1.3678,
595
+ "num_input_tokens_seen": 12809808,
596
+ "step": 35000
597
+ },
598
+ {
599
+ "epoch": 0.7895907473309609,
600
+ "grad_norm": 2.3211023807525635,
601
+ "learning_rate": 3.026023131672598e-05,
602
+ "loss": 1.3732,
603
+ "num_input_tokens_seen": 12989912,
604
+ "step": 35500
605
+ },
606
+ {
607
+ "epoch": 0.800711743772242,
608
+ "grad_norm": 3.599958658218384,
609
+ "learning_rate": 2.9982206405693954e-05,
610
+ "loss": 1.3606,
611
+ "num_input_tokens_seen": 13170560,
612
+ "step": 36000
613
+ },
614
+ {
615
+ "epoch": 0.8118327402135231,
616
+ "grad_norm": 2.0861263275146484,
617
+ "learning_rate": 2.9704181494661926e-05,
618
+ "loss": 1.3475,
619
+ "num_input_tokens_seen": 13350424,
620
+ "step": 36500
621
+ },
622
+ {
623
+ "epoch": 0.8229537366548043,
624
+ "grad_norm": 2.043938159942627,
625
+ "learning_rate": 2.9426156583629895e-05,
626
+ "loss": 1.3753,
627
+ "num_input_tokens_seen": 13538640,
628
+ "step": 37000
629
+ },
630
+ {
631
+ "epoch": 0.8340747330960854,
632
+ "grad_norm": 2.4880995750427246,
633
+ "learning_rate": 2.9148131672597867e-05,
634
+ "loss": 1.3497,
635
+ "num_input_tokens_seen": 13730096,
636
+ "step": 37500
637
+ },
638
+ {
639
+ "epoch": 0.8451957295373665,
640
+ "grad_norm": 2.535860300064087,
641
+ "learning_rate": 2.8870106761565835e-05,
642
+ "loss": 1.3708,
643
+ "num_input_tokens_seen": 13909256,
644
+ "step": 38000
645
+ },
646
+ {
647
+ "epoch": 0.8563167259786477,
648
+ "grad_norm": 2.499455213546753,
649
+ "learning_rate": 2.8592081850533807e-05,
650
+ "loss": 1.3546,
651
+ "num_input_tokens_seen": 14101536,
652
+ "step": 38500
653
+ },
654
+ {
655
+ "epoch": 0.8674377224199288,
656
+ "grad_norm": 2.3696117401123047,
657
+ "learning_rate": 2.8314056939501783e-05,
658
+ "loss": 1.3593,
659
+ "num_input_tokens_seen": 14281392,
660
+ "step": 39000
661
+ },
662
+ {
663
+ "epoch": 0.87855871886121,
664
+ "grad_norm": 3.260430097579956,
665
+ "learning_rate": 2.803603202846975e-05,
666
+ "loss": 1.3506,
667
+ "num_input_tokens_seen": 14466320,
668
+ "step": 39500
669
+ },
670
+ {
671
+ "epoch": 0.8896797153024911,
672
+ "grad_norm": 4.11997652053833,
673
+ "learning_rate": 2.7758007117437723e-05,
674
+ "loss": 1.3595,
675
+ "num_input_tokens_seen": 14653328,
676
+ "step": 40000
677
+ },
678
+ {
679
+ "epoch": 0.8896797153024911,
680
+ "eval_loss": 1.1787019968032837,
681
+ "eval_runtime": 2.9698,
682
+ "eval_samples_per_second": 841.809,
683
+ "eval_steps_per_second": 105.394,
684
+ "num_input_tokens_seen": 14653328,
685
+ "step": 40000
686
+ },
687
+ {
688
+ "epoch": 0.9008007117437722,
689
+ "grad_norm": 1.932468056678772,
690
+ "learning_rate": 2.7479982206405695e-05,
691
+ "loss": 1.3559,
692
+ "num_input_tokens_seen": 14837624,
693
+ "step": 40500
694
+ },
695
+ {
696
+ "epoch": 0.9119217081850534,
697
+ "grad_norm": 2.6026477813720703,
698
+ "learning_rate": 2.7201957295373664e-05,
699
+ "loss": 1.3454,
700
+ "num_input_tokens_seen": 15020192,
701
+ "step": 41000
702
+ },
703
+ {
704
+ "epoch": 0.9230427046263345,
705
+ "grad_norm": 2.3455870151519775,
706
+ "learning_rate": 2.692393238434164e-05,
707
+ "loss": 1.3563,
708
+ "num_input_tokens_seen": 15204288,
709
+ "step": 41500
710
+ },
711
+ {
712
+ "epoch": 0.9341637010676157,
713
+ "grad_norm": 2.8757784366607666,
714
+ "learning_rate": 2.664590747330961e-05,
715
+ "loss": 1.3283,
716
+ "num_input_tokens_seen": 15393744,
717
+ "step": 42000
718
+ },
719
+ {
720
+ "epoch": 0.9452846975088968,
721
+ "grad_norm": 2.3972697257995605,
722
+ "learning_rate": 2.636788256227758e-05,
723
+ "loss": 1.3612,
724
+ "num_input_tokens_seen": 15569824,
725
+ "step": 42500
726
+ },
727
+ {
728
+ "epoch": 0.9564056939501779,
729
+ "grad_norm": 3.187290906906128,
730
+ "learning_rate": 2.6089857651245552e-05,
731
+ "loss": 1.3525,
732
+ "num_input_tokens_seen": 15753768,
733
+ "step": 43000
734
+ },
735
+ {
736
+ "epoch": 0.9675266903914591,
737
+ "grad_norm": 2.447659969329834,
738
+ "learning_rate": 2.5811832740213527e-05,
739
+ "loss": 1.3532,
740
+ "num_input_tokens_seen": 15934952,
741
+ "step": 43500
742
+ },
743
+ {
744
+ "epoch": 0.9786476868327402,
745
+ "grad_norm": 2.037935495376587,
746
+ "learning_rate": 2.5533807829181493e-05,
747
+ "loss": 1.3318,
748
+ "num_input_tokens_seen": 16117896,
749
+ "step": 44000
750
+ },
751
+ {
752
+ "epoch": 0.9897686832740213,
753
+ "grad_norm": 2.7559268474578857,
754
+ "learning_rate": 2.5255782918149468e-05,
755
+ "loss": 1.3325,
756
+ "num_input_tokens_seen": 16298928,
757
+ "step": 44500
758
+ },
759
+ {
760
+ "epoch": 1.0008896797153024,
761
+ "grad_norm": 2.2017595767974854,
762
+ "learning_rate": 2.4977758007117437e-05,
763
+ "loss": 1.3605,
764
+ "num_input_tokens_seen": 16481072,
765
+ "step": 45000
766
+ },
767
+ {
768
+ "epoch": 1.0120106761565837,
769
+ "grad_norm": 2.3097991943359375,
770
+ "learning_rate": 2.4699733096085412e-05,
771
+ "loss": 1.3404,
772
+ "num_input_tokens_seen": 16664336,
773
+ "step": 45500
774
+ },
775
+ {
776
+ "epoch": 1.0231316725978647,
777
+ "grad_norm": 2.6227993965148926,
778
+ "learning_rate": 2.4421708185053384e-05,
779
+ "loss": 1.3235,
780
+ "num_input_tokens_seen": 16845800,
781
+ "step": 46000
782
+ },
783
+ {
784
+ "epoch": 1.0342526690391458,
785
+ "grad_norm": 2.24474835395813,
786
+ "learning_rate": 2.4143683274021353e-05,
787
+ "loss": 1.3376,
788
+ "num_input_tokens_seen": 17029008,
789
+ "step": 46500
790
+ },
791
+ {
792
+ "epoch": 1.045373665480427,
793
+ "grad_norm": 2.7171192169189453,
794
+ "learning_rate": 2.3865658362989325e-05,
795
+ "loss": 1.3188,
796
+ "num_input_tokens_seen": 17215936,
797
+ "step": 47000
798
+ },
799
+ {
800
+ "epoch": 1.0564946619217082,
801
+ "grad_norm": 2.5323736667633057,
802
+ "learning_rate": 2.3587633451957297e-05,
803
+ "loss": 1.3369,
804
+ "num_input_tokens_seen": 17401232,
805
+ "step": 47500
806
+ },
807
+ {
808
+ "epoch": 1.0676156583629894,
809
+ "grad_norm": 2.267789363861084,
810
+ "learning_rate": 2.330960854092527e-05,
811
+ "loss": 1.3215,
812
+ "num_input_tokens_seen": 17583344,
813
+ "step": 48000
814
+ },
815
+ {
816
+ "epoch": 1.0787366548042705,
817
+ "grad_norm": 3.399862289428711,
818
+ "learning_rate": 2.3031583629893237e-05,
819
+ "loss": 1.3464,
820
+ "num_input_tokens_seen": 17771600,
821
+ "step": 48500
822
+ },
823
+ {
824
+ "epoch": 1.0898576512455516,
825
+ "grad_norm": 2.8749985694885254,
826
+ "learning_rate": 2.2753558718861213e-05,
827
+ "loss": 1.2928,
828
+ "num_input_tokens_seen": 17957200,
829
+ "step": 49000
830
+ },
831
+ {
832
+ "epoch": 1.1009786476868326,
833
+ "grad_norm": 2.6826517581939697,
834
+ "learning_rate": 2.247553380782918e-05,
835
+ "loss": 1.3191,
836
+ "num_input_tokens_seen": 18138560,
837
+ "step": 49500
838
+ },
839
+ {
840
+ "epoch": 1.112099644128114,
841
+ "grad_norm": 2.2963333129882812,
842
+ "learning_rate": 2.2197508896797153e-05,
843
+ "loss": 1.3059,
844
+ "num_input_tokens_seen": 18326104,
845
+ "step": 50000
846
+ },
847
+ {
848
+ "epoch": 1.112099644128114,
849
+ "eval_loss": 1.1665468215942383,
850
+ "eval_runtime": 3.0497,
851
+ "eval_samples_per_second": 819.748,
852
+ "eval_steps_per_second": 102.632,
853
+ "num_input_tokens_seen": 18326104,
854
+ "step": 50000
855
+ },
856
+ {
857
+ "epoch": 1.123220640569395,
858
+ "grad_norm": 2.2697386741638184,
859
+ "learning_rate": 2.1919483985765125e-05,
860
+ "loss": 1.321,
861
+ "num_input_tokens_seen": 18508560,
862
+ "step": 50500
863
+ },
864
+ {
865
+ "epoch": 1.1343416370106763,
866
+ "grad_norm": 2.436851739883423,
867
+ "learning_rate": 2.1641459074733097e-05,
868
+ "loss": 1.3528,
869
+ "num_input_tokens_seen": 18689264,
870
+ "step": 51000
871
+ },
872
+ {
873
+ "epoch": 1.1454626334519573,
874
+ "grad_norm": 2.297527313232422,
875
+ "learning_rate": 2.136343416370107e-05,
876
+ "loss": 1.2987,
877
+ "num_input_tokens_seen": 18874328,
878
+ "step": 51500
879
+ },
880
+ {
881
+ "epoch": 1.1565836298932384,
882
+ "grad_norm": 2.5088889598846436,
883
+ "learning_rate": 2.1085409252669038e-05,
884
+ "loss": 1.3139,
885
+ "num_input_tokens_seen": 19060472,
886
+ "step": 52000
887
+ },
888
+ {
889
+ "epoch": 1.1677046263345197,
890
+ "grad_norm": 2.067575454711914,
891
+ "learning_rate": 2.0807384341637014e-05,
892
+ "loss": 1.2961,
893
+ "num_input_tokens_seen": 19247416,
894
+ "step": 52500
895
+ },
896
+ {
897
+ "epoch": 1.1788256227758007,
898
+ "grad_norm": 2.467543363571167,
899
+ "learning_rate": 2.0529359430604982e-05,
900
+ "loss": 1.3279,
901
+ "num_input_tokens_seen": 19436888,
902
+ "step": 53000
903
+ },
904
+ {
905
+ "epoch": 1.1899466192170818,
906
+ "grad_norm": 3.4245800971984863,
907
+ "learning_rate": 2.0251334519572954e-05,
908
+ "loss": 1.3303,
909
+ "num_input_tokens_seen": 19616320,
910
+ "step": 53500
911
+ },
912
+ {
913
+ "epoch": 1.201067615658363,
914
+ "grad_norm": 2.759120464324951,
915
+ "learning_rate": 1.9973309608540926e-05,
916
+ "loss": 1.3201,
917
+ "num_input_tokens_seen": 19793576,
918
+ "step": 54000
919
+ },
920
+ {
921
+ "epoch": 1.2121886120996441,
922
+ "grad_norm": 2.7749531269073486,
923
+ "learning_rate": 1.9695284697508898e-05,
924
+ "loss": 1.3194,
925
+ "num_input_tokens_seen": 19980880,
926
+ "step": 54500
927
+ },
928
+ {
929
+ "epoch": 1.2233096085409252,
930
+ "grad_norm": 2.4467661380767822,
931
+ "learning_rate": 1.9417259786476867e-05,
932
+ "loss": 1.3293,
933
+ "num_input_tokens_seen": 20163048,
934
+ "step": 55000
935
+ },
936
+ {
937
+ "epoch": 1.2344306049822065,
938
+ "grad_norm": 3.4420840740203857,
939
+ "learning_rate": 1.9139234875444842e-05,
940
+ "loss": 1.3015,
941
+ "num_input_tokens_seen": 20340904,
942
+ "step": 55500
943
+ },
944
+ {
945
+ "epoch": 1.2455516014234875,
946
+ "grad_norm": 2.4761664867401123,
947
+ "learning_rate": 1.8861209964412814e-05,
948
+ "loss": 1.3292,
949
+ "num_input_tokens_seen": 20524328,
950
+ "step": 56000
951
+ },
952
+ {
953
+ "epoch": 1.2566725978647688,
954
+ "grad_norm": 3.0505285263061523,
955
+ "learning_rate": 1.8583185053380783e-05,
956
+ "loss": 1.317,
957
+ "num_input_tokens_seen": 20707808,
958
+ "step": 56500
959
+ },
960
+ {
961
+ "epoch": 1.2677935943060499,
962
+ "grad_norm": 2.361429214477539,
963
+ "learning_rate": 1.8305160142348755e-05,
964
+ "loss": 1.3176,
965
+ "num_input_tokens_seen": 20892744,
966
+ "step": 57000
967
+ },
968
+ {
969
+ "epoch": 1.278914590747331,
970
+ "grad_norm": 1.9151511192321777,
971
+ "learning_rate": 1.8027135231316727e-05,
972
+ "loss": 1.3267,
973
+ "num_input_tokens_seen": 21072216,
974
+ "step": 57500
975
+ },
976
+ {
977
+ "epoch": 1.290035587188612,
978
+ "grad_norm": 3.0513691902160645,
979
+ "learning_rate": 1.77491103202847e-05,
980
+ "loss": 1.2948,
981
+ "num_input_tokens_seen": 21256064,
982
+ "step": 58000
983
+ },
984
+ {
985
+ "epoch": 1.3011565836298933,
986
+ "grad_norm": 1.7151504755020142,
987
+ "learning_rate": 1.7471085409252668e-05,
988
+ "loss": 1.2967,
989
+ "num_input_tokens_seen": 21439648,
990
+ "step": 58500
991
+ },
992
+ {
993
+ "epoch": 1.3122775800711743,
994
+ "grad_norm": 2.3011133670806885,
995
+ "learning_rate": 1.7193060498220643e-05,
996
+ "loss": 1.3199,
997
+ "num_input_tokens_seen": 21621776,
998
+ "step": 59000
999
+ },
1000
+ {
1001
+ "epoch": 1.3233985765124556,
1002
+ "grad_norm": 2.1768672466278076,
1003
+ "learning_rate": 1.691503558718861e-05,
1004
+ "loss": 1.3233,
1005
+ "num_input_tokens_seen": 21810728,
1006
+ "step": 59500
1007
+ },
1008
+ {
1009
+ "epoch": 1.3345195729537367,
1010
+ "grad_norm": 3.249089241027832,
1011
+ "learning_rate": 1.6637010676156584e-05,
1012
+ "loss": 1.3298,
1013
+ "num_input_tokens_seen": 21991016,
1014
+ "step": 60000
1015
+ },
1016
+ {
1017
+ "epoch": 1.3345195729537367,
1018
+ "eval_loss": 1.1589475870132446,
1019
+ "eval_runtime": 3.0361,
1020
+ "eval_samples_per_second": 823.416,
1021
+ "eval_steps_per_second": 103.092,
1022
+ "num_input_tokens_seen": 21991016,
1023
+ "step": 60000
1024
+ },
1025
+ {
1026
+ "epoch": 1.3456405693950177,
1027
+ "grad_norm": 1.7282594442367554,
1028
+ "learning_rate": 1.6358985765124556e-05,
1029
+ "loss": 1.325,
1030
+ "num_input_tokens_seen": 22178168,
1031
+ "step": 60500
1032
+ },
1033
+ {
1034
+ "epoch": 1.3567615658362988,
1035
+ "grad_norm": 2.2659966945648193,
1036
+ "learning_rate": 1.6080960854092528e-05,
1037
+ "loss": 1.3321,
1038
+ "num_input_tokens_seen": 22359360,
1039
+ "step": 61000
1040
+ },
1041
+ {
1042
+ "epoch": 1.36788256227758,
1043
+ "grad_norm": 2.155791759490967,
1044
+ "learning_rate": 1.58029359430605e-05,
1045
+ "loss": 1.2995,
1046
+ "num_input_tokens_seen": 22543952,
1047
+ "step": 61500
1048
+ },
1049
+ {
1050
+ "epoch": 1.3790035587188612,
1051
+ "grad_norm": 2.902367353439331,
1052
+ "learning_rate": 1.5524911032028472e-05,
1053
+ "loss": 1.3061,
1054
+ "num_input_tokens_seen": 22728064,
1055
+ "step": 62000
1056
+ },
1057
+ {
1058
+ "epoch": 1.3901245551601424,
1059
+ "grad_norm": 2.168686866760254,
1060
+ "learning_rate": 1.5246886120996442e-05,
1061
+ "loss": 1.3231,
1062
+ "num_input_tokens_seen": 22913608,
1063
+ "step": 62500
1064
+ },
1065
+ {
1066
+ "epoch": 1.4012455516014235,
1067
+ "grad_norm": 2.682150363922119,
1068
+ "learning_rate": 1.4968861209964412e-05,
1069
+ "loss": 1.3091,
1070
+ "num_input_tokens_seen": 23101416,
1071
+ "step": 63000
1072
+ },
1073
+ {
1074
+ "epoch": 1.4123665480427046,
1075
+ "grad_norm": 2.82143497467041,
1076
+ "learning_rate": 1.4690836298932384e-05,
1077
+ "loss": 1.2922,
1078
+ "num_input_tokens_seen": 23281200,
1079
+ "step": 63500
1080
+ },
1081
+ {
1082
+ "epoch": 1.4234875444839858,
1083
+ "grad_norm": 2.1155049800872803,
1084
+ "learning_rate": 1.4412811387900358e-05,
1085
+ "loss": 1.3015,
1086
+ "num_input_tokens_seen": 23462552,
1087
+ "step": 64000
1088
+ },
1089
+ {
1090
+ "epoch": 1.434608540925267,
1091
+ "grad_norm": 2.6742069721221924,
1092
+ "learning_rate": 1.4134786476868328e-05,
1093
+ "loss": 1.3108,
1094
+ "num_input_tokens_seen": 23640424,
1095
+ "step": 64500
1096
+ },
1097
+ {
1098
+ "epoch": 1.445729537366548,
1099
+ "grad_norm": 2.575198173522949,
1100
+ "learning_rate": 1.3856761565836299e-05,
1101
+ "loss": 1.2987,
1102
+ "num_input_tokens_seen": 23825392,
1103
+ "step": 65000
1104
+ },
1105
+ {
1106
+ "epoch": 1.4568505338078293,
1107
+ "grad_norm": 3.2627415657043457,
1108
+ "learning_rate": 1.3578736654804272e-05,
1109
+ "loss": 1.3096,
1110
+ "num_input_tokens_seen": 24004872,
1111
+ "step": 65500
1112
+ },
1113
+ {
1114
+ "epoch": 1.4679715302491103,
1115
+ "grad_norm": 3.2727510929107666,
1116
+ "learning_rate": 1.3300711743772243e-05,
1117
+ "loss": 1.3108,
1118
+ "num_input_tokens_seen": 24186064,
1119
+ "step": 66000
1120
+ },
1121
+ {
1122
+ "epoch": 1.4790925266903914,
1123
+ "grad_norm": 2.187281608581543,
1124
+ "learning_rate": 1.3022686832740213e-05,
1125
+ "loss": 1.3239,
1126
+ "num_input_tokens_seen": 24367968,
1127
+ "step": 66500
1128
+ },
1129
+ {
1130
+ "epoch": 1.4902135231316727,
1131
+ "grad_norm": 3.5734856128692627,
1132
+ "learning_rate": 1.2744661921708187e-05,
1133
+ "loss": 1.3241,
1134
+ "num_input_tokens_seen": 24551184,
1135
+ "step": 67000
1136
+ },
1137
+ {
1138
+ "epoch": 1.5013345195729537,
1139
+ "grad_norm": 2.756578207015991,
1140
+ "learning_rate": 1.2466637010676157e-05,
1141
+ "loss": 1.307,
1142
+ "num_input_tokens_seen": 24730288,
1143
+ "step": 67500
1144
+ },
1145
+ {
1146
+ "epoch": 1.512455516014235,
1147
+ "grad_norm": 2.3752739429473877,
1148
+ "learning_rate": 1.2188612099644127e-05,
1149
+ "loss": 1.2963,
1150
+ "num_input_tokens_seen": 24914816,
1151
+ "step": 68000
1152
+ },
1153
+ {
1154
+ "epoch": 1.523576512455516,
1155
+ "grad_norm": 3.7108139991760254,
1156
+ "learning_rate": 1.1910587188612101e-05,
1157
+ "loss": 1.3126,
1158
+ "num_input_tokens_seen": 25100648,
1159
+ "step": 68500
1160
+ },
1161
+ {
1162
+ "epoch": 1.5346975088967971,
1163
+ "grad_norm": 2.5954089164733887,
1164
+ "learning_rate": 1.1632562277580072e-05,
1165
+ "loss": 1.3053,
1166
+ "num_input_tokens_seen": 25282312,
1167
+ "step": 69000
1168
+ },
1169
+ {
1170
+ "epoch": 1.5458185053380782,
1171
+ "grad_norm": 3.0645289421081543,
1172
+ "learning_rate": 1.1354537366548044e-05,
1173
+ "loss": 1.2976,
1174
+ "num_input_tokens_seen": 25465504,
1175
+ "step": 69500
1176
+ },
1177
+ {
1178
+ "epoch": 1.5569395017793595,
1179
+ "grad_norm": 2.3734166622161865,
1180
+ "learning_rate": 1.1076512455516016e-05,
1181
+ "loss": 1.2994,
1182
+ "num_input_tokens_seen": 25656600,
1183
+ "step": 70000
1184
+ },
1185
+ {
1186
+ "epoch": 1.5569395017793595,
1187
+ "eval_loss": 1.1562061309814453,
1188
+ "eval_runtime": 2.9007,
1189
+ "eval_samples_per_second": 861.851,
1190
+ "eval_steps_per_second": 107.904,
1191
+ "num_input_tokens_seen": 25656600,
1192
+ "step": 70000
1193
+ },
1194
+ {
1195
+ "epoch": 1.5680604982206405,
1196
+ "grad_norm": 2.8284573554992676,
1197
+ "learning_rate": 1.0798487544483986e-05,
1198
+ "loss": 1.3125,
1199
+ "num_input_tokens_seen": 25844216,
1200
+ "step": 70500
1201
+ },
1202
+ {
1203
+ "epoch": 1.5791814946619218,
1204
+ "grad_norm": 1.8697574138641357,
1205
+ "learning_rate": 1.0520462633451958e-05,
1206
+ "loss": 1.3247,
1207
+ "num_input_tokens_seen": 26026824,
1208
+ "step": 71000
1209
+ },
1210
+ {
1211
+ "epoch": 1.5903024911032029,
1212
+ "grad_norm": 2.4746646881103516,
1213
+ "learning_rate": 1.024243772241993e-05,
1214
+ "loss": 1.3295,
1215
+ "num_input_tokens_seen": 26211760,
1216
+ "step": 71500
1217
+ },
1218
+ {
1219
+ "epoch": 1.601423487544484,
1220
+ "grad_norm": 2.415778398513794,
1221
+ "learning_rate": 9.9644128113879e-06,
1222
+ "loss": 1.3191,
1223
+ "num_input_tokens_seen": 26393624,
1224
+ "step": 72000
1225
+ },
1226
+ {
1227
+ "epoch": 1.612544483985765,
1228
+ "grad_norm": 2.263882875442505,
1229
+ "learning_rate": 9.686387900355872e-06,
1230
+ "loss": 1.3136,
1231
+ "num_input_tokens_seen": 26576872,
1232
+ "step": 72500
1233
+ },
1234
+ {
1235
+ "epoch": 1.6236654804270463,
1236
+ "grad_norm": 2.436645269393921,
1237
+ "learning_rate": 9.408362989323843e-06,
1238
+ "loss": 1.286,
1239
+ "num_input_tokens_seen": 26763752,
1240
+ "step": 73000
1241
+ },
1242
+ {
1243
+ "epoch": 1.6347864768683276,
1244
+ "grad_norm": 2.739278793334961,
1245
+ "learning_rate": 9.130338078291816e-06,
1246
+ "loss": 1.2712,
1247
+ "num_input_tokens_seen": 26943856,
1248
+ "step": 73500
1249
+ },
1250
+ {
1251
+ "epoch": 1.6459074733096086,
1252
+ "grad_norm": 2.406345844268799,
1253
+ "learning_rate": 8.852313167259788e-06,
1254
+ "loss": 1.305,
1255
+ "num_input_tokens_seen": 27122192,
1256
+ "step": 74000
1257
+ },
1258
+ {
1259
+ "epoch": 1.6570284697508897,
1260
+ "grad_norm": 2.1659858226776123,
1261
+ "learning_rate": 8.574288256227759e-06,
1262
+ "loss": 1.312,
1263
+ "num_input_tokens_seen": 27305912,
1264
+ "step": 74500
1265
+ },
1266
+ {
1267
+ "epoch": 1.6681494661921707,
1268
+ "grad_norm": 2.7831106185913086,
1269
+ "learning_rate": 8.29626334519573e-06,
1270
+ "loss": 1.2997,
1271
+ "num_input_tokens_seen": 27485504,
1272
+ "step": 75000
1273
+ },
1274
+ {
1275
+ "epoch": 1.6792704626334518,
1276
+ "grad_norm": 2.9885916709899902,
1277
+ "learning_rate": 8.018238434163701e-06,
1278
+ "loss": 1.3187,
1279
+ "num_input_tokens_seen": 27665584,
1280
+ "step": 75500
1281
+ },
1282
+ {
1283
+ "epoch": 1.690391459074733,
1284
+ "grad_norm": 2.5562667846679688,
1285
+ "learning_rate": 7.740213523131673e-06,
1286
+ "loss": 1.3142,
1287
+ "num_input_tokens_seen": 27851448,
1288
+ "step": 76000
1289
+ },
1290
+ {
1291
+ "epoch": 1.7015124555160144,
1292
+ "grad_norm": 2.9897525310516357,
1293
+ "learning_rate": 7.462188612099645e-06,
1294
+ "loss": 1.3071,
1295
+ "num_input_tokens_seen": 28033824,
1296
+ "step": 76500
1297
+ },
1298
+ {
1299
+ "epoch": 1.7126334519572954,
1300
+ "grad_norm": 2.6760592460632324,
1301
+ "learning_rate": 7.184163701067615e-06,
1302
+ "loss": 1.3049,
1303
+ "num_input_tokens_seen": 28220000,
1304
+ "step": 77000
1305
+ },
1306
+ {
1307
+ "epoch": 1.7237544483985765,
1308
+ "grad_norm": 2.314532995223999,
1309
+ "learning_rate": 6.906138790035588e-06,
1310
+ "loss": 1.3124,
1311
+ "num_input_tokens_seen": 28403400,
1312
+ "step": 77500
1313
+ },
1314
+ {
1315
+ "epoch": 1.7348754448398576,
1316
+ "grad_norm": 2.2853899002075195,
1317
+ "learning_rate": 6.6281138790035586e-06,
1318
+ "loss": 1.3188,
1319
+ "num_input_tokens_seen": 28585168,
1320
+ "step": 78000
1321
+ },
1322
+ {
1323
+ "epoch": 1.7459964412811388,
1324
+ "grad_norm": 2.462369918823242,
1325
+ "learning_rate": 6.3500889679715306e-06,
1326
+ "loss": 1.3146,
1327
+ "num_input_tokens_seen": 28767960,
1328
+ "step": 78500
1329
+ },
1330
+ {
1331
+ "epoch": 1.75711743772242,
1332
+ "grad_norm": 2.626847505569458,
1333
+ "learning_rate": 6.072064056939502e-06,
1334
+ "loss": 1.3082,
1335
+ "num_input_tokens_seen": 28950344,
1336
+ "step": 79000
1337
+ },
1338
+ {
1339
+ "epoch": 1.7682384341637012,
1340
+ "grad_norm": 3.058187484741211,
1341
+ "learning_rate": 5.794039145907473e-06,
1342
+ "loss": 1.301,
1343
+ "num_input_tokens_seen": 29133248,
1344
+ "step": 79500
1345
+ },
1346
+ {
1347
+ "epoch": 1.7793594306049823,
1348
+ "grad_norm": 1.932924747467041,
1349
+ "learning_rate": 5.516014234875446e-06,
1350
+ "loss": 1.2952,
1351
+ "num_input_tokens_seen": 29314808,
1352
+ "step": 80000
1353
+ },
1354
+ {
1355
+ "epoch": 1.7793594306049823,
1356
+ "eval_loss": 1.1517876386642456,
1357
+ "eval_runtime": 3.0679,
1358
+ "eval_samples_per_second": 814.88,
1359
+ "eval_steps_per_second": 102.023,
1360
+ "num_input_tokens_seen": 29314808,
1361
+ "step": 80000
1362
+ },
1363
+ {
1364
+ "epoch": 1.7904804270462633,
1365
+ "grad_norm": 3.326586961746216,
1366
+ "learning_rate": 5.237989323843417e-06,
1367
+ "loss": 1.3161,
1368
+ "num_input_tokens_seen": 29504040,
1369
+ "step": 80500
1370
+ },
1371
+ {
1372
+ "epoch": 1.8016014234875444,
1373
+ "grad_norm": 3.139636754989624,
1374
+ "learning_rate": 4.959964412811388e-06,
1375
+ "loss": 1.3,
1376
+ "num_input_tokens_seen": 29686712,
1377
+ "step": 81000
1378
+ },
1379
+ {
1380
+ "epoch": 1.8127224199288257,
1381
+ "grad_norm": 2.5767743587493896,
1382
+ "learning_rate": 4.681939501779359e-06,
1383
+ "loss": 1.2934,
1384
+ "num_input_tokens_seen": 29868136,
1385
+ "step": 81500
1386
+ },
1387
+ {
1388
+ "epoch": 1.8238434163701067,
1389
+ "grad_norm": 2.1930339336395264,
1390
+ "learning_rate": 4.4039145907473305e-06,
1391
+ "loss": 1.3104,
1392
+ "num_input_tokens_seen": 30050016,
1393
+ "step": 82000
1394
+ },
1395
+ {
1396
+ "epoch": 1.834964412811388,
1397
+ "grad_norm": 2.9890389442443848,
1398
+ "learning_rate": 4.125889679715303e-06,
1399
+ "loss": 1.303,
1400
+ "num_input_tokens_seen": 30234304,
1401
+ "step": 82500
1402
+ },
1403
+ {
1404
+ "epoch": 1.846085409252669,
1405
+ "grad_norm": 2.9597034454345703,
1406
+ "learning_rate": 3.8478647686832745e-06,
1407
+ "loss": 1.3024,
1408
+ "num_input_tokens_seen": 30419248,
1409
+ "step": 83000
1410
+ },
1411
+ {
1412
+ "epoch": 1.8572064056939501,
1413
+ "grad_norm": 3.17301082611084,
1414
+ "learning_rate": 3.5698398576512457e-06,
1415
+ "loss": 1.2926,
1416
+ "num_input_tokens_seen": 30601776,
1417
+ "step": 83500
1418
+ },
1419
+ {
1420
+ "epoch": 1.8683274021352312,
1421
+ "grad_norm": 2.677340269088745,
1422
+ "learning_rate": 3.291814946619217e-06,
1423
+ "loss": 1.3184,
1424
+ "num_input_tokens_seen": 30783800,
1425
+ "step": 84000
1426
+ },
1427
+ {
1428
+ "epoch": 1.8794483985765125,
1429
+ "grad_norm": 3.5369062423706055,
1430
+ "learning_rate": 3.013790035587189e-06,
1431
+ "loss": 1.2858,
1432
+ "num_input_tokens_seen": 30968472,
1433
+ "step": 84500
1434
+ },
1435
+ {
1436
+ "epoch": 1.8905693950177938,
1437
+ "grad_norm": 2.870908737182617,
1438
+ "learning_rate": 2.73576512455516e-06,
1439
+ "loss": 1.3107,
1440
+ "num_input_tokens_seen": 31147072,
1441
+ "step": 85000
1442
+ },
1443
+ {
1444
+ "epoch": 1.9016903914590748,
1445
+ "grad_norm": 2.1105244159698486,
1446
+ "learning_rate": 2.457740213523132e-06,
1447
+ "loss": 1.2797,
1448
+ "num_input_tokens_seen": 31330536,
1449
+ "step": 85500
1450
+ },
1451
+ {
1452
+ "epoch": 1.9128113879003559,
1453
+ "grad_norm": 4.080565452575684,
1454
+ "learning_rate": 2.1797153024911032e-06,
1455
+ "loss": 1.2991,
1456
+ "num_input_tokens_seen": 31507752,
1457
+ "step": 86000
1458
+ },
1459
+ {
1460
+ "epoch": 1.923932384341637,
1461
+ "grad_norm": 3.036339044570923,
1462
+ "learning_rate": 1.901690391459075e-06,
1463
+ "loss": 1.2884,
1464
+ "num_input_tokens_seen": 31685504,
1465
+ "step": 86500
1466
+ },
1467
+ {
1468
+ "epoch": 1.935053380782918,
1469
+ "grad_norm": 2.3314597606658936,
1470
+ "learning_rate": 1.6236654804270462e-06,
1471
+ "loss": 1.2935,
1472
+ "num_input_tokens_seen": 31865648,
1473
+ "step": 87000
1474
+ },
1475
+ {
1476
+ "epoch": 1.9461743772241993,
1477
+ "grad_norm": 2.4721710681915283,
1478
+ "learning_rate": 1.3456405693950178e-06,
1479
+ "loss": 1.3124,
1480
+ "num_input_tokens_seen": 32046704,
1481
+ "step": 87500
1482
+ },
1483
+ {
1484
+ "epoch": 1.9572953736654806,
1485
+ "grad_norm": 2.4747238159179688,
1486
+ "learning_rate": 1.0676156583629894e-06,
1487
+ "loss": 1.2941,
1488
+ "num_input_tokens_seen": 32235800,
1489
+ "step": 88000
1490
+ },
1491
+ {
1492
+ "epoch": 1.9684163701067616,
1493
+ "grad_norm": 2.4421122074127197,
1494
+ "learning_rate": 7.89590747330961e-07,
1495
+ "loss": 1.3187,
1496
+ "num_input_tokens_seen": 32415672,
1497
+ "step": 88500
1498
+ },
1499
+ {
1500
+ "epoch": 1.9795373665480427,
1501
+ "grad_norm": 3.469534158706665,
1502
+ "learning_rate": 5.115658362989324e-07,
1503
+ "loss": 1.2897,
1504
+ "num_input_tokens_seen": 32597136,
1505
+ "step": 89000
1506
+ },
1507
+ {
1508
+ "epoch": 1.9906583629893237,
1509
+ "grad_norm": 3.1344103813171387,
1510
+ "learning_rate": 2.335409252669039e-07,
1511
+ "loss": 1.2902,
1512
+ "num_input_tokens_seen": 32782688,
1513
+ "step": 89500
1514
+ },
1515
+ {
1516
+ "epoch": 2.0,
1517
+ "num_input_tokens_seen": 32939232,
1518
+ "step": 89920,
1519
+ "total_flos": 1.1959161056722944e+16,
1520
+ "train_loss": 1.358397777055082,
1521
+ "train_runtime": 3698.2072,
1522
+ "train_samples_per_second": 194.516,
1523
+ "train_steps_per_second": 24.314,
1524
+ "train_tokens_per_second": 8905.5
1525
+ }
1526
+ ],
1527
+ "logging_steps": 500,
1528
+ "max_steps": 89920,
1529
+ "num_input_tokens_seen": 32939232,
1530
+ "num_train_epochs": 2,
1531
+ "save_steps": 10000,
1532
+ "stateful_callbacks": {
1533
+ "TrainerControl": {
1534
+ "args": {
1535
+ "should_epoch_stop": false,
1536
+ "should_evaluate": false,
1537
+ "should_log": false,
1538
+ "should_save": true,
1539
+ "should_training_stop": true
1540
+ },
1541
+ "attributes": {}
1542
+ }
1543
+ },
1544
+ "total_flos": 1.1959161056722944e+16,
1545
+ "train_batch_size": 8,
1546
+ "trial_name": null,
1547
+ "trial_params": null
1548
+ }
training_args.bin ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:9da9a5ab537442a209e5c83fc219cb8c862ef0e698c640ae397b39bdaf02ad6e
3
+ size 5368