Tien09 commited on
Commit
6ae030f
·
verified ·
1 Parent(s): 0ca80b7

Add new SentenceTransformer model

Browse files
1_Pooling/config.json ADDED
@@ -0,0 +1,10 @@
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "word_embedding_dimension": 128,
3
+ "pooling_mode_cls_token": false,
4
+ "pooling_mode_mean_tokens": true,
5
+ "pooling_mode_max_tokens": false,
6
+ "pooling_mode_mean_sqrt_len_tokens": false,
7
+ "pooling_mode_weightedmean_tokens": false,
8
+ "pooling_mode_lasttoken": false,
9
+ "include_prompt": true
10
+ }
README.md ADDED
@@ -0,0 +1,472 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
0
  <br>If this card is Synchro Summoned using a Tuner Synchro Monster: You can target 1 Spell/Trap in your GY; add it to your hand. When your opponent activates a card or effect (Quick Effect): You can send 1 Spell/Trap from your hand or field to the GY; Special Summon 1 Level 7 or lower Tuner Synchro Monster from your Extra Deck, GY, or banishment. You can only use each effect of "Diabell, Queen of the White Forest" once per turn.</code> | <code>0.2</code> | <code>1 Aqua monster + 1 Level 10 WATER monster<br>Must first be either Fusion Summoned, or Special Summoned (from your Extra Deck) by Tributing 1 Level 10 Aqua monster with 0 ATK. This card can be treated as 3 Tributes for the Tribute Summon of a monster. Cannot be destroyed by battle. Your opponent cannot target monsters you control with card effects, except "Egyptian God Slime", also their monsters cannot target monsters for attacks, except "Egyptian God Slime".</code> |
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ language:
3
+ - en
4
+ license: apache-2.0
5
+ tags:
6
+ - sentence-transformers
7
+ - sentence-similarity
8
+ - feature-extraction
9
+ - generated_from_trainer
10
+ - dataset_size:8959
11
+ - loss:CoSENTLoss
12
+ base_model: prajjwal1/bert-tiny
13
+ widget:
14
+ - source_sentence: 'When this card is Normal Summoned: You can Special Summon 1 "Crashbug
15
+ X" from your Deck. You must control a face-up "Crashbug Z" to activate and to
16
+ resolve this effect.'
17
+ sentences:
18
+ - You can remove from play 1 Tuner monster in your GY to Special Summon this card
19
+ from your hand.
20
+ - This spirit emerges from the mystic lamp and obeys the wishes of its summoner.
21
+ - 'When your opponent activates a monster effect, while you control a "Beetrooper"
22
+ monster: Negate the activation, and if you do, destroy it. During your End Phase,
23
+ if this card is in your GY and you control an Insect monster with 3000 or more
24
+ ATK: You can banish 1 Insect monster from your GY; Set this card. You can only
25
+ use 1 "Beetrooper Fly & Sting" effect per turn, and only once that turn.'
26
+ - source_sentence: Each time a Spell Card is activated, place 1 Spell Counter on this
27
+ card when that Spell Card resolves. This card's Level is increased by the number
28
+ of Spell Counters on this card. You can remove 3 Spell Counters from this card,
29
+ then target 1 Quick-Play Spell Card in your GY; Set that card to your Spell &
30
+ Trap Zone. You can only use this effect of "Magical Something" once per turn.
31
+ sentences:
32
+ - Activate only while "Umi" is on the field. As long as "Umi" remains face-up on
33
+ the field, you take no damage from attacking monsters. When "Umi" is removed from
34
+ the field, destroy this card.
35
+ - Shuffle 1 "Duoterion", 1 "Hydrogeddon", and 1 "Oxygeddon" from your hand and/or
36
+ GY into the Deck; Special Summon 1 "Water Dragon Cluster" from your hand or GY.
37
+ You can banish this card from your GY; add 1 "Water Dragon" or "Water Dragon Cluster"
38
+ from your Deck or GY to your hand.
39
+ - You can Special Summon this card (from your hand) by Tributing 1 WATER monster.
40
+ - source_sentence: You can only activate this card when there are "Don Zaloog", "Cliff
41
+ the Trap Remover", "Dark Scorpion - Chick the Yellow", "Dark Scorpion - Gorg the
42
+ Strong", and "Dark Scorpion - Meanae the Thorn" face-up on your side of the field.
43
+ During this turn, any of these 5 cards can attack your opponent's Life Points
44
+ directly. In that case, the Battle Damage inflicted to your opponent by each of
45
+ those cards becomes 400 points.
46
+ sentences:
47
+ - "During either turn, except the End Phase (Quick Effect): You can discard this\
48
+ \ card; apply this effect this turn. You can only use this effect of \"Ghost Sister\
49
+ \ & Spooky Dogwood\" once per turn.\r\n● Each time your opponent Special Summons\
50
+ \ an Effect Monster(s) during the Main Phase or Battle Phase, you gain LP equal\
51
+ \ to that monster's ATK. If you did not gain LP by this effect, your LP are halved\
52
+ \ during the End Phase."
53
+ - A WIND monster equipped with this card increases its ATK by 400 and decreases
54
+ its DEF by 200.
55
+ - During your Standby Phase, inflict 300 points of damage to your opponent's Life
56
+ Points for each monster on your opponent's side of the field.
57
+ - source_sentence: When this card is destroyed by battle and sent to the GY, send
58
+ 1 Fish-type monster from your Deck to the GY. Then, you can Special Summon 1 "Nimble
59
+ Sunfish" from your Deck.
60
+ sentences:
61
+ - You can Special Summon this card (from your hand) to your Main Monster Zone, adjacent
62
+ to a "Scareclaw" monster you control or in its column. You can only Special Summon
63
+ "Scareclaw Belone" once per turn this way. If your "Scareclaw" monster in the
64
+ Extra Monster Zone attacks a Defense Position monster, inflict piercing battle
65
+ damage to your opponent.
66
+ - "2 monsters, including a Level/Rank/Link 2 monster\r\nCannot be used as Link Material\
67
+ \ the turn it is Link Summoned. Your opponent cannot target monsters this card\
68
+ \ points to with card effects. During the Main Phase (Quick Effect): You can target\
69
+ \ 1 Level 2 monster in your GY, or, if your opponent controls a monster, you can\
70
+ \ target 1 Rank/Link 2 monster instead; Special Summon it. You can only use this\
71
+ \ effect of \"Spright Elf\" once per turn."
72
+ - 'If this card is Normal/Special Summoned, or flipped face-up: You can target up
73
+ to 2 face-up monsters on the field; change them to face-down Defense Position,
74
+ and if you do, any opponent''s monsters that were flipped by this effect cannot
75
+ change their battle positions. If a monster on the field is flipped face-up, while
76
+ this monster is face-up on the field (except during the Damage Step): You can
77
+ target 1 card your opponent controls; destroy it. You can only use each effect
78
+ of "Jioh the Gravity Ninja" once per turn.'
79
+ - source_sentence: 'If you control 3 or more face-up "Six Samurai" monsters, you can
80
+ activate 1 of these effects: Destroy all face-up monsters your opponent controls.
81
+ Destroy all face-up Spell/Trap Cards your opponent controls. Destroy all Set Spell/Trap
82
+ Cards your opponent controls.'
83
+ sentences:
84
+ - Target 1 Link Monster you control and 1 monster your opponent controls; destroy
85
+ them, then draw 1 card. You can only activate 1 "Link Burst" per turn.
86
+ - 'Cannot be Normal Summoned/Set. Must first be Special Summoned (from your hand)
87
+ by Tributing 1 Level 1 "Flower Cardian" monster, except "Flower Cardian Pine with
88
+ Crane". If this card is Special Summoned: Draw 1 card, and if you do, show it,
89
+ then you can Special Summon it if it is a "Flower Cardian" monster. Otherwise,
90
+ send it to the GY. At the end of the Battle Phase, if this card battled: Draw
91
+ 1 card.'
92
+ - While you have 2 or less cards in your hand, all face-up "Fabled" monsters you
93
+ control gain 400 ATK.
94
+ datasets:
95
+ - Tien09/pair_similarity
96
+ pipeline_tag: sentence-similarity
97
+ library_name: sentence-transformers
98
+ ---
99
+
100
+ # MPNet base trained on AllNLI triplets
101
+
102
+ This is a [sentence-transformers](https://www.SBERT.net) model finetuned from [prajjwal1/bert-tiny](https://huggingface.co/prajjwal1/bert-tiny) on the [pair_similarity](https://huggingface.co/datasets/Tien09/pair_similarity) dataset. It maps sentences & paragraphs to a 128-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.
103
+
104
+ ## Model Details
105
+
106
+ ### Model Description
107
+ - **Model Type:** Sentence Transformer
108
+ - **Base model:** [prajjwal1/bert-tiny](https://huggingface.co/prajjwal1/bert-tiny) <!-- at revision 6f75de8b60a9f8a2fdf7b69cbd86d9e64bcb3837 -->
109
+ - **Maximum Sequence Length:** 512 tokens
110
+ - **Output Dimensionality:** 128 dimensions
111
+ - **Similarity Function:** Cosine Similarity
112
+ - **Training Dataset:**
113
+ - [pair_similarity](https://huggingface.co/datasets/Tien09/pair_similarity)
114
+ - **Language:** en
115
+ - **License:** apache-2.0
116
+
117
+ ### Model Sources
118
+
119
+ - **Documentation:** [Sentence Transformers Documentation](https://sbert.net)
120
+ - **Repository:** [Sentence Transformers on GitHub](https://github.com/UKPLab/sentence-transformers)
121
+ - **Hugging Face:** [Sentence Transformers on Hugging Face](https://huggingface.co/models?library=sentence-transformers)
122
+
123
+ ### Full Model Architecture
124
+
125
+ ```
126
+ SentenceTransformer(
127
+ (0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: BertModel
128
+ (1): Pooling({'word_embedding_dimension': 128, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
129
+ )
130
+ ```
131
+
132
+ ## Usage
133
+
134
+ ### Direct Usage (Sentence Transformers)
135
+
136
+ First install the Sentence Transformers library:
137
+
138
+ ```bash
139
+ pip install -U sentence-transformers
140
+ ```
141
+
142
+ Then you can load this model and run inference.
143
+ ```python
144
+ from sentence_transformers import SentenceTransformer
145
+
146
+ # Download from the 🤗 Hub
147
+ model = SentenceTransformer("Tien09/tiny_bert_ft_sim_score")
148
+ # Run inference
149
+ sentences = [
150
+ 'If you control 3 or more face-up "Six Samurai" monsters, you can activate 1 of these effects: Destroy all face-up monsters your opponent controls. Destroy all face-up Spell/Trap Cards your opponent controls. Destroy all Set Spell/Trap Cards your opponent controls.',
151
+ 'Target 1 Link Monster you control and 1 monster your opponent controls; destroy them, then draw 1 card. You can only activate 1 "Link Burst" per turn.',
152
+ 'While you have 2 or less cards in your hand, all face-up "Fabled" monsters you control gain 400 ATK.',
153
+ ]
154
+ embeddings = model.encode(sentences)
155
+ print(embeddings.shape)
156
+ # [3, 128]
157
+
158
+ # Get the similarity scores for the embeddings
159
+ similarities = model.similarity(embeddings, embeddings)
160
+ print(similarities.shape)
161
+ # [3, 3]
162
+ ```
163
+
164
+ <!--
165
+ ### Direct Usage (Transformers)
166
+
167
+ <details><summary>Click to see the direct usage in Transformers</summary>
168
+
169
+ </details>
170
+ -->
171
+
172
+ <!--
173
+ ### Downstream Usage (Sentence Transformers)
174
+
175
+ You can finetune this model on your own dataset.
176
+
177
+ <details><summary>Click to expand</summary>
178
+
179
+ </details>
180
+ -->
181
+
182
+ <!--
183
+ ### Out-of-Scope Use
184
+
185
+ *List how the model may foreseeably be misused and address what users ought not to do with the model.*
186
+ -->
187
+
188
+ <!--
189
+ ## Bias, Risks and Limitations
190
+
191
+ *What are the known or foreseeable issues stemming from this model? You could also flag here known failure cases or weaknesses of the model.*
192
+ -->
193
+
194
+ <!--
195
+ ### Recommendations
196
+
197
+ *What are recommendations with respect to the foreseeable issues? For example, filtering explicit content.*
198
+ -->
199
+
200
+ ## Training Details
201
+
202
+ ### Training Dataset
203
+
204
+ #### pair_similarity
205
+
206
+ * Dataset: [pair_similarity](https://huggingface.co/datasets/Tien09/pair_similarity) at [a933de4](https://huggingface.co/datasets/Tien09/pair_similarity/tree/a933de4485aee2deeb50b77b0b27e4654094d56f)
207
+ * Size: 8,959 training samples
208
+ * Columns: <code>effect_text</code>, <code>score</code>, and <code>effect_text2</code>
209
+ * Approximate statistics based on the first 1000 samples:
210
+ | | effect_text | score | effect_text2 |
211
+ |:--------|:-----------------------------------------------------------------------------------|:---------------------------------------------------------------|:----------------------------------------------------------------------------------|
212
+ | type | string | float | string |
213
+ | details | <ul><li>min: 6 tokens</li><li>mean: 72.39 tokens</li><li>max: 191 tokens</li></ul> | <ul><li>min: 0.0</li><li>mean: 0.09</li><li>max: 1.0</li></ul> | <ul><li>min: 8 tokens</li><li>mean: 72.1 tokens</li><li>max: 198 tokens</li></ul> |
214
+ * Samples:
215
+ | effect_text | score | effect_text2 |
216
+ |:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:-----------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
217
+ | <code>Once per turn, if you Special Summon a DARK Synchro Monster(s) from the Extra Deck: You can target 1 of your "Blackwing" monsters, or "Black-Winged Dragon", with lower ATK that is banished or in your GY; Special Summon it. Once per turn, if a DARK monster(s) you control would be destroyed by battle or card effect, you can remove 1 Black Feather Counter from your field instead.</code> | <code>0.0</code> | <code>A Millennium item, it's rumored to block any strong attack.</code> |
218
+ | <code>Target 1 face-up monster your opponent controls; the ATK of all other monsters currently on the field become equal to that monster's ATK, until the end of this turn.</code> | <code>0.0</code> | <code>While you control a "Blue-Eyes" monster, you choose the attack targets for your opponent's attacks. You can only use each of the following effects of "Dictator of D." once per turn. You can send 1 "Blue-Eyes White Dragon" from your hand or Deck to the GY; Special Summon this card from your hand. You can discard 1 "Blue-Eyes White Dragon", or 1 card that mentions it, then target 1 "Blue-Eyes" monster in your GY; Special Summon it.</code> |
219
+ | <code>1 Tuner + 1+ non-Tuner monsters
220
  <br>If this card is Synchro Summoned using a Tuner Synchro Monster: You can target 1 Spell/Trap in your GY; add it to your hand. When your opponent activates a card or effect (Quick Effect): You can send 1 Spell/Trap from your hand or field to the GY; Special Summon 1 Level 7 or lower Tuner Synchro Monster from your Extra Deck, GY, or banishment. You can only use each effect of "Diabell, Queen of the White Forest" once per turn.</code> | <code>0.2</code> | <code>1 Aqua monster + 1 Level 10 WATER monster<br>Must first be either Fusion Summoned, or Special Summoned (from your Extra Deck) by Tributing 1 Level 10 Aqua monster with 0 ATK. This card can be treated as 3 Tributes for the Tribute Summon of a monster. Cannot be destroyed by battle. Your opponent cannot target monsters you control with card effects, except "Egyptian God Slime", also their monsters cannot target monsters for attacks, except "Egyptian God Slime".</code> |
221
+ * Loss: [<code>CoSENTLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#cosentloss) with these parameters:
222
+ ```json
223
+ {
224
+ "scale": 20.0,
225
+ "similarity_fct": "pairwise_cos_sim"
226
+ }
227
+ ```
228
+
229
+ ### Evaluation Dataset
230
+
231
+ #### pair_similarity
232
+
233
+ * Dataset: [pair_similarity](https://huggingface.co/datasets/Tien09/pair_similarity) at [a933de4](https://huggingface.co/datasets/Tien09/pair_similarity/tree/a933de4485aee2deeb50b77b0b27e4654094d56f)
234
+ * Size: 1,920 evaluation samples
235
+ * Columns: <code>effect_text</code>, <code>score</code>, and <code>effect_text2</code>
236
+ * Approximate statistics based on the first 1000 samples:
237
+ | | effect_text | score | effect_text2 |
238
+ |:--------|:------------------------------------------------------------------------------------|:---------------------------------------------------------------|:-----------------------------------------------------------------------------------|
239
+ | type | string | float | string |
240
+ | details | <ul><li>min: 10 tokens</li><li>mean: 72.56 tokens</li><li>max: 202 tokens</li></ul> | <ul><li>min: 0.0</li><li>mean: 0.09</li><li>max: 1.0</li></ul> | <ul><li>min: 5 tokens</li><li>mean: 71.33 tokens</li><li>max: 186 tokens</li></ul> |
241
+ * Samples:
242
+ | effect_text | score | effect_text2 |
243
+ |:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
244
+ | <code>A proud ruler of the jungle that some fear and others respect.</code> | <code>0.0</code> | <code>Cannot attack the turn it is Normal Summoned. Once per turn: You can target 1 face-up monster on the field; change this card to Defense Position, and if you do, that target loses 800 ATK until the end of this turn.</code> |
245
+ | <code>During your opponent's Main Phase or Battle Phase: You can Special Summon 1 non-Tuner monster from your hand, but it has its effects negated (if any), and if you do, immediately after this effect resolves, Synchro Summon 1 Machine-Type Synchro Monster using only that monster and this card (this is a Quick Effect). You can only use this effect of "Crystron Quan" once per turn.</code> | <code>0.0</code> | <code>You can Tribute this card while "Neo Space" is on the field to Special Summon 1 "Neo-Spacian Dark Panther" from your hand or Deck.</code> |
246
+ | <code>When your opponent Special Summons a monster(s): Destroy it, then you can banish 5 Zombie monsters from your GY, and if you do, Special Summon 1 Level 7 or higher Zombie monster from your hand or Deck.</code> | <code>0.25</code> | <code>You can target 1 Dragon monster you control; it gains ATK/DEF equal to the total Link Rating of the Link Monsters currently on the field x 100, until the end of the opponent's turn. You can only use this effect of "Guardragon Shield" once per turn. Once per turn, if exactly 1 Dragon monster you control would be destroyed by battle or card effect, you can send 1 Normal Monster from your hand or Deck to the GY instead.</code> |
247
+ * Loss: [<code>CoSENTLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#cosentloss) with these parameters:
248
+ ```json
249
+ {
250
+ "scale": 20.0,
251
+ "similarity_fct": "pairwise_cos_sim"
252
+ }
253
+ ```
254
+
255
+ ### Training Hyperparameters
256
+ #### Non-Default Hyperparameters
257
+
258
+ - `eval_strategy`: steps
259
+ - `per_device_train_batch_size`: 16
260
+ - `per_device_eval_batch_size`: 16
261
+ - `num_train_epochs`: 5
262
+ - `warmup_ratio`: 0.1
263
+ - `fp16`: True
264
+ - `batch_sampler`: no_duplicates
265
+
266
+ #### All Hyperparameters
267
+ <details><summary>Click to expand</summary>
268
+
269
+ - `overwrite_output_dir`: False
270
+ - `do_predict`: False
271
+ - `eval_strategy`: steps
272
+ - `prediction_loss_only`: True
273
+ - `per_device_train_batch_size`: 16
274
+ - `per_device_eval_batch_size`: 16
275
+ - `per_gpu_train_batch_size`: None
276
+ - `per_gpu_eval_batch_size`: None
277
+ - `gradient_accumulation_steps`: 1
278
+ - `eval_accumulation_steps`: None
279
+ - `torch_empty_cache_steps`: None
280
+ - `learning_rate`: 5e-05
281
+ - `weight_decay`: 0.0
282
+ - `adam_beta1`: 0.9
283
+ - `adam_beta2`: 0.999
284
+ - `adam_epsilon`: 1e-08
285
+ - `max_grad_norm`: 1.0
286
+ - `num_train_epochs`: 5
287
+ - `max_steps`: -1
288
+ - `lr_scheduler_type`: linear
289
+ - `lr_scheduler_kwargs`: {}
290
+ - `warmup_ratio`: 0.1
291
+ - `warmup_steps`: 0
292
+ - `log_level`: passive
293
+ - `log_level_replica`: warning
294
+ - `log_on_each_node`: True
295
+ - `logging_nan_inf_filter`: True
296
+ - `save_safetensors`: True
297
+ - `save_on_each_node`: False
298
+ - `save_only_model`: False
299
+ - `restore_callback_states_from_checkpoint`: False
300
+ - `no_cuda`: False
301
+ - `use_cpu`: False
302
+ - `use_mps_device`: False
303
+ - `seed`: 42
304
+ - `data_seed`: None
305
+ - `jit_mode_eval`: False
306
+ - `use_ipex`: False
307
+ - `bf16`: False
308
+ - `fp16`: True
309
+ - `fp16_opt_level`: O1
310
+ - `half_precision_backend`: auto
311
+ - `bf16_full_eval`: False
312
+ - `fp16_full_eval`: False
313
+ - `tf32`: None
314
+ - `local_rank`: 0
315
+ - `ddp_backend`: None
316
+ - `tpu_num_cores`: None
317
+ - `tpu_metrics_debug`: False
318
+ - `debug`: []
319
+ - `dataloader_drop_last`: False
320
+ - `dataloader_num_workers`: 0
321
+ - `dataloader_prefetch_factor`: None
322
+ - `past_index`: -1
323
+ - `disable_tqdm`: False
324
+ - `remove_unused_columns`: True
325
+ - `label_names`: None
326
+ - `load_best_model_at_end`: False
327
+ - `ignore_data_skip`: False
328
+ - `fsdp`: []
329
+ - `fsdp_min_num_params`: 0
330
+ - `fsdp_config`: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
331
+ - `fsdp_transformer_layer_cls_to_wrap`: None
332
+ - `accelerator_config`: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
333
+ - `deepspeed`: None
334
+ - `label_smoothing_factor`: 0.0
335
+ - `optim`: adamw_torch
336
+ - `optim_args`: None
337
+ - `adafactor`: False
338
+ - `group_by_length`: False
339
+ - `length_column_name`: length
340
+ - `ddp_find_unused_parameters`: None
341
+ - `ddp_bucket_cap_mb`: None
342
+ - `ddp_broadcast_buffers`: False
343
+ - `dataloader_pin_memory`: True
344
+ - `dataloader_persistent_workers`: False
345
+ - `skip_memory_metrics`: True
346
+ - `use_legacy_prediction_loop`: False
347
+ - `push_to_hub`: False
348
+ - `resume_from_checkpoint`: None
349
+ - `hub_model_id`: None
350
+ - `hub_strategy`: every_save
351
+ - `hub_private_repo`: None
352
+ - `hub_always_push`: False
353
+ - `gradient_checkpointing`: False
354
+ - `gradient_checkpointing_kwargs`: None
355
+ - `include_inputs_for_metrics`: False
356
+ - `include_for_metrics`: []
357
+ - `eval_do_concat_batches`: True
358
+ - `fp16_backend`: auto
359
+ - `push_to_hub_model_id`: None
360
+ - `push_to_hub_organization`: None
361
+ - `mp_parameters`:
362
+ - `auto_find_batch_size`: False
363
+ - `full_determinism`: False
364
+ - `torchdynamo`: None
365
+ - `ray_scope`: last
366
+ - `ddp_timeout`: 1800
367
+ - `torch_compile`: False
368
+ - `torch_compile_backend`: None
369
+ - `torch_compile_mode`: None
370
+ - `dispatch_batches`: None
371
+ - `split_batches`: None
372
+ - `include_tokens_per_second`: False
373
+ - `include_num_input_tokens_seen`: False
374
+ - `neftune_noise_alpha`: None
375
+ - `optim_target_modules`: None
376
+ - `batch_eval_metrics`: False
377
+ - `eval_on_start`: False
378
+ - `use_liger_kernel`: False
379
+ - `eval_use_gather_object`: False
380
+ - `average_tokens_across_devices`: False
381
+ - `prompts`: None
382
+ - `batch_sampler`: no_duplicates
383
+ - `multi_dataset_batch_sampler`: proportional
384
+
385
+ </details>
386
+
387
+ ### Training Logs
388
+ | Epoch | Step | Training Loss | Validation Loss |
389
+ |:------:|:----:|:-------------:|:---------------:|
390
+ | 0.1786 | 100 | 3.8917 | 3.7898 |
391
+ | 0.3571 | 200 | 3.7289 | 3.7576 |
392
+ | 0.5357 | 300 | 3.6719 | 3.7211 |
393
+ | 0.7143 | 400 | 3.6294 | 3.6751 |
394
+ | 0.8929 | 500 | 3.5188 | 3.6291 |
395
+ | 1.0714 | 600 | 3.6794 | 3.5768 |
396
+ | 1.25 | 700 | 3.4962 | 3.5798 |
397
+ | 1.4286 | 800 | 3.4325 | 3.6149 |
398
+ | 1.6071 | 900 | 3.3956 | 3.6151 |
399
+ | 1.7857 | 1000 | 3.2907 | 3.7533 |
400
+ | 1.9643 | 1100 | 3.3685 | 3.5106 |
401
+ | 2.1429 | 1200 | 3.502 | 3.4844 |
402
+ | 2.3214 | 1300 | 3.3796 | 3.6363 |
403
+ | 2.5 | 1400 | 3.2383 | 3.5744 |
404
+ | 2.6786 | 1500 | 3.1346 | 3.6568 |
405
+ | 2.8571 | 1600 | 3.1808 | 3.6278 |
406
+ | 3.0357 | 1700 | 3.3241 | 3.4786 |
407
+ | 3.2143 | 1800 | 3.2864 | 3.4705 |
408
+ | 3.3929 | 1900 | 3.2056 | 3.5290 |
409
+ | 3.5714 | 2000 | 3.1519 | 3.6228 |
410
+ | 3.75 | 2100 | 3.0889 | 3.5919 |
411
+ | 3.9286 | 2200 | 2.9385 | 3.6148 |
412
+ | 4.1071 | 2300 | 3.2051 | 3.5180 |
413
+ | 4.2857 | 2400 | 3.2581 | 3.5216 |
414
+ | 4.4643 | 2500 | 3.0765 | 3.5968 |
415
+ | 4.6429 | 2600 | 2.9497 | 3.6496 |
416
+ | 4.8214 | 2700 | 2.8502 | 3.6804 |
417
+ | 5.0 | 2800 | 3.1919 | 3.6668 |
418
+
419
+
420
+ ### Framework Versions
421
+ - Python: 3.10.12
422
+ - Sentence Transformers: 3.3.1
423
+ - Transformers: 4.47.1
424
+ - PyTorch: 2.5.1+cu121
425
+ - Accelerate: 1.2.1
426
+ - Datasets: 3.2.0
427
+ - Tokenizers: 0.21.0
428
+
429
+ ## Citation
430
+
431
+ ### BibTeX
432
+
433
+ #### Sentence Transformers
434
+ ```bibtex
435
+ @inproceedings{reimers-2019-sentence-bert,
436
+ title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
437
+ author = "Reimers, Nils and Gurevych, Iryna",
438
+ booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
439
+ month = "11",
440
+ year = "2019",
441
+ publisher = "Association for Computational Linguistics",
442
+ url = "https://arxiv.org/abs/1908.10084",
443
+ }
444
+ ```
445
+
446
+ #### CoSENTLoss
447
+ ```bibtex
448
+ @online{kexuefm-8847,
449
+ title={CoSENT: A more efficient sentence vector scheme than Sentence-BERT},
450
+ author={Su Jianlin},
451
+ year={2022},
452
+ month={Jan},
453
+ url={https://kexue.fm/archives/8847},
454
+ }
455
+ ```
456
+
457
+ <!--
458
+ ## Glossary
459
+
460
+ *Clearly define terms in order to be accessible across audiences.*
461
+ -->
462
+
463
+ <!--
464
+ ## Model Card Authors
465
+
466
+ *Lists the people who create the model card, providing recognition and accountability for the detailed work that goes into its construction.*
467
+ -->
468
+
469
+ <!--
470
+ ## Model Card Contact
471
+
472
+ *Provides a way for people who have updates to the Model Card, suggestions, or questions, to contact the Model Card authors.*
473
+ -->
config.json ADDED
@@ -0,0 +1,25 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "_name_or_path": "prajjwal1/bert-tiny",
3
+ "architectures": [
4
+ "BertModel"
5
+ ],
6
+ "attention_probs_dropout_prob": 0.1,
7
+ "classifier_dropout": null,
8
+ "hidden_act": "gelu",
9
+ "hidden_dropout_prob": 0.1,
10
+ "hidden_size": 128,
11
+ "initializer_range": 0.02,
12
+ "intermediate_size": 512,
13
+ "layer_norm_eps": 1e-12,
14
+ "max_position_embeddings": 512,
15
+ "model_type": "bert",
16
+ "num_attention_heads": 2,
17
+ "num_hidden_layers": 2,
18
+ "pad_token_id": 0,
19
+ "position_embedding_type": "absolute",
20
+ "torch_dtype": "float32",
21
+ "transformers_version": "4.47.1",
22
+ "type_vocab_size": 2,
23
+ "use_cache": true,
24
+ "vocab_size": 30522
25
+ }
config_sentence_transformers.json ADDED
@@ -0,0 +1,10 @@
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "__version__": {
3
+ "sentence_transformers": "3.3.1",
4
+ "transformers": "4.47.1",
5
+ "pytorch": "2.5.1+cu121"
6
+ },
7
+ "prompts": {},
8
+ "default_prompt_name": null,
9
+ "similarity_fn_name": "cosine"
10
+ }
model.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:7524d25984f13d0abb20cfb4a54654649c8a941371d3bfbd57a2f81feedac362
3
+ size 17547912
modules.json ADDED
@@ -0,0 +1,14 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ [
2
+ {
3
+ "idx": 0,
4
+ "name": "0",
5
+ "path": "",
6
+ "type": "sentence_transformers.models.Transformer"
7
+ },
8
+ {
9
+ "idx": 1,
10
+ "name": "1",
11
+ "path": "1_Pooling",
12
+ "type": "sentence_transformers.models.Pooling"
13
+ }
14
+ ]
sentence_bert_config.json ADDED
@@ -0,0 +1,4 @@
 
 
 
 
 
1
+ {
2
+ "max_seq_length": 512,
3
+ "do_lower_case": false
4
+ }
special_tokens_map.json ADDED
@@ -0,0 +1,7 @@
 
 
 
 
 
 
 
 
1
+ {
2
+ "cls_token": "[CLS]",
3
+ "mask_token": "[MASK]",
4
+ "pad_token": "[PAD]",
5
+ "sep_token": "[SEP]",
6
+ "unk_token": "[UNK]"
7
+ }
tokenizer.json ADDED
The diff for this file is too large to render. See raw diff
 
tokenizer_config.json ADDED
@@ -0,0 +1,58 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "added_tokens_decoder": {
3
+ "0": {
4
+ "content": "[PAD]",
5
+ "lstrip": false,
6
+ "normalized": false,
7
+ "rstrip": false,
8
+ "single_word": false,
9
+ "special": true
10
+ },
11
+ "100": {
12
+ "content": "[UNK]",
13
+ "lstrip": false,
14
+ "normalized": false,
15
+ "rstrip": false,
16
+ "single_word": false,
17
+ "special": true
18
+ },
19
+ "101": {
20
+ "content": "[CLS]",
21
+ "lstrip": false,
22
+ "normalized": false,
23
+ "rstrip": false,
24
+ "single_word": false,
25
+ "special": true
26
+ },
27
+ "102": {
28
+ "content": "[SEP]",
29
+ "lstrip": false,
30
+ "normalized": false,
31
+ "rstrip": false,
32
+ "single_word": false,
33
+ "special": true
34
+ },
35
+ "103": {
36
+ "content": "[MASK]",
37
+ "lstrip": false,
38
+ "normalized": false,
39
+ "rstrip": false,
40
+ "single_word": false,
41
+ "special": true
42
+ }
43
+ },
44
+ "clean_up_tokenization_spaces": true,
45
+ "cls_token": "[CLS]",
46
+ "do_basic_tokenize": true,
47
+ "do_lower_case": true,
48
+ "extra_special_tokens": {},
49
+ "mask_token": "[MASK]",
50
+ "model_max_length": 1000000000000000019884624838656,
51
+ "never_split": null,
52
+ "pad_token": "[PAD]",
53
+ "sep_token": "[SEP]",
54
+ "strip_accents": null,
55
+ "tokenize_chinese_chars": true,
56
+ "tokenizer_class": "BertTokenizer",
57
+ "unk_token": "[UNK]"
58
+ }
vocab.txt ADDED
The diff for this file is too large to render. See raw diff