Gonalb commited on
Commit
f876541
·
verified ·
1 Parent(s): 25a3700

Add new SentenceTransformer model

Browse files
1_Pooling/config.json ADDED
@@ -0,0 +1,10 @@
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "word_embedding_dimension": 1024,
3
+ "pooling_mode_cls_token": true,
4
+ "pooling_mode_mean_tokens": false,
5
+ "pooling_mode_max_tokens": false,
6
+ "pooling_mode_mean_sqrt_len_tokens": false,
7
+ "pooling_mode_weightedmean_tokens": false,
8
+ "pooling_mode_lasttoken": false,
9
+ "include_prompt": true
10
+ }
README.md ADDED
@@ -0,0 +1,770 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ tags:
3
+ - sentence-transformers
4
+ - sentence-similarity
5
+ - feature-extraction
6
+ - generated_from_trainer
7
+ - dataset_size:400
8
+ - loss:MatryoshkaLoss
9
+ - loss:MultipleNegativesRankingLoss
10
+ base_model: Snowflake/snowflake-arctic-embed-l
11
+ widget:
12
+ - source_sentence: 'QUESTION #2: What percentage of patients in the study reported
13
+ experiencing "chills" and "feverish discomfort"?'
14
+ sentences:
15
+ - "been proven superior. Annual influenza vaccination is recommended for all people\
16
+ \ six months and older who do not have \ncontraindications. ( Am Fam Physician.\
17
+ \ 2019; 100(12):751-758. Copyright © 2019 American Academy of Family Physicians.)\n\
18
+ BEST PRACTICES IN INFECTIOUS DISEASE \nRecommendations from the Choosing \nWisely\
19
+ \ Campaign\nRecommendation Sponsoring organization\nDo not routinely avoid \n\
20
+ influenza vaccination in \negg-allergic patients.\nAmerican Academy of Allergy,\
21
+ \ \nAsthma, and Immunology\nSource: For more information on the Choosing Wisely\
22
+ \ Campaign,"
23
+ - 'Review
24
+
25
+ 722 Vol 5 November 2005
26
+
27
+ accompanied by fever and some subjects have a transient
28
+
29
+ fall in body temperature during the early stages of
30
+
31
+ common cold. In a study of 272 patients with sore throat
32
+
33
+ associated with URTIs, the mean aural temperature was
34
+
35
+ 36·8ºC and around 35% of these patients said they were
36
+
37
+ suffering from “chills” and “feverish discomfort”.49 The
38
+
39
+ sensation of chilliness may be unrelated to any change in
40
+
41
+ skin or body temperature. In a study of human
42
+
43
+ volunteers, a sensation of chill still develops on
44
+
45
+ administration of exogenous pyrogen even though the'
46
+ - "ered when the results will modify management or when a \npatient with signs or\
47
+ \ symptoms of influenza is hospitalized.19 \nTABLE 2\nComplications of Influenza\n\
48
+ Cardiovascular 26\nCerebrovascular accidents\nIschemic heart disease\nMyocarditis\n\
49
+ Hematologic 26\nHemolytic uremic syndrome\nHemophagocytic syndrome\nThrombotic\
50
+ \ thrombocytope -\nnic purpura\nMusculoskeletal 19,26\nMyositis\nRhabdomyolysis\n\
51
+ Neurologic 26\nAcute disseminated \nencephalomyelitis\nEncephalitis\nGuillain-Barré\
52
+ \ syndrome\nPostinfluenza encephalopathy \n(neurologic symptoms occur -\nring\
53
+ \ after resolution but within"
54
+ - source_sentence: How do cytokines interact with the body's systems to influence
55
+ the hypothalamus and affect body temperature?
56
+ sentences:
57
+ - 'interleukin 1, interleukin 6, and tumour necrosis factor
58
+
59
+ alpha, as well as the anti-inflammatory cytokines
60
+
61
+ interleukin-1 receptor antagonist and interleukin 10
62
+
63
+ have been investigated for their pyrogenic or antipyretic
64
+
65
+ action.17 Interleukin 1 and interleukin 6 are believed to
66
+
67
+ be the most important cytokines that induce fever. 55
68
+
69
+ Cytokines are believed to cross the blood–brain barrier
70
+
71
+ or interact with the vagus nerve endings to signal the
72
+
73
+ temperature control centre of the hypothalamus to
74
+
75
+ increase the thermal set point.55,56 The hypothalamus
76
+
77
+ then initiates shivering, constriction of skin blood'
78
+ - "mended human dose; possible \nrisk of embryo-fetal toxicity with \ncontinuous\
79
+ \ intravenous infusion \nbased on limited animal data\nBaloxavir (Xofluza), \n\
80
+ available as oral \ntablets\nNA ($160) Adults and children 12 years \nand older:\
81
+ \ \n88 to 174 lb (40 to 79 kg): \nsingle dose of 40 mg \n≥ 175 lb (80 kg):\
82
+ \ single dose \nof 80 mg\nTreatment of uncom-\nplicated acute \ninfluenza in\
83
+ \ patients \n12 years and older who \nhave been symptom -\natic for no more than\
84
+ \ \n48 hours\nContraindicated in people with \na history of hypersensitivity to\
85
+ \ \nbaloxavir or any component of the \nproduct"
86
+ - "CME This clinical content conforms to AAFP criteria for con-\ntinuing medical\
87
+ \ education (CME). See CME Quiz on page 271.\nAuthor disclosure: No relevant\
88
+ \ financial affiliations.\nPatient information: Handouts on this topic, written\
89
+ \ by the \nauthors of this article, are available at https:// www.aafp.org/\n\
90
+ afp/2019/0901/p281-s1.html and https:// www.aafp.org/\nafp/2019/0901/p281-s2.html.\n\
91
+ Acute upper respiratory tract infections are extremely common in adults and children,\
92
+ \ but only a few safe and effective treat-"
93
+ - source_sentence: What are the limitations of using adamantanes (amantadine and rimantadine)
94
+ for influenza treatment according to the context?
95
+ sentences:
96
+ - "December 15, 2019 ◆ Volume 100, Number 12 www.aafp.org/afp American Family Physician\
97
+ \ 755\nINFLUENZA\nClinicians caring for high-risk patients can also be consid\
98
+ \ -\nered for treatment.28\nFour antiviral drugs have been approved for the treat\
99
+ \ -\nment of influenza (Table 4): the NA inhibitors oseltamivir \n(Tamiflu),\
100
+ \ zanamivir (Relenza), and peramivir (Rapivab), \nand the cap-dependent endonuclease\
101
+ \ inhibitor baloxa -\nvir (Xofluza). 18,37 Any of these agents can be used in\
102
+ \ age- \nappropriate, otherwise healthy outpatients with uncom -\nplicated influenza\
103
+ \ and no contraindications. 18 Baloxavir is"
104
+ - "756 American Family Physician www.aafp.org/afp Volume 100, Number 12 ◆ December\
105
+ \ 15, 2019\nINFLUENZA\nthe risk of bronchospasm. 18,28 Adamantanes (amantadine\
106
+ \ \nand rimantadine [Flumadine]) are approved for influenza \ntreatment but are\
107
+ \ not currently recommended. These med -\nications are not active against influenza\
108
+ \ B, and most influ -\nenza A strains have shown adamantane resistance for the\
109
+ \ \npast 10 years.18\nThere is no demonstrated benefit to treating patients \n\
110
+ with more than one antiviral agent or using higher than \nrecommended dosages.\
111
+ \ 28 However, extended treatment"
112
+ - "distress syndrome\nDiffuse alveolar \nhemorrhage\nHypoxic respiratory \nfailure\n\
113
+ Primary viral pneumonia\nSecondary bacterial \npneumonia\nRenal 26\nAcute kidney\
114
+ \ injury \n(e.g., acute tubulo- \ninterstitial nephritis, \nglomerulonephritis,\
115
+ \ \nminimal change disease)\nMultiorgan failure\nInformation from references 8,\
116
+ \ 19, and 25-27.\nSORT: KEY RECOMMENDATIONS FOR PRACTICE\nClinical recommendation\n\
117
+ Evidence \nrating Comments\nAnnual influenza vaccination is recommended for all\
118
+ \ people 6 months and older. 15,16 A Reports of expert committees"
119
+ - source_sentence: Which symptoms of colds and flu are now better understood due to
120
+ new knowledge in molecular biology?
121
+ sentences:
122
+ - 'mechanisms that generate the familiar symptoms is poor compared with the amount
123
+ of knowledge available on the
124
+
125
+ molecular biology of the viruses involved. New knowledge of the effects of cytokines
126
+ in human beings now helps to
127
+
128
+ explain some of the symptoms of colds and flu that were previously in the realm
129
+ of folklore rather than medicine—
130
+
131
+ eg, fever, anorexia, malaise, chilliness, headache, and muscle aches and pains.
132
+ The mechanisms of symptoms of
133
+
134
+ sore throat, rhinorrhoea, sneezing, nasal congestion, cough, watery eyes, and
135
+ sinus pain are discussed, since these'
136
+ - 'medicines such as ipratropium. These studies have
137
+
138
+ demonstrated that nasal secretions in the first 4 days of a
139
+
140
+ common cold are inhibited by intranasal administration
141
+
142
+ of ipratropium.25 The nasal discharge also consists of a
143
+
144
+ protein-rich plasma exudate derived from subepithelial
145
+
146
+ capillaries,28 which may explain why anticholinergics
147
+
148
+ only partly inhibit nasal discharge associated with
149
+
150
+ URTIs.27
151
+
152
+ The colour of nasal discharge and sputum is often
153
+
154
+ used as a clinical marker to determine whether or not to
155
+
156
+ prescribe antibiotics but there is no evidence from the'
157
+ - "ing diffuse alveolar hemorrhage in immunocompetent patients: a state-\nof-the-art\
158
+ \ review. Lung. 2013; 191(1): 9-18.\n 28. Uyeki TM, Bernstein HH, Bradley JS,\
159
+ \ et al. Clinical practice guidelines by \nthe Infectious Diseases Society of\
160
+ \ America: 2018 update on diagnosis, \ntreatment, chemoprophylaxis, and institutional\
161
+ \ outbreak management \nof seasonal influenza. Clin Infect Dis. 2019; 68(6): 895-902.\n\
162
+ \ 29. Ebell MH, Afonso AM, Gonzales R, et al. Development and validation of \n\
163
+ a clinical decision rule for the diagnosis of influenza. J Am Board Fam \nMed.\
164
+ \ 2012; 25(1): 55-62."
165
+ - source_sentence: 'QUESTION #2: How does the sneeze centre in the brainstem coordinate
166
+ the actions involved in sneezing?'
167
+ sentences:
168
+ - "stroke, seizure disorder, dementia)\nAsthma or other chronic pulmonary disease\n\
169
+ Chronic kidney disease\nChronic liver disease\nHeart disease (acquired or congenital)\n\
170
+ Immunosuppression (e.g., HIV infection, cancer, transplant \nrecipients, use of\
171
+ \ immunosuppressive medications)\nLong-term aspirin therapy in patients younger\
172
+ \ than 19 years\nMetabolic disorders (acquired [e.g., diabetes mellitus] or \n\
173
+ inherited [e.g., mitochondrial disorders])\nMorbid obesity\nSickle cell anemia\
174
+ \ and other hemoglobinopathies\nSpecial groups\nAdults 65 years and older\nAmerican\
175
+ \ Indians and Alaska Natives"
176
+ - 'causes sneezing.23 The trigeminal nerves relay
177
+
178
+ information to the sneeze centre in the brainstem and
179
+
180
+ cause reflex activation of motor and parasympathetic
181
+
182
+ branches of the facial nerve and activate respiratory
183
+
184
+ muscles. A model of the sneeze reflex is illustrated in
185
+
186
+ figure 1. The sneeze centre coordinates the inspiratory
187
+
188
+ and expiratory actions of sneezing via respiratory
189
+
190
+ muscles, and lacrimation and nasal congestion via
191
+
192
+ parasympathetic branches of the facial nerve. The eyes
193
+
194
+ are always closed during sneezing by activation of facial
195
+
196
+ muscles, indicating a close relation between the'
197
+ - 'during experimental rhinovirus infections have not
198
+
199
+ been able to find any morphological changes in the
200
+
201
+ nasal epithelium of infected volunteers, apart from a
202
+
203
+ substantial increase in polymorphonuclear leucocytes
204
+
205
+ early in the course of the infection.11 The major cell
206
+
207
+ monitoring the host for the invasion of pathogens is
208
+
209
+ the macrophage, which has the ability to trigger an
210
+
211
+ acute phase response when stimulated with
212
+
213
+ components of viruses or bacteria—eg, viral RNA and
214
+
215
+ bacterial cell wall components.12 The surface of the
216
+
217
+ macrophage exhibits toll-like receptors that combine'
218
+ pipeline_tag: sentence-similarity
219
+ library_name: sentence-transformers
220
+ metrics:
221
+ - cosine_accuracy@1
222
+ - cosine_accuracy@3
223
+ - cosine_accuracy@5
224
+ - cosine_accuracy@10
225
+ - cosine_precision@1
226
+ - cosine_precision@3
227
+ - cosine_precision@5
228
+ - cosine_precision@10
229
+ - cosine_recall@1
230
+ - cosine_recall@3
231
+ - cosine_recall@5
232
+ - cosine_recall@10
233
+ - cosine_ndcg@10
234
+ - cosine_mrr@10
235
+ - cosine_map@100
236
+ model-index:
237
+ - name: SentenceTransformer based on Snowflake/snowflake-arctic-embed-l
238
+ results:
239
+ - task:
240
+ type: information-retrieval
241
+ name: Information Retrieval
242
+ dataset:
243
+ name: Unknown
244
+ type: unknown
245
+ metrics:
246
+ - type: cosine_accuracy@1
247
+ value: 0.6122448979591837
248
+ name: Cosine Accuracy@1
249
+ - type: cosine_accuracy@3
250
+ value: 0.8877551020408163
251
+ name: Cosine Accuracy@3
252
+ - type: cosine_accuracy@5
253
+ value: 0.9387755102040817
254
+ name: Cosine Accuracy@5
255
+ - type: cosine_accuracy@10
256
+ value: 0.9897959183673469
257
+ name: Cosine Accuracy@10
258
+ - type: cosine_precision@1
259
+ value: 0.6122448979591837
260
+ name: Cosine Precision@1
261
+ - type: cosine_precision@3
262
+ value: 0.29591836734693877
263
+ name: Cosine Precision@3
264
+ - type: cosine_precision@5
265
+ value: 0.1877551020408163
266
+ name: Cosine Precision@5
267
+ - type: cosine_precision@10
268
+ value: 0.09897959183673469
269
+ name: Cosine Precision@10
270
+ - type: cosine_recall@1
271
+ value: 0.6122448979591837
272
+ name: Cosine Recall@1
273
+ - type: cosine_recall@3
274
+ value: 0.8877551020408163
275
+ name: Cosine Recall@3
276
+ - type: cosine_recall@5
277
+ value: 0.9387755102040817
278
+ name: Cosine Recall@5
279
+ - type: cosine_recall@10
280
+ value: 0.9897959183673469
281
+ name: Cosine Recall@10
282
+ - type: cosine_ndcg@10
283
+ value: 0.8165441473931409
284
+ name: Cosine Ndcg@10
285
+ - type: cosine_mrr@10
286
+ value: 0.7593091998704244
287
+ name: Cosine Mrr@10
288
+ - type: cosine_map@100
289
+ value: 0.7600380628441854
290
+ name: Cosine Map@100
291
+ - type: cosine_accuracy@1
292
+ value: 0.61
293
+ name: Cosine Accuracy@1
294
+ - type: cosine_accuracy@3
295
+ value: 0.86
296
+ name: Cosine Accuracy@3
297
+ - type: cosine_accuracy@5
298
+ value: 0.91
299
+ name: Cosine Accuracy@5
300
+ - type: cosine_accuracy@10
301
+ value: 0.98
302
+ name: Cosine Accuracy@10
303
+ - type: cosine_precision@1
304
+ value: 0.61
305
+ name: Cosine Precision@1
306
+ - type: cosine_precision@3
307
+ value: 0.2866666666666666
308
+ name: Cosine Precision@3
309
+ - type: cosine_precision@5
310
+ value: 0.18199999999999997
311
+ name: Cosine Precision@5
312
+ - type: cosine_precision@10
313
+ value: 0.09799999999999998
314
+ name: Cosine Precision@10
315
+ - type: cosine_recall@1
316
+ value: 0.61
317
+ name: Cosine Recall@1
318
+ - type: cosine_recall@3
319
+ value: 0.86
320
+ name: Cosine Recall@3
321
+ - type: cosine_recall@5
322
+ value: 0.91
323
+ name: Cosine Recall@5
324
+ - type: cosine_recall@10
325
+ value: 0.98
326
+ name: Cosine Recall@10
327
+ - type: cosine_ndcg@10
328
+ value: 0.8056804227184741
329
+ name: Cosine Ndcg@10
330
+ - type: cosine_mrr@10
331
+ value: 0.7489960317460317
332
+ name: Cosine Mrr@10
333
+ - type: cosine_map@100
334
+ value: 0.7504795482295481
335
+ name: Cosine Map@100
336
+ ---
337
+
338
+ # SentenceTransformer based on Snowflake/snowflake-arctic-embed-l
339
+
340
+ This is a [sentence-transformers](https://www.SBERT.net) model finetuned from [Snowflake/snowflake-arctic-embed-l](https://huggingface.co/Snowflake/snowflake-arctic-embed-l). It maps sentences & paragraphs to a 1024-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.
341
+
342
+ ## Model Details
343
+
344
+ ### Model Description
345
+ - **Model Type:** Sentence Transformer
346
+ - **Base model:** [Snowflake/snowflake-arctic-embed-l](https://huggingface.co/Snowflake/snowflake-arctic-embed-l) <!-- at revision d8fb21ca8d905d2832ee8b96c894d3298964346b -->
347
+ - **Maximum Sequence Length:** 512 tokens
348
+ - **Output Dimensionality:** 1024 dimensions
349
+ - **Similarity Function:** Cosine Similarity
350
+ <!-- - **Training Dataset:** Unknown -->
351
+ <!-- - **Language:** Unknown -->
352
+ <!-- - **License:** Unknown -->
353
+
354
+ ### Model Sources
355
+
356
+ - **Documentation:** [Sentence Transformers Documentation](https://sbert.net)
357
+ - **Repository:** [Sentence Transformers on GitHub](https://github.com/UKPLab/sentence-transformers)
358
+ - **Hugging Face:** [Sentence Transformers on Hugging Face](https://huggingface.co/models?library=sentence-transformers)
359
+
360
+ ### Full Model Architecture
361
+
362
+ ```
363
+ SentenceTransformer(
364
+ (0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: BertModel
365
+ (1): Pooling({'word_embedding_dimension': 1024, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
366
+ (2): Normalize()
367
+ )
368
+ ```
369
+
370
+ ## Usage
371
+
372
+ ### Direct Usage (Sentence Transformers)
373
+
374
+ First install the Sentence Transformers library:
375
+
376
+ ```bash
377
+ pip install -U sentence-transformers
378
+ ```
379
+
380
+ Then you can load this model and run inference.
381
+ ```python
382
+ from sentence_transformers import SentenceTransformer
383
+
384
+ # Download from the 🤗 Hub
385
+ model = SentenceTransformer("Gonalb/flucold-ft-v2")
386
+ # Run inference
387
+ sentences = [
388
+ 'QUESTION #2: How does the sneeze centre in the brainstem coordinate the actions involved in sneezing?',
389
+ 'causes sneezing.23 The trigeminal nerves relay\ninformation to the sneeze centre in the brainstem and\ncause reflex activation of motor and parasympathetic\nbranches of the facial nerve and activate respiratory\nmuscles. A model of the sneeze reflex is illustrated in\nfigure 1. The sneeze centre coordinates the inspiratory\nand expiratory actions of sneezing via respiratory\nmuscles, and lacrimation and nasal congestion via\nparasympathetic branches of the facial nerve. The eyes\nare always closed during sneezing by activation of facial\nmuscles, indicating a close relation between the',
390
+ 'stroke, seizure disorder, dementia)\nAsthma or other chronic pulmonary disease\nChronic kidney disease\nChronic liver disease\nHeart disease (acquired or congenital)\nImmunosuppression (e.g., HIV infection, cancer, transplant \nrecipients, use of immunosuppressive medications)\nLong-term aspirin therapy in patients younger than 19 years\nMetabolic disorders (acquired [e.g., diabetes mellitus] or \ninherited [e.g., mitochondrial disorders])\nMorbid obesity\nSickle cell anemia and other hemoglobinopathies\nSpecial groups\nAdults 65 years and older\nAmerican Indians and Alaska Natives',
391
+ ]
392
+ embeddings = model.encode(sentences)
393
+ print(embeddings.shape)
394
+ # [3, 1024]
395
+
396
+ # Get the similarity scores for the embeddings
397
+ similarities = model.similarity(embeddings, embeddings)
398
+ print(similarities.shape)
399
+ # [3, 3]
400
+ ```
401
+
402
+ <!--
403
+ ### Direct Usage (Transformers)
404
+
405
+ <details><summary>Click to see the direct usage in Transformers</summary>
406
+
407
+ </details>
408
+ -->
409
+
410
+ <!--
411
+ ### Downstream Usage (Sentence Transformers)
412
+
413
+ You can finetune this model on your own dataset.
414
+
415
+ <details><summary>Click to expand</summary>
416
+
417
+ </details>
418
+ -->
419
+
420
+ <!--
421
+ ### Out-of-Scope Use
422
+
423
+ *List how the model may foreseeably be misused and address what users ought not to do with the model.*
424
+ -->
425
+
426
+ ## Evaluation
427
+
428
+ ### Metrics
429
+
430
+ #### Information Retrieval
431
+
432
+ * Evaluated with [<code>InformationRetrievalEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.InformationRetrievalEvaluator)
433
+
434
+ | Metric | Value |
435
+ |:--------------------|:-----------|
436
+ | cosine_accuracy@1 | 0.6122 |
437
+ | cosine_accuracy@3 | 0.8878 |
438
+ | cosine_accuracy@5 | 0.9388 |
439
+ | cosine_accuracy@10 | 0.9898 |
440
+ | cosine_precision@1 | 0.6122 |
441
+ | cosine_precision@3 | 0.2959 |
442
+ | cosine_precision@5 | 0.1878 |
443
+ | cosine_precision@10 | 0.099 |
444
+ | cosine_recall@1 | 0.6122 |
445
+ | cosine_recall@3 | 0.8878 |
446
+ | cosine_recall@5 | 0.9388 |
447
+ | cosine_recall@10 | 0.9898 |
448
+ | **cosine_ndcg@10** | **0.8165** |
449
+ | cosine_mrr@10 | 0.7593 |
450
+ | cosine_map@100 | 0.76 |
451
+
452
+ #### Information Retrieval
453
+
454
+ * Evaluated with [<code>InformationRetrievalEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.InformationRetrievalEvaluator)
455
+
456
+ | Metric | Value |
457
+ |:--------------------|:-----------|
458
+ | cosine_accuracy@1 | 0.61 |
459
+ | cosine_accuracy@3 | 0.86 |
460
+ | cosine_accuracy@5 | 0.91 |
461
+ | cosine_accuracy@10 | 0.98 |
462
+ | cosine_precision@1 | 0.61 |
463
+ | cosine_precision@3 | 0.2867 |
464
+ | cosine_precision@5 | 0.182 |
465
+ | cosine_precision@10 | 0.098 |
466
+ | cosine_recall@1 | 0.61 |
467
+ | cosine_recall@3 | 0.86 |
468
+ | cosine_recall@5 | 0.91 |
469
+ | cosine_recall@10 | 0.98 |
470
+ | **cosine_ndcg@10** | **0.8057** |
471
+ | cosine_mrr@10 | 0.749 |
472
+ | cosine_map@100 | 0.7505 |
473
+
474
+ <!--
475
+ ## Bias, Risks and Limitations
476
+
477
+ *What are the known or foreseeable issues stemming from this model? You could also flag here known failure cases or weaknesses of the model.*
478
+ -->
479
+
480
+ <!--
481
+ ### Recommendations
482
+
483
+ *What are recommendations with respect to the foreseeable issues? For example, filtering explicit content.*
484
+ -->
485
+
486
+ ## Training Details
487
+
488
+ ### Training Dataset
489
+
490
+ #### Unnamed Dataset
491
+
492
+ * Size: 400 training samples
493
+ * Columns: <code>sentence_0</code> and <code>sentence_1</code>
494
+ * Approximate statistics based on the first 400 samples:
495
+ | | sentence_0 | sentence_1 |
496
+ |:--------|:----------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------|
497
+ | type | string | string |
498
+ | details | <ul><li>min: 2 tokens</li><li>mean: 23.07 tokens</li><li>max: 53 tokens</li></ul> | <ul><li>min: 25 tokens</li><li>mean: 122.33 tokens</li><li>max: 296 tokens</li></ul> |
499
+ * Samples:
500
+ | sentence_0 | sentence_1 |
501
+ |:-------------------------------------------------------------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
502
+ | <code>What should individuals with asthma do if they experience flu symptoms?</code> | <code>People with asthma who get flu symptoms should call their health care provider right<br>away. There are antiviral drugs that can treat flu illness and help prevent serious flu<br>complications.<br>About asthma<br>Asthma is a lung disease that is caused by chronic inflammation of the airways. It is one of the most common long-term diseases among<br>children, but adults can have asthma, too. Asthma attacks occur when the lung airways tighten due to inflammation. Asthma attacks can be</code> |
503
+ | <code>What causes asthma attacks to occur in individuals with asthma?</code> | <code>People with asthma who get flu symptoms should call their health care provider right<br>away. There are antiviral drugs that can treat flu illness and help prevent serious flu<br>complications.<br>About asthma<br>Asthma is a lung disease that is caused by chronic inflammation of the airways. It is one of the most common long-term diseases among<br>children, but adults can have asthma, too. Asthma attacks occur when the lung airways tighten due to inflammation. Asthma attacks can be</code> |
504
+ | <code>QUESTION #1: How long are people with RSV typically contagious?</code> | <code>second birthday. However, repeat infections may occur throughout life.<br>People with RSV are usually contagious for 3 to 8 days and may become contagious a day or two before they start showing signs of illness.<br>However, some infants and people with weakened immune systems can continue to spread the virus for 4 weeks or longer, even after they stop<br>showing symptoms. Children are often exposed to and infected with RSV outside the home, such as in school or childcare centers. They can then<br>transmit the virus to other members of the family.</code> |
505
+ * Loss: [<code>MatryoshkaLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#matryoshkaloss) with these parameters:
506
+ ```json
507
+ {
508
+ "loss": "MultipleNegativesRankingLoss",
509
+ "matryoshka_dims": [
510
+ 768,
511
+ 512,
512
+ 256,
513
+ 128,
514
+ 64
515
+ ],
516
+ "matryoshka_weights": [
517
+ 1,
518
+ 1,
519
+ 1,
520
+ 1,
521
+ 1
522
+ ],
523
+ "n_dims_per_step": -1
524
+ }
525
+ ```
526
+
527
+ ### Training Hyperparameters
528
+ #### Non-Default Hyperparameters
529
+
530
+ - `eval_strategy`: steps
531
+ - `per_device_train_batch_size`: 10
532
+ - `per_device_eval_batch_size`: 10
533
+ - `num_train_epochs`: 10
534
+ - `multi_dataset_batch_sampler`: round_robin
535
+
536
+ #### All Hyperparameters
537
+ <details><summary>Click to expand</summary>
538
+
539
+ - `overwrite_output_dir`: False
540
+ - `do_predict`: False
541
+ - `eval_strategy`: steps
542
+ - `prediction_loss_only`: True
543
+ - `per_device_train_batch_size`: 10
544
+ - `per_device_eval_batch_size`: 10
545
+ - `per_gpu_train_batch_size`: None
546
+ - `per_gpu_eval_batch_size`: None
547
+ - `gradient_accumulation_steps`: 1
548
+ - `eval_accumulation_steps`: None
549
+ - `torch_empty_cache_steps`: None
550
+ - `learning_rate`: 5e-05
551
+ - `weight_decay`: 0.0
552
+ - `adam_beta1`: 0.9
553
+ - `adam_beta2`: 0.999
554
+ - `adam_epsilon`: 1e-08
555
+ - `max_grad_norm`: 1
556
+ - `num_train_epochs`: 10
557
+ - `max_steps`: -1
558
+ - `lr_scheduler_type`: linear
559
+ - `lr_scheduler_kwargs`: {}
560
+ - `warmup_ratio`: 0.0
561
+ - `warmup_steps`: 0
562
+ - `log_level`: passive
563
+ - `log_level_replica`: warning
564
+ - `log_on_each_node`: True
565
+ - `logging_nan_inf_filter`: True
566
+ - `save_safetensors`: True
567
+ - `save_on_each_node`: False
568
+ - `save_only_model`: False
569
+ - `restore_callback_states_from_checkpoint`: False
570
+ - `no_cuda`: False
571
+ - `use_cpu`: False
572
+ - `use_mps_device`: False
573
+ - `seed`: 42
574
+ - `data_seed`: None
575
+ - `jit_mode_eval`: False
576
+ - `use_ipex`: False
577
+ - `bf16`: False
578
+ - `fp16`: False
579
+ - `fp16_opt_level`: O1
580
+ - `half_precision_backend`: auto
581
+ - `bf16_full_eval`: False
582
+ - `fp16_full_eval`: False
583
+ - `tf32`: None
584
+ - `local_rank`: 0
585
+ - `ddp_backend`: None
586
+ - `tpu_num_cores`: None
587
+ - `tpu_metrics_debug`: False
588
+ - `debug`: []
589
+ - `dataloader_drop_last`: False
590
+ - `dataloader_num_workers`: 0
591
+ - `dataloader_prefetch_factor`: None
592
+ - `past_index`: -1
593
+ - `disable_tqdm`: False
594
+ - `remove_unused_columns`: True
595
+ - `label_names`: None
596
+ - `load_best_model_at_end`: False
597
+ - `ignore_data_skip`: False
598
+ - `fsdp`: []
599
+ - `fsdp_min_num_params`: 0
600
+ - `fsdp_config`: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
601
+ - `fsdp_transformer_layer_cls_to_wrap`: None
602
+ - `accelerator_config`: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
603
+ - `deepspeed`: None
604
+ - `label_smoothing_factor`: 0.0
605
+ - `optim`: adamw_torch
606
+ - `optim_args`: None
607
+ - `adafactor`: False
608
+ - `group_by_length`: False
609
+ - `length_column_name`: length
610
+ - `ddp_find_unused_parameters`: None
611
+ - `ddp_bucket_cap_mb`: None
612
+ - `ddp_broadcast_buffers`: False
613
+ - `dataloader_pin_memory`: True
614
+ - `dataloader_persistent_workers`: False
615
+ - `skip_memory_metrics`: True
616
+ - `use_legacy_prediction_loop`: False
617
+ - `push_to_hub`: False
618
+ - `resume_from_checkpoint`: None
619
+ - `hub_model_id`: None
620
+ - `hub_strategy`: every_save
621
+ - `hub_private_repo`: None
622
+ - `hub_always_push`: False
623
+ - `gradient_checkpointing`: False
624
+ - `gradient_checkpointing_kwargs`: None
625
+ - `include_inputs_for_metrics`: False
626
+ - `include_for_metrics`: []
627
+ - `eval_do_concat_batches`: True
628
+ - `fp16_backend`: auto
629
+ - `push_to_hub_model_id`: None
630
+ - `push_to_hub_organization`: None
631
+ - `mp_parameters`:
632
+ - `auto_find_batch_size`: False
633
+ - `full_determinism`: False
634
+ - `torchdynamo`: None
635
+ - `ray_scope`: last
636
+ - `ddp_timeout`: 1800
637
+ - `torch_compile`: False
638
+ - `torch_compile_backend`: None
639
+ - `torch_compile_mode`: None
640
+ - `dispatch_batches`: None
641
+ - `split_batches`: None
642
+ - `include_tokens_per_second`: False
643
+ - `include_num_input_tokens_seen`: False
644
+ - `neftune_noise_alpha`: None
645
+ - `optim_target_modules`: None
646
+ - `batch_eval_metrics`: False
647
+ - `eval_on_start`: False
648
+ - `use_liger_kernel`: False
649
+ - `eval_use_gather_object`: False
650
+ - `average_tokens_across_devices`: False
651
+ - `prompts`: None
652
+ - `batch_sampler`: batch_sampler
653
+ - `multi_dataset_batch_sampler`: round_robin
654
+
655
+ </details>
656
+
657
+ ### Training Logs
658
+ | Epoch | Step | Training Loss | cosine_ndcg@10 |
659
+ |:------:|:----:|:-------------:|:--------------:|
660
+ | 1.0 | 40 | - | 0.8359 |
661
+ | 1.25 | 50 | - | 0.8312 |
662
+ | 2.0 | 80 | - | 0.8304 |
663
+ | 2.5 | 100 | - | 0.8156 |
664
+ | 3.0 | 120 | - | 0.8016 |
665
+ | 3.75 | 150 | - | 0.7952 |
666
+ | 4.0 | 160 | - | 0.7880 |
667
+ | 5.0 | 200 | - | 0.8021 |
668
+ | 6.0 | 240 | - | 0.8215 |
669
+ | 6.25 | 250 | - | 0.8286 |
670
+ | 7.0 | 280 | - | 0.8079 |
671
+ | 7.5 | 300 | - | 0.8043 |
672
+ | 8.0 | 320 | - | 0.8126 |
673
+ | 8.75 | 350 | - | 0.8099 |
674
+ | 9.0 | 360 | - | 0.8126 |
675
+ | 10.0 | 400 | - | 0.8165 |
676
+ | 0.6173 | 50 | - | 0.8138 |
677
+ | 1.0 | 81 | - | 0.8158 |
678
+ | 1.2346 | 100 | - | 0.7932 |
679
+ | 1.8519 | 150 | - | 0.7989 |
680
+ | 2.0 | 162 | - | 0.7866 |
681
+ | 2.4691 | 200 | - | 0.8012 |
682
+ | 3.0 | 243 | - | 0.7803 |
683
+ | 3.0864 | 250 | - | 0.7969 |
684
+ | 3.7037 | 300 | - | 0.8030 |
685
+ | 4.0 | 324 | - | 0.7993 |
686
+ | 4.3210 | 350 | - | 0.7848 |
687
+ | 4.9383 | 400 | - | 0.7852 |
688
+ | 5.0 | 405 | - | 0.7814 |
689
+ | 5.5556 | 450 | - | 0.7975 |
690
+ | 6.0 | 486 | - | 0.7846 |
691
+ | 6.1728 | 500 | 0.314 | 0.7925 |
692
+ | 6.7901 | 550 | - | 0.7994 |
693
+ | 7.0 | 567 | - | 0.8069 |
694
+ | 7.4074 | 600 | - | 0.8048 |
695
+ | 8.0 | 648 | - | 0.8063 |
696
+ | 8.0247 | 650 | - | 0.8062 |
697
+ | 8.6420 | 700 | - | 0.7992 |
698
+ | 9.0 | 729 | - | 0.8115 |
699
+ | 9.2593 | 750 | - | 0.8118 |
700
+ | 9.8765 | 800 | - | 0.8057 |
701
+ | 10.0 | 810 | - | 0.8057 |
702
+
703
+
704
+ ### Framework Versions
705
+ - Python: 3.11.11
706
+ - Sentence Transformers: 3.4.1
707
+ - Transformers: 4.48.3
708
+ - PyTorch: 2.5.1+cu124
709
+ - Accelerate: 1.3.0
710
+ - Datasets: 3.3.2
711
+ - Tokenizers: 0.21.0
712
+
713
+ ## Citation
714
+
715
+ ### BibTeX
716
+
717
+ #### Sentence Transformers
718
+ ```bibtex
719
+ @inproceedings{reimers-2019-sentence-bert,
720
+ title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
721
+ author = "Reimers, Nils and Gurevych, Iryna",
722
+ booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
723
+ month = "11",
724
+ year = "2019",
725
+ publisher = "Association for Computational Linguistics",
726
+ url = "https://arxiv.org/abs/1908.10084",
727
+ }
728
+ ```
729
+
730
+ #### MatryoshkaLoss
731
+ ```bibtex
732
+ @misc{kusupati2024matryoshka,
733
+ title={Matryoshka Representation Learning},
734
+ author={Aditya Kusupati and Gantavya Bhatt and Aniket Rege and Matthew Wallingford and Aditya Sinha and Vivek Ramanujan and William Howard-Snyder and Kaifeng Chen and Sham Kakade and Prateek Jain and Ali Farhadi},
735
+ year={2024},
736
+ eprint={2205.13147},
737
+ archivePrefix={arXiv},
738
+ primaryClass={cs.LG}
739
+ }
740
+ ```
741
+
742
+ #### MultipleNegativesRankingLoss
743
+ ```bibtex
744
+ @misc{henderson2017efficient,
745
+ title={Efficient Natural Language Response Suggestion for Smart Reply},
746
+ author={Matthew Henderson and Rami Al-Rfou and Brian Strope and Yun-hsuan Sung and Laszlo Lukacs and Ruiqi Guo and Sanjiv Kumar and Balint Miklos and Ray Kurzweil},
747
+ year={2017},
748
+ eprint={1705.00652},
749
+ archivePrefix={arXiv},
750
+ primaryClass={cs.CL}
751
+ }
752
+ ```
753
+
754
+ <!--
755
+ ## Glossary
756
+
757
+ *Clearly define terms in order to be accessible across audiences.*
758
+ -->
759
+
760
+ <!--
761
+ ## Model Card Authors
762
+
763
+ *Lists the people who create the model card, providing recognition and accountability for the detailed work that goes into its construction.*
764
+ -->
765
+
766
+ <!--
767
+ ## Model Card Contact
768
+
769
+ *Provides a way for people who have updates to the Model Card, suggestions, or questions, to contact the Model Card authors.*
770
+ -->
config.json ADDED
@@ -0,0 +1,25 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "_name_or_path": "Snowflake/snowflake-arctic-embed-l",
3
+ "architectures": [
4
+ "BertModel"
5
+ ],
6
+ "attention_probs_dropout_prob": 0.1,
7
+ "classifier_dropout": null,
8
+ "hidden_act": "gelu",
9
+ "hidden_dropout_prob": 0.1,
10
+ "hidden_size": 1024,
11
+ "initializer_range": 0.02,
12
+ "intermediate_size": 4096,
13
+ "layer_norm_eps": 1e-12,
14
+ "max_position_embeddings": 512,
15
+ "model_type": "bert",
16
+ "num_attention_heads": 16,
17
+ "num_hidden_layers": 24,
18
+ "pad_token_id": 0,
19
+ "position_embedding_type": "absolute",
20
+ "torch_dtype": "float32",
21
+ "transformers_version": "4.48.3",
22
+ "type_vocab_size": 2,
23
+ "use_cache": true,
24
+ "vocab_size": 30522
25
+ }
config_sentence_transformers.json ADDED
@@ -0,0 +1,12 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "__version__": {
3
+ "sentence_transformers": "3.4.1",
4
+ "transformers": "4.48.3",
5
+ "pytorch": "2.5.1+cu124"
6
+ },
7
+ "prompts": {
8
+ "query": "Represent this sentence for searching relevant passages: "
9
+ },
10
+ "default_prompt_name": null,
11
+ "similarity_fn_name": "cosine"
12
+ }
model.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:42d4fe1e80d1b2d9c105a16bd6d8c6212d646fa7da08c42d27dcf7eaf07ebb44
3
+ size 1336413848
modules.json ADDED
@@ -0,0 +1,20 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ [
2
+ {
3
+ "idx": 0,
4
+ "name": "0",
5
+ "path": "",
6
+ "type": "sentence_transformers.models.Transformer"
7
+ },
8
+ {
9
+ "idx": 1,
10
+ "name": "1",
11
+ "path": "1_Pooling",
12
+ "type": "sentence_transformers.models.Pooling"
13
+ },
14
+ {
15
+ "idx": 2,
16
+ "name": "2",
17
+ "path": "2_Normalize",
18
+ "type": "sentence_transformers.models.Normalize"
19
+ }
20
+ ]
sentence_bert_config.json ADDED
@@ -0,0 +1,4 @@
 
 
 
 
 
1
+ {
2
+ "max_seq_length": 512,
3
+ "do_lower_case": false
4
+ }
special_tokens_map.json ADDED
@@ -0,0 +1,37 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "cls_token": {
3
+ "content": "[CLS]",
4
+ "lstrip": false,
5
+ "normalized": false,
6
+ "rstrip": false,
7
+ "single_word": false
8
+ },
9
+ "mask_token": {
10
+ "content": "[MASK]",
11
+ "lstrip": false,
12
+ "normalized": false,
13
+ "rstrip": false,
14
+ "single_word": false
15
+ },
16
+ "pad_token": {
17
+ "content": "[PAD]",
18
+ "lstrip": false,
19
+ "normalized": false,
20
+ "rstrip": false,
21
+ "single_word": false
22
+ },
23
+ "sep_token": {
24
+ "content": "[SEP]",
25
+ "lstrip": false,
26
+ "normalized": false,
27
+ "rstrip": false,
28
+ "single_word": false
29
+ },
30
+ "unk_token": {
31
+ "content": "[UNK]",
32
+ "lstrip": false,
33
+ "normalized": false,
34
+ "rstrip": false,
35
+ "single_word": false
36
+ }
37
+ }
tokenizer.json ADDED
The diff for this file is too large to render. See raw diff
 
tokenizer_config.json ADDED
@@ -0,0 +1,63 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "added_tokens_decoder": {
3
+ "0": {
4
+ "content": "[PAD]",
5
+ "lstrip": false,
6
+ "normalized": false,
7
+ "rstrip": false,
8
+ "single_word": false,
9
+ "special": true
10
+ },
11
+ "100": {
12
+ "content": "[UNK]",
13
+ "lstrip": false,
14
+ "normalized": false,
15
+ "rstrip": false,
16
+ "single_word": false,
17
+ "special": true
18
+ },
19
+ "101": {
20
+ "content": "[CLS]",
21
+ "lstrip": false,
22
+ "normalized": false,
23
+ "rstrip": false,
24
+ "single_word": false,
25
+ "special": true
26
+ },
27
+ "102": {
28
+ "content": "[SEP]",
29
+ "lstrip": false,
30
+ "normalized": false,
31
+ "rstrip": false,
32
+ "single_word": false,
33
+ "special": true
34
+ },
35
+ "103": {
36
+ "content": "[MASK]",
37
+ "lstrip": false,
38
+ "normalized": false,
39
+ "rstrip": false,
40
+ "single_word": false,
41
+ "special": true
42
+ }
43
+ },
44
+ "clean_up_tokenization_spaces": true,
45
+ "cls_token": "[CLS]",
46
+ "do_lower_case": true,
47
+ "extra_special_tokens": {},
48
+ "mask_token": "[MASK]",
49
+ "max_length": 512,
50
+ "model_max_length": 512,
51
+ "pad_to_multiple_of": null,
52
+ "pad_token": "[PAD]",
53
+ "pad_token_type_id": 0,
54
+ "padding_side": "right",
55
+ "sep_token": "[SEP]",
56
+ "stride": 0,
57
+ "strip_accents": null,
58
+ "tokenize_chinese_chars": true,
59
+ "tokenizer_class": "BertTokenizer",
60
+ "truncation_side": "right",
61
+ "truncation_strategy": "longest_first",
62
+ "unk_token": "[UNK]"
63
+ }
vocab.txt ADDED
The diff for this file is too large to render. See raw diff