alunadiderot commited on
Commit
3a9636c
·
verified ·
1 Parent(s): e62d070

Upload folder using huggingface_hub

Browse files
.gitattributes CHANGED
@@ -33,3 +33,4 @@ saved_model/**/* filter=lfs diff=lfs merge=lfs -text
33
  *.zip filter=lfs diff=lfs merge=lfs -text
34
  *.zst filter=lfs diff=lfs merge=lfs -text
35
  *tfevents* filter=lfs diff=lfs merge=lfs -text
 
 
33
  *.zip filter=lfs diff=lfs merge=lfs -text
34
  *.zst filter=lfs diff=lfs merge=lfs -text
35
  *tfevents* filter=lfs diff=lfs merge=lfs -text
36
+ tokenizer.json filter=lfs diff=lfs merge=lfs -text
1_Pooling/config.json ADDED
@@ -0,0 +1,10 @@
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "word_embedding_dimension": 768,
3
+ "pooling_mode_cls_token": false,
4
+ "pooling_mode_mean_tokens": true,
5
+ "pooling_mode_max_tokens": false,
6
+ "pooling_mode_mean_sqrt_len_tokens": false,
7
+ "pooling_mode_weightedmean_tokens": false,
8
+ "pooling_mode_lasttoken": false,
9
+ "include_prompt": true
10
+ }
README.md ADDED
@@ -0,0 +1,678 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ tags:
3
+ - setfit
4
+ - sentence-transformers
5
+ - text-classification
6
+ - generated_from_setfit_trainer
7
+ widget:
8
+ - text: Just add speakers I tested this amazing-looking streaming amplifier, and
9
+ its filled with futuristic features to make your music sound epic. 3,000 of premium
10
+ Wi-Fi sound
11
+ - text: Alon Aboutboul Dies The Dark Knight Snowfall Actor was 60. The actor
12
+ was swimming on a beach when the scary moment happened.
13
+ - text: Karnataka Train Services Disrupted By Boulder Fall Near Yedakumari Normalcy
14
+ Restored By Morning. Bengaluru Sakleshpur Train operations on the South Western
15
+ Railway route were briefly disrupted early Saturday morning after boulders fell
16
+ onto the track
17
+ - text: Ari Paparo on Google s Digital Dominance. Our guest is Ari Paparo.
18
+ - text: Seen elsewhere The hill of crosses. I don t want to hear about this. He says
19
+ it again and again. 1,775 people murdered on South African farms from 1991 to
20
+ 2006. I want to go away and never come back.
21
+ metrics:
22
+ - accuracy
23
+ pipeline_tag: text-classification
24
+ library_name: setfit
25
+ inference: true
26
+ base_model: intfloat/multilingual-e5-base
27
+ model-index:
28
+ - name: SetFit with intfloat/multilingual-e5-base
29
+ results:
30
+ - task:
31
+ type: text-classification
32
+ name: Text Classification
33
+ dataset:
34
+ name: Unknown
35
+ type: unknown
36
+ split: test
37
+ metrics:
38
+ - type: accuracy
39
+ value: 0.8421052631578947
40
+ name: Accuracy
41
+ ---
42
+
43
+ # SetFit with intfloat/multilingual-e5-base
44
+
45
+ This is a [SetFit](https://github.com/huggingface/setfit) model that can be used for Text Classification. This SetFit model uses [intfloat/multilingual-e5-base](https://huggingface.co/intfloat/multilingual-e5-base) as the Sentence Transformer embedding model. A [LogisticRegression](https://scikit-learn.org/stable/modules/generated/sklearn.linear_model.LogisticRegression.html) instance is used for classification.
46
+
47
+ The model has been trained using an efficient few-shot learning technique that involves:
48
+
49
+ 1. Fine-tuning a [Sentence Transformer](https://www.sbert.net) with contrastive learning.
50
+ 2. Training a classification head with features from the fine-tuned Sentence Transformer.
51
+
52
+ ## Model Details
53
+
54
+ ### Model Description
55
+ - **Model Type:** SetFit
56
+ - **Sentence Transformer body:** [intfloat/multilingual-e5-base](https://huggingface.co/intfloat/multilingual-e5-base)
57
+ - **Classification head:** a [LogisticRegression](https://scikit-learn.org/stable/modules/generated/sklearn.linear_model.LogisticRegression.html) instance
58
+ - **Maximum Sequence Length:** 512 tokens
59
+ - **Number of Classes:** 12 classes
60
+ <!-- - **Training Dataset:** [Unknown](https://huggingface.co/datasets/unknown) -->
61
+ <!-- - **Language:** Unknown -->
62
+ <!-- - **License:** Unknown -->
63
+
64
+ ### Model Sources
65
+
66
+ - **Repository:** [SetFit on GitHub](https://github.com/huggingface/setfit)
67
+ - **Paper:** [Efficient Few-Shot Learning Without Prompts](https://arxiv.org/abs/2209.11055)
68
+ - **Blogpost:** [SetFit: Efficient Few-Shot Learning Without Prompts](https://huggingface.co/blog/setfit)
69
+
70
+ ### Model Labels
71
+ | Label | Examples |
72
+ |:--------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
73
+ | General News | <ul><li>'VCNBGOMEZ RUPERT.jpg. Click here to view this image from abqjournal.'</li><li>'Breadlines and Bare Shelves Suweida Faces Deepening Humanitarian Crisis. Despite calm on the military front, residents see little hope for immediate improvement, al-Hal writes.'</li><li>'Sarawak protects 4 million hectares, or 30 pct of State, as water catchments to ensure water security. By DayakDaily Team KUCHING, June 24 A total of 28 areas, covering four million hectares or about 30 per cent of Sarawak, have been gazetted as water catchments in one of the State s key measures t'</li></ul> |
74
+ | Politics | <ul><li>'House committee issues subpoenas for Epstein files and depositions with the Clintons. Local news, sports, obituaries, photos, videos and more by journalists covering Butler County, Pennsylvania. Subscribe today.'</li><li>'44 candidates running for major offices in 2025 Manchester municipal election. The number of candidates running for alderman and school board seats in the 2025 Manchester municipal election continues to grow, with another three individuals filing paperwork at City Hall on'</li><li>'Liberals to upend housing policies and make peace with business Bragg. The Coalition s new housing and productivity spokesman says no housing policy will escape critical review, the Coalition will oppose Labor s crazy housing policies, and he wants to rebuild relations with business.'</li></ul> |
75
+ | Sports | <ul><li>'Playing Alongside Ravindra Jadeja A Great Honour Kuldeep Yadav. India s left-arm wrist spinner also talked about the new Test skipper, Shubman Gill, and said that the youngster is continuing the good work done by Virat Kohli and Rohit Sharma'</li><li>'Xabi Alonso s Real Madrid Lineup Still Undecided. Xabi Alonso is keeping us all on the edge of our seats as he tinkers with his squad in Austria against WSG Tirol, giving us a tantalizing glimpse of what might be his starting XI for his grand debut'</li><li>'20 40 Uttoxeter - Racecard - Quinnbet Handicap Chase 4. 20 40 Uttoxeter Quinnbet Handicap Chase 4 full racecard - tips, runners, odds and analysts reviews from Timeform..'</li></ul> |
76
+ | Health | <ul><li>'WaterSafe urges plumbers to use only lead-free solder to protect UK drinking water. WaterSafe is issuing a strong call to all plumbing professionals make the switch to lead-free solder exclusively.'</li><li>'FDA Approves Novavax COVID-19 Vaccine. The company s Nuvaxovid vaccine is the only recombinant protein-based, non-mRNA COVID-19 immunization available in the US.'</li><li>'Doctors Urge Better Health Checks for Psychiatric Patients. A major new Lancet Commission has called for people taking psychiatric medications - such as antidepressants, antipsychotics, and mood stabilisers'</li></ul> |
77
+ | Lifestyle | <ul><li>'Charleston s Queer Bloomsbury festival programme announced, featuring Bimini, Jodie Harsh, Charlie Porter, and an exhibition of the UK AIDS Memorial Quilt. The full programme for Charleston s annual Queer Bloomsbury festival is now live a celebration of queer creativity, bringing together bold performance, joyful activism and powerful LGBTQIA voices to remember, resist and reimagine.'</li><li>'Free Family Fun Day at American Quarter Horse Hall of Fame and Museum set for July 19. Mark your calendars for Saturday, July 19, and bring the whole family to the American Quarter Horse Hall of Fame and Museum for a free day of fun, discovery and hands-on activities.'</li><li>'Anant Ambani Radhika Merchant s Swiss Getaway Must-Visit Spots in Switzerland. Join Anant Ambani and Radhika Merchant on their stunning Swiss holiday complete with romantic cities, scenic lakes, and luxurious alpine escapes.'</li></ul> |
78
+ | Entertainment | <ul><li>'The 8 Best Game Series That Died On The PS2. The PS2 was home to a lot of gaming franchises, and while some continue to thrive today, others died on the console and have yet to resurface.'</li><li>'Borderlands 4 Is More Vertical and Lets You Go Anywhere, And That s Why It Doesn t Have a Minimap. Gearbox Software founder and president Randy Pitchford explained why the upcoming game Borderlands 4 won t include a minimap.'</li><li>'Coleen Nolan supported as she introduces latest member of our family. The TV star was flooded with messages of praise and support as she revealed the latest member of our family'</li></ul> |
79
+ | Business | <ul><li>'Inchcape Director Increases Stake with ADR Purchase. Inchcape GB INCH has provided an announcement. Inchcape plc announced that Stuart Rowley, a Non-Executive Director of the company, purchased 3,000 American d...'</li><li>'Nvidia s May surge signals investors love AI, even if you don t. Nvidia posted a staggering 24 monthly jump its best performance since May 2024 after delivering stronger-than-expected quarterly earnings.'</li><li>'Obituary Sunjay Kapur was the man who saw tomorrow and made Sona Comstar future-ready. Sunjay Kapur, the force behind Sona Comstar s EV pivot and IPO, died during a private polo match. Industry mourns a leader who future-proofed Indian auto components.'</li></ul> |
80
+ | Technology | <ul><li>'Malicious Code Digest Monthly Recap July. Explore Xygeni s July 2025 Malicious Code Digest with insights on 160 compromised packages. Stay ahead of software threats!'</li><li>'What are Decentralized Applications dApps ? The 2025 Guide. Discover the benefits and drawbacks of decentralized applications DApps , and how they compare to centralized apps.'</li><li>'iPhone 17 Pro Leaks New Colors, Price, and Everything We Know So Far. Language Selector English'</li></ul> |
81
+ | Religion | <ul><li>'World Religions Christianity. Christianity'</li><li>'RELIGION Crying out to God from the depths. Out of the depths I have cried to you, O LORD. Lord, hear my voice. Let your ears be attentive to the voice of my supplications. If you, LORD, should mark iniquities, O Lord, who shall stand? But there is forgiveness with you so that you may be feared. Psalm 130 1-4'</li><li>'What Buddhist monks can teach you about pain management. Much of the global population suffers from chronic pain. Here s what we can learn from Buddhism about how to manage it.'</li></ul> |
82
+ | Crime | <ul><li>'Gold wars in Peru town leave Amazon nature defenders vulnerable. Illegal gold mining in one of Peru s most ecologically significant areas has unleashed environmental destruction and gang violence'</li><li>'Stablehand dies after tragic incident at Cranbourne Training Centre. A female stable worker has died after being kicked by a horse at the Cranbourne Training Centre on Tuesday morning. Racing Victoria confirmed the fatal incident'</li><li>'Thirteen-year-old girl charged with rioting in Ballymena and man arrested for Larne leisure centre attack. The teenager was arrested for rioting in Ballymena on Wednesday, and is one of 28 people detained after last week s race-fuelled disorder.'</li></ul> |
83
+ | Science | <ul><li>'The Kessler Effect The Haunted Housewives. Author Christopher Lee Jones spoke about the Kessler Effect-- the problem of accumulating space debris. Followed by Theresa Argie and Cathi Weber, known as the Haunted Housewives, with insights from their paranormal investigations.'</li><li>'Episode 27 lava geysers skyrocket from Halemaʻumaʻu Crater at Big Island s Kīlauea volcano. Fountains started out about 500 feet tall by 9 a.m. when the eruptive episode began, but by about 10 30 a.m., a view of any of the three livestreams provided by U.S. Geological Survey watching the volcano s summit molten rock pumping likely close to or even well more than 1,000 feet high.'</li><li>'More than 500 years later, the Beaver is back in Portugal. After centuries of absence, evidence of the presence of the European beaver Castor fibre in Portuguese territory has finally emerged'</li></ul> |
84
+ | Education | <ul><li>'Southside School Hops Charter Networks. Rocketship takes Stellar from Carmen and everyone is happy.'</li><li>'New school year to be filled with opportunity for junior high students. MATTAPOISETT As the 2025-26 school year approaches, educators at Old Rochester Regional Junior High School are gearing up to help kids transition from elementary school to junior high. I think parents and kids are in good hands at the junior high...'</li><li>'Anatomy education at central Europe medical schools a qualitative analysis of educators pedagogical knowledge, methods, practices, and challenges. Globally, there has been a growing demand for a unified education standard, spurred by sustainability initiatives such as the United Nations Agenda 2030 and the increasing internationalisation of higher education. The World Federation for Medical Education WFME promote accreditation process for international medical education institutions that provide curricula in English. However, some Central European Medical Schools offering such curricula are not fully aligned with WFME accreditation standards. Organisers of human anatomy courses at these schools are seeking to improve their skills and abilities to deliver high-quality teaching effectively in multicultural and multilingual environments. A survey conducted by the Erasmus Strategic Partnership project LEANbody, which aims to reach for quality management tools to teach human anatomy effectively in a multicultural and multilingual learning space, revealed that over 70 49 69 of anatomists in Hungary, the Czech Republic, and Croatia are unfamiliar with international quality standards for medical education and the concept of student-centred pedagogy. This study seeks to understand educators perceptions of pedagogical knowledge and concepts frameworks, such as constructive alignment CA , Intended Learning Outcomes ILOs , in relation to student-centred pedagogy and their anatomy teaching practices. The study also investigates perceived gaps at the institutional, departmental, and individual levels concerning anatomy teaching and the pedagogical practices that should be promoted. A descriptive cross-sectional study using a qualitative approach was used for this purpose. In 2022, face-to-face or online interviews were conducted with 14 anatomy educators including course organisers from Zagreb, Masaryk and Pécs Universities. We found that most educators had not received formal teacher training on teaching methods prior to starting anatomy teaching and were unfamiliar with such pedagogical frameworks as CA, even though they were familiar with the concept of ILOs. Thematic analysis was applied to open-ended questions and the umbrella theme that emerged was Transforming Anatomy Teaching and Learning in the Glocal Classroom Navigating the Intersections of Pedagogical Practice, Constructive Alignment, and Student-Centred frameworks . Two themes and 5 subthemes were identified from the data. The study presents recommendations and a novel framework linking student-centred approaches, CA, and global educational sustainability agendas, such as the sustainable development goal 4 target 7 SDG4.7 to enhance the quality of anatomy teaching.'</li></ul> |
85
+
86
+ ## Evaluation
87
+
88
+ ### Metrics
89
+ | Label | Accuracy |
90
+ |:--------|:---------|
91
+ | **all** | 0.8421 |
92
+
93
+ ## Uses
94
+
95
+ ### Direct Use for Inference
96
+
97
+ First install the SetFit library:
98
+
99
+ ```bash
100
+ pip install setfit
101
+ ```
102
+
103
+ Then you can load this model and run inference.
104
+
105
+ ```python
106
+ from setfit import SetFitModel
107
+
108
+ # Download from the 🤗 Hub
109
+ model = SetFitModel.from_pretrained("setfit_model_id")
110
+ # Run inference
111
+ preds = model("Ari Paparo on Google s Digital Dominance. Our guest is Ari Paparo.")
112
+ ```
113
+
114
+ <!--
115
+ ### Downstream Use
116
+
117
+ *List how someone could finetune this model on their own dataset.*
118
+ -->
119
+
120
+ <!--
121
+ ### Out-of-Scope Use
122
+
123
+ *List how the model may foreseeably be misused and address what users ought not to do with the model.*
124
+ -->
125
+
126
+ <!--
127
+ ## Bias, Risks and Limitations
128
+
129
+ *What are the known or foreseeable issues stemming from this model? You could also flag here known failure cases or weaknesses of the model.*
130
+ -->
131
+
132
+ <!--
133
+ ### Recommendations
134
+
135
+ *What are recommendations with respect to the foreseeable issues? For example, filtering explicit content.*
136
+ -->
137
+
138
+ ## Training Details
139
+
140
+ ### Training Set Metrics
141
+ | Training set | Min | Median | Max |
142
+ |:-------------|:----|:--------|:----|
143
+ | Word count | 2 | 46.3786 | 454 |
144
+
145
+ | Label | Training Sample Count |
146
+ |:--------------|:----------------------|
147
+ | Business | 302 |
148
+ | Sports | 302 |
149
+ | Politics | 302 |
150
+ | Lifestyle | 302 |
151
+ | General News | 302 |
152
+ | Entertainment | 302 |
153
+ | Crime | 302 |
154
+ | Technology | 302 |
155
+ | Health | 302 |
156
+ | Science | 302 |
157
+ | Religion | 302 |
158
+ | Education | 302 |
159
+
160
+ ### Training Hyperparameters
161
+ - batch_size: (16, 16)
162
+ - num_epochs: (5, 5)
163
+ - max_steps: -1
164
+ - sampling_strategy: oversampling
165
+ - num_iterations: 10
166
+ - body_learning_rate: (2e-05, 1e-05)
167
+ - head_learning_rate: 0.01
168
+ - loss: CosineSimilarityLoss
169
+ - distance_metric: cosine_distance
170
+ - margin: 0.25
171
+ - end_to_end: False
172
+ - use_amp: False
173
+ - warmup_proportion: 0.1
174
+ - l2_weight: 0.01
175
+ - seed: 42
176
+ - eval_max_steps: -1
177
+ - load_best_model_at_end: False
178
+
179
+ ### Training Results
180
+ | Epoch | Step | Training Loss | Validation Loss |
181
+ |:------:|:-----:|:-------------:|:---------------:|
182
+ | 0.0002 | 1 | 0.2487 | - |
183
+ | 0.0110 | 50 | 0.3115 | - |
184
+ | 0.0221 | 100 | 0.2961 | - |
185
+ | 0.0331 | 150 | 0.2719 | - |
186
+ | 0.0442 | 200 | 0.2379 | - |
187
+ | 0.0552 | 250 | 0.222 | - |
188
+ | 0.0662 | 300 | 0.2096 | - |
189
+ | 0.0773 | 350 | 0.1889 | - |
190
+ | 0.0883 | 400 | 0.1645 | - |
191
+ | 0.0993 | 450 | 0.1465 | - |
192
+ | 0.1104 | 500 | 0.1197 | - |
193
+ | 0.1214 | 550 | 0.0931 | - |
194
+ | 0.1325 | 600 | 0.0885 | - |
195
+ | 0.1435 | 650 | 0.0695 | - |
196
+ | 0.1545 | 700 | 0.0673 | - |
197
+ | 0.1656 | 750 | 0.0648 | - |
198
+ | 0.1766 | 800 | 0.0538 | - |
199
+ | 0.1876 | 850 | 0.0485 | - |
200
+ | 0.1987 | 900 | 0.041 | - |
201
+ | 0.2097 | 950 | 0.0328 | - |
202
+ | 0.2208 | 1000 | 0.0285 | - |
203
+ | 0.2318 | 1050 | 0.0222 | - |
204
+ | 0.2428 | 1100 | 0.0192 | - |
205
+ | 0.2539 | 1150 | 0.0179 | - |
206
+ | 0.2649 | 1200 | 0.0144 | - |
207
+ | 0.2759 | 1250 | 0.0174 | - |
208
+ | 0.2870 | 1300 | 0.0119 | - |
209
+ | 0.2980 | 1350 | 0.0187 | - |
210
+ | 0.3091 | 1400 | 0.0156 | - |
211
+ | 0.3201 | 1450 | 0.0068 | - |
212
+ | 0.3311 | 1500 | 0.0068 | - |
213
+ | 0.3422 | 1550 | 0.0067 | - |
214
+ | 0.3532 | 1600 | 0.0061 | - |
215
+ | 0.3642 | 1650 | 0.0073 | - |
216
+ | 0.3753 | 1700 | 0.0047 | - |
217
+ | 0.3863 | 1750 | 0.0047 | - |
218
+ | 0.3974 | 1800 | 0.0054 | - |
219
+ | 0.4084 | 1850 | 0.0043 | - |
220
+ | 0.4194 | 1900 | 0.0022 | - |
221
+ | 0.4305 | 1950 | 0.0046 | - |
222
+ | 0.4415 | 2000 | 0.0018 | - |
223
+ | 0.4525 | 2050 | 0.0035 | - |
224
+ | 0.4636 | 2100 | 0.0007 | - |
225
+ | 0.4746 | 2150 | 0.003 | - |
226
+ | 0.4857 | 2200 | 0.0009 | - |
227
+ | 0.4967 | 2250 | 0.0042 | - |
228
+ | 0.5077 | 2300 | 0.0023 | - |
229
+ | 0.5188 | 2350 | 0.0005 | - |
230
+ | 0.5298 | 2400 | 0.0031 | - |
231
+ | 0.5408 | 2450 | 0.0016 | - |
232
+ | 0.5519 | 2500 | 0.001 | - |
233
+ | 0.5629 | 2550 | 0.0028 | - |
234
+ | 0.5740 | 2600 | 0.0011 | - |
235
+ | 0.5850 | 2650 | 0.0004 | - |
236
+ | 0.5960 | 2700 | 0.0003 | - |
237
+ | 0.6071 | 2750 | 0.0003 | - |
238
+ | 0.6181 | 2800 | 0.0017 | - |
239
+ | 0.6291 | 2850 | 0.001 | - |
240
+ | 0.6402 | 2900 | 0.0011 | - |
241
+ | 0.6512 | 2950 | 0.0004 | - |
242
+ | 0.6623 | 3000 | 0.0015 | - |
243
+ | 0.6733 | 3050 | 0.0006 | - |
244
+ | 0.6843 | 3100 | 0.0003 | - |
245
+ | 0.6954 | 3150 | 0.0002 | - |
246
+ | 0.7064 | 3200 | 0.0017 | - |
247
+ | 0.7174 | 3250 | 0.0005 | - |
248
+ | 0.7285 | 3300 | 0.0011 | - |
249
+ | 0.7395 | 3350 | 0.0006 | - |
250
+ | 0.7506 | 3400 | 0.0015 | - |
251
+ | 0.7616 | 3450 | 0.0004 | - |
252
+ | 0.7726 | 3500 | 0.0009 | - |
253
+ | 0.7837 | 3550 | 0.0016 | - |
254
+ | 0.7947 | 3600 | 0.0008 | - |
255
+ | 0.8057 | 3650 | 0.0004 | - |
256
+ | 0.8168 | 3700 | 0.0016 | - |
257
+ | 0.8278 | 3750 | 0.0003 | - |
258
+ | 0.8389 | 3800 | 0.0002 | - |
259
+ | 0.8499 | 3850 | 0.0001 | - |
260
+ | 0.8609 | 3900 | 0.0027 | - |
261
+ | 0.8720 | 3950 | 0.0029 | - |
262
+ | 0.8830 | 4000 | 0.0019 | - |
263
+ | 0.8940 | 4050 | 0.0036 | - |
264
+ | 0.9051 | 4100 | 0.0018 | - |
265
+ | 0.9161 | 4150 | 0.0018 | - |
266
+ | 0.9272 | 4200 | 0.0021 | - |
267
+ | 0.9382 | 4250 | 0.0003 | - |
268
+ | 0.9492 | 4300 | 0.0002 | - |
269
+ | 0.9603 | 4350 | 0.0001 | - |
270
+ | 0.9713 | 4400 | 0.0002 | - |
271
+ | 0.9823 | 4450 | 0.0016 | - |
272
+ | 0.9934 | 4500 | 0.0003 | - |
273
+ | 1.0044 | 4550 | 0.0015 | - |
274
+ | 1.0155 | 4600 | 0.0008 | - |
275
+ | 1.0265 | 4650 | 0.0002 | - |
276
+ | 1.0375 | 4700 | 0.0001 | - |
277
+ | 1.0486 | 4750 | 0.0007 | - |
278
+ | 1.0596 | 4800 | 0.0007 | - |
279
+ | 1.0706 | 4850 | 0.0001 | - |
280
+ | 1.0817 | 4900 | 0.0001 | - |
281
+ | 1.0927 | 4950 | 0.0001 | - |
282
+ | 1.1038 | 5000 | 0.0001 | - |
283
+ | 1.1148 | 5050 | 0.0001 | - |
284
+ | 1.1258 | 5100 | 0.0001 | - |
285
+ | 1.1369 | 5150 | 0.0001 | - |
286
+ | 1.1479 | 5200 | 0.0001 | - |
287
+ | 1.1589 | 5250 | 0.0001 | - |
288
+ | 1.1700 | 5300 | 0.0001 | - |
289
+ | 1.1810 | 5350 | 0.0001 | - |
290
+ | 1.1921 | 5400 | 0.0015 | - |
291
+ | 1.2031 | 5450 | 0.0045 | - |
292
+ | 1.2141 | 5500 | 0.0037 | - |
293
+ | 1.2252 | 5550 | 0.005 | - |
294
+ | 1.2362 | 5600 | 0.0006 | - |
295
+ | 1.2472 | 5650 | 0.0001 | - |
296
+ | 1.2583 | 5700 | 0.001 | - |
297
+ | 1.2693 | 5750 | 0.0001 | - |
298
+ | 1.2804 | 5800 | 0.0001 | - |
299
+ | 1.2914 | 5850 | 0.0022 | - |
300
+ | 1.3024 | 5900 | 0.0003 | - |
301
+ | 1.3135 | 5950 | 0.0016 | - |
302
+ | 1.3245 | 6000 | 0.0003 | - |
303
+ | 1.3355 | 6050 | 0.0001 | - |
304
+ | 1.3466 | 6100 | 0.0001 | - |
305
+ | 1.3576 | 6150 | 0.0001 | - |
306
+ | 1.3687 | 6200 | 0.0001 | - |
307
+ | 1.3797 | 6250 | 0.0002 | - |
308
+ | 1.3907 | 6300 | 0.0001 | - |
309
+ | 1.4018 | 6350 | 0.0001 | - |
310
+ | 1.4128 | 6400 | 0.0011 | - |
311
+ | 1.4238 | 6450 | 0.0003 | - |
312
+ | 1.4349 | 6500 | 0.0004 | - |
313
+ | 1.4459 | 6550 | 0.0001 | - |
314
+ | 1.4570 | 6600 | 0.0021 | - |
315
+ | 1.4680 | 6650 | 0.0013 | - |
316
+ | 1.4790 | 6700 | 0.0038 | - |
317
+ | 1.4901 | 6750 | 0.0002 | - |
318
+ | 1.5011 | 6800 | 0.0007 | - |
319
+ | 1.5121 | 6850 | 0.0001 | - |
320
+ | 1.5232 | 6900 | 0.0002 | - |
321
+ | 1.5342 | 6950 | 0.0014 | - |
322
+ | 1.5453 | 7000 | 0.0003 | - |
323
+ | 1.5563 | 7050 | 0.0001 | - |
324
+ | 1.5673 | 7100 | 0.0001 | - |
325
+ | 1.5784 | 7150 | 0.0001 | - |
326
+ | 1.5894 | 7200 | 0.0011 | - |
327
+ | 1.6004 | 7250 | 0.0001 | - |
328
+ | 1.6115 | 7300 | 0.0001 | - |
329
+ | 1.6225 | 7350 | 0.0001 | - |
330
+ | 1.6336 | 7400 | 0.0001 | - |
331
+ | 1.6446 | 7450 | 0.0 | - |
332
+ | 1.6556 | 7500 | 0.0 | - |
333
+ | 1.6667 | 7550 | 0.0 | - |
334
+ | 1.6777 | 7600 | 0.0 | - |
335
+ | 1.6887 | 7650 | 0.0 | - |
336
+ | 1.6998 | 7700 | 0.0 | - |
337
+ | 1.7108 | 7750 | 0.0 | - |
338
+ | 1.7219 | 7800 | 0.0 | - |
339
+ | 1.7329 | 7850 | 0.0 | - |
340
+ | 1.7439 | 7900 | 0.0001 | - |
341
+ | 1.7550 | 7950 | 0.0 | - |
342
+ | 1.7660 | 8000 | 0.0 | - |
343
+ | 1.7770 | 8050 | 0.0 | - |
344
+ | 1.7881 | 8100 | 0.0 | - |
345
+ | 1.7991 | 8150 | 0.0 | - |
346
+ | 1.8102 | 8200 | 0.0 | - |
347
+ | 1.8212 | 8250 | 0.0 | - |
348
+ | 1.8322 | 8300 | 0.0 | - |
349
+ | 1.8433 | 8350 | 0.0001 | - |
350
+ | 1.8543 | 8400 | 0.0018 | - |
351
+ | 1.8653 | 8450 | 0.0017 | - |
352
+ | 1.8764 | 8500 | 0.0001 | - |
353
+ | 1.8874 | 8550 | 0.0001 | - |
354
+ | 1.8985 | 8600 | 0.0001 | - |
355
+ | 1.9095 | 8650 | 0.0 | - |
356
+ | 1.9205 | 8700 | 0.0 | - |
357
+ | 1.9316 | 8750 | 0.0 | - |
358
+ | 1.9426 | 8800 | 0.0001 | - |
359
+ | 1.9536 | 8850 | 0.0001 | - |
360
+ | 1.9647 | 8900 | 0.0007 | - |
361
+ | 1.9757 | 8950 | 0.0015 | - |
362
+ | 1.9868 | 9000 | 0.0012 | - |
363
+ | 1.9978 | 9050 | 0.0015 | - |
364
+ | 2.0088 | 9100 | 0.0017 | - |
365
+ | 2.0199 | 9150 | 0.0021 | - |
366
+ | 2.0309 | 9200 | 0.0008 | - |
367
+ | 2.0419 | 9250 | 0.0033 | - |
368
+ | 2.0530 | 9300 | 0.0019 | - |
369
+ | 2.0640 | 9350 | 0.0002 | - |
370
+ | 2.0751 | 9400 | 0.0001 | - |
371
+ | 2.0861 | 9450 | 0.0 | - |
372
+ | 2.0971 | 9500 | 0.0 | - |
373
+ | 2.1082 | 9550 | 0.0 | - |
374
+ | 2.1192 | 9600 | 0.0 | - |
375
+ | 2.1302 | 9650 | 0.0 | - |
376
+ | 2.1413 | 9700 | 0.0 | - |
377
+ | 2.1523 | 9750 | 0.0 | - |
378
+ | 2.1634 | 9800 | 0.0001 | - |
379
+ | 2.1744 | 9850 | 0.0 | - |
380
+ | 2.1854 | 9900 | 0.0008 | - |
381
+ | 2.1965 | 9950 | 0.0143 | - |
382
+ | 2.2075 | 10000 | 0.0043 | - |
383
+ | 2.2185 | 10050 | 0.0067 | - |
384
+ | 2.2296 | 10100 | 0.0043 | - |
385
+ | 2.2406 | 10150 | 0.0017 | - |
386
+ | 2.2517 | 10200 | 0.0002 | - |
387
+ | 2.2627 | 10250 | 0.0022 | - |
388
+ | 2.2737 | 10300 | 0.0024 | - |
389
+ | 2.2848 | 10350 | 0.0004 | - |
390
+ | 2.2958 | 10400 | 0.0001 | - |
391
+ | 2.3068 | 10450 | 0.002 | - |
392
+ | 2.3179 | 10500 | 0.0001 | - |
393
+ | 2.3289 | 10550 | 0.001 | - |
394
+ | 2.3400 | 10600 | 0.0002 | - |
395
+ | 2.3510 | 10650 | 0.0002 | - |
396
+ | 2.3620 | 10700 | 0.0001 | - |
397
+ | 2.3731 | 10750 | 0.0 | - |
398
+ | 2.3841 | 10800 | 0.0016 | - |
399
+ | 2.3951 | 10850 | 0.0002 | - |
400
+ | 2.4062 | 10900 | 0.0012 | - |
401
+ | 2.4172 | 10950 | 0.0 | - |
402
+ | 2.4283 | 11000 | 0.0001 | - |
403
+ | 2.4393 | 11050 | 0.0002 | - |
404
+ | 2.4503 | 11100 | 0.0001 | - |
405
+ | 2.4614 | 11150 | 0.0001 | - |
406
+ | 2.4724 | 11200 | 0.0 | - |
407
+ | 2.4834 | 11250 | 0.0 | - |
408
+ | 2.4945 | 11300 | 0.0001 | - |
409
+ | 2.5055 | 11350 | 0.0 | - |
410
+ | 2.5166 | 11400 | 0.0 | - |
411
+ | 2.5276 | 11450 | 0.0 | - |
412
+ | 2.5386 | 11500 | 0.0 | - |
413
+ | 2.5497 | 11550 | 0.0 | - |
414
+ | 2.5607 | 11600 | 0.0 | - |
415
+ | 2.5717 | 11650 | 0.0 | - |
416
+ | 2.5828 | 11700 | 0.0 | - |
417
+ | 2.5938 | 11750 | 0.0 | - |
418
+ | 2.6049 | 11800 | 0.0 | - |
419
+ | 2.6159 | 11850 | 0.0 | - |
420
+ | 2.6269 | 11900 | 0.0 | - |
421
+ | 2.6380 | 11950 | 0.0 | - |
422
+ | 2.6490 | 12000 | 0.0 | - |
423
+ | 2.6600 | 12050 | 0.0 | - |
424
+ | 2.6711 | 12100 | 0.0 | - |
425
+ | 2.6821 | 12150 | 0.0 | - |
426
+ | 2.6932 | 12200 | 0.0 | - |
427
+ | 2.7042 | 12250 | 0.0 | - |
428
+ | 2.7152 | 12300 | 0.0 | - |
429
+ | 2.7263 | 12350 | 0.0 | - |
430
+ | 2.7373 | 12400 | 0.0 | - |
431
+ | 2.7483 | 12450 | 0.0 | - |
432
+ | 2.7594 | 12500 | 0.0 | - |
433
+ | 2.7704 | 12550 | 0.0 | - |
434
+ | 2.7815 | 12600 | 0.0 | - |
435
+ | 2.7925 | 12650 | 0.0 | - |
436
+ | 2.8035 | 12700 | 0.0 | - |
437
+ | 2.8146 | 12750 | 0.0 | - |
438
+ | 2.8256 | 12800 | 0.0 | - |
439
+ | 2.8366 | 12850 | 0.0 | - |
440
+ | 2.8477 | 12900 | 0.0 | - |
441
+ | 2.8587 | 12950 | 0.0 | - |
442
+ | 2.8698 | 13000 | 0.0 | - |
443
+ | 2.8808 | 13050 | 0.0 | - |
444
+ | 2.8918 | 13100 | 0.0 | - |
445
+ | 2.9029 | 13150 | 0.0 | - |
446
+ | 2.9139 | 13200 | 0.0 | - |
447
+ | 2.9249 | 13250 | 0.0 | - |
448
+ | 2.9360 | 13300 | 0.0 | - |
449
+ | 2.9470 | 13350 | 0.0 | - |
450
+ | 2.9581 | 13400 | 0.0 | - |
451
+ | 2.9691 | 13450 | 0.0 | - |
452
+ | 2.9801 | 13500 | 0.0 | - |
453
+ | 2.9912 | 13550 | 0.0 | - |
454
+ | 3.0022 | 13600 | 0.0 | - |
455
+ | 3.0132 | 13650 | 0.0 | - |
456
+ | 3.0243 | 13700 | 0.0 | - |
457
+ | 3.0353 | 13750 | 0.0 | - |
458
+ | 3.0464 | 13800 | 0.0 | - |
459
+ | 3.0574 | 13850 | 0.0 | - |
460
+ | 3.0684 | 13900 | 0.0 | - |
461
+ | 3.0795 | 13950 | 0.0 | - |
462
+ | 3.0905 | 14000 | 0.0 | - |
463
+ | 3.1015 | 14050 | 0.0 | - |
464
+ | 3.1126 | 14100 | 0.0 | - |
465
+ | 3.1236 | 14150 | 0.0 | - |
466
+ | 3.1347 | 14200 | 0.0 | - |
467
+ | 3.1457 | 14250 | 0.0 | - |
468
+ | 3.1567 | 14300 | 0.0 | - |
469
+ | 3.1678 | 14350 | 0.0 | - |
470
+ | 3.1788 | 14400 | 0.0 | - |
471
+ | 3.1898 | 14450 | 0.0 | - |
472
+ | 3.2009 | 14500 | 0.0 | - |
473
+ | 3.2119 | 14550 | 0.0 | - |
474
+ | 3.2230 | 14600 | 0.0 | - |
475
+ | 3.2340 | 14650 | 0.0 | - |
476
+ | 3.2450 | 14700 | 0.0 | - |
477
+ | 3.2561 | 14750 | 0.0 | - |
478
+ | 3.2671 | 14800 | 0.0 | - |
479
+ | 3.2781 | 14850 | 0.0 | - |
480
+ | 3.2892 | 14900 | 0.0 | - |
481
+ | 3.3002 | 14950 | 0.0 | - |
482
+ | 3.3113 | 15000 | 0.0 | - |
483
+ | 3.3223 | 15050 | 0.0 | - |
484
+ | 3.3333 | 15100 | 0.0 | - |
485
+ | 3.3444 | 15150 | 0.0 | - |
486
+ | 3.3554 | 15200 | 0.0 | - |
487
+ | 3.3664 | 15250 | 0.0 | - |
488
+ | 3.3775 | 15300 | 0.0 | - |
489
+ | 3.3885 | 15350 | 0.0 | - |
490
+ | 3.3996 | 15400 | 0.0 | - |
491
+ | 3.4106 | 15450 | 0.0 | - |
492
+ | 3.4216 | 15500 | 0.0 | - |
493
+ | 3.4327 | 15550 | 0.0 | - |
494
+ | 3.4437 | 15600 | 0.0 | - |
495
+ | 3.4547 | 15650 | 0.0 | - |
496
+ | 3.4658 | 15700 | 0.0 | - |
497
+ | 3.4768 | 15750 | 0.0 | - |
498
+ | 3.4879 | 15800 | 0.0 | - |
499
+ | 3.4989 | 15850 | 0.0 | - |
500
+ | 3.5099 | 15900 | 0.0 | - |
501
+ | 3.5210 | 15950 | 0.0 | - |
502
+ | 3.5320 | 16000 | 0.0 | - |
503
+ | 3.5430 | 16050 | 0.0 | - |
504
+ | 3.5541 | 16100 | 0.0 | - |
505
+ | 3.5651 | 16150 | 0.0 | - |
506
+ | 3.5762 | 16200 | 0.0 | - |
507
+ | 3.5872 | 16250 | 0.0 | - |
508
+ | 3.5982 | 16300 | 0.0 | - |
509
+ | 3.6093 | 16350 | 0.0 | - |
510
+ | 3.6203 | 16400 | 0.0 | - |
511
+ | 3.6313 | 16450 | 0.0 | - |
512
+ | 3.6424 | 16500 | 0.0 | - |
513
+ | 3.6534 | 16550 | 0.0 | - |
514
+ | 3.6645 | 16600 | 0.0 | - |
515
+ | 3.6755 | 16650 | 0.0 | - |
516
+ | 3.6865 | 16700 | 0.0 | - |
517
+ | 3.6976 | 16750 | 0.0 | - |
518
+ | 3.7086 | 16800 | 0.0 | - |
519
+ | 3.7196 | 16850 | 0.0 | - |
520
+ | 3.7307 | 16900 | 0.0 | - |
521
+ | 3.7417 | 16950 | 0.0 | - |
522
+ | 3.7528 | 17000 | 0.0 | - |
523
+ | 3.7638 | 17050 | 0.0 | - |
524
+ | 3.7748 | 17100 | 0.0 | - |
525
+ | 3.7859 | 17150 | 0.0 | - |
526
+ | 3.7969 | 17200 | 0.0 | - |
527
+ | 3.8079 | 17250 | 0.0 | - |
528
+ | 3.8190 | 17300 | 0.0 | - |
529
+ | 3.8300 | 17350 | 0.0 | - |
530
+ | 3.8411 | 17400 | 0.0 | - |
531
+ | 3.8521 | 17450 | 0.0 | - |
532
+ | 3.8631 | 17500 | 0.0 | - |
533
+ | 3.8742 | 17550 | 0.0 | - |
534
+ | 3.8852 | 17600 | 0.0 | - |
535
+ | 3.8962 | 17650 | 0.0 | - |
536
+ | 3.9073 | 17700 | 0.0 | - |
537
+ | 3.9183 | 17750 | 0.0 | - |
538
+ | 3.9294 | 17800 | 0.0 | - |
539
+ | 3.9404 | 17850 | 0.0 | - |
540
+ | 3.9514 | 17900 | 0.0 | - |
541
+ | 3.9625 | 17950 | 0.0 | - |
542
+ | 3.9735 | 18000 | 0.0 | - |
543
+ | 3.9845 | 18050 | 0.0 | - |
544
+ | 3.9956 | 18100 | 0.0 | - |
545
+ | 4.0066 | 18150 | 0.0 | - |
546
+ | 4.0177 | 18200 | 0.0 | - |
547
+ | 4.0287 | 18250 | 0.0 | - |
548
+ | 4.0397 | 18300 | 0.0 | - |
549
+ | 4.0508 | 18350 | 0.0 | - |
550
+ | 4.0618 | 18400 | 0.0 | - |
551
+ | 4.0728 | 18450 | 0.0 | - |
552
+ | 4.0839 | 18500 | 0.0 | - |
553
+ | 4.0949 | 18550 | 0.0 | - |
554
+ | 4.1060 | 18600 | 0.0 | - |
555
+ | 4.1170 | 18650 | 0.0 | - |
556
+ | 4.1280 | 18700 | 0.0 | - |
557
+ | 4.1391 | 18750 | 0.0 | - |
558
+ | 4.1501 | 18800 | 0.0 | - |
559
+ | 4.1611 | 18850 | 0.0 | - |
560
+ | 4.1722 | 18900 | 0.0 | - |
561
+ | 4.1832 | 18950 | 0.0 | - |
562
+ | 4.1943 | 19000 | 0.0 | - |
563
+ | 4.2053 | 19050 | 0.0 | - |
564
+ | 4.2163 | 19100 | 0.0 | - |
565
+ | 4.2274 | 19150 | 0.0 | - |
566
+ | 4.2384 | 19200 | 0.0 | - |
567
+ | 4.2494 | 19250 | 0.0 | - |
568
+ | 4.2605 | 19300 | 0.0 | - |
569
+ | 4.2715 | 19350 | 0.0 | - |
570
+ | 4.2826 | 19400 | 0.0 | - |
571
+ | 4.2936 | 19450 | 0.0 | - |
572
+ | 4.3046 | 19500 | 0.0 | - |
573
+ | 4.3157 | 19550 | 0.0 | - |
574
+ | 4.3267 | 19600 | 0.0 | - |
575
+ | 4.3377 | 19650 | 0.0 | - |
576
+ | 4.3488 | 19700 | 0.0 | - |
577
+ | 4.3598 | 19750 | 0.0 | - |
578
+ | 4.3709 | 19800 | 0.0 | - |
579
+ | 4.3819 | 19850 | 0.0 | - |
580
+ | 4.3929 | 19900 | 0.0 | - |
581
+ | 4.4040 | 19950 | 0.0 | - |
582
+ | 4.4150 | 20000 | 0.0 | - |
583
+ | 4.4260 | 20050 | 0.0 | - |
584
+ | 4.4371 | 20100 | 0.0 | - |
585
+ | 4.4481 | 20150 | 0.0 | - |
586
+ | 4.4592 | 20200 | 0.0 | - |
587
+ | 4.4702 | 20250 | 0.0 | - |
588
+ | 4.4812 | 20300 | 0.0 | - |
589
+ | 4.4923 | 20350 | 0.0 | - |
590
+ | 4.5033 | 20400 | 0.0 | - |
591
+ | 4.5143 | 20450 | 0.0 | - |
592
+ | 4.5254 | 20500 | 0.0 | - |
593
+ | 4.5364 | 20550 | 0.0 | - |
594
+ | 4.5475 | 20600 | 0.0 | - |
595
+ | 4.5585 | 20650 | 0.0 | - |
596
+ | 4.5695 | 20700 | 0.0 | - |
597
+ | 4.5806 | 20750 | 0.0 | - |
598
+ | 4.5916 | 20800 | 0.0 | - |
599
+ | 4.6026 | 20850 | 0.0 | - |
600
+ | 4.6137 | 20900 | 0.0 | - |
601
+ | 4.6247 | 20950 | 0.0 | - |
602
+ | 4.6358 | 21000 | 0.0 | - |
603
+ | 4.6468 | 21050 | 0.0 | - |
604
+ | 4.6578 | 21100 | 0.0 | - |
605
+ | 4.6689 | 21150 | 0.0 | - |
606
+ | 4.6799 | 21200 | 0.0 | - |
607
+ | 4.6909 | 21250 | 0.0 | - |
608
+ | 4.7020 | 21300 | 0.0 | - |
609
+ | 4.7130 | 21350 | 0.0 | - |
610
+ | 4.7241 | 21400 | 0.0 | - |
611
+ | 4.7351 | 21450 | 0.0 | - |
612
+ | 4.7461 | 21500 | 0.0 | - |
613
+ | 4.7572 | 21550 | 0.0 | - |
614
+ | 4.7682 | 21600 | 0.0 | - |
615
+ | 4.7792 | 21650 | 0.0 | - |
616
+ | 4.7903 | 21700 | 0.0 | - |
617
+ | 4.8013 | 21750 | 0.0 | - |
618
+ | 4.8124 | 21800 | 0.0 | - |
619
+ | 4.8234 | 21850 | 0.0 | - |
620
+ | 4.8344 | 21900 | 0.0 | - |
621
+ | 4.8455 | 21950 | 0.0 | - |
622
+ | 4.8565 | 22000 | 0.0 | - |
623
+ | 4.8675 | 22050 | 0.0 | - |
624
+ | 4.8786 | 22100 | 0.0 | - |
625
+ | 4.8896 | 22150 | 0.0 | - |
626
+ | 4.9007 | 22200 | 0.0 | - |
627
+ | 4.9117 | 22250 | 0.0 | - |
628
+ | 4.9227 | 22300 | 0.0 | - |
629
+ | 4.9338 | 22350 | 0.0 | - |
630
+ | 4.9448 | 22400 | 0.0 | - |
631
+ | 4.9558 | 22450 | 0.0 | - |
632
+ | 4.9669 | 22500 | 0.0 | - |
633
+ | 4.9779 | 22550 | 0.0 | - |
634
+ | 4.9890 | 22600 | 0.0 | - |
635
+ | 5.0 | 22650 | 0.0 | - |
636
+
637
+ ### Framework Versions
638
+ - Python: 3.10.12
639
+ - SetFit: 1.1.3
640
+ - Sentence Transformers: 5.1.0
641
+ - Transformers: 4.55.2
642
+ - PyTorch: 2.7.1+cu118
643
+ - Datasets: 4.0.0
644
+ - Tokenizers: 0.21.4
645
+
646
+ ## Citation
647
+
648
+ ### BibTeX
649
+ ```bibtex
650
+ @article{https://doi.org/10.48550/arxiv.2209.11055,
651
+ doi = {10.48550/ARXIV.2209.11055},
652
+ url = {https://arxiv.org/abs/2209.11055},
653
+ author = {Tunstall, Lewis and Reimers, Nils and Jo, Unso Eun Seo and Bates, Luke and Korat, Daniel and Wasserblat, Moshe and Pereg, Oren},
654
+ keywords = {Computation and Language (cs.CL), FOS: Computer and information sciences, FOS: Computer and information sciences},
655
+ title = {Efficient Few-Shot Learning Without Prompts},
656
+ publisher = {arXiv},
657
+ year = {2022},
658
+ copyright = {Creative Commons Attribution 4.0 International}
659
+ }
660
+ ```
661
+
662
+ <!--
663
+ ## Glossary
664
+
665
+ *Clearly define terms in order to be accessible across audiences.*
666
+ -->
667
+
668
+ <!--
669
+ ## Model Card Authors
670
+
671
+ *Lists the people who create the model card, providing recognition and accountability for the detailed work that goes into its construction.*
672
+ -->
673
+
674
+ <!--
675
+ ## Model Card Contact
676
+
677
+ *Provides a way for people who have updates to the Model Card, suggestions, or questions, to contact the Model Card authors.*
678
+ -->
config.json ADDED
@@ -0,0 +1,27 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "architectures": [
3
+ "XLMRobertaModel"
4
+ ],
5
+ "attention_probs_dropout_prob": 0.1,
6
+ "bos_token_id": 0,
7
+ "classifier_dropout": null,
8
+ "eos_token_id": 2,
9
+ "hidden_act": "gelu",
10
+ "hidden_dropout_prob": 0.1,
11
+ "hidden_size": 768,
12
+ "initializer_range": 0.02,
13
+ "intermediate_size": 3072,
14
+ "layer_norm_eps": 1e-05,
15
+ "max_position_embeddings": 514,
16
+ "model_type": "xlm-roberta",
17
+ "num_attention_heads": 12,
18
+ "num_hidden_layers": 12,
19
+ "output_past": true,
20
+ "pad_token_id": 1,
21
+ "position_embedding_type": "absolute",
22
+ "torch_dtype": "float32",
23
+ "transformers_version": "4.55.2",
24
+ "type_vocab_size": 1,
25
+ "use_cache": true,
26
+ "vocab_size": 250002
27
+ }
config_sentence_transformers.json ADDED
@@ -0,0 +1,14 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "model_type": "SentenceTransformer",
3
+ "__version__": {
4
+ "sentence_transformers": "5.1.0",
5
+ "transformers": "4.55.2",
6
+ "pytorch": "2.7.1+cu118"
7
+ },
8
+ "prompts": {
9
+ "query": "",
10
+ "document": ""
11
+ },
12
+ "default_prompt_name": null,
13
+ "similarity_fn_name": "cosine"
14
+ }
config_setfit.json ADDED
@@ -0,0 +1,17 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "normalize_embeddings": false,
3
+ "labels": [
4
+ "Business",
5
+ "Sports",
6
+ "Politics",
7
+ "Lifestyle",
8
+ "General News",
9
+ "Entertainment",
10
+ "Crime",
11
+ "Technology",
12
+ "Health",
13
+ "Science",
14
+ "Religion",
15
+ "Education"
16
+ ]
17
+ }
model.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:02818c78510fe9231ba9d6271c7170c42b2aa63ef5caf87481fcb64325dd6819
3
+ size 1112197096
model_head.pkl ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:66aa0481aeeb8d58dd4ab19dccceecd63cbef8021b8287e5b18d58a3625fc021
3
+ size 75287
modules.json ADDED
@@ -0,0 +1,20 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ [
2
+ {
3
+ "idx": 0,
4
+ "name": "0",
5
+ "path": "",
6
+ "type": "sentence_transformers.models.Transformer"
7
+ },
8
+ {
9
+ "idx": 1,
10
+ "name": "1",
11
+ "path": "1_Pooling",
12
+ "type": "sentence_transformers.models.Pooling"
13
+ },
14
+ {
15
+ "idx": 2,
16
+ "name": "2",
17
+ "path": "2_Normalize",
18
+ "type": "sentence_transformers.models.Normalize"
19
+ }
20
+ ]
sentence_bert_config.json ADDED
@@ -0,0 +1,4 @@
 
 
 
 
 
1
+ {
2
+ "max_seq_length": 512,
3
+ "do_lower_case": false
4
+ }
special_tokens_map.json ADDED
@@ -0,0 +1,51 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "bos_token": {
3
+ "content": "<s>",
4
+ "lstrip": false,
5
+ "normalized": false,
6
+ "rstrip": false,
7
+ "single_word": false
8
+ },
9
+ "cls_token": {
10
+ "content": "<s>",
11
+ "lstrip": false,
12
+ "normalized": false,
13
+ "rstrip": false,
14
+ "single_word": false
15
+ },
16
+ "eos_token": {
17
+ "content": "</s>",
18
+ "lstrip": false,
19
+ "normalized": false,
20
+ "rstrip": false,
21
+ "single_word": false
22
+ },
23
+ "mask_token": {
24
+ "content": "<mask>",
25
+ "lstrip": true,
26
+ "normalized": false,
27
+ "rstrip": false,
28
+ "single_word": false
29
+ },
30
+ "pad_token": {
31
+ "content": "<pad>",
32
+ "lstrip": false,
33
+ "normalized": false,
34
+ "rstrip": false,
35
+ "single_word": false
36
+ },
37
+ "sep_token": {
38
+ "content": "</s>",
39
+ "lstrip": false,
40
+ "normalized": false,
41
+ "rstrip": false,
42
+ "single_word": false
43
+ },
44
+ "unk_token": {
45
+ "content": "<unk>",
46
+ "lstrip": false,
47
+ "normalized": false,
48
+ "rstrip": false,
49
+ "single_word": false
50
+ }
51
+ }
tokenizer.json ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:883b037111086fd4dfebbbc9b7cee11e1517b5e0c0514879478661440f137085
3
+ size 17082987
tokenizer_config.json ADDED
@@ -0,0 +1,55 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "added_tokens_decoder": {
3
+ "0": {
4
+ "content": "<s>",
5
+ "lstrip": false,
6
+ "normalized": false,
7
+ "rstrip": false,
8
+ "single_word": false,
9
+ "special": true
10
+ },
11
+ "1": {
12
+ "content": "<pad>",
13
+ "lstrip": false,
14
+ "normalized": false,
15
+ "rstrip": false,
16
+ "single_word": false,
17
+ "special": true
18
+ },
19
+ "2": {
20
+ "content": "</s>",
21
+ "lstrip": false,
22
+ "normalized": false,
23
+ "rstrip": false,
24
+ "single_word": false,
25
+ "special": true
26
+ },
27
+ "3": {
28
+ "content": "<unk>",
29
+ "lstrip": false,
30
+ "normalized": false,
31
+ "rstrip": false,
32
+ "single_word": false,
33
+ "special": true
34
+ },
35
+ "250001": {
36
+ "content": "<mask>",
37
+ "lstrip": true,
38
+ "normalized": false,
39
+ "rstrip": false,
40
+ "single_word": false,
41
+ "special": true
42
+ }
43
+ },
44
+ "bos_token": "<s>",
45
+ "clean_up_tokenization_spaces": true,
46
+ "cls_token": "<s>",
47
+ "eos_token": "</s>",
48
+ "extra_special_tokens": {},
49
+ "mask_token": "<mask>",
50
+ "model_max_length": 512,
51
+ "pad_token": "<pad>",
52
+ "sep_token": "</s>",
53
+ "tokenizer_class": "XLMRobertaTokenizer",
54
+ "unk_token": "<unk>"
55
+ }