diff --git "a/README.md" "b/README.md"
--- "a/README.md"
+++ "b/README.md"
@@ -1,6 +1,4 @@
---
-language:
-- tr
tags:
- sentence-transformers
- sentence-similarity
@@ -10,66 +8,11 @@ tags:
- loss:MSELoss
base_model: BAAI/bge-m3
widget:
-- source_sentence: Ak Hunlar'ın kültürel etkileşimleri ve mirasları hakkında ne söyleyebiliriz?
- Ak Hunlar'ın diğer kültürler üzerindeki etkileri ve izleri nelerdir?
+- source_sentence: That is a happy person
sentences:
- - Film, hangi oyun yazarının hayatını konu almaktadır?
- - Bir Eskişehir-Afyonkarahisar tren yolculuğu ne kadar sürmektedir?
- - Mektupta, Türkiye'nin adaya tek taraflı müdahalesinin Türk ve Yunan tarafları
- arasında savaşa yol açabileceği ve NATO üyesi olan bu iki ülkenin savaşmasının
- kabul edilemez olduğu ifade edilmiştir. Türkiye'nin müdahale kararı almadan önce
- müttefiklerine danışması gerektiği anımsatılmıştır. Ayrıca bu savaşın Sovyetler
- Birliği'nin de Türkiye'ye müdahale ihtimalini doğuracağı ve NATO'nun böyle bir
- durumda Türkiye'yi savunma konusunda isteksiz olacağı ima edilmiştir. ABD'nin
- Türkiye'ye sağladığı askeri malzemenin bu müdahalede kullanılmasına izin verilmeyeceği
- belirtilmiştir. Mektubun ardından Türkiye müdahale kararından vazgeçmiştir. İsmet
- İnönü 21 Haziran 1964'te ABD'ye giderek başkan Johnson ile bir görüşmede bulunmuştur.
-- source_sentence: Evet, metinde teslimiyetçilik, edilgenlik veya boyun eğme olarak
- da tanımlanmaktadır.
- sentences:
- - Cezary Kucharski'nin doğduğu tarih nedir?
- - Beylerbeyi Camii, 2013 yılında yapılan restorasyon çalışmaları sonrasında ne durumda?
- - "İkinci Dünya Savaşı esnasında ve sonrasında elektroniklerin doğasından kaynaklanan\
- \ birçok güvenilir olmama durumu ve ürün yorgunluğu gündeme geldi. 1945'te M.A.\
- \ Miner, ASME (Amerikan Makine Mühendisleri Topluluğu) Dergisi içerisinde \"Yorulma\
- \ Esnasında Birikimli Hasar\" adında taslak bir yazı paylaştı. Ordu için uygulanan\
- \ ilk güvenilirlik hususu, Radar Sistemleri ve diğer elektronik parçalarda kullanılan,\
- \ yine güvenilirlik analizi sayesinde kanıtlanmış, oldukça arıza çıkarmaya yatkın\
- \ ve maliyetli bir vakum silindiri idi. Elektrik ve Elektronik Mühendisleri Enstitüsü,\
- \ 1948 yılında Güvenilirlik Topluluğunu kurmuştur. 1950 yılı içerisinde, asker\
- \ tarafında, Elektronik Ekipman Güvenilirliği Tavsiye Grubu kurulmuştur. Bu grup,\
- \ 3 ana çalışma yolu tavsiye etmiştir. Bunlar:\n\n Parça güvenilirliğinin arttırılması,\n\
- \ Tedarikçiler için kalite ve güvenilirlik gereksinimlerinin tanımlanması,\n Saha\
- \ verilerinin toplanması ve kök analiz yapılması."
-- source_sentence: Belgrad'ın ele geçirilmesinde Klingenberg'in rolü nedir ve bu olay
- nasıl gerçekleşti?
- sentences:
- - Jimmy White ve Peter Ebdon.
- - DualSense kontrolörünün titreşim özelliği hakkında detaylı bilgi verir misiniz?
- - "Kozluk, Kocaeli ilinin İzmit ilçesine bağlı bir mahalledir.\n\nNüfus\n\nKaynakça\
- \ \n\nİzmit'in mahalleleri"
-- source_sentence: 1996 yılında kurulmuştur. Ağırlıklı olarak standart caz repertuvarından
- parçalar sunmuşlardır.
- sentences:
- - San Leucio'nun coğrafi konumu hakkında bilgi verir misiniz?
- - Kinik felsefesinin öncüsüdür.
- - Aydın Doğu Demirkol'un vizyona girmesi planlanan sinema filmleri nelerdir ve yönetmenleri
- kimlerdir?
-- source_sentence: Serbest pazar prensiplerinin varlıklı ve yoksul futbol kulüpleri
- arasındaki farkı büyütmesine yönelik kaygılar nedeniyle bu durum önemlidir.
- sentences:
- - Yazar, 12 Mart baskınlarının ve işkencelerinin sonucunda, ideolojik kimlikleriyle
- küçük burjuva kimlikleri arasında çelişkiye düşen devrimcilerin rejime boyun eğmelerini
- gösterme çabasındadır.
- - "Verilen kesin süre \niçinde şikayetçi tarafından ilgili masraflar yatırıldığından\
- \ PTT’ce söz konusu \nkeşfa.va.nsınıngeri önd.e-rilmesi sonucu talimat \nmahkemesince\
- \ keşf yapılmamış ise de burada şikayetçiye atfedilebilecek bir kusur \nbulunmadığından,\
- \ keşif avansının ilgili mahkemeye tekrar gönderilerek keşfin \nyapılmasının sağlanarak\
- \ oluşacak sonuca göre bir karar verilmesi gerekir."
- - This Kind of Bird Flies Backwards (Bu Cins Kuş Tersten Uçar) adlı ilk kitabı,
- LeRoy Jones ve Hettie Jones'un kurduğu Totem Press tarafından 1958 yılında yayınlandı.
-datasets:
-- altaidevorg/tr-sentences
+ - That is a happy dog
+ - That is a very happy person
+ - Today is a sunny day
pipeline_tag: sentence-similarity
library_name: sentence-transformers
metrics:
@@ -117,9 +60,17 @@ model-index:
name: Spearman Cosine
---
-# SentenceTransformer based on BAAI/bge-m3
+# 8-layer distillation from BAAI/bge-m3 with2.5x speedup
+
+This is an embedding model distilled from [BAAI/bge-m3](https://huggingface.co/BAAI/bge-m3) on a combination of public and proprietary datasets. It is a 8-layer model --instead of 24 layers) in 366m-parameter size and achieves 2.5x speedup with little-to-no loss in retrieval performance.
+
+## Motivation
+
+We are a team that have developed some of the real use cases of semantic search and RAG, and no other models apart from `BAAI/bge-m3` have proved to be useful in a variety of domains and use cases, especially in multimodal settings. However, it's extra large and prohibitively expensive to serve for large user groups with a low latency and/or index large volumes of data. That's why we wanted the same retrieval performance in a smaller model size and with higher speed. We composed a large and diverse dataset of 10m texts and applied a knowledge distillation technique that reduced the number of layers from 24 to 8. The results were surprisingly promising --we achieved a Spearman Cosine score of 0.965 and MSE of 0.006 in the test subset, which can be even taken to be within numerical error ranges. We couldn't observe a considerable degredation in our qualitative tests, either. Finally, we measured a 2.5x throughput increase (454 texts / sec instead of 175 texts / sec, measured on a T4 Colab GPU).
-This is a [sentence-transformers](https://www.SBERT.net) model finetuned from [BAAI/bge-m3](https://huggingface.co/BAAI/bge-m3) on the [tr-sentences](https://huggingface.co/datasets/altaidevorg/tr-sentences) dataset. It maps sentences & paragraphs to a 1024-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.
+## Future Work
+
+Even though our training dataset was composed of diverse texts in Turkish, the model retained a considerable performance in other languages as well --we measured a Spearman Cosine score of 0.938 in a collection 10k texts in English, for example. This performance retention motivated us to work on the second version of this distillation model trained on a larger and multilingual dataset as well as an even smaller distillation. Stay tuned for these updates, and feel free to reach out to us for collaboration options.
## Model Details
@@ -130,9 +81,7 @@ This is a [sentence-transformers](https://www.SBERT.net) model finetuned from [B
- **Output Dimensionality:** 1024 dimensions
- **Similarity Function:** Cosine Similarity
- **Training Dataset:**
- - [tr-sentences](https://huggingface.co/datasets/altaidevorg/tr-sentences)
-- **Language:** tr
-
+- **License:** Proprietary
### Model Sources
@@ -168,9 +117,9 @@ from sentence_transformers import SentenceTransformer
model = SentenceTransformer("sentence_transformers_model_id")
# Run inference
sentences = [
- 'Serbest pazar prensiplerinin varlıklı ve yoksul futbol kulüpleri arasındaki farkı büyütmesine yönelik kaygılar nedeniyle bu durum önemlidir.',
- 'Yazar, 12 Mart baskınlarının ve işkencelerinin sonucunda, ideolojik kimlikleriyle küçük burjuva kimlikleri arasında çelişkiye düşen devrimcilerin rejime boyun eğmelerini gösterme çabasındadır.',
- "This Kind of Bird Flies Backwards (Bu Cins Kuş Tersten Uçar) adlı ilk kitabı, LeRoy Jones ve Hettie Jones'un kurduğu Totem Press tarafından 1958 yılında yayınlandı.",
+ 'That is a happy person',
+ 'That is a happy dog',
+ 'That is a very happy person',
]
embeddings = model.encode(sentences)
print(embeddings.shape)
@@ -182,30 +131,6 @@ print(similarities.shape)
# [3, 3]
```
-
-
-
-
-
-
## Evaluation
### Metrics
@@ -244,9 +169,6 @@ You can finetune this model on your own dataset.
### Training Dataset
-#### tr-sentences
-
-* Dataset: [tr-sentences](https://huggingface.co/datasets/altaidevorg/tr-sentences) at [f5ebc52](https://huggingface.co/datasets/altaidevorg/tr-sentences/tree/f5ebc522ed687664c812bf5789714aead7a5842c)
* Size: 9,623,924 training samples
* Columns: sentence
and label
* Approximate statistics based on the first 1000 samples:
@@ -262,779 +184,6 @@ You can finetune this model on your own dataset.
| Evet, Nasuhlar ismi Adapazarı, Kandıra ve Yenipazar ilçelerinde farklı yer isimlerine aittir.
| [0.0020795632153749466, -0.013080586679279804, -0.018256550654768944, 0.022429518401622772, -0.03087380714714527, ...]
|
* Loss: [MSELoss
](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#mseloss)
-### Evaluation Dataset
-
-#### tr-sentences
-
-* Dataset: [tr-sentences](https://huggingface.co/datasets/altaidevorg/tr-sentences) at [f5ebc52](https://huggingface.co/datasets/altaidevorg/tr-sentences/tree/f5ebc522ed687664c812bf5789714aead7a5842c)
-* Size: 9,623,924 evaluation samples
-* Columns: sentence
and label
-* Approximate statistics based on the first 1000 samples:
- | | sentence | label |
- |:--------|:-----------------------------------------------------------------------------------|:--------------------------------------|
- | type | string | list |
- | details |
- min: 3 tokens
- mean: 51.95 tokens
- max: 614 tokens
| |
-* Samples:
- | sentence | label |
- |:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------------------------------------------------------|
- | Bernhard, şiirle yazarlık hayatına başlamış ve 1963'te "Frost" (Don) adlı ilk romanını yayınlamıştır. 1957'den itibaren serbest yazarlık yapmaya başlamış ve hayatı boyunca yazarlık sayesinde geçimini sağlamıştır.
| [-0.019921669736504555, -0.007309767417609692, 0.01690034568309784, -0.03302725777029991, -0.003539217868819833, ...]
|
- | Sonraki maçta AJ Styles ile Kevin Owens, WWE Birleşik Devletler Şampiyonluğu kemeri için maça çıktı. Shane McMahon, maçın özel konuk hakemliğini yaptı. As Shane, Owens'ı kontrol etti. Styles, Owens'a Springboard 450 Splash yapmaya çalışırken yanlışlıkla Shane'e de yaptı. Owens, Styles'a Pop Up Powerbomb yaptıktan sonra Styles'ı tuşlamaya çalıştı ancak Styles son anda kurtuldu. Owens, Shane'in kararını beğenmeyince ikisi arasında kısa süreli bir tartışma oldu. Owens, Styles'ın Calf Crusher hareketini karşıladıktan sonra Styles'tan tekme yiyince Shane'in üzerine düştü. Styles, Owens'ı Calf Crusher ile pes ettirse de ringin dışında aşağıda yatan Shane bunu göremedi. Bunun üzerine Styles da Shane ile tartıştı. Styles, Owens'a Styles Clash yaptıktan sonra tuşa gitti ancak Owens son anda kurtuldu. Owens'ın yaptığı Pop Up Powerbomb'dan sonra Styles'ı tuşladı ancak Shane son anda Styles'ın ayağının iplerde olduğunu fark edince tuşu iptal etti. Owens ve Shane tartışmaya başladı ve Shane,
| [0.04532943293452263, -0.007217255420982838, -0.019380981102585793, -0.0026675150729715824, 0.018997980281710625, ...]
|
- | Leylek yavruları, anne ve babaları tarafından yiyip kısmen sindirdikleri besinleri kusarak beslenirler. Anne leylek yavruları yağmur, fırtına ve güneşten korurken, baba leylek yavrularını beslemekle yükümlüdür.
| [-0.055585864931344986, 0.045432090759277344, -0.04405859857797623, 0.0009241091320291162, -0.0689476728439331, ...]
|
-* Loss: [MSELoss
](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#mseloss)
-
-### Training Hyperparameters
-#### Non-Default Hyperparameters
-
-- `eval_strategy`: steps
-- `per_device_train_batch_size`: 32
-- `per_device_eval_batch_size`: 32
-- `learning_rate`: 0.0001
-- `num_train_epochs`: 1
-- `warmup_ratio`: 0.1
-- `bf16`: True
-- `load_best_model_at_end`: True
-
-#### All Hyperparameters
-Click to expand
-
-- `overwrite_output_dir`: False
-- `do_predict`: False
-- `eval_strategy`: steps
-- `prediction_loss_only`: True
-- `per_device_train_batch_size`: 32
-- `per_device_eval_batch_size`: 32
-- `per_gpu_train_batch_size`: None
-- `per_gpu_eval_batch_size`: None
-- `gradient_accumulation_steps`: 1
-- `eval_accumulation_steps`: None
-- `torch_empty_cache_steps`: None
-- `learning_rate`: 0.0001
-- `weight_decay`: 0.0
-- `adam_beta1`: 0.9
-- `adam_beta2`: 0.999
-- `adam_epsilon`: 1e-08
-- `max_grad_norm`: 1.0
-- `num_train_epochs`: 1
-- `max_steps`: -1
-- `lr_scheduler_type`: linear
-- `lr_scheduler_kwargs`: {}
-- `warmup_ratio`: 0.1
-- `warmup_steps`: 0
-- `log_level`: passive
-- `log_level_replica`: warning
-- `log_on_each_node`: True
-- `logging_nan_inf_filter`: True
-- `save_safetensors`: True
-- `save_on_each_node`: False
-- `save_only_model`: False
-- `restore_callback_states_from_checkpoint`: False
-- `no_cuda`: False
-- `use_cpu`: False
-- `use_mps_device`: False
-- `seed`: 42
-- `data_seed`: None
-- `jit_mode_eval`: False
-- `use_ipex`: False
-- `bf16`: True
-- `fp16`: False
-- `fp16_opt_level`: O1
-- `half_precision_backend`: auto
-- `bf16_full_eval`: False
-- `fp16_full_eval`: False
-- `tf32`: None
-- `local_rank`: 0
-- `ddp_backend`: None
-- `tpu_num_cores`: None
-- `tpu_metrics_debug`: False
-- `debug`: []
-- `dataloader_drop_last`: False
-- `dataloader_num_workers`: 0
-- `dataloader_prefetch_factor`: None
-- `past_index`: -1
-- `disable_tqdm`: False
-- `remove_unused_columns`: True
-- `label_names`: None
-- `load_best_model_at_end`: True
-- `ignore_data_skip`: False
-- `fsdp`: []
-- `fsdp_min_num_params`: 0
-- `fsdp_config`: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
-- `fsdp_transformer_layer_cls_to_wrap`: None
-- `accelerator_config`: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
-- `deepspeed`: None
-- `label_smoothing_factor`: 0.0
-- `optim`: adamw_torch
-- `optim_args`: None
-- `adafactor`: False
-- `group_by_length`: False
-- `length_column_name`: length
-- `ddp_find_unused_parameters`: None
-- `ddp_bucket_cap_mb`: None
-- `ddp_broadcast_buffers`: False
-- `dataloader_pin_memory`: True
-- `dataloader_persistent_workers`: False
-- `skip_memory_metrics`: True
-- `use_legacy_prediction_loop`: False
-- `push_to_hub`: False
-- `resume_from_checkpoint`: None
-- `hub_model_id`: None
-- `hub_strategy`: every_save
-- `hub_private_repo`: None
-- `hub_always_push`: False
-- `gradient_checkpointing`: False
-- `gradient_checkpointing_kwargs`: None
-- `include_inputs_for_metrics`: False
-- `include_for_metrics`: []
-- `eval_do_concat_batches`: True
-- `fp16_backend`: auto
-- `push_to_hub_model_id`: None
-- `push_to_hub_organization`: None
-- `mp_parameters`:
-- `auto_find_batch_size`: False
-- `full_determinism`: False
-- `torchdynamo`: None
-- `ray_scope`: last
-- `ddp_timeout`: 1800
-- `torch_compile`: False
-- `torch_compile_backend`: None
-- `torch_compile_mode`: None
-- `dispatch_batches`: None
-- `split_batches`: None
-- `include_tokens_per_second`: False
-- `include_num_input_tokens_seen`: False
-- `neftune_noise_alpha`: None
-- `optim_target_modules`: None
-- `batch_eval_metrics`: False
-- `eval_on_start`: False
-- `use_liger_kernel`: False
-- `eval_use_gather_object`: False
-- `average_tokens_across_devices`: False
-- `prompts`: None
-- `batch_sampler`: batch_sampler
-- `multi_dataset_batch_sampler`: proportional
-
-
-
-### Training Logs
-Click to expand
-
-| Epoch | Step | Training Loss | Validation Loss | sts-dev_spearman_cosine | negative_mse | sts-test_spearman_cosine |
-|:----------:|:----------:|:-------------:|:---------------:|:-----------------------:|:------------:|:------------------------:|
-| 0 | 0 | - | - | 0.0074 | -0.1913 | - |
-| 0.0017 | 500 | - | 0.0009 | 0.3279 | -0.0860 | - |
-| 0.0033 | 1000 | 0.001 | 0.0007 | 0.5478 | -0.0651 | - |
-| 0.0050 | 1500 | - | 0.0006 | 0.6221 | -0.0573 | - |
-| 0.0067 | 2000 | 0.0007 | 0.0005 | 0.6635 | -0.0523 | - |
-| 0.0083 | 2500 | - | 0.0005 | 0.6916 | -0.0485 | - |
-| 0.0100 | 3000 | 0.0006 | 0.0005 | 0.7148 | -0.0455 | - |
-| 0.0117 | 3500 | - | 0.0004 | 0.7319 | -0.0429 | - |
-| 0.0133 | 4000 | 0.0005 | 0.0004 | 0.7485 | -0.0406 | - |
-| 0.0150 | 4500 | - | 0.0004 | 0.7622 | -0.0385 | - |
-| 0.0167 | 5000 | 0.0005 | 0.0004 | 0.7722 | -0.0368 | - |
-| 0.0183 | 5500 | - | 0.0004 | 0.7856 | -0.0352 | - |
-| 0.0200 | 6000 | 0.0004 | 0.0003 | 0.7999 | -0.0336 | - |
-| 0.0217 | 6500 | - | 0.0003 | 0.8074 | -0.0323 | - |
-| 0.0233 | 7000 | 0.0004 | 0.0003 | 0.8155 | -0.0311 | - |
-| 0.0250 | 7500 | - | 0.0003 | 0.8237 | -0.0299 | - |
-| 0.0267 | 8000 | 0.0004 | 0.0003 | 0.8308 | -0.0289 | - |
-| 0.0283 | 8500 | - | 0.0003 | 0.8322 | -0.0280 | - |
-| 0.0300 | 9000 | 0.0004 | 0.0003 | 0.8409 | -0.0270 | - |
-| 0.0317 | 9500 | - | 0.0003 | 0.8446 | -0.0262 | - |
-| 0.0333 | 10000 | 0.0003 | 0.0003 | 0.8513 | -0.0254 | - |
-| 0.0350 | 10500 | - | 0.0002 | 0.8519 | -0.0247 | - |
-| 0.0367 | 11000 | 0.0003 | 0.0002 | 0.8591 | -0.0240 | - |
-| 0.0383 | 11500 | - | 0.0002 | 0.8623 | -0.0233 | - |
-| 0.0400 | 12000 | 0.0003 | 0.0002 | 0.8674 | -0.0228 | - |
-| 0.0416 | 12500 | - | 0.0002 | 0.8659 | -0.0222 | - |
-| 0.0433 | 13000 | 0.0003 | 0.0002 | 0.8724 | -0.0215 | - |
-| 0.0450 | 13500 | - | 0.0002 | 0.8725 | -0.0212 | - |
-| 0.0466 | 14000 | 0.0003 | 0.0002 | 0.8793 | -0.0206 | - |
-| 0.0483 | 14500 | - | 0.0002 | 0.8834 | -0.0202 | - |
-| 0.0500 | 15000 | 0.0003 | 0.0002 | 0.8817 | -0.0197 | - |
-| 0.0516 | 15500 | - | 0.0002 | 0.8860 | -0.0194 | - |
-| 0.0533 | 16000 | 0.0003 | 0.0002 | 0.8842 | -0.0188 | - |
-| 0.0550 | 16500 | - | 0.0002 | 0.8893 | -0.0185 | - |
-| 0.0566 | 17000 | 0.0002 | 0.0002 | 0.8880 | -0.0181 | - |
-| 0.0583 | 17500 | - | 0.0002 | 0.8932 | -0.0179 | - |
-| 0.0600 | 18000 | 0.0002 | 0.0002 | 0.8913 | -0.0176 | - |
-| 0.0616 | 18500 | - | 0.0002 | 0.8963 | -0.0172 | - |
-| 0.0633 | 19000 | 0.0002 | 0.0002 | 0.8915 | -0.0170 | - |
-| 0.0650 | 19500 | - | 0.0002 | 0.8969 | -0.0167 | - |
-| 0.0666 | 20000 | 0.0002 | 0.0002 | 0.8984 | -0.0165 | - |
-| 0.0683 | 20500 | - | 0.0002 | 0.9021 | -0.0162 | - |
-| 0.0700 | 21000 | 0.0002 | 0.0002 | 0.9027 | -0.0160 | - |
-| 0.0716 | 21500 | - | 0.0002 | 0.9018 | -0.0158 | - |
-| 0.0733 | 22000 | 0.0002 | 0.0002 | 0.9043 | -0.0156 | - |
-| 0.0750 | 22500 | - | 0.0002 | 0.9028 | -0.0154 | - |
-| 0.0766 | 23000 | 0.0002 | 0.0002 | 0.9024 | -0.0153 | - |
-| 0.0783 | 23500 | - | 0.0002 | 0.9049 | -0.0152 | - |
-| 0.0800 | 24000 | 0.0002 | 0.0001 | 0.9087 | -0.0150 | - |
-| 0.0816 | 24500 | - | 0.0001 | 0.9079 | -0.0148 | - |
-| 0.0833 | 25000 | 0.0002 | 0.0001 | 0.9080 | -0.0147 | - |
-| 0.0850 | 25500 | - | 0.0001 | 0.9096 | -0.0145 | - |
-| 0.0866 | 26000 | 0.0002 | 0.0001 | 0.9061 | -0.0145 | - |
-| 0.0883 | 26500 | - | 0.0001 | 0.9098 | -0.0143 | - |
-| 0.0900 | 27000 | 0.0002 | 0.0001 | 0.9122 | -0.0142 | - |
-| 0.0916 | 27500 | - | 0.0001 | 0.9131 | -0.0140 | - |
-| 0.0933 | 28000 | 0.0002 | 0.0001 | 0.9114 | -0.0139 | - |
-| 0.0950 | 28500 | - | 0.0001 | 0.9126 | -0.0139 | - |
-| 0.0966 | 29000 | 0.0002 | 0.0001 | 0.9163 | -0.0138 | - |
-| 0.0983 | 29500 | - | 0.0001 | 0.9140 | -0.0137 | - |
-| 0.1000 | 30000 | 0.0002 | 0.0001 | 0.9141 | -0.0136 | - |
-| 0.1016 | 30500 | - | 0.0001 | 0.9163 | -0.0135 | - |
-| 0.1033 | 31000 | 0.0002 | 0.0001 | 0.9159 | -0.0135 | - |
-| 0.1050 | 31500 | - | 0.0001 | 0.9153 | -0.0132 | - |
-| 0.1066 | 32000 | 0.0002 | 0.0001 | 0.9194 | -0.0131 | - |
-| 0.1083 | 32500 | - | 0.0001 | 0.9203 | -0.0131 | - |
-| 0.1100 | 33000 | 0.0002 | 0.0001 | 0.9187 | -0.0129 | - |
-| 0.1116 | 33500 | - | 0.0001 | 0.9218 | -0.0129 | - |
-| 0.1133 | 34000 | 0.0002 | 0.0001 | 0.9204 | -0.0127 | - |
-| 0.1150 | 34500 | - | 0.0001 | 0.9216 | -0.0127 | - |
-| 0.1166 | 35000 | 0.0002 | 0.0001 | 0.9232 | -0.0125 | - |
-| 0.1183 | 35500 | - | 0.0001 | 0.9212 | -0.0125 | - |
-| 0.1200 | 36000 | 0.0002 | 0.0001 | 0.9227 | -0.0125 | - |
-| 0.1216 | 36500 | - | 0.0001 | 0.9233 | -0.0124 | - |
-| 0.1233 | 37000 | 0.0002 | 0.0001 | 0.9261 | -0.0123 | - |
-| 0.1249 | 37500 | - | 0.0001 | 0.9256 | -0.0122 | - |
-| 0.1266 | 38000 | 0.0002 | 0.0001 | 0.9273 | -0.0121 | - |
-| 0.1283 | 38500 | - | 0.0001 | 0.9274 | -0.0120 | - |
-| 0.1299 | 39000 | 0.0002 | 0.0001 | 0.9273 | -0.0119 | - |
-| 0.1316 | 39500 | - | 0.0001 | 0.9287 | -0.0119 | - |
-| 0.1333 | 40000 | 0.0002 | 0.0001 | 0.9266 | -0.0118 | - |
-| 0.1349 | 40500 | - | 0.0001 | 0.9283 | -0.0118 | - |
-| 0.1366 | 41000 | 0.0002 | 0.0001 | 0.9307 | -0.0117 | - |
-| 0.1383 | 41500 | - | 0.0001 | 0.9277 | -0.0117 | - |
-| 0.1399 | 42000 | 0.0002 | 0.0001 | 0.9281 | -0.0115 | - |
-| 0.1416 | 42500 | - | 0.0001 | 0.9299 | -0.0115 | - |
-| 0.1433 | 43000 | 0.0002 | 0.0001 | 0.9306 | -0.0115 | - |
-| 0.1449 | 43500 | - | 0.0001 | 0.9301 | -0.0114 | - |
-| 0.1466 | 44000 | 0.0002 | 0.0001 | 0.9302 | -0.0114 | - |
-| 0.1483 | 44500 | - | 0.0001 | 0.9321 | -0.0114 | - |
-| 0.1499 | 45000 | 0.0002 | 0.0001 | 0.9320 | -0.0113 | - |
-| 0.1516 | 45500 | - | 0.0001 | 0.9333 | -0.0112 | - |
-| 0.1533 | 46000 | 0.0002 | 0.0001 | 0.9343 | -0.0111 | - |
-| 0.1549 | 46500 | - | 0.0001 | 0.9315 | -0.0111 | - |
-| 0.1566 | 47000 | 0.0002 | 0.0001 | 0.9326 | -0.0111 | - |
-| 0.1583 | 47500 | - | 0.0001 | 0.9324 | -0.0110 | - |
-| 0.1599 | 48000 | 0.0001 | 0.0001 | 0.9362 | -0.0110 | - |
-| 0.1616 | 48500 | - | 0.0001 | 0.9370 | -0.0109 | - |
-| 0.1633 | 49000 | 0.0001 | 0.0001 | 0.9348 | -0.0109 | - |
-| 0.1649 | 49500 | - | 0.0001 | 0.9352 | -0.0108 | - |
-| 0.1666 | 50000 | 0.0001 | 0.0001 | 0.9364 | -0.0107 | - |
-| 0.1683 | 50500 | - | 0.0001 | 0.9351 | -0.0107 | - |
-| 0.1699 | 51000 | 0.0001 | 0.0001 | 0.9372 | -0.0108 | - |
-| 0.1716 | 51500 | - | 0.0001 | 0.9357 | -0.0108 | - |
-| 0.1733 | 52000 | 0.0001 | 0.0001 | 0.9384 | -0.0106 | - |
-| 0.1749 | 52500 | - | 0.0001 | 0.9366 | -0.0106 | - |
-| 0.1766 | 53000 | 0.0001 | 0.0001 | 0.9375 | -0.0106 | - |
-| 0.1783 | 53500 | - | 0.0001 | 0.9381 | -0.0105 | - |
-| 0.1799 | 54000 | 0.0001 | 0.0001 | 0.9382 | -0.0105 | - |
-| 0.1816 | 54500 | - | 0.0001 | 0.9368 | -0.0106 | - |
-| 0.1833 | 55000 | 0.0001 | 0.0001 | 0.9383 | -0.0105 | - |
-| 0.1849 | 55500 | - | 0.0001 | 0.9393 | -0.0104 | - |
-| 0.1866 | 56000 | 0.0001 | 0.0001 | 0.9383 | -0.0104 | - |
-| 0.1883 | 56500 | - | 0.0001 | 0.9397 | -0.0104 | - |
-| 0.1899 | 57000 | 0.0001 | 0.0001 | 0.9404 | -0.0103 | - |
-| 0.1916 | 57500 | - | 0.0001 | 0.9378 | -0.0103 | - |
-| 0.1933 | 58000 | 0.0001 | 0.0001 | 0.9379 | -0.0103 | - |
-| 0.1949 | 58500 | - | 0.0001 | 0.9397 | -0.0102 | - |
-| 0.1966 | 59000 | 0.0001 | 0.0001 | 0.9406 | -0.0102 | - |
-| 0.1983 | 59500 | - | 0.0001 | 0.9402 | -0.0102 | - |
-| 0.1999 | 60000 | 0.0001 | 0.0001 | 0.9408 | -0.0101 | - |
-| 0.2016 | 60500 | - | 0.0001 | 0.9410 | -0.0101 | - |
-| 0.2033 | 61000 | 0.0001 | 0.0001 | 0.9409 | -0.0101 | - |
-| 0.2049 | 61500 | - | 0.0001 | 0.9405 | -0.0101 | - |
-| 0.2066 | 62000 | 0.0001 | 0.0001 | 0.9424 | -0.0100 | - |
-| 0.2082 | 62500 | - | 0.0001 | 0.9378 | -0.0101 | - |
-| 0.2099 | 63000 | 0.0001 | 0.0001 | 0.9408 | -0.0099 | - |
-| 0.2116 | 63500 | - | 0.0001 | 0.9404 | -0.0100 | - |
-| 0.2132 | 64000 | 0.0001 | 0.0001 | 0.9397 | -0.0099 | - |
-| 0.2149 | 64500 | - | 0.0001 | 0.9411 | -0.0099 | - |
-| 0.2166 | 65000 | 0.0001 | 0.0001 | 0.9401 | -0.0099 | - |
-| 0.2182 | 65500 | - | 0.0001 | 0.9415 | -0.0098 | - |
-| 0.2199 | 66000 | 0.0001 | 0.0001 | 0.9413 | -0.0098 | - |
-| 0.2216 | 66500 | - | 0.0001 | 0.9417 | -0.0098 | - |
-| 0.2232 | 67000 | 0.0001 | 0.0001 | 0.9411 | -0.0097 | - |
-| 0.2249 | 67500 | - | 0.0001 | 0.9423 | -0.0097 | - |
-| 0.2266 | 68000 | 0.0001 | 0.0001 | 0.9424 | -0.0097 | - |
-| 0.2282 | 68500 | - | 0.0001 | 0.9424 | -0.0098 | - |
-| 0.2299 | 69000 | 0.0001 | 0.0001 | 0.9439 | -0.0096 | - |
-| 0.2316 | 69500 | - | 0.0001 | 0.9423 | -0.0097 | - |
-| 0.2332 | 70000 | 0.0001 | 0.0001 | 0.9420 | -0.0096 | - |
-| 0.2349 | 70500 | - | 0.0001 | 0.9429 | -0.0096 | - |
-| 0.2366 | 71000 | 0.0001 | 0.0001 | 0.9440 | -0.0096 | - |
-| 0.2382 | 71500 | - | 0.0001 | 0.9425 | -0.0096 | - |
-| 0.2399 | 72000 | 0.0001 | 0.0001 | 0.9438 | -0.0096 | - |
-| 0.2416 | 72500 | - | 0.0001 | 0.9442 | -0.0095 | - |
-| 0.2432 | 73000 | 0.0001 | 0.0001 | 0.9451 | -0.0095 | - |
-| 0.2449 | 73500 | - | 0.0001 | 0.9432 | -0.0095 | - |
-| 0.2466 | 74000 | 0.0001 | 0.0001 | 0.9441 | -0.0095 | - |
-| 0.2482 | 74500 | - | 0.0001 | 0.9442 | -0.0094 | - |
-| 0.2499 | 75000 | 0.0001 | 0.0001 | 0.9436 | -0.0094 | - |
-| 0.2516 | 75500 | - | 0.0001 | 0.9450 | -0.0094 | - |
-| 0.2532 | 76000 | 0.0001 | 0.0001 | 0.9455 | -0.0094 | - |
-| 0.2549 | 76500 | - | 0.0001 | 0.9439 | -0.0094 | - |
-| 0.2566 | 77000 | 0.0001 | 0.0001 | 0.9444 | -0.0094 | - |
-| 0.2582 | 77500 | - | 0.0001 | 0.9449 | -0.0093 | - |
-| 0.2599 | 78000 | 0.0001 | 0.0001 | 0.9444 | -0.0093 | - |
-| 0.2616 | 78500 | - | 0.0001 | 0.9454 | -0.0093 | - |
-| 0.2632 | 79000 | 0.0001 | 0.0001 | 0.9452 | -0.0093 | - |
-| 0.2649 | 79500 | - | 0.0001 | 0.9465 | -0.0093 | - |
-| 0.2666 | 80000 | 0.0001 | 0.0001 | 0.9450 | -0.0093 | - |
-| 0.2682 | 80500 | - | 0.0001 | 0.9467 | -0.0092 | - |
-| 0.2699 | 81000 | 0.0001 | 0.0001 | 0.9470 | -0.0092 | - |
-| 0.2716 | 81500 | - | 0.0001 | 0.9447 | -0.0092 | - |
-| 0.2732 | 82000 | 0.0001 | 0.0001 | 0.9477 | -0.0092 | - |
-| 0.2749 | 82500 | - | 0.0001 | 0.9442 | -0.0092 | - |
-| 0.2766 | 83000 | 0.0001 | 0.0001 | 0.9482 | -0.0091 | - |
-| 0.2782 | 83500 | - | 0.0001 | 0.9475 | -0.0091 | - |
-| 0.2799 | 84000 | 0.0001 | 0.0001 | 0.9451 | -0.0091 | - |
-| 0.2816 | 84500 | - | 0.0001 | 0.9471 | -0.0091 | - |
-| 0.2832 | 85000 | 0.0001 | 0.0001 | 0.9470 | -0.0090 | - |
-| 0.2849 | 85500 | - | 0.0001 | 0.9468 | -0.0091 | - |
-| 0.2865 | 86000 | 0.0001 | 0.0001 | 0.9464 | -0.0090 | - |
-| 0.2882 | 86500 | - | 0.0001 | 0.9482 | -0.0090 | - |
-| 0.2899 | 87000 | 0.0001 | 0.0001 | 0.9466 | -0.0090 | - |
-| 0.2915 | 87500 | - | 0.0001 | 0.9474 | -0.0090 | - |
-| 0.2932 | 88000 | 0.0001 | 0.0001 | 0.9476 | -0.0090 | - |
-| 0.2949 | 88500 | - | 0.0001 | 0.9480 | -0.0089 | - |
-| 0.2965 | 89000 | 0.0001 | 0.0001 | 0.9489 | -0.0090 | - |
-| 0.2982 | 89500 | - | 0.0001 | 0.9475 | -0.0089 | - |
-| 0.2999 | 90000 | 0.0001 | 0.0001 | 0.9483 | -0.0089 | - |
-| 0.3015 | 90500 | - | 0.0001 | 0.9478 | -0.0089 | - |
-| 0.3032 | 91000 | 0.0001 | 0.0001 | 0.9471 | -0.0090 | - |
-| 0.3049 | 91500 | - | 0.0001 | 0.9470 | -0.0089 | - |
-| 0.3065 | 92000 | 0.0001 | 0.0001 | 0.9472 | -0.0089 | - |
-| 0.3082 | 92500 | - | 0.0001 | 0.9485 | -0.0089 | - |
-| 0.3099 | 93000 | 0.0001 | 0.0001 | 0.9468 | -0.0089 | - |
-| 0.3115 | 93500 | - | 0.0001 | 0.9484 | -0.0088 | - |
-| 0.3132 | 94000 | 0.0001 | 0.0001 | 0.9482 | -0.0088 | - |
-| 0.3149 | 94500 | - | 0.0001 | 0.9503 | -0.0088 | - |
-| 0.3165 | 95000 | 0.0001 | 0.0001 | 0.9485 | -0.0088 | - |
-| 0.3182 | 95500 | - | 0.0001 | 0.9509 | -0.0087 | - |
-| 0.3199 | 96000 | 0.0001 | 0.0001 | 0.9492 | -0.0088 | - |
-| 0.3215 | 96500 | - | 0.0001 | 0.9488 | -0.0087 | - |
-| 0.3232 | 97000 | 0.0001 | 0.0001 | 0.9500 | -0.0087 | - |
-| 0.3249 | 97500 | - | 0.0001 | 0.9495 | -0.0087 | - |
-| 0.3265 | 98000 | 0.0001 | 0.0001 | 0.9499 | -0.0087 | - |
-| 0.3282 | 98500 | - | 0.0001 | 0.9496 | -0.0087 | - |
-| 0.3299 | 99000 | 0.0001 | 0.0001 | 0.9493 | -0.0087 | - |
-| 0.3315 | 99500 | - | 0.0001 | 0.9497 | -0.0087 | - |
-| 0.3332 | 100000 | 0.0001 | 0.0001 | 0.9511 | -0.0086 | - |
-| 0.3349 | 100500 | - | 0.0001 | 0.9508 | -0.0086 | - |
-| 0.3365 | 101000 | 0.0001 | 0.0001 | 0.9502 | -0.0086 | - |
-| 0.3382 | 101500 | - | 0.0001 | 0.9488 | -0.0087 | - |
-| 0.3399 | 102000 | 0.0001 | 0.0001 | 0.9505 | -0.0086 | - |
-| 0.3415 | 102500 | - | 0.0001 | 0.9497 | -0.0086 | - |
-| 0.3432 | 103000 | 0.0001 | 0.0001 | 0.9500 | -0.0085 | - |
-| 0.3449 | 103500 | - | 0.0001 | 0.9497 | -0.0086 | - |
-| 0.3465 | 104000 | 0.0001 | 0.0001 | 0.9521 | -0.0085 | - |
-| 0.3482 | 104500 | - | 0.0001 | 0.9499 | -0.0085 | - |
-| 0.3499 | 105000 | 0.0001 | 0.0001 | 0.9488 | -0.0085 | - |
-| 0.3515 | 105500 | - | 0.0001 | 0.9490 | -0.0085 | - |
-| 0.3532 | 106000 | 0.0001 | 0.0001 | 0.9503 | -0.0085 | - |
-| 0.3549 | 106500 | - | 0.0001 | 0.9504 | -0.0085 | - |
-| 0.3565 | 107000 | 0.0001 | 0.0001 | 0.9503 | -0.0085 | - |
-| 0.3582 | 107500 | - | 0.0001 | 0.9514 | -0.0085 | - |
-| 0.3599 | 108000 | 0.0001 | 0.0001 | 0.9509 | -0.0084 | - |
-| 0.3615 | 108500 | - | 0.0001 | 0.9513 | -0.0084 | - |
-| 0.3632 | 109000 | 0.0001 | 0.0001 | 0.9512 | -0.0084 | - |
-| 0.3649 | 109500 | - | 0.0001 | 0.9515 | -0.0084 | - |
-| 0.3665 | 110000 | 0.0001 | 0.0001 | 0.9509 | -0.0084 | - |
-| 0.3682 | 110500 | - | 0.0001 | 0.9495 | -0.0084 | - |
-| 0.3698 | 111000 | 0.0001 | 0.0001 | 0.9507 | -0.0084 | - |
-| 0.3715 | 111500 | - | 0.0001 | 0.9512 | -0.0083 | - |
-| 0.3732 | 112000 | 0.0001 | 0.0001 | 0.9519 | -0.0084 | - |
-| 0.3748 | 112500 | - | 0.0001 | 0.9512 | -0.0084 | - |
-| 0.3765 | 113000 | 0.0001 | 0.0001 | 0.9511 | -0.0083 | - |
-| 0.3782 | 113500 | - | 0.0001 | 0.9513 | -0.0083 | - |
-| 0.3798 | 114000 | 0.0001 | 0.0001 | 0.9512 | -0.0084 | - |
-| 0.3815 | 114500 | - | 0.0001 | 0.9501 | -0.0083 | - |
-| 0.3832 | 115000 | 0.0001 | 0.0001 | 0.9515 | -0.0083 | - |
-| 0.3848 | 115500 | - | 0.0001 | 0.9526 | -0.0083 | - |
-| 0.3865 | 116000 | 0.0001 | 0.0001 | 0.9518 | -0.0083 | - |
-| 0.3882 | 116500 | - | 0.0001 | 0.9521 | -0.0083 | - |
-| 0.3898 | 117000 | 0.0001 | 0.0001 | 0.9515 | -0.0083 | - |
-| 0.3915 | 117500 | - | 0.0001 | 0.9515 | -0.0083 | - |
-| 0.3932 | 118000 | 0.0001 | 0.0001 | 0.9530 | -0.0082 | - |
-| 0.3948 | 118500 | - | 0.0001 | 0.9533 | -0.0082 | - |
-| 0.3965 | 119000 | 0.0001 | 0.0001 | 0.9523 | -0.0082 | - |
-| 0.3982 | 119500 | - | 0.0001 | 0.9520 | -0.0082 | - |
-| 0.3998 | 120000 | 0.0001 | 0.0001 | 0.9511 | -0.0082 | - |
-| 0.4015 | 120500 | - | 0.0001 | 0.9530 | -0.0083 | - |
-| 0.4032 | 121000 | 0.0001 | 0.0001 | 0.9525 | -0.0082 | - |
-| 0.4048 | 121500 | - | 0.0001 | 0.9526 | -0.0082 | - |
-| 0.4065 | 122000 | 0.0001 | 0.0001 | 0.9527 | -0.0082 | - |
-| 0.4082 | 122500 | - | 0.0001 | 0.9522 | -0.0082 | - |
-| 0.4098 | 123000 | 0.0001 | 0.0001 | 0.9535 | -0.0081 | - |
-| 0.4115 | 123500 | - | 0.0001 | 0.9527 | -0.0081 | - |
-| 0.4132 | 124000 | 0.0001 | 0.0001 | 0.9530 | -0.0082 | - |
-| 0.4148 | 124500 | - | 0.0001 | 0.9520 | -0.0082 | - |
-| 0.4165 | 125000 | 0.0001 | 0.0001 | 0.9526 | -0.0081 | - |
-| 0.4182 | 125500 | - | 0.0001 | 0.9528 | -0.0081 | - |
-| 0.4198 | 126000 | 0.0001 | 0.0001 | 0.9535 | -0.0081 | - |
-| 0.4215 | 126500 | - | 0.0001 | 0.9530 | -0.0081 | - |
-| 0.4232 | 127000 | 0.0001 | 0.0001 | 0.9539 | -0.0081 | - |
-| 0.4248 | 127500 | - | 0.0001 | 0.9531 | -0.0081 | - |
-| 0.4265 | 128000 | 0.0001 | 0.0001 | 0.9540 | -0.0081 | - |
-| 0.4282 | 128500 | - | 0.0001 | 0.9534 | -0.0081 | - |
-| 0.4298 | 129000 | 0.0001 | 0.0001 | 0.9536 | -0.0080 | - |
-| 0.4315 | 129500 | - | 0.0001 | 0.9536 | -0.0081 | - |
-| 0.4332 | 130000 | 0.0001 | 0.0001 | 0.9547 | -0.0080 | - |
-| 0.4348 | 130500 | - | 0.0001 | 0.9535 | -0.0080 | - |
-| 0.4365 | 131000 | 0.0001 | 0.0001 | 0.9541 | -0.0080 | - |
-| 0.4382 | 131500 | - | 0.0001 | 0.9542 | -0.0080 | - |
-| 0.4398 | 132000 | 0.0001 | 0.0001 | 0.9540 | -0.0080 | - |
-| 0.4415 | 132500 | - | 0.0001 | 0.9537 | -0.0080 | - |
-| 0.4432 | 133000 | 0.0001 | 0.0001 | 0.9538 | -0.0080 | - |
-| 0.4448 | 133500 | - | 0.0001 | 0.9540 | -0.0079 | - |
-| 0.4465 | 134000 | 0.0001 | 0.0001 | 0.9540 | -0.0080 | - |
-| 0.4481 | 134500 | - | 0.0001 | 0.9544 | -0.0080 | - |
-| 0.4498 | 135000 | 0.0001 | 0.0001 | 0.9535 | -0.0079 | - |
-| 0.4515 | 135500 | - | 0.0001 | 0.9541 | -0.0079 | - |
-| 0.4531 | 136000 | 0.0001 | 0.0001 | 0.9546 | -0.0079 | - |
-| 0.4548 | 136500 | - | 0.0001 | 0.9543 | -0.0079 | - |
-| 0.4565 | 137000 | 0.0001 | 0.0001 | 0.9548 | -0.0079 | - |
-| 0.4581 | 137500 | - | 0.0001 | 0.9555 | -0.0079 | - |
-| 0.4598 | 138000 | 0.0001 | 0.0001 | 0.9548 | -0.0079 | - |
-| 0.4615 | 138500 | - | 0.0001 | 0.9542 | -0.0079 | - |
-| 0.4631 | 139000 | 0.0001 | 0.0001 | 0.9548 | -0.0079 | - |
-| 0.4648 | 139500 | - | 0.0001 | 0.9544 | -0.0079 | - |
-| 0.4665 | 140000 | 0.0001 | 0.0001 | 0.9546 | -0.0079 | - |
-| 0.4681 | 140500 | - | 0.0001 | 0.9553 | -0.0078 | - |
-| 0.4698 | 141000 | 0.0001 | 0.0001 | 0.9542 | -0.0078 | - |
-| 0.4715 | 141500 | - | 0.0001 | 0.9553 | -0.0078 | - |
-| 0.4731 | 142000 | 0.0001 | 0.0001 | 0.9548 | -0.0079 | - |
-| 0.4748 | 142500 | - | 0.0001 | 0.9545 | -0.0078 | - |
-| 0.4765 | 143000 | 0.0001 | 0.0001 | 0.9553 | -0.0079 | - |
-| 0.4781 | 143500 | - | 0.0001 | 0.9561 | -0.0078 | - |
-| 0.4798 | 144000 | 0.0001 | 0.0001 | 0.9551 | -0.0078 | - |
-| 0.4815 | 144500 | - | 0.0001 | 0.9550 | -0.0078 | - |
-| 0.4831 | 145000 | 0.0001 | 0.0001 | 0.9557 | -0.0078 | - |
-| 0.4848 | 145500 | - | 0.0001 | 0.9557 | -0.0077 | - |
-| 0.4865 | 146000 | 0.0001 | 0.0001 | 0.9552 | -0.0077 | - |
-| 0.4881 | 146500 | - | 0.0001 | 0.9553 | -0.0078 | - |
-| 0.4898 | 147000 | 0.0001 | 0.0001 | 0.9555 | -0.0077 | - |
-| 0.4915 | 147500 | - | 0.0001 | 0.9561 | -0.0077 | - |
-| 0.4931 | 148000 | 0.0001 | 0.0001 | 0.9558 | -0.0077 | - |
-| 0.4948 | 148500 | - | 0.0001 | 0.9558 | -0.0077 | - |
-| 0.4965 | 149000 | 0.0001 | 0.0001 | 0.9560 | -0.0077 | - |
-| 0.4981 | 149500 | - | 0.0001 | 0.9558 | -0.0077 | - |
-| 0.4998 | 150000 | 0.0001 | 0.0001 | 0.9553 | -0.0077 | - |
-| 0.5015 | 150500 | - | 0.0001 | 0.9557 | -0.0077 | - |
-| 0.5031 | 151000 | 0.0001 | 0.0001 | 0.9562 | -0.0077 | - |
-| 0.5048 | 151500 | - | 0.0001 | 0.9558 | -0.0077 | - |
-| 0.5065 | 152000 | 0.0001 | 0.0001 | 0.9553 | -0.0077 | - |
-| 0.5081 | 152500 | - | 0.0001 | 0.9553 | -0.0076 | - |
-| 0.5098 | 153000 | 0.0001 | 0.0001 | 0.9559 | -0.0077 | - |
-| 0.5115 | 153500 | - | 0.0001 | 0.9560 | -0.0076 | - |
-| 0.5131 | 154000 | 0.0001 | 0.0001 | 0.9557 | -0.0076 | - |
-| 0.5148 | 154500 | - | 0.0001 | 0.9563 | -0.0076 | - |
-| 0.5165 | 155000 | 0.0001 | 0.0001 | 0.9567 | -0.0076 | - |
-| 0.5181 | 155500 | - | 0.0001 | 0.9559 | -0.0076 | - |
-| 0.5198 | 156000 | 0.0001 | 0.0001 | 0.9565 | -0.0076 | - |
-| 0.5215 | 156500 | - | 0.0001 | 0.9563 | -0.0076 | - |
-| 0.5231 | 157000 | 0.0001 | 0.0001 | 0.9569 | -0.0076 | - |
-| 0.5248 | 157500 | - | 0.0001 | 0.9571 | -0.0076 | - |
-| 0.5265 | 158000 | 0.0001 | 0.0001 | 0.9560 | -0.0076 | - |
-| 0.5281 | 158500 | - | 0.0001 | 0.9562 | -0.0076 | - |
-| 0.5298 | 159000 | 0.0001 | 0.0001 | 0.9569 | -0.0076 | - |
-| 0.5314 | 159500 | - | 0.0001 | 0.9556 | -0.0076 | - |
-| 0.5331 | 160000 | 0.0001 | 0.0001 | 0.9560 | -0.0075 | - |
-| 0.5348 | 160500 | - | 0.0001 | 0.9555 | -0.0075 | - |
-| 0.5364 | 161000 | 0.0001 | 0.0001 | 0.9555 | -0.0076 | - |
-| 0.5381 | 161500 | - | 0.0001 | 0.9564 | -0.0075 | - |
-| 0.5398 | 162000 | 0.0001 | 0.0001 | 0.9574 | -0.0076 | - |
-| 0.5414 | 162500 | - | 0.0001 | 0.9569 | -0.0075 | - |
-| 0.5431 | 163000 | 0.0001 | 0.0001 | 0.9578 | -0.0075 | - |
-| 0.5448 | 163500 | - | 0.0001 | 0.9571 | -0.0075 | - |
-| 0.5464 | 164000 | 0.0001 | 0.0001 | 0.9578 | -0.0075 | - |
-| 0.5481 | 164500 | - | 0.0001 | 0.9580 | -0.0075 | - |
-| 0.5498 | 165000 | 0.0001 | 0.0001 | 0.9568 | -0.0075 | - |
-| 0.5514 | 165500 | - | 0.0001 | 0.9582 | -0.0075 | - |
-| 0.5531 | 166000 | 0.0001 | 0.0001 | 0.9578 | -0.0075 | - |
-| 0.5548 | 166500 | - | 0.0001 | 0.9569 | -0.0075 | - |
-| 0.5564 | 167000 | 0.0001 | 0.0001 | 0.9568 | -0.0075 | - |
-| 0.5581 | 167500 | - | 0.0001 | 0.9576 | -0.0075 | - |
-| 0.5598 | 168000 | 0.0001 | 0.0001 | 0.9581 | -0.0075 | - |
-| 0.5614 | 168500 | - | 0.0001 | 0.9581 | -0.0075 | - |
-| 0.5631 | 169000 | 0.0001 | 0.0001 | 0.9573 | -0.0075 | - |
-| 0.5648 | 169500 | - | 0.0001 | 0.9581 | -0.0074 | - |
-| 0.5664 | 170000 | 0.0001 | 0.0001 | 0.9568 | -0.0074 | - |
-| 0.5681 | 170500 | - | 0.0001 | 0.9573 | -0.0075 | - |
-| 0.5698 | 171000 | 0.0001 | 0.0001 | 0.9579 | -0.0074 | - |
-| 0.5714 | 171500 | - | 0.0001 | 0.9578 | -0.0074 | - |
-| 0.5731 | 172000 | 0.0001 | 0.0001 | 0.9581 | -0.0074 | - |
-| 0.5748 | 172500 | - | 0.0001 | 0.9567 | -0.0074 | - |
-| 0.5764 | 173000 | 0.0001 | 0.0001 | 0.9581 | -0.0074 | - |
-| 0.5781 | 173500 | - | 0.0001 | 0.9584 | -0.0074 | - |
-| 0.5798 | 174000 | 0.0001 | 0.0001 | 0.9585 | -0.0074 | - |
-| 0.5814 | 174500 | - | 0.0001 | 0.9583 | -0.0074 | - |
-| 0.5831 | 175000 | 0.0001 | 0.0001 | 0.9590 | -0.0074 | - |
-| 0.5848 | 175500 | - | 0.0001 | 0.9580 | -0.0074 | - |
-| 0.5864 | 176000 | 0.0001 | 0.0001 | 0.9580 | -0.0073 | - |
-| 0.5881 | 176500 | - | 0.0001 | 0.9584 | -0.0073 | - |
-| 0.5898 | 177000 | 0.0001 | 0.0001 | 0.9591 | -0.0074 | - |
-| 0.5914 | 177500 | - | 0.0001 | 0.9592 | -0.0073 | - |
-| 0.5931 | 178000 | 0.0001 | 0.0001 | 0.9582 | -0.0073 | - |
-| 0.5948 | 178500 | - | 0.0001 | 0.9585 | -0.0073 | - |
-| 0.5964 | 179000 | 0.0001 | 0.0001 | 0.9590 | -0.0074 | - |
-| 0.5981 | 179500 | - | 0.0001 | 0.9586 | -0.0073 | - |
-| 0.5998 | 180000 | 0.0001 | 0.0001 | 0.9588 | -0.0073 | - |
-| 0.6014 | 180500 | - | 0.0001 | 0.9584 | -0.0073 | - |
-| 0.6031 | 181000 | 0.0001 | 0.0001 | 0.9588 | -0.0073 | - |
-| 0.6048 | 181500 | - | 0.0001 | 0.9581 | -0.0073 | - |
-| 0.6064 | 182000 | 0.0001 | 0.0001 | 0.9585 | -0.0073 | - |
-| 0.6081 | 182500 | - | 0.0001 | 0.9588 | -0.0073 | - |
-| 0.6098 | 183000 | 0.0001 | 0.0001 | 0.9589 | -0.0073 | - |
-| 0.6114 | 183500 | - | 0.0001 | 0.9590 | -0.0073 | - |
-| 0.6131 | 184000 | 0.0001 | 0.0001 | 0.9592 | -0.0073 | - |
-| 0.6147 | 184500 | - | 0.0001 | 0.9585 | -0.0072 | - |
-| 0.6164 | 185000 | 0.0001 | 0.0001 | 0.9591 | -0.0073 | - |
-| 0.6181 | 185500 | - | 0.0001 | 0.9581 | -0.0072 | - |
-| 0.6197 | 186000 | 0.0001 | 0.0001 | 0.9583 | -0.0072 | - |
-| 0.6214 | 186500 | - | 0.0001 | 0.9592 | -0.0072 | - |
-| 0.6231 | 187000 | 0.0001 | 0.0001 | 0.9594 | -0.0072 | - |
-| 0.6247 | 187500 | - | 0.0001 | 0.9596 | -0.0072 | - |
-| 0.6264 | 188000 | 0.0001 | 0.0001 | 0.9599 | -0.0072 | - |
-| 0.6281 | 188500 | - | 0.0001 | 0.9598 | -0.0072 | - |
-| 0.6297 | 189000 | 0.0001 | 0.0001 | 0.9597 | -0.0072 | - |
-| 0.6314 | 189500 | - | 0.0001 | 0.9596 | -0.0072 | - |
-| 0.6331 | 190000 | 0.0001 | 0.0001 | 0.9603 | -0.0072 | - |
-| 0.6347 | 190500 | - | 0.0001 | 0.9600 | -0.0072 | - |
-| 0.6364 | 191000 | 0.0001 | 0.0001 | 0.9591 | -0.0072 | - |
-| 0.6381 | 191500 | - | 0.0001 | 0.9590 | -0.0072 | - |
-| 0.6397 | 192000 | 0.0001 | 0.0001 | 0.9586 | -0.0072 | - |
-| 0.6414 | 192500 | - | 0.0001 | 0.9591 | -0.0072 | - |
-| 0.6431 | 193000 | 0.0001 | 0.0001 | 0.9595 | -0.0072 | - |
-| 0.6447 | 193500 | - | 0.0001 | 0.9599 | -0.0071 | - |
-| 0.6464 | 194000 | 0.0001 | 0.0001 | 0.9598 | -0.0072 | - |
-| 0.6481 | 194500 | - | 0.0001 | 0.9591 | -0.0072 | - |
-| 0.6497 | 195000 | 0.0001 | 0.0001 | 0.9589 | -0.0071 | - |
-| 0.6514 | 195500 | - | 0.0001 | 0.9597 | -0.0071 | - |
-| 0.6531 | 196000 | 0.0001 | 0.0001 | 0.9596 | -0.0071 | - |
-| 0.6547 | 196500 | - | 0.0001 | 0.9602 | -0.0071 | - |
-| 0.6564 | 197000 | 0.0001 | 0.0001 | 0.9598 | -0.0071 | - |
-| 0.6581 | 197500 | - | 0.0001 | 0.9599 | -0.0071 | - |
-| 0.6597 | 198000 | 0.0001 | 0.0001 | 0.9602 | -0.0071 | - |
-| 0.6614 | 198500 | - | 0.0001 | 0.9604 | -0.0071 | - |
-| 0.6631 | 199000 | 0.0001 | 0.0001 | 0.9601 | -0.0071 | - |
-| 0.6647 | 199500 | - | 0.0001 | 0.9606 | -0.0071 | - |
-| 0.6664 | 200000 | 0.0001 | 0.0001 | 0.9598 | -0.0071 | - |
-| 0.6681 | 200500 | - | 0.0001 | 0.9601 | -0.0071 | - |
-| 0.6697 | 201000 | 0.0001 | 0.0001 | 0.9599 | -0.0071 | - |
-| 0.6714 | 201500 | - | 0.0001 | 0.9602 | -0.0071 | - |
-| 0.6731 | 202000 | 0.0001 | 0.0001 | 0.9595 | -0.0071 | - |
-| 0.6747 | 202500 | - | 0.0001 | 0.9607 | -0.0071 | - |
-| 0.6764 | 203000 | 0.0001 | 0.0001 | 0.9607 | -0.0071 | - |
-| 0.6781 | 203500 | - | 0.0001 | 0.9603 | -0.0071 | - |
-| 0.6797 | 204000 | 0.0001 | 0.0001 | 0.9612 | -0.0070 | - |
-| 0.6814 | 204500 | - | 0.0001 | 0.9605 | -0.0071 | - |
-| 0.6831 | 205000 | 0.0001 | 0.0001 | 0.9611 | -0.0070 | - |
-| 0.6847 | 205500 | - | 0.0001 | 0.9607 | -0.0070 | - |
-| 0.6864 | 206000 | 0.0001 | 0.0001 | 0.9601 | -0.0070 | - |
-| 0.6881 | 206500 | - | 0.0001 | 0.9606 | -0.0070 | - |
-| 0.6897 | 207000 | 0.0001 | 0.0001 | 0.9601 | -0.0070 | - |
-| 0.6914 | 207500 | - | 0.0001 | 0.9611 | -0.0070 | - |
-| 0.6930 | 208000 | 0.0001 | 0.0001 | 0.9613 | -0.0070 | - |
-| 0.6947 | 208500 | - | 0.0001 | 0.9607 | -0.0070 | - |
-| 0.6964 | 209000 | 0.0001 | 0.0001 | 0.9605 | -0.0070 | - |
-| 0.6980 | 209500 | - | 0.0001 | 0.9611 | -0.0070 | - |
-| 0.6997 | 210000 | 0.0001 | 0.0001 | 0.9604 | -0.0070 | - |
-| 0.7014 | 210500 | - | 0.0001 | 0.9609 | -0.0070 | - |
-| 0.7030 | 211000 | 0.0001 | 0.0001 | 0.9611 | -0.0070 | - |
-| 0.7047 | 211500 | - | 0.0001 | 0.9611 | -0.0070 | - |
-| 0.7064 | 212000 | 0.0001 | 0.0001 | 0.9612 | -0.0070 | - |
-| 0.7080 | 212500 | - | 0.0001 | 0.9610 | -0.0070 | - |
-| 0.7097 | 213000 | 0.0001 | 0.0001 | 0.9614 | -0.0070 | - |
-| 0.7114 | 213500 | - | 0.0001 | 0.9613 | -0.0069 | - |
-| 0.7130 | 214000 | 0.0001 | 0.0001 | 0.9619 | -0.0070 | - |
-| 0.7147 | 214500 | - | 0.0001 | 0.9612 | -0.0070 | - |
-| 0.7164 | 215000 | 0.0001 | 0.0001 | 0.9615 | -0.0069 | - |
-| 0.7180 | 215500 | - | 0.0001 | 0.9614 | -0.0069 | - |
-| 0.7197 | 216000 | 0.0001 | 0.0001 | 0.9614 | -0.0070 | - |
-| 0.7214 | 216500 | - | 0.0001 | 0.9613 | -0.0069 | - |
-| 0.7230 | 217000 | 0.0001 | 0.0001 | 0.9612 | -0.0069 | - |
-| 0.7247 | 217500 | - | 0.0001 | 0.9608 | -0.0069 | - |
-| 0.7264 | 218000 | 0.0001 | 0.0001 | 0.9619 | -0.0069 | - |
-| 0.7280 | 218500 | - | 0.0001 | 0.9612 | -0.0069 | - |
-| 0.7297 | 219000 | 0.0001 | 0.0001 | 0.9613 | -0.0069 | - |
-| 0.7314 | 219500 | - | 0.0001 | 0.9617 | -0.0069 | - |
-| 0.7330 | 220000 | 0.0001 | 0.0001 | 0.9620 | -0.0069 | - |
-| 0.7347 | 220500 | - | 0.0001 | 0.9621 | -0.0069 | - |
-| 0.7364 | 221000 | 0.0001 | 0.0001 | 0.9616 | -0.0069 | - |
-| 0.7380 | 221500 | - | 0.0001 | 0.9622 | -0.0069 | - |
-| 0.7397 | 222000 | 0.0001 | 0.0001 | 0.9620 | -0.0069 | - |
-| 0.7414 | 222500 | - | 0.0001 | 0.9612 | -0.0069 | - |
-| 0.7430 | 223000 | 0.0001 | 0.0001 | 0.9615 | -0.0069 | - |
-| 0.7447 | 223500 | - | 0.0001 | 0.9615 | -0.0069 | - |
-| 0.7464 | 224000 | 0.0001 | 0.0001 | 0.9621 | -0.0069 | - |
-| 0.7480 | 224500 | - | 0.0001 | 0.9622 | -0.0068 | - |
-| 0.7497 | 225000 | 0.0001 | 0.0001 | 0.9616 | -0.0069 | - |
-| 0.7514 | 225500 | - | 0.0001 | 0.9616 | -0.0069 | - |
-| 0.7530 | 226000 | 0.0001 | 0.0001 | 0.9614 | -0.0069 | - |
-| 0.7547 | 226500 | - | 0.0001 | 0.9614 | -0.0069 | - |
-| 0.7564 | 227000 | 0.0001 | 0.0001 | 0.9614 | -0.0068 | - |
-| 0.7580 | 227500 | - | 0.0001 | 0.9613 | -0.0069 | - |
-| 0.7597 | 228000 | 0.0001 | 0.0001 | 0.9620 | -0.0068 | - |
-| 0.7614 | 228500 | - | 0.0001 | 0.9616 | -0.0068 | - |
-| 0.7630 | 229000 | 0.0001 | 0.0001 | 0.9621 | -0.0068 | - |
-| 0.7647 | 229500 | - | 0.0001 | 0.9620 | -0.0069 | - |
-| 0.7664 | 230000 | 0.0001 | 0.0001 | 0.9618 | -0.0068 | - |
-| 0.7680 | 230500 | - | 0.0001 | 0.9616 | -0.0068 | - |
-| 0.7697 | 231000 | 0.0001 | 0.0001 | 0.9624 | -0.0068 | - |
-| 0.7714 | 231500 | - | 0.0001 | 0.9618 | -0.0068 | - |
-| 0.7730 | 232000 | 0.0001 | 0.0001 | 0.9621 | -0.0068 | - |
-| 0.7747 | 232500 | - | 0.0001 | 0.9618 | -0.0068 | - |
-| 0.7763 | 233000 | 0.0001 | 0.0001 | 0.9617 | -0.0068 | - |
-| 0.7780 | 233500 | - | 0.0001 | 0.9620 | -0.0068 | - |
-| 0.7797 | 234000 | 0.0001 | 0.0001 | 0.9620 | -0.0068 | - |
-| 0.7813 | 234500 | - | 0.0001 | 0.9624 | -0.0068 | - |
-| 0.7830 | 235000 | 0.0001 | 0.0001 | 0.9624 | -0.0068 | - |
-| 0.7847 | 235500 | - | 0.0001 | 0.9624 | -0.0068 | - |
-| 0.7863 | 236000 | 0.0001 | 0.0001 | 0.9627 | -0.0068 | - |
-| 0.7880 | 236500 | - | 0.0001 | 0.9620 | -0.0068 | - |
-| 0.7897 | 237000 | 0.0001 | 0.0001 | 0.9626 | -0.0068 | - |
-| 0.7913 | 237500 | - | 0.0001 | 0.9629 | -0.0068 | - |
-| 0.7930 | 238000 | 0.0001 | 0.0001 | 0.9621 | -0.0067 | - |
-| 0.7947 | 238500 | - | 0.0001 | 0.9630 | -0.0067 | - |
-| 0.7963 | 239000 | 0.0001 | 0.0001 | 0.9627 | -0.0067 | - |
-| 0.7980 | 239500 | - | 0.0001 | 0.9628 | -0.0068 | - |
-| 0.7997 | 240000 | 0.0001 | 0.0001 | 0.9626 | -0.0067 | - |
-| 0.8013 | 240500 | - | 0.0001 | 0.9624 | -0.0067 | - |
-| 0.8030 | 241000 | 0.0001 | 0.0001 | 0.9623 | -0.0067 | - |
-| 0.8047 | 241500 | - | 0.0001 | 0.9622 | -0.0067 | - |
-| 0.8063 | 242000 | 0.0001 | 0.0001 | 0.9620 | -0.0067 | - |
-| 0.8080 | 242500 | - | 0.0001 | 0.9622 | -0.0067 | - |
-| 0.8097 | 243000 | 0.0001 | 0.0001 | 0.9626 | -0.0067 | - |
-| 0.8113 | 243500 | - | 0.0001 | 0.9634 | -0.0067 | - |
-| 0.8130 | 244000 | 0.0001 | 0.0001 | 0.9623 | -0.0067 | - |
-| 0.8147 | 244500 | - | 0.0001 | 0.9632 | -0.0067 | - |
-| 0.8163 | 245000 | 0.0001 | 0.0001 | 0.9630 | -0.0067 | - |
-| 0.8180 | 245500 | - | 0.0001 | 0.9634 | -0.0067 | - |
-| 0.8197 | 246000 | 0.0001 | 0.0001 | 0.9627 | -0.0067 | - |
-| 0.8213 | 246500 | - | 0.0001 | 0.9625 | -0.0067 | - |
-| 0.8230 | 247000 | 0.0001 | 0.0001 | 0.9629 | -0.0067 | - |
-| 0.8247 | 247500 | - | 0.0001 | 0.9633 | -0.0067 | - |
-| 0.8263 | 248000 | 0.0001 | 0.0001 | 0.9628 | -0.0067 | - |
-| 0.8280 | 248500 | - | 0.0001 | 0.9636 | -0.0067 | - |
-| 0.8297 | 249000 | 0.0001 | 0.0001 | 0.9632 | -0.0067 | - |
-| 0.8313 | 249500 | - | 0.0001 | 0.9630 | -0.0067 | - |
-| 0.8330 | 250000 | 0.0001 | 0.0001 | 0.9639 | -0.0067 | - |
-| 0.8347 | 250500 | - | 0.0001 | 0.9633 | -0.0067 | - |
-| 0.8363 | 251000 | 0.0001 | 0.0001 | 0.9635 | -0.0066 | - |
-| 0.8380 | 251500 | - | 0.0001 | 0.9637 | -0.0066 | - |
-| 0.8397 | 252000 | 0.0001 | 0.0001 | 0.9632 | -0.0067 | - |
-| 0.8413 | 252500 | - | 0.0001 | 0.9638 | -0.0066 | - |
-| 0.8430 | 253000 | 0.0001 | 0.0001 | 0.9636 | -0.0066 | - |
-| 0.8447 | 253500 | - | 0.0001 | 0.9635 | -0.0066 | - |
-| 0.8463 | 254000 | 0.0001 | 0.0001 | 0.9636 | -0.0066 | - |
-| 0.8480 | 254500 | - | 0.0001 | 0.9630 | -0.0066 | - |
-| 0.8497 | 255000 | 0.0001 | 0.0001 | 0.9633 | -0.0066 | - |
-| 0.8513 | 255500 | - | 0.0001 | 0.9636 | -0.0066 | - |
-| 0.8530 | 256000 | 0.0001 | 0.0001 | 0.9635 | -0.0066 | - |
-| 0.8546 | 256500 | - | 0.0001 | 0.9640 | -0.0066 | - |
-| 0.8563 | 257000 | 0.0001 | 0.0001 | 0.9636 | -0.0066 | - |
-| 0.8580 | 257500 | - | 0.0001 | 0.9636 | -0.0066 | - |
-| 0.8596 | 258000 | 0.0001 | 0.0001 | 0.9636 | -0.0066 | - |
-| 0.8613 | 258500 | - | 0.0001 | 0.9636 | -0.0066 | - |
-| 0.8630 | 259000 | 0.0001 | 0.0001 | 0.9636 | -0.0066 | - |
-| 0.8646 | 259500 | - | 0.0001 | 0.9635 | -0.0066 | - |
-| 0.8663 | 260000 | 0.0001 | 0.0001 | 0.9637 | -0.0066 | - |
-| 0.8680 | 260500 | - | 0.0001 | 0.9637 | -0.0066 | - |
-| 0.8696 | 261000 | 0.0001 | 0.0001 | 0.9639 | -0.0066 | - |
-| 0.8713 | 261500 | - | 0.0001 | 0.9640 | -0.0066 | - |
-| 0.8730 | 262000 | 0.0001 | 0.0001 | 0.9640 | -0.0066 | - |
-| 0.8746 | 262500 | - | 0.0001 | 0.9642 | -0.0066 | - |
-| 0.8763 | 263000 | 0.0001 | 0.0001 | 0.9636 | -0.0066 | - |
-| 0.8780 | 263500 | - | 0.0001 | 0.9640 | -0.0066 | - |
-| 0.8796 | 264000 | 0.0001 | 0.0001 | 0.9642 | -0.0066 | - |
-| 0.8813 | 264500 | - | 0.0001 | 0.9640 | -0.0066 | - |
-| 0.8830 | 265000 | 0.0001 | 0.0001 | 0.9642 | -0.0066 | - |
-| 0.8846 | 265500 | - | 0.0001 | 0.9645 | -0.0066 | - |
-| 0.8863 | 266000 | 0.0001 | 0.0001 | 0.9637 | -0.0066 | - |
-| 0.8880 | 266500 | - | 0.0001 | 0.9640 | -0.0066 | - |
-| 0.8896 | 267000 | 0.0001 | 0.0001 | 0.9643 | -0.0065 | - |
-| 0.8913 | 267500 | - | 0.0001 | 0.9641 | -0.0065 | - |
-| 0.8930 | 268000 | 0.0001 | 0.0001 | 0.9639 | -0.0065 | - |
-| 0.8946 | 268500 | - | 0.0001 | 0.9642 | -0.0065 | - |
-| 0.8963 | 269000 | 0.0001 | 0.0001 | 0.9642 | -0.0065 | - |
-| 0.8980 | 269500 | - | 0.0001 | 0.9640 | -0.0065 | - |
-| 0.8996 | 270000 | 0.0001 | 0.0001 | 0.9642 | -0.0065 | - |
-| 0.9013 | 270500 | - | 0.0001 | 0.9639 | -0.0065 | - |
-| 0.9030 | 271000 | 0.0001 | 0.0001 | 0.9641 | -0.0065 | - |
-| 0.9046 | 271500 | - | 0.0001 | 0.9640 | -0.0065 | - |
-| 0.9063 | 272000 | 0.0001 | 0.0001 | 0.9643 | -0.0065 | - |
-| 0.9080 | 272500 | - | 0.0001 | 0.9645 | -0.0065 | - |
-| 0.9096 | 273000 | 0.0001 | 0.0001 | 0.9645 | -0.0065 | - |
-| 0.9113 | 273500 | - | 0.0001 | 0.9645 | -0.0065 | - |
-| 0.9130 | 274000 | 0.0001 | 0.0001 | 0.9643 | -0.0065 | - |
-| 0.9146 | 274500 | - | 0.0001 | 0.9645 | -0.0065 | - |
-| 0.9163 | 275000 | 0.0001 | 0.0001 | 0.9642 | -0.0065 | - |
-| 0.9180 | 275500 | - | 0.0001 | 0.9645 | -0.0065 | - |
-| 0.9196 | 276000 | 0.0001 | 0.0001 | 0.9643 | -0.0065 | - |
-| 0.9213 | 276500 | - | 0.0001 | 0.9643 | -0.0065 | - |
-| 0.9230 | 277000 | 0.0001 | 0.0001 | 0.9644 | -0.0065 | - |
-| 0.9246 | 277500 | - | 0.0001 | 0.9643 | -0.0065 | - |
-| 0.9263 | 278000 | 0.0001 | 0.0001 | 0.9644 | -0.0065 | - |
-| 0.9280 | 278500 | - | 0.0001 | 0.9646 | -0.0065 | - |
-| 0.9296 | 279000 | 0.0001 | 0.0001 | 0.9643 | -0.0065 | - |
-| 0.9313 | 279500 | - | 0.0001 | 0.9644 | -0.0065 | - |
-| 0.9330 | 280000 | 0.0001 | 0.0001 | 0.9643 | -0.0065 | - |
-| 0.9346 | 280500 | - | 0.0001 | 0.9644 | -0.0065 | - |
-| 0.9363 | 281000 | 0.0001 | 0.0001 | 0.9643 | -0.0065 | - |
-| 0.9379 | 281500 | - | 0.0001 | 0.9645 | -0.0065 | - |
-| 0.9396 | 282000 | 0.0001 | 0.0001 | 0.9643 | -0.0065 | - |
-| 0.9413 | 282500 | - | 0.0001 | 0.9643 | -0.0065 | - |
-| 0.9429 | 283000 | 0.0001 | 0.0001 | 0.9646 | -0.0065 | - |
-| 0.9446 | 283500 | - | 0.0001 | 0.9644 | -0.0064 | - |
-| 0.9463 | 284000 | 0.0001 | 0.0001 | 0.9646 | -0.0065 | - |
-| 0.9479 | 284500 | - | 0.0001 | 0.9648 | -0.0064 | - |
-| 0.9496 | 285000 | 0.0001 | 0.0001 | 0.9650 | -0.0064 | - |
-| 0.9513 | 285500 | - | 0.0001 | 0.9647 | -0.0064 | - |
-| 0.9529 | 286000 | 0.0001 | 0.0001 | 0.9648 | -0.0064 | - |
-| 0.9546 | 286500 | - | 0.0001 | 0.9645 | -0.0064 | - |
-| 0.9563 | 287000 | 0.0001 | 0.0001 | 0.9646 | -0.0064 | - |
-| 0.9579 | 287500 | - | 0.0001 | 0.9647 | -0.0064 | - |
-| 0.9596 | 288000 | 0.0001 | 0.0001 | 0.9648 | -0.0064 | - |
-| 0.9613 | 288500 | - | 0.0001 | 0.9647 | -0.0064 | - |
-| 0.9629 | 289000 | 0.0001 | 0.0001 | 0.9647 | -0.0064 | - |
-| 0.9646 | 289500 | - | 0.0001 | 0.9647 | -0.0064 | - |
-| 0.9663 | 290000 | 0.0001 | 0.0001 | 0.9649 | -0.0064 | - |
-| 0.9679 | 290500 | - | 0.0001 | 0.9648 | -0.0064 | - |
-| 0.9696 | 291000 | 0.0001 | 0.0001 | 0.9648 | -0.0064 | - |
-| 0.9713 | 291500 | - | 0.0001 | 0.9649 | -0.0064 | - |
-| 0.9729 | 292000 | 0.0001 | 0.0001 | 0.9648 | -0.0064 | - |
-| 0.9746 | 292500 | - | 0.0001 | 0.9649 | -0.0064 | - |
-| 0.9763 | 293000 | 0.0001 | 0.0001 | 0.9648 | -0.0064 | - |
-| 0.9779 | 293500 | - | 0.0001 | 0.9648 | -0.0064 | - |
-| 0.9796 | 294000 | 0.0001 | 0.0001 | 0.9650 | -0.0064 | - |
-| 0.9813 | 294500 | - | 0.0001 | 0.9650 | -0.0064 | - |
-| 0.9829 | 295000 | 0.0001 | 0.0001 | 0.9649 | -0.0064 | - |
-| 0.9846 | 295500 | - | 0.0001 | 0.9650 | -0.0064 | - |
-| 0.9863 | 296000 | 0.0001 | 0.0001 | 0.9649 | -0.0064 | - |
-| 0.9879 | 296500 | - | 0.0001 | 0.9650 | -0.0064 | - |
-| 0.9896 | 297000 | 0.0001 | 0.0001 | 0.9650 | -0.0064 | - |
-| **0.9913** | **297500** | **-** | **0.0001** | **0.9651** | **-0.0064** | **-** |
-| 0.9929 | 298000 | 0.0001 | 0.0001 | 0.9650 | -0.0064 | - |
-| 0.9946 | 298500 | - | 0.0001 | 0.9650 | -0.0064 | - |
-| 0.9963 | 299000 | 0.0001 | 0.0001 | 0.9651 | -0.0064 | - |
-| 0.9979 | 299500 | - | 0.0001 | 0.9650 | -0.0064 | - |
-| 0.9996 | 300000 | 0.0001 | 0.0001 | 0.9650 | -0.0064 | - |
-| 1.0 | 300123 | - | - | - | - | 0.9651 |
-
-* The bold row denotes the saved checkpoint.
-
-
-### Framework Versions
-- Python: 3.12.4
-- Sentence Transformers: 3.3.1
-- Transformers: 4.48.0
-- PyTorch: 2.4.1+cu121
-- Accelerate: 1.0.1
-- Datasets: 2.19.0
-- Tokenizers: 0.21.0
-
## Citation
### BibTeX
@@ -1065,20 +214,14 @@ You can finetune this model on your own dataset.
}
```
-
-
-
-
-
\ No newline at end of file
+#### bge-m3
+```bibtex
+@misc{bge-m3,
+ title={BGE M3-Embedding: Multi-Lingual, Multi-Functionality, Multi-Granularity Text Embeddings Through Self-Knowledge Distillation},
+ author={Jianlv Chen and Shitao Xiao and Peitian Zhang and Kun Luo and Defu Lian and Zheng Liu},
+ year={2024},
+ eprint={2402.03216},
+ archivePrefix={arXiv},
+ primaryClass={cs.CL}
+}
+```
\ No newline at end of file