metadata
language:
- tr
tags:
- sentence-transformers
- sentence-similarity
- feature-extraction
- generated_from_trainer
- dataset_size:9623924
- loss:MSELoss
base_model: BAAI/bge-m3
widget:
- source_sentence: >-
Ak Hunlar'ın kültürel etkileşimleri ve mirasları hakkında ne
söyleyebiliriz? Ak Hunlar'ın diğer kültürler üzerindeki etkileri ve izleri
nelerdir?
sentences:
- Film, hangi oyun yazarının hayatını konu almaktadır?
- Bir Eskişehir-Afyonkarahisar tren yolculuğu ne kadar sürmektedir?
- >-
Mektupta, Türkiye'nin adaya tek taraflı müdahalesinin Türk ve Yunan
tarafları arasında savaşa yol açabileceği ve NATO üyesi olan bu iki
ülkenin savaşmasının kabul edilemez olduğu ifade edilmiştir. Türkiye'nin
müdahale kararı almadan önce müttefiklerine danışması gerektiği
anımsatılmıştır. Ayrıca bu savaşın Sovyetler Birliği'nin de Türkiye'ye
müdahale ihtimalini doğuracağı ve NATO'nun böyle bir durumda Türkiye'yi
savunma konusunda isteksiz olacağı ima edilmiştir. ABD'nin Türkiye'ye
sağladığı askeri malzemenin bu müdahalede kullanılmasına izin
verilmeyeceği belirtilmiştir. Mektubun ardından Türkiye müdahale
kararından vazgeçmiştir. İsmet İnönü 21 Haziran 1964'te ABD'ye giderek
başkan Johnson ile bir görüşmede bulunmuştur.
- source_sentence: >-
Evet, metinde teslimiyetçilik, edilgenlik veya boyun eğme olarak da
tanımlanmaktadır.
sentences:
- Cezary Kucharski'nin doğduğu tarih nedir?
- >-
Beylerbeyi Camii, 2013 yılında yapılan restorasyon çalışmaları
sonrasında ne durumda?
- >-
İkinci Dünya Savaşı esnasında ve sonrasında elektroniklerin doğasından
kaynaklanan birçok güvenilir olmama durumu ve ürün yorgunluğu gündeme
geldi. 1945'te M.A. Miner, ASME (Amerikan Makine Mühendisleri Topluluğu)
Dergisi içerisinde "Yorulma Esnasında Birikimli Hasar" adında taslak bir
yazı paylaştı. Ordu için uygulanan ilk güvenilirlik hususu, Radar
Sistemleri ve diğer elektronik parçalarda kullanılan, yine güvenilirlik
analizi sayesinde kanıtlanmış, oldukça arıza çıkarmaya yatkın ve
maliyetli bir vakum silindiri idi. Elektrik ve Elektronik Mühendisleri
Enstitüsü, 1948 yılında Güvenilirlik Topluluğunu kurmuştur. 1950 yılı
içerisinde, asker tarafında, Elektronik Ekipman Güvenilirliği Tavsiye
Grubu kurulmuştur. Bu grup, 3 ana çalışma yolu tavsiye etmiştir. Bunlar:
Parça güvenilirliğinin arttırılması,
Tedarikçiler için kalite ve güvenilirlik gereksinimlerinin tanımlanması,
Saha verilerinin toplanması ve kök analiz yapılması.
- source_sentence: >-
Belgrad'ın ele geçirilmesinde Klingenberg'in rolü nedir ve bu olay nasıl
gerçekleşti?
sentences:
- Jimmy White ve Peter Ebdon.
- >-
DualSense kontrolörünün titreşim özelliği hakkında detaylı bilgi verir
misiniz?
- |-
Kozluk, Kocaeli ilinin İzmit ilçesine bağlı bir mahalledir.
Nüfus
Kaynakça
İzmit'in mahalleleri
- source_sentence: >-
1996 yılında kurulmuştur. Ağırlıklı olarak standart caz repertuvarından
parçalar sunmuşlardır.
sentences:
- San Leucio'nun coğrafi konumu hakkında bilgi verir misiniz?
- Kinik felsefesinin öncüsüdür.
- >-
Aydın Doğu Demirkol'un vizyona girmesi planlanan sinema filmleri
nelerdir ve yönetmenleri kimlerdir?
- source_sentence: >-
Serbest pazar prensiplerinin varlıklı ve yoksul futbol kulüpleri
arasındaki farkı büyütmesine yönelik kaygılar nedeniyle bu durum
önemlidir.
sentences:
- >-
Yazar, 12 Mart baskınlarının ve işkencelerinin sonucunda, ideolojik
kimlikleriyle küçük burjuva kimlikleri arasında çelişkiye düşen
devrimcilerin rejime boyun eğmelerini gösterme çabasındadır.
- >-
Verilen kesin süre
içinde şikayetçi tarafından ilgili masraflar yatırıldığından PTT’ce söz
konusu
keşfa.va.nsınıngeri önd.e-rilmesi sonucu talimat
mahkemesince keşf yapılmamış ise de burada şikayetçiye atfedilebilecek
bir kusur
bulunmadığından, keşif avansının ilgili mahkemeye tekrar gönderilerek
keşfin
yapılmasının sağlanarak oluşacak sonuca göre bir karar verilmesi
gerekir.
- >-
This Kind of Bird Flies Backwards (Bu Cins Kuş Tersten Uçar) adlı ilk
kitabı, LeRoy Jones ve Hettie Jones'un kurduğu Totem Press tarafından
1958 yılında yayınlandı.
datasets:
- altaidevorg/tr-sentences
pipeline_tag: sentence-similarity
library_name: sentence-transformers
metrics:
- pearson_cosine
- spearman_cosine
- negative_mse
model-index:
- name: SentenceTransformer based on BAAI/bge-m3
results:
- task:
type: semantic-similarity
name: Semantic Similarity
dataset:
name: sts dev
type: sts-dev
metrics:
- type: pearson_cosine
value: 0.9691269661048901
name: Pearson Cosine
- type: spearman_cosine
value: 0.9650087926361528
name: Spearman Cosine
- task:
type: knowledge-distillation
name: Knowledge Distillation
dataset:
name: Unknown
type: unknown
metrics:
- type: negative_mse
value: -0.006388394831446931
name: Negative Mse
- task:
type: semantic-similarity
name: Semantic Similarity
dataset:
name: sts test
type: sts-test
metrics:
- type: pearson_cosine
value: 0.9691398285942048
name: Pearson Cosine
- type: spearman_cosine
value: 0.9650683134098942
name: Spearman Cosine
SentenceTransformer based on BAAI/bge-m3
This is a sentence-transformers model finetuned from BAAI/bge-m3 on the tr-sentences dataset. It maps sentences & paragraphs to a 1024-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.
Model Details
Model Description
- Model Type: Sentence Transformer
- Base model: BAAI/bge-m3
- Maximum Sequence Length: 8192 tokens
- Output Dimensionality: 1024 dimensions
- Similarity Function: Cosine Similarity
- Training Dataset:
- Language: tr
Model Sources
- Documentation: Sentence Transformers Documentation
- Repository: Sentence Transformers on GitHub
- Hugging Face: Sentence Transformers on Hugging Face
Full Model Architecture
SentenceTransformer(
(0): Transformer({'max_seq_length': 8192, 'do_lower_case': False}) with Transformer model: XLMRobertaModel
(1): Pooling({'word_embedding_dimension': 1024, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
(2): Normalize()
)
Usage
Direct Usage (Sentence Transformers)
First install the Sentence Transformers library:
pip install -U sentence-transformers
Then you can load this model and run inference.
from sentence_transformers import SentenceTransformer
# Download from the 🤗 Hub
model = SentenceTransformer("sentence_transformers_model_id")
# Run inference
sentences = [
'Serbest pazar prensiplerinin varlıklı ve yoksul futbol kulüpleri arasındaki farkı büyütmesine yönelik kaygılar nedeniyle bu durum önemlidir.',
'Yazar, 12 Mart baskınlarının ve işkencelerinin sonucunda, ideolojik kimlikleriyle küçük burjuva kimlikleri arasında çelişkiye düşen devrimcilerin rejime boyun eğmelerini gösterme çabasındadır.',
"This Kind of Bird Flies Backwards (Bu Cins Kuş Tersten Uçar) adlı ilk kitabı, LeRoy Jones ve Hettie Jones'un kurduğu Totem Press tarafından 1958 yılında yayınlandı.",
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 1024]
# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities.shape)
# [3, 3]
Evaluation
Metrics
Semantic Similarity
- Datasets:
sts-dev
andsts-test
- Evaluated with
EmbeddingSimilarityEvaluator
Metric | sts-dev | sts-test |
---|---|---|
pearson_cosine | 0.9691 | 0.9691 |
spearman_cosine | 0.965 | 0.9651 |
Knowledge Distillation
- Evaluated with
MSEEvaluator
Metric | Value |
---|---|
negative_mse | -0.0064 |
Training Details
Training Dataset
tr-sentences
- Dataset: tr-sentences at f5ebc52
- Size: 9,623,924 training samples
- Columns:
sentence
andlabel
- Approximate statistics based on the first 1000 samples:
sentence label type string list details - min: 5 tokens
- mean: 55.78 tokens
- max: 468 tokens
- size: 1024 elements
- Samples:
sentence label NBA tarihinde bu ödülü en çok kaç kez kim kazanmıştır?
[-0.027497457340359688, -0.024517377838492393, -0.013820995576679707, 0.00024465256137773395, -0.020534219220280647, ...]
Romero ve yapımcı Richard P. Rubinstein, yeni bir proje için herhangi bir yerli yatırımcılara temin koyamadıklarını söyledi. Romero Şans eseri, İtalyan korku yönetmeni Dario Argento'ya ulaştı. bu film Yaşayan Ölülerin Gecesi filmin'in kritik savunucusudur, Argento filmin korku klasik arasında yer almasına yardımcı olmak için istekliydi. uluslararası dağıtım hakları karşılığında finansman sağlamak için, Romero ve Rubinstein bir araya geldi. Senaryoyu yazarken bir sahnede değişiklik yapmak için Argento Roma'yı Romero filme davet etti. İkisi de daha sonra arsa gelişmelerini tartışmak için bir olabilirdi. Romero Monroeville Mall'ın durumunun yanı sıra Oxford Kalkınma'da alışveriş merkezi sahipleri ile bağlantıları ile ek bir güvenli finansman başardı. Döküm tamamlandıktan sonra, başlıca çekim tarihinin 13 Kasım, 1977 tarihinde film'in Pensilvanya'da başlaması planlanıyordu.
[-0.02431895025074482, -0.03177526593208313, -0.010546382516622543, 0.0393124595284462, -0.03390512242913246, ...]
Evet, Nasuhlar ismi Adapazarı, Kandıra ve Yenipazar ilçelerinde farklı yer isimlerine aittir.
[0.0020795632153749466, -0.013080586679279804, -0.018256550654768944, 0.022429518401622772, -0.03087380714714527, ...]
- Loss:
MSELoss
Evaluation Dataset
tr-sentences
- Dataset: tr-sentences at f5ebc52
- Size: 9,623,924 evaluation samples
- Columns:
sentence
andlabel
- Approximate statistics based on the first 1000 samples:
sentence label type string list details - min: 3 tokens
- mean: 51.95 tokens
- max: 614 tokens
- size: 1024 elements
- Samples:
sentence label Bernhard, şiirle yazarlık hayatına başlamış ve 1963'te "Frost" (Don) adlı ilk romanını yayınlamıştır. 1957'den itibaren serbest yazarlık yapmaya başlamış ve hayatı boyunca yazarlık sayesinde geçimini sağlamıştır.
[-0.019921669736504555, -0.007309767417609692, 0.01690034568309784, -0.03302725777029991, -0.003539217868819833, ...]
Sonraki maçta AJ Styles ile Kevin Owens, WWE Birleşik Devletler Şampiyonluğu kemeri için maça çıktı. Shane McMahon, maçın özel konuk hakemliğini yaptı. As Shane, Owens'ı kontrol etti. Styles, Owens'a Springboard 450 Splash yapmaya çalışırken yanlışlıkla Shane'e de yaptı. Owens, Styles'a Pop Up Powerbomb yaptıktan sonra Styles'ı tuşlamaya çalıştı ancak Styles son anda kurtuldu. Owens, Shane'in kararını beğenmeyince ikisi arasında kısa süreli bir tartışma oldu. Owens, Styles'ın Calf Crusher hareketini karşıladıktan sonra Styles'tan tekme yiyince Shane'in üzerine düştü. Styles, Owens'ı Calf Crusher ile pes ettirse de ringin dışında aşağıda yatan Shane bunu göremedi. Bunun üzerine Styles da Shane ile tartıştı. Styles, Owens'a Styles Clash yaptıktan sonra tuşa gitti ancak Owens son anda kurtuldu. Owens'ın yaptığı Pop Up Powerbomb'dan sonra Styles'ı tuşladı ancak Shane son anda Styles'ın ayağının iplerde olduğunu fark edince tuşu iptal etti. Owens ve Shane tartışmaya başladı ve Shane,
[0.04532943293452263, -0.007217255420982838, -0.019380981102585793, -0.0026675150729715824, 0.018997980281710625, ...]
Leylek yavruları, anne ve babaları tarafından yiyip kısmen sindirdikleri besinleri kusarak beslenirler. Anne leylek yavruları yağmur, fırtına ve güneşten korurken, baba leylek yavrularını beslemekle yükümlüdür.
[-0.055585864931344986, 0.045432090759277344, -0.04405859857797623, 0.0009241091320291162, -0.0689476728439331, ...]
- Loss:
MSELoss
Training Hyperparameters
Non-Default Hyperparameters
eval_strategy
: stepsper_device_train_batch_size
: 32per_device_eval_batch_size
: 32learning_rate
: 0.0001num_train_epochs
: 1warmup_ratio
: 0.1bf16
: Trueload_best_model_at_end
: True
All Hyperparameters
Click to expand
overwrite_output_dir
: Falsedo_predict
: Falseeval_strategy
: stepsprediction_loss_only
: Trueper_device_train_batch_size
: 32per_device_eval_batch_size
: 32per_gpu_train_batch_size
: Noneper_gpu_eval_batch_size
: Nonegradient_accumulation_steps
: 1eval_accumulation_steps
: Nonetorch_empty_cache_steps
: Nonelearning_rate
: 0.0001weight_decay
: 0.0adam_beta1
: 0.9adam_beta2
: 0.999adam_epsilon
: 1e-08max_grad_norm
: 1.0num_train_epochs
: 1max_steps
: -1lr_scheduler_type
: linearlr_scheduler_kwargs
: {}warmup_ratio
: 0.1warmup_steps
: 0log_level
: passivelog_level_replica
: warninglog_on_each_node
: Truelogging_nan_inf_filter
: Truesave_safetensors
: Truesave_on_each_node
: Falsesave_only_model
: Falserestore_callback_states_from_checkpoint
: Falseno_cuda
: Falseuse_cpu
: Falseuse_mps_device
: Falseseed
: 42data_seed
: Nonejit_mode_eval
: Falseuse_ipex
: Falsebf16
: Truefp16
: Falsefp16_opt_level
: O1half_precision_backend
: autobf16_full_eval
: Falsefp16_full_eval
: Falsetf32
: Nonelocal_rank
: 0ddp_backend
: Nonetpu_num_cores
: Nonetpu_metrics_debug
: Falsedebug
: []dataloader_drop_last
: Falsedataloader_num_workers
: 0dataloader_prefetch_factor
: Nonepast_index
: -1disable_tqdm
: Falseremove_unused_columns
: Truelabel_names
: Noneload_best_model_at_end
: Trueignore_data_skip
: Falsefsdp
: []fsdp_min_num_params
: 0fsdp_config
: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}fsdp_transformer_layer_cls_to_wrap
: Noneaccelerator_config
: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}deepspeed
: Nonelabel_smoothing_factor
: 0.0optim
: adamw_torchoptim_args
: Noneadafactor
: Falsegroup_by_length
: Falselength_column_name
: lengthddp_find_unused_parameters
: Noneddp_bucket_cap_mb
: Noneddp_broadcast_buffers
: Falsedataloader_pin_memory
: Truedataloader_persistent_workers
: Falseskip_memory_metrics
: Trueuse_legacy_prediction_loop
: Falsepush_to_hub
: Falseresume_from_checkpoint
: Nonehub_model_id
: Nonehub_strategy
: every_savehub_private_repo
: Nonehub_always_push
: Falsegradient_checkpointing
: Falsegradient_checkpointing_kwargs
: Noneinclude_inputs_for_metrics
: Falseinclude_for_metrics
: []eval_do_concat_batches
: Truefp16_backend
: autopush_to_hub_model_id
: Nonepush_to_hub_organization
: Nonemp_parameters
:auto_find_batch_size
: Falsefull_determinism
: Falsetorchdynamo
: Noneray_scope
: lastddp_timeout
: 1800torch_compile
: Falsetorch_compile_backend
: Nonetorch_compile_mode
: Nonedispatch_batches
: Nonesplit_batches
: Noneinclude_tokens_per_second
: Falseinclude_num_input_tokens_seen
: Falseneftune_noise_alpha
: Noneoptim_target_modules
: Nonebatch_eval_metrics
: Falseeval_on_start
: Falseuse_liger_kernel
: Falseeval_use_gather_object
: Falseaverage_tokens_across_devices
: Falseprompts
: Nonebatch_sampler
: batch_samplermulti_dataset_batch_sampler
: proportional
Training Logs
Click to expand
Epoch | Step | Training Loss | Validation Loss | sts-dev_spearman_cosine | negative_mse | sts-test_spearman_cosine |
---|---|---|---|---|---|---|
0 | 0 | - | - | 0.0074 | -0.1913 | - |
0.0017 | 500 | - | 0.0009 | 0.3279 | -0.0860 | - |
0.0033 | 1000 | 0.001 | 0.0007 | 0.5478 | -0.0651 | - |
0.0050 | 1500 | - | 0.0006 | 0.6221 | -0.0573 | - |
0.0067 | 2000 | 0.0007 | 0.0005 | 0.6635 | -0.0523 | - |
0.0083 | 2500 | - | 0.0005 | 0.6916 | -0.0485 | - |
0.0100 | 3000 | 0.0006 | 0.0005 | 0.7148 | -0.0455 | - |
0.0117 | 3500 | - | 0.0004 | 0.7319 | -0.0429 | - |
0.0133 | 4000 | 0.0005 | 0.0004 | 0.7485 | -0.0406 | - |
0.0150 | 4500 | - | 0.0004 | 0.7622 | -0.0385 | - |
0.0167 | 5000 | 0.0005 | 0.0004 | 0.7722 | -0.0368 | - |
0.0183 | 5500 | - | 0.0004 | 0.7856 | -0.0352 | - |
0.0200 | 6000 | 0.0004 | 0.0003 | 0.7999 | -0.0336 | - |
0.0217 | 6500 | - | 0.0003 | 0.8074 | -0.0323 | - |
0.0233 | 7000 | 0.0004 | 0.0003 | 0.8155 | -0.0311 | - |
0.0250 | 7500 | - | 0.0003 | 0.8237 | -0.0299 | - |
0.0267 | 8000 | 0.0004 | 0.0003 | 0.8308 | -0.0289 | - |
0.0283 | 8500 | - | 0.0003 | 0.8322 | -0.0280 | - |
0.0300 | 9000 | 0.0004 | 0.0003 | 0.8409 | -0.0270 | - |
0.0317 | 9500 | - | 0.0003 | 0.8446 | -0.0262 | - |
0.0333 | 10000 | 0.0003 | 0.0003 | 0.8513 | -0.0254 | - |
0.0350 | 10500 | - | 0.0002 | 0.8519 | -0.0247 | - |
0.0367 | 11000 | 0.0003 | 0.0002 | 0.8591 | -0.0240 | - |
0.0383 | 11500 | - | 0.0002 | 0.8623 | -0.0233 | - |
0.0400 | 12000 | 0.0003 | 0.0002 | 0.8674 | -0.0228 | - |
0.0416 | 12500 | - | 0.0002 | 0.8659 | -0.0222 | - |
0.0433 | 13000 | 0.0003 | 0.0002 | 0.8724 | -0.0215 | - |
0.0450 | 13500 | - | 0.0002 | 0.8725 | -0.0212 | - |
0.0466 | 14000 | 0.0003 | 0.0002 | 0.8793 | -0.0206 | - |
0.0483 | 14500 | - | 0.0002 | 0.8834 | -0.0202 | - |
0.0500 | 15000 | 0.0003 | 0.0002 | 0.8817 | -0.0197 | - |
0.0516 | 15500 | - | 0.0002 | 0.8860 | -0.0194 | - |
0.0533 | 16000 | 0.0003 | 0.0002 | 0.8842 | -0.0188 | - |
0.0550 | 16500 | - | 0.0002 | 0.8893 | -0.0185 | - |
0.0566 | 17000 | 0.0002 | 0.0002 | 0.8880 | -0.0181 | - |
0.0583 | 17500 | - | 0.0002 | 0.8932 | -0.0179 | - |
0.0600 | 18000 | 0.0002 | 0.0002 | 0.8913 | -0.0176 | - |
0.0616 | 18500 | - | 0.0002 | 0.8963 | -0.0172 | - |
0.0633 | 19000 | 0.0002 | 0.0002 | 0.8915 | -0.0170 | - |
0.0650 | 19500 | - | 0.0002 | 0.8969 | -0.0167 | - |
0.0666 | 20000 | 0.0002 | 0.0002 | 0.8984 | -0.0165 | - |
0.0683 | 20500 | - | 0.0002 | 0.9021 | -0.0162 | - |
0.0700 | 21000 | 0.0002 | 0.0002 | 0.9027 | -0.0160 | - |
0.0716 | 21500 | - | 0.0002 | 0.9018 | -0.0158 | - |
0.0733 | 22000 | 0.0002 | 0.0002 | 0.9043 | -0.0156 | - |
0.0750 | 22500 | - | 0.0002 | 0.9028 | -0.0154 | - |
0.0766 | 23000 | 0.0002 | 0.0002 | 0.9024 | -0.0153 | - |
0.0783 | 23500 | - | 0.0002 | 0.9049 | -0.0152 | - |
0.0800 | 24000 | 0.0002 | 0.0001 | 0.9087 | -0.0150 | - |
0.0816 | 24500 | - | 0.0001 | 0.9079 | -0.0148 | - |
0.0833 | 25000 | 0.0002 | 0.0001 | 0.9080 | -0.0147 | - |
0.0850 | 25500 | - | 0.0001 | 0.9096 | -0.0145 | - |
0.0866 | 26000 | 0.0002 | 0.0001 | 0.9061 | -0.0145 | - |
0.0883 | 26500 | - | 0.0001 | 0.9098 | -0.0143 | - |
0.0900 | 27000 | 0.0002 | 0.0001 | 0.9122 | -0.0142 | - |
0.0916 | 27500 | - | 0.0001 | 0.9131 | -0.0140 | - |
0.0933 | 28000 | 0.0002 | 0.0001 | 0.9114 | -0.0139 | - |
0.0950 | 28500 | - | 0.0001 | 0.9126 | -0.0139 | - |
0.0966 | 29000 | 0.0002 | 0.0001 | 0.9163 | -0.0138 | - |
0.0983 | 29500 | - | 0.0001 | 0.9140 | -0.0137 | - |
0.1000 | 30000 | 0.0002 | 0.0001 | 0.9141 | -0.0136 | - |
0.1016 | 30500 | - | 0.0001 | 0.9163 | -0.0135 | - |
0.1033 | 31000 | 0.0002 | 0.0001 | 0.9159 | -0.0135 | - |
0.1050 | 31500 | - | 0.0001 | 0.9153 | -0.0132 | - |
0.1066 | 32000 | 0.0002 | 0.0001 | 0.9194 | -0.0131 | - |
0.1083 | 32500 | - | 0.0001 | 0.9203 | -0.0131 | - |
0.1100 | 33000 | 0.0002 | 0.0001 | 0.9187 | -0.0129 | - |
0.1116 | 33500 | - | 0.0001 | 0.9218 | -0.0129 | - |
0.1133 | 34000 | 0.0002 | 0.0001 | 0.9204 | -0.0127 | - |
0.1150 | 34500 | - | 0.0001 | 0.9216 | -0.0127 | - |
0.1166 | 35000 | 0.0002 | 0.0001 | 0.9232 | -0.0125 | - |
0.1183 | 35500 | - | 0.0001 | 0.9212 | -0.0125 | - |
0.1200 | 36000 | 0.0002 | 0.0001 | 0.9227 | -0.0125 | - |
0.1216 | 36500 | - | 0.0001 | 0.9233 | -0.0124 | - |
0.1233 | 37000 | 0.0002 | 0.0001 | 0.9261 | -0.0123 | - |
0.1249 | 37500 | - | 0.0001 | 0.9256 | -0.0122 | - |
0.1266 | 38000 | 0.0002 | 0.0001 | 0.9273 | -0.0121 | - |
0.1283 | 38500 | - | 0.0001 | 0.9274 | -0.0120 | - |
0.1299 | 39000 | 0.0002 | 0.0001 | 0.9273 | -0.0119 | - |
0.1316 | 39500 | - | 0.0001 | 0.9287 | -0.0119 | - |
0.1333 | 40000 | 0.0002 | 0.0001 | 0.9266 | -0.0118 | - |
0.1349 | 40500 | - | 0.0001 | 0.9283 | -0.0118 | - |
0.1366 | 41000 | 0.0002 | 0.0001 | 0.9307 | -0.0117 | - |
0.1383 | 41500 | - | 0.0001 | 0.9277 | -0.0117 | - |
0.1399 | 42000 | 0.0002 | 0.0001 | 0.9281 | -0.0115 | - |
0.1416 | 42500 | - | 0.0001 | 0.9299 | -0.0115 | - |
0.1433 | 43000 | 0.0002 | 0.0001 | 0.9306 | -0.0115 | - |
0.1449 | 43500 | - | 0.0001 | 0.9301 | -0.0114 | - |
0.1466 | 44000 | 0.0002 | 0.0001 | 0.9302 | -0.0114 | - |
0.1483 | 44500 | - | 0.0001 | 0.9321 | -0.0114 | - |
0.1499 | 45000 | 0.0002 | 0.0001 | 0.9320 | -0.0113 | - |
0.1516 | 45500 | - | 0.0001 | 0.9333 | -0.0112 | - |
0.1533 | 46000 | 0.0002 | 0.0001 | 0.9343 | -0.0111 | - |
0.1549 | 46500 | - | 0.0001 | 0.9315 | -0.0111 | - |
0.1566 | 47000 | 0.0002 | 0.0001 | 0.9326 | -0.0111 | - |
0.1583 | 47500 | - | 0.0001 | 0.9324 | -0.0110 | - |
0.1599 | 48000 | 0.0001 | 0.0001 | 0.9362 | -0.0110 | - |
0.1616 | 48500 | - | 0.0001 | 0.9370 | -0.0109 | - |
0.1633 | 49000 | 0.0001 | 0.0001 | 0.9348 | -0.0109 | - |
0.1649 | 49500 | - | 0.0001 | 0.9352 | -0.0108 | - |
0.1666 | 50000 | 0.0001 | 0.0001 | 0.9364 | -0.0107 | - |
0.1683 | 50500 | - | 0.0001 | 0.9351 | -0.0107 | - |
0.1699 | 51000 | 0.0001 | 0.0001 | 0.9372 | -0.0108 | - |
0.1716 | 51500 | - | 0.0001 | 0.9357 | -0.0108 | - |
0.1733 | 52000 | 0.0001 | 0.0001 | 0.9384 | -0.0106 | - |
0.1749 | 52500 | - | 0.0001 | 0.9366 | -0.0106 | - |
0.1766 | 53000 | 0.0001 | 0.0001 | 0.9375 | -0.0106 | - |
0.1783 | 53500 | - | 0.0001 | 0.9381 | -0.0105 | - |
0.1799 | 54000 | 0.0001 | 0.0001 | 0.9382 | -0.0105 | - |
0.1816 | 54500 | - | 0.0001 | 0.9368 | -0.0106 | - |
0.1833 | 55000 | 0.0001 | 0.0001 | 0.9383 | -0.0105 | - |
0.1849 | 55500 | - | 0.0001 | 0.9393 | -0.0104 | - |
0.1866 | 56000 | 0.0001 | 0.0001 | 0.9383 | -0.0104 | - |
0.1883 | 56500 | - | 0.0001 | 0.9397 | -0.0104 | - |
0.1899 | 57000 | 0.0001 | 0.0001 | 0.9404 | -0.0103 | - |
0.1916 | 57500 | - | 0.0001 | 0.9378 | -0.0103 | - |
0.1933 | 58000 | 0.0001 | 0.0001 | 0.9379 | -0.0103 | - |
0.1949 | 58500 | - | 0.0001 | 0.9397 | -0.0102 | - |
0.1966 | 59000 | 0.0001 | 0.0001 | 0.9406 | -0.0102 | - |
0.1983 | 59500 | - | 0.0001 | 0.9402 | -0.0102 | - |
0.1999 | 60000 | 0.0001 | 0.0001 | 0.9408 | -0.0101 | - |
0.2016 | 60500 | - | 0.0001 | 0.9410 | -0.0101 | - |
0.2033 | 61000 | 0.0001 | 0.0001 | 0.9409 | -0.0101 | - |
0.2049 | 61500 | - | 0.0001 | 0.9405 | -0.0101 | - |
0.2066 | 62000 | 0.0001 | 0.0001 | 0.9424 | -0.0100 | - |
0.2082 | 62500 | - | 0.0001 | 0.9378 | -0.0101 | - |
0.2099 | 63000 | 0.0001 | 0.0001 | 0.9408 | -0.0099 | - |
0.2116 | 63500 | - | 0.0001 | 0.9404 | -0.0100 | - |
0.2132 | 64000 | 0.0001 | 0.0001 | 0.9397 | -0.0099 | - |
0.2149 | 64500 | - | 0.0001 | 0.9411 | -0.0099 | - |
0.2166 | 65000 | 0.0001 | 0.0001 | 0.9401 | -0.0099 | - |
0.2182 | 65500 | - | 0.0001 | 0.9415 | -0.0098 | - |
0.2199 | 66000 | 0.0001 | 0.0001 | 0.9413 | -0.0098 | - |
0.2216 | 66500 | - | 0.0001 | 0.9417 | -0.0098 | - |
0.2232 | 67000 | 0.0001 | 0.0001 | 0.9411 | -0.0097 | - |
0.2249 | 67500 | - | 0.0001 | 0.9423 | -0.0097 | - |
0.2266 | 68000 | 0.0001 | 0.0001 | 0.9424 | -0.0097 | - |
0.2282 | 68500 | - | 0.0001 | 0.9424 | -0.0098 | - |
0.2299 | 69000 | 0.0001 | 0.0001 | 0.9439 | -0.0096 | - |
0.2316 | 69500 | - | 0.0001 | 0.9423 | -0.0097 | - |
0.2332 | 70000 | 0.0001 | 0.0001 | 0.9420 | -0.0096 | - |
0.2349 | 70500 | - | 0.0001 | 0.9429 | -0.0096 | - |
0.2366 | 71000 | 0.0001 | 0.0001 | 0.9440 | -0.0096 | - |
0.2382 | 71500 | - | 0.0001 | 0.9425 | -0.0096 | - |
0.2399 | 72000 | 0.0001 | 0.0001 | 0.9438 | -0.0096 | - |
0.2416 | 72500 | - | 0.0001 | 0.9442 | -0.0095 | - |
0.2432 | 73000 | 0.0001 | 0.0001 | 0.9451 | -0.0095 | - |
0.2449 | 73500 | - | 0.0001 | 0.9432 | -0.0095 | - |
0.2466 | 74000 | 0.0001 | 0.0001 | 0.9441 | -0.0095 | - |
0.2482 | 74500 | - | 0.0001 | 0.9442 | -0.0094 | - |
0.2499 | 75000 | 0.0001 | 0.0001 | 0.9436 | -0.0094 | - |
0.2516 | 75500 | - | 0.0001 | 0.9450 | -0.0094 | - |
0.2532 | 76000 | 0.0001 | 0.0001 | 0.9455 | -0.0094 | - |
0.2549 | 76500 | - | 0.0001 | 0.9439 | -0.0094 | - |
0.2566 | 77000 | 0.0001 | 0.0001 | 0.9444 | -0.0094 | - |
0.2582 | 77500 | - | 0.0001 | 0.9449 | -0.0093 | - |
0.2599 | 78000 | 0.0001 | 0.0001 | 0.9444 | -0.0093 | - |
0.2616 | 78500 | - | 0.0001 | 0.9454 | -0.0093 | - |
0.2632 | 79000 | 0.0001 | 0.0001 | 0.9452 | -0.0093 | - |
0.2649 | 79500 | - | 0.0001 | 0.9465 | -0.0093 | - |
0.2666 | 80000 | 0.0001 | 0.0001 | 0.9450 | -0.0093 | - |
0.2682 | 80500 | - | 0.0001 | 0.9467 | -0.0092 | - |
0.2699 | 81000 | 0.0001 | 0.0001 | 0.9470 | -0.0092 | - |
0.2716 | 81500 | - | 0.0001 | 0.9447 | -0.0092 | - |
0.2732 | 82000 | 0.0001 | 0.0001 | 0.9477 | -0.0092 | - |
0.2749 | 82500 | - | 0.0001 | 0.9442 | -0.0092 | - |
0.2766 | 83000 | 0.0001 | 0.0001 | 0.9482 | -0.0091 | - |
0.2782 | 83500 | - | 0.0001 | 0.9475 | -0.0091 | - |
0.2799 | 84000 | 0.0001 | 0.0001 | 0.9451 | -0.0091 | - |
0.2816 | 84500 | - | 0.0001 | 0.9471 | -0.0091 | - |
0.2832 | 85000 | 0.0001 | 0.0001 | 0.9470 | -0.0090 | - |
0.2849 | 85500 | - | 0.0001 | 0.9468 | -0.0091 | - |
0.2865 | 86000 | 0.0001 | 0.0001 | 0.9464 | -0.0090 | - |
0.2882 | 86500 | - | 0.0001 | 0.9482 | -0.0090 | - |
0.2899 | 87000 | 0.0001 | 0.0001 | 0.9466 | -0.0090 | - |
0.2915 | 87500 | - | 0.0001 | 0.9474 | -0.0090 | - |
0.2932 | 88000 | 0.0001 | 0.0001 | 0.9476 | -0.0090 | - |
0.2949 | 88500 | - | 0.0001 | 0.9480 | -0.0089 | - |
0.2965 | 89000 | 0.0001 | 0.0001 | 0.9489 | -0.0090 | - |
0.2982 | 89500 | - | 0.0001 | 0.9475 | -0.0089 | - |
0.2999 | 90000 | 0.0001 | 0.0001 | 0.9483 | -0.0089 | - |
0.3015 | 90500 | - | 0.0001 | 0.9478 | -0.0089 | - |
0.3032 | 91000 | 0.0001 | 0.0001 | 0.9471 | -0.0090 | - |
0.3049 | 91500 | - | 0.0001 | 0.9470 | -0.0089 | - |
0.3065 | 92000 | 0.0001 | 0.0001 | 0.9472 | -0.0089 | - |
0.3082 | 92500 | - | 0.0001 | 0.9485 | -0.0089 | - |
0.3099 | 93000 | 0.0001 | 0.0001 | 0.9468 | -0.0089 | - |
0.3115 | 93500 | - | 0.0001 | 0.9484 | -0.0088 | - |
0.3132 | 94000 | 0.0001 | 0.0001 | 0.9482 | -0.0088 | - |
0.3149 | 94500 | - | 0.0001 | 0.9503 | -0.0088 | - |
0.3165 | 95000 | 0.0001 | 0.0001 | 0.9485 | -0.0088 | - |
0.3182 | 95500 | - | 0.0001 | 0.9509 | -0.0087 | - |
0.3199 | 96000 | 0.0001 | 0.0001 | 0.9492 | -0.0088 | - |
0.3215 | 96500 | - | 0.0001 | 0.9488 | -0.0087 | - |
0.3232 | 97000 | 0.0001 | 0.0001 | 0.9500 | -0.0087 | - |
0.3249 | 97500 | - | 0.0001 | 0.9495 | -0.0087 | - |
0.3265 | 98000 | 0.0001 | 0.0001 | 0.9499 | -0.0087 | - |
0.3282 | 98500 | - | 0.0001 | 0.9496 | -0.0087 | - |
0.3299 | 99000 | 0.0001 | 0.0001 | 0.9493 | -0.0087 | - |
0.3315 | 99500 | - | 0.0001 | 0.9497 | -0.0087 | - |
0.3332 | 100000 | 0.0001 | 0.0001 | 0.9511 | -0.0086 | - |
0.3349 | 100500 | - | 0.0001 | 0.9508 | -0.0086 | - |
0.3365 | 101000 | 0.0001 | 0.0001 | 0.9502 | -0.0086 | - |
0.3382 | 101500 | - | 0.0001 | 0.9488 | -0.0087 | - |
0.3399 | 102000 | 0.0001 | 0.0001 | 0.9505 | -0.0086 | - |
0.3415 | 102500 | - | 0.0001 | 0.9497 | -0.0086 | - |
0.3432 | 103000 | 0.0001 | 0.0001 | 0.9500 | -0.0085 | - |
0.3449 | 103500 | - | 0.0001 | 0.9497 | -0.0086 | - |
0.3465 | 104000 | 0.0001 | 0.0001 | 0.9521 | -0.0085 | - |
0.3482 | 104500 | - | 0.0001 | 0.9499 | -0.0085 | - |
0.3499 | 105000 | 0.0001 | 0.0001 | 0.9488 | -0.0085 | - |
0.3515 | 105500 | - | 0.0001 | 0.9490 | -0.0085 | - |
0.3532 | 106000 | 0.0001 | 0.0001 | 0.9503 | -0.0085 | - |
0.3549 | 106500 | - | 0.0001 | 0.9504 | -0.0085 | - |
0.3565 | 107000 | 0.0001 | 0.0001 | 0.9503 | -0.0085 | - |
0.3582 | 107500 | - | 0.0001 | 0.9514 | -0.0085 | - |
0.3599 | 108000 | 0.0001 | 0.0001 | 0.9509 | -0.0084 | - |
0.3615 | 108500 | - | 0.0001 | 0.9513 | -0.0084 | - |
0.3632 | 109000 | 0.0001 | 0.0001 | 0.9512 | -0.0084 | - |
0.3649 | 109500 | - | 0.0001 | 0.9515 | -0.0084 | - |
0.3665 | 110000 | 0.0001 | 0.0001 | 0.9509 | -0.0084 | - |
0.3682 | 110500 | - | 0.0001 | 0.9495 | -0.0084 | - |
0.3698 | 111000 | 0.0001 | 0.0001 | 0.9507 | -0.0084 | - |
0.3715 | 111500 | - | 0.0001 | 0.9512 | -0.0083 | - |
0.3732 | 112000 | 0.0001 | 0.0001 | 0.9519 | -0.0084 | - |
0.3748 | 112500 | - | 0.0001 | 0.9512 | -0.0084 | - |
0.3765 | 113000 | 0.0001 | 0.0001 | 0.9511 | -0.0083 | - |
0.3782 | 113500 | - | 0.0001 | 0.9513 | -0.0083 | - |
0.3798 | 114000 | 0.0001 | 0.0001 | 0.9512 | -0.0084 | - |
0.3815 | 114500 | - | 0.0001 | 0.9501 | -0.0083 | - |
0.3832 | 115000 | 0.0001 | 0.0001 | 0.9515 | -0.0083 | - |
0.3848 | 115500 | - | 0.0001 | 0.9526 | -0.0083 | - |
0.3865 | 116000 | 0.0001 | 0.0001 | 0.9518 | -0.0083 | - |
0.3882 | 116500 | - | 0.0001 | 0.9521 | -0.0083 | - |
0.3898 | 117000 | 0.0001 | 0.0001 | 0.9515 | -0.0083 | - |
0.3915 | 117500 | - | 0.0001 | 0.9515 | -0.0083 | - |
0.3932 | 118000 | 0.0001 | 0.0001 | 0.9530 | -0.0082 | - |
0.3948 | 118500 | - | 0.0001 | 0.9533 | -0.0082 | - |
0.3965 | 119000 | 0.0001 | 0.0001 | 0.9523 | -0.0082 | - |
0.3982 | 119500 | - | 0.0001 | 0.9520 | -0.0082 | - |
0.3998 | 120000 | 0.0001 | 0.0001 | 0.9511 | -0.0082 | - |
0.4015 | 120500 | - | 0.0001 | 0.9530 | -0.0083 | - |
0.4032 | 121000 | 0.0001 | 0.0001 | 0.9525 | -0.0082 | - |
0.4048 | 121500 | - | 0.0001 | 0.9526 | -0.0082 | - |
0.4065 | 122000 | 0.0001 | 0.0001 | 0.9527 | -0.0082 | - |
0.4082 | 122500 | - | 0.0001 | 0.9522 | -0.0082 | - |
0.4098 | 123000 | 0.0001 | 0.0001 | 0.9535 | -0.0081 | - |
0.4115 | 123500 | - | 0.0001 | 0.9527 | -0.0081 | - |
0.4132 | 124000 | 0.0001 | 0.0001 | 0.9530 | -0.0082 | - |
0.4148 | 124500 | - | 0.0001 | 0.9520 | -0.0082 | - |
0.4165 | 125000 | 0.0001 | 0.0001 | 0.9526 | -0.0081 | - |
0.4182 | 125500 | - | 0.0001 | 0.9528 | -0.0081 | - |
0.4198 | 126000 | 0.0001 | 0.0001 | 0.9535 | -0.0081 | - |
0.4215 | 126500 | - | 0.0001 | 0.9530 | -0.0081 | - |
0.4232 | 127000 | 0.0001 | 0.0001 | 0.9539 | -0.0081 | - |
0.4248 | 127500 | - | 0.0001 | 0.9531 | -0.0081 | - |
0.4265 | 128000 | 0.0001 | 0.0001 | 0.9540 | -0.0081 | - |
0.4282 | 128500 | - | 0.0001 | 0.9534 | -0.0081 | - |
0.4298 | 129000 | 0.0001 | 0.0001 | 0.9536 | -0.0080 | - |
0.4315 | 129500 | - | 0.0001 | 0.9536 | -0.0081 | - |
0.4332 | 130000 | 0.0001 | 0.0001 | 0.9547 | -0.0080 | - |
0.4348 | 130500 | - | 0.0001 | 0.9535 | -0.0080 | - |
0.4365 | 131000 | 0.0001 | 0.0001 | 0.9541 | -0.0080 | - |
0.4382 | 131500 | - | 0.0001 | 0.9542 | -0.0080 | - |
0.4398 | 132000 | 0.0001 | 0.0001 | 0.9540 | -0.0080 | - |
0.4415 | 132500 | - | 0.0001 | 0.9537 | -0.0080 | - |
0.4432 | 133000 | 0.0001 | 0.0001 | 0.9538 | -0.0080 | - |
0.4448 | 133500 | - | 0.0001 | 0.9540 | -0.0079 | - |
0.4465 | 134000 | 0.0001 | 0.0001 | 0.9540 | -0.0080 | - |
0.4481 | 134500 | - | 0.0001 | 0.9544 | -0.0080 | - |
0.4498 | 135000 | 0.0001 | 0.0001 | 0.9535 | -0.0079 | - |
0.4515 | 135500 | - | 0.0001 | 0.9541 | -0.0079 | - |
0.4531 | 136000 | 0.0001 | 0.0001 | 0.9546 | -0.0079 | - |
0.4548 | 136500 | - | 0.0001 | 0.9543 | -0.0079 | - |
0.4565 | 137000 | 0.0001 | 0.0001 | 0.9548 | -0.0079 | - |
0.4581 | 137500 | - | 0.0001 | 0.9555 | -0.0079 | - |
0.4598 | 138000 | 0.0001 | 0.0001 | 0.9548 | -0.0079 | - |
0.4615 | 138500 | - | 0.0001 | 0.9542 | -0.0079 | - |
0.4631 | 139000 | 0.0001 | 0.0001 | 0.9548 | -0.0079 | - |
0.4648 | 139500 | - | 0.0001 | 0.9544 | -0.0079 | - |
0.4665 | 140000 | 0.0001 | 0.0001 | 0.9546 | -0.0079 | - |
0.4681 | 140500 | - | 0.0001 | 0.9553 | -0.0078 | - |
0.4698 | 141000 | 0.0001 | 0.0001 | 0.9542 | -0.0078 | - |
0.4715 | 141500 | - | 0.0001 | 0.9553 | -0.0078 | - |
0.4731 | 142000 | 0.0001 | 0.0001 | 0.9548 | -0.0079 | - |
0.4748 | 142500 | - | 0.0001 | 0.9545 | -0.0078 | - |
0.4765 | 143000 | 0.0001 | 0.0001 | 0.9553 | -0.0079 | - |
0.4781 | 143500 | - | 0.0001 | 0.9561 | -0.0078 | - |
0.4798 | 144000 | 0.0001 | 0.0001 | 0.9551 | -0.0078 | - |
0.4815 | 144500 | - | 0.0001 | 0.9550 | -0.0078 | - |
0.4831 | 145000 | 0.0001 | 0.0001 | 0.9557 | -0.0078 | - |
0.4848 | 145500 | - | 0.0001 | 0.9557 | -0.0077 | - |
0.4865 | 146000 | 0.0001 | 0.0001 | 0.9552 | -0.0077 | - |
0.4881 | 146500 | - | 0.0001 | 0.9553 | -0.0078 | - |
0.4898 | 147000 | 0.0001 | 0.0001 | 0.9555 | -0.0077 | - |
0.4915 | 147500 | - | 0.0001 | 0.9561 | -0.0077 | - |
0.4931 | 148000 | 0.0001 | 0.0001 | 0.9558 | -0.0077 | - |
0.4948 | 148500 | - | 0.0001 | 0.9558 | -0.0077 | - |
0.4965 | 149000 | 0.0001 | 0.0001 | 0.9560 | -0.0077 | - |
0.4981 | 149500 | - | 0.0001 | 0.9558 | -0.0077 | - |
0.4998 | 150000 | 0.0001 | 0.0001 | 0.9553 | -0.0077 | - |
0.5015 | 150500 | - | 0.0001 | 0.9557 | -0.0077 | - |
0.5031 | 151000 | 0.0001 | 0.0001 | 0.9562 | -0.0077 | - |
0.5048 | 151500 | - | 0.0001 | 0.9558 | -0.0077 | - |
0.5065 | 152000 | 0.0001 | 0.0001 | 0.9553 | -0.0077 | - |
0.5081 | 152500 | - | 0.0001 | 0.9553 | -0.0076 | - |
0.5098 | 153000 | 0.0001 | 0.0001 | 0.9559 | -0.0077 | - |
0.5115 | 153500 | - | 0.0001 | 0.9560 | -0.0076 | - |
0.5131 | 154000 | 0.0001 | 0.0001 | 0.9557 | -0.0076 | - |
0.5148 | 154500 | - | 0.0001 | 0.9563 | -0.0076 | - |
0.5165 | 155000 | 0.0001 | 0.0001 | 0.9567 | -0.0076 | - |
0.5181 | 155500 | - | 0.0001 | 0.9559 | -0.0076 | - |
0.5198 | 156000 | 0.0001 | 0.0001 | 0.9565 | -0.0076 | - |
0.5215 | 156500 | - | 0.0001 | 0.9563 | -0.0076 | - |
0.5231 | 157000 | 0.0001 | 0.0001 | 0.9569 | -0.0076 | - |
0.5248 | 157500 | - | 0.0001 | 0.9571 | -0.0076 | - |
0.5265 | 158000 | 0.0001 | 0.0001 | 0.9560 | -0.0076 | - |
0.5281 | 158500 | - | 0.0001 | 0.9562 | -0.0076 | - |
0.5298 | 159000 | 0.0001 | 0.0001 | 0.9569 | -0.0076 | - |
0.5314 | 159500 | - | 0.0001 | 0.9556 | -0.0076 | - |
0.5331 | 160000 | 0.0001 | 0.0001 | 0.9560 | -0.0075 | - |
0.5348 | 160500 | - | 0.0001 | 0.9555 | -0.0075 | - |
0.5364 | 161000 | 0.0001 | 0.0001 | 0.9555 | -0.0076 | - |
0.5381 | 161500 | - | 0.0001 | 0.9564 | -0.0075 | - |
0.5398 | 162000 | 0.0001 | 0.0001 | 0.9574 | -0.0076 | - |
0.5414 | 162500 | - | 0.0001 | 0.9569 | -0.0075 | - |
0.5431 | 163000 | 0.0001 | 0.0001 | 0.9578 | -0.0075 | - |
0.5448 | 163500 | - | 0.0001 | 0.9571 | -0.0075 | - |
0.5464 | 164000 | 0.0001 | 0.0001 | 0.9578 | -0.0075 | - |
0.5481 | 164500 | - | 0.0001 | 0.9580 | -0.0075 | - |
0.5498 | 165000 | 0.0001 | 0.0001 | 0.9568 | -0.0075 | - |
0.5514 | 165500 | - | 0.0001 | 0.9582 | -0.0075 | - |
0.5531 | 166000 | 0.0001 | 0.0001 | 0.9578 | -0.0075 | - |
0.5548 | 166500 | - | 0.0001 | 0.9569 | -0.0075 | - |
0.5564 | 167000 | 0.0001 | 0.0001 | 0.9568 | -0.0075 | - |
0.5581 | 167500 | - | 0.0001 | 0.9576 | -0.0075 | - |
0.5598 | 168000 | 0.0001 | 0.0001 | 0.9581 | -0.0075 | - |
0.5614 | 168500 | - | 0.0001 | 0.9581 | -0.0075 | - |
0.5631 | 169000 | 0.0001 | 0.0001 | 0.9573 | -0.0075 | - |
0.5648 | 169500 | - | 0.0001 | 0.9581 | -0.0074 | - |
0.5664 | 170000 | 0.0001 | 0.0001 | 0.9568 | -0.0074 | - |
0.5681 | 170500 | - | 0.0001 | 0.9573 | -0.0075 | - |
0.5698 | 171000 | 0.0001 | 0.0001 | 0.9579 | -0.0074 | - |
0.5714 | 171500 | - | 0.0001 | 0.9578 | -0.0074 | - |
0.5731 | 172000 | 0.0001 | 0.0001 | 0.9581 | -0.0074 | - |
0.5748 | 172500 | - | 0.0001 | 0.9567 | -0.0074 | - |
0.5764 | 173000 | 0.0001 | 0.0001 | 0.9581 | -0.0074 | - |
0.5781 | 173500 | - | 0.0001 | 0.9584 | -0.0074 | - |
0.5798 | 174000 | 0.0001 | 0.0001 | 0.9585 | -0.0074 | - |
0.5814 | 174500 | - | 0.0001 | 0.9583 | -0.0074 | - |
0.5831 | 175000 | 0.0001 | 0.0001 | 0.9590 | -0.0074 | - |
0.5848 | 175500 | - | 0.0001 | 0.9580 | -0.0074 | - |
0.5864 | 176000 | 0.0001 | 0.0001 | 0.9580 | -0.0073 | - |
0.5881 | 176500 | - | 0.0001 | 0.9584 | -0.0073 | - |
0.5898 | 177000 | 0.0001 | 0.0001 | 0.9591 | -0.0074 | - |
0.5914 | 177500 | - | 0.0001 | 0.9592 | -0.0073 | - |
0.5931 | 178000 | 0.0001 | 0.0001 | 0.9582 | -0.0073 | - |
0.5948 | 178500 | - | 0.0001 | 0.9585 | -0.0073 | - |
0.5964 | 179000 | 0.0001 | 0.0001 | 0.9590 | -0.0074 | - |
0.5981 | 179500 | - | 0.0001 | 0.9586 | -0.0073 | - |
0.5998 | 180000 | 0.0001 | 0.0001 | 0.9588 | -0.0073 | - |
0.6014 | 180500 | - | 0.0001 | 0.9584 | -0.0073 | - |
0.6031 | 181000 | 0.0001 | 0.0001 | 0.9588 | -0.0073 | - |
0.6048 | 181500 | - | 0.0001 | 0.9581 | -0.0073 | - |
0.6064 | 182000 | 0.0001 | 0.0001 | 0.9585 | -0.0073 | - |
0.6081 | 182500 | - | 0.0001 | 0.9588 | -0.0073 | - |
0.6098 | 183000 | 0.0001 | 0.0001 | 0.9589 | -0.0073 | - |
0.6114 | 183500 | - | 0.0001 | 0.9590 | -0.0073 | - |
0.6131 | 184000 | 0.0001 | 0.0001 | 0.9592 | -0.0073 | - |
0.6147 | 184500 | - | 0.0001 | 0.9585 | -0.0072 | - |
0.6164 | 185000 | 0.0001 | 0.0001 | 0.9591 | -0.0073 | - |
0.6181 | 185500 | - | 0.0001 | 0.9581 | -0.0072 | - |
0.6197 | 186000 | 0.0001 | 0.0001 | 0.9583 | -0.0072 | - |
0.6214 | 186500 | - | 0.0001 | 0.9592 | -0.0072 | - |
0.6231 | 187000 | 0.0001 | 0.0001 | 0.9594 | -0.0072 | - |
0.6247 | 187500 | - | 0.0001 | 0.9596 | -0.0072 | - |
0.6264 | 188000 | 0.0001 | 0.0001 | 0.9599 | -0.0072 | - |
0.6281 | 188500 | - | 0.0001 | 0.9598 | -0.0072 | - |
0.6297 | 189000 | 0.0001 | 0.0001 | 0.9597 | -0.0072 | - |
0.6314 | 189500 | - | 0.0001 | 0.9596 | -0.0072 | - |
0.6331 | 190000 | 0.0001 | 0.0001 | 0.9603 | -0.0072 | - |
0.6347 | 190500 | - | 0.0001 | 0.9600 | -0.0072 | - |
0.6364 | 191000 | 0.0001 | 0.0001 | 0.9591 | -0.0072 | - |
0.6381 | 191500 | - | 0.0001 | 0.9590 | -0.0072 | - |
0.6397 | 192000 | 0.0001 | 0.0001 | 0.9586 | -0.0072 | - |
0.6414 | 192500 | - | 0.0001 | 0.9591 | -0.0072 | - |
0.6431 | 193000 | 0.0001 | 0.0001 | 0.9595 | -0.0072 | - |
0.6447 | 193500 | - | 0.0001 | 0.9599 | -0.0071 | - |
0.6464 | 194000 | 0.0001 | 0.0001 | 0.9598 | -0.0072 | - |
0.6481 | 194500 | - | 0.0001 | 0.9591 | -0.0072 | - |
0.6497 | 195000 | 0.0001 | 0.0001 | 0.9589 | -0.0071 | - |
0.6514 | 195500 | - | 0.0001 | 0.9597 | -0.0071 | - |
0.6531 | 196000 | 0.0001 | 0.0001 | 0.9596 | -0.0071 | - |
0.6547 | 196500 | - | 0.0001 | 0.9602 | -0.0071 | - |
0.6564 | 197000 | 0.0001 | 0.0001 | 0.9598 | -0.0071 | - |
0.6581 | 197500 | - | 0.0001 | 0.9599 | -0.0071 | - |
0.6597 | 198000 | 0.0001 | 0.0001 | 0.9602 | -0.0071 | - |
0.6614 | 198500 | - | 0.0001 | 0.9604 | -0.0071 | - |
0.6631 | 199000 | 0.0001 | 0.0001 | 0.9601 | -0.0071 | - |
0.6647 | 199500 | - | 0.0001 | 0.9606 | -0.0071 | - |
0.6664 | 200000 | 0.0001 | 0.0001 | 0.9598 | -0.0071 | - |
0.6681 | 200500 | - | 0.0001 | 0.9601 | -0.0071 | - |
0.6697 | 201000 | 0.0001 | 0.0001 | 0.9599 | -0.0071 | - |
0.6714 | 201500 | - | 0.0001 | 0.9602 | -0.0071 | - |
0.6731 | 202000 | 0.0001 | 0.0001 | 0.9595 | -0.0071 | - |
0.6747 | 202500 | - | 0.0001 | 0.9607 | -0.0071 | - |
0.6764 | 203000 | 0.0001 | 0.0001 | 0.9607 | -0.0071 | - |
0.6781 | 203500 | - | 0.0001 | 0.9603 | -0.0071 | - |
0.6797 | 204000 | 0.0001 | 0.0001 | 0.9612 | -0.0070 | - |
0.6814 | 204500 | - | 0.0001 | 0.9605 | -0.0071 | - |
0.6831 | 205000 | 0.0001 | 0.0001 | 0.9611 | -0.0070 | - |
0.6847 | 205500 | - | 0.0001 | 0.9607 | -0.0070 | - |
0.6864 | 206000 | 0.0001 | 0.0001 | 0.9601 | -0.0070 | - |
0.6881 | 206500 | - | 0.0001 | 0.9606 | -0.0070 | - |
0.6897 | 207000 | 0.0001 | 0.0001 | 0.9601 | -0.0070 | - |
0.6914 | 207500 | - | 0.0001 | 0.9611 | -0.0070 | - |
0.6930 | 208000 | 0.0001 | 0.0001 | 0.9613 | -0.0070 | - |
0.6947 | 208500 | - | 0.0001 | 0.9607 | -0.0070 | - |
0.6964 | 209000 | 0.0001 | 0.0001 | 0.9605 | -0.0070 | - |
0.6980 | 209500 | - | 0.0001 | 0.9611 | -0.0070 | - |
0.6997 | 210000 | 0.0001 | 0.0001 | 0.9604 | -0.0070 | - |
0.7014 | 210500 | - | 0.0001 | 0.9609 | -0.0070 | - |
0.7030 | 211000 | 0.0001 | 0.0001 | 0.9611 | -0.0070 | - |
0.7047 | 211500 | - | 0.0001 | 0.9611 | -0.0070 | - |
0.7064 | 212000 | 0.0001 | 0.0001 | 0.9612 | -0.0070 | - |
0.7080 | 212500 | - | 0.0001 | 0.9610 | -0.0070 | - |
0.7097 | 213000 | 0.0001 | 0.0001 | 0.9614 | -0.0070 | - |
0.7114 | 213500 | - | 0.0001 | 0.9613 | -0.0069 | - |
0.7130 | 214000 | 0.0001 | 0.0001 | 0.9619 | -0.0070 | - |
0.7147 | 214500 | - | 0.0001 | 0.9612 | -0.0070 | - |
0.7164 | 215000 | 0.0001 | 0.0001 | 0.9615 | -0.0069 | - |
0.7180 | 215500 | - | 0.0001 | 0.9614 | -0.0069 | - |
0.7197 | 216000 | 0.0001 | 0.0001 | 0.9614 | -0.0070 | - |
0.7214 | 216500 | - | 0.0001 | 0.9613 | -0.0069 | - |
0.7230 | 217000 | 0.0001 | 0.0001 | 0.9612 | -0.0069 | - |
0.7247 | 217500 | - | 0.0001 | 0.9608 | -0.0069 | - |
0.7264 | 218000 | 0.0001 | 0.0001 | 0.9619 | -0.0069 | - |
0.7280 | 218500 | - | 0.0001 | 0.9612 | -0.0069 | - |
0.7297 | 219000 | 0.0001 | 0.0001 | 0.9613 | -0.0069 | - |
0.7314 | 219500 | - | 0.0001 | 0.9617 | -0.0069 | - |
0.7330 | 220000 | 0.0001 | 0.0001 | 0.9620 | -0.0069 | - |
0.7347 | 220500 | - | 0.0001 | 0.9621 | -0.0069 | - |
0.7364 | 221000 | 0.0001 | 0.0001 | 0.9616 | -0.0069 | - |
0.7380 | 221500 | - | 0.0001 | 0.9622 | -0.0069 | - |
0.7397 | 222000 | 0.0001 | 0.0001 | 0.9620 | -0.0069 | - |
0.7414 | 222500 | - | 0.0001 | 0.9612 | -0.0069 | - |
0.7430 | 223000 | 0.0001 | 0.0001 | 0.9615 | -0.0069 | - |
0.7447 | 223500 | - | 0.0001 | 0.9615 | -0.0069 | - |
0.7464 | 224000 | 0.0001 | 0.0001 | 0.9621 | -0.0069 | - |
0.7480 | 224500 | - | 0.0001 | 0.9622 | -0.0068 | - |
0.7497 | 225000 | 0.0001 | 0.0001 | 0.9616 | -0.0069 | - |
0.7514 | 225500 | - | 0.0001 | 0.9616 | -0.0069 | - |
0.7530 | 226000 | 0.0001 | 0.0001 | 0.9614 | -0.0069 | - |
0.7547 | 226500 | - | 0.0001 | 0.9614 | -0.0069 | - |
0.7564 | 227000 | 0.0001 | 0.0001 | 0.9614 | -0.0068 | - |
0.7580 | 227500 | - | 0.0001 | 0.9613 | -0.0069 | - |
0.7597 | 228000 | 0.0001 | 0.0001 | 0.9620 | -0.0068 | - |
0.7614 | 228500 | - | 0.0001 | 0.9616 | -0.0068 | - |
0.7630 | 229000 | 0.0001 | 0.0001 | 0.9621 | -0.0068 | - |
0.7647 | 229500 | - | 0.0001 | 0.9620 | -0.0069 | - |
0.7664 | 230000 | 0.0001 | 0.0001 | 0.9618 | -0.0068 | - |
0.7680 | 230500 | - | 0.0001 | 0.9616 | -0.0068 | - |
0.7697 | 231000 | 0.0001 | 0.0001 | 0.9624 | -0.0068 | - |
0.7714 | 231500 | - | 0.0001 | 0.9618 | -0.0068 | - |
0.7730 | 232000 | 0.0001 | 0.0001 | 0.9621 | -0.0068 | - |
0.7747 | 232500 | - | 0.0001 | 0.9618 | -0.0068 | - |
0.7763 | 233000 | 0.0001 | 0.0001 | 0.9617 | -0.0068 | - |
0.7780 | 233500 | - | 0.0001 | 0.9620 | -0.0068 | - |
0.7797 | 234000 | 0.0001 | 0.0001 | 0.9620 | -0.0068 | - |
0.7813 | 234500 | - | 0.0001 | 0.9624 | -0.0068 | - |
0.7830 | 235000 | 0.0001 | 0.0001 | 0.9624 | -0.0068 | - |
0.7847 | 235500 | - | 0.0001 | 0.9624 | -0.0068 | - |
0.7863 | 236000 | 0.0001 | 0.0001 | 0.9627 | -0.0068 | - |
0.7880 | 236500 | - | 0.0001 | 0.9620 | -0.0068 | - |
0.7897 | 237000 | 0.0001 | 0.0001 | 0.9626 | -0.0068 | - |
0.7913 | 237500 | - | 0.0001 | 0.9629 | -0.0068 | - |
0.7930 | 238000 | 0.0001 | 0.0001 | 0.9621 | -0.0067 | - |
0.7947 | 238500 | - | 0.0001 | 0.9630 | -0.0067 | - |
0.7963 | 239000 | 0.0001 | 0.0001 | 0.9627 | -0.0067 | - |
0.7980 | 239500 | - | 0.0001 | 0.9628 | -0.0068 | - |
0.7997 | 240000 | 0.0001 | 0.0001 | 0.9626 | -0.0067 | - |
0.8013 | 240500 | - | 0.0001 | 0.9624 | -0.0067 | - |
0.8030 | 241000 | 0.0001 | 0.0001 | 0.9623 | -0.0067 | - |
0.8047 | 241500 | - | 0.0001 | 0.9622 | -0.0067 | - |
0.8063 | 242000 | 0.0001 | 0.0001 | 0.9620 | -0.0067 | - |
0.8080 | 242500 | - | 0.0001 | 0.9622 | -0.0067 | - |
0.8097 | 243000 | 0.0001 | 0.0001 | 0.9626 | -0.0067 | - |
0.8113 | 243500 | - | 0.0001 | 0.9634 | -0.0067 | - |
0.8130 | 244000 | 0.0001 | 0.0001 | 0.9623 | -0.0067 | - |
0.8147 | 244500 | - | 0.0001 | 0.9632 | -0.0067 | - |
0.8163 | 245000 | 0.0001 | 0.0001 | 0.9630 | -0.0067 | - |
0.8180 | 245500 | - | 0.0001 | 0.9634 | -0.0067 | - |
0.8197 | 246000 | 0.0001 | 0.0001 | 0.9627 | -0.0067 | - |
0.8213 | 246500 | - | 0.0001 | 0.9625 | -0.0067 | - |
0.8230 | 247000 | 0.0001 | 0.0001 | 0.9629 | -0.0067 | - |
0.8247 | 247500 | - | 0.0001 | 0.9633 | -0.0067 | - |
0.8263 | 248000 | 0.0001 | 0.0001 | 0.9628 | -0.0067 | - |
0.8280 | 248500 | - | 0.0001 | 0.9636 | -0.0067 | - |
0.8297 | 249000 | 0.0001 | 0.0001 | 0.9632 | -0.0067 | - |
0.8313 | 249500 | - | 0.0001 | 0.9630 | -0.0067 | - |
0.8330 | 250000 | 0.0001 | 0.0001 | 0.9639 | -0.0067 | - |
0.8347 | 250500 | - | 0.0001 | 0.9633 | -0.0067 | - |
0.8363 | 251000 | 0.0001 | 0.0001 | 0.9635 | -0.0066 | - |
0.8380 | 251500 | - | 0.0001 | 0.9637 | -0.0066 | - |
0.8397 | 252000 | 0.0001 | 0.0001 | 0.9632 | -0.0067 | - |
0.8413 | 252500 | - | 0.0001 | 0.9638 | -0.0066 | - |
0.8430 | 253000 | 0.0001 | 0.0001 | 0.9636 | -0.0066 | - |
0.8447 | 253500 | - | 0.0001 | 0.9635 | -0.0066 | - |
0.8463 | 254000 | 0.0001 | 0.0001 | 0.9636 | -0.0066 | - |
0.8480 | 254500 | - | 0.0001 | 0.9630 | -0.0066 | - |
0.8497 | 255000 | 0.0001 | 0.0001 | 0.9633 | -0.0066 | - |
0.8513 | 255500 | - | 0.0001 | 0.9636 | -0.0066 | - |
0.8530 | 256000 | 0.0001 | 0.0001 | 0.9635 | -0.0066 | - |
0.8546 | 256500 | - | 0.0001 | 0.9640 | -0.0066 | - |
0.8563 | 257000 | 0.0001 | 0.0001 | 0.9636 | -0.0066 | - |
0.8580 | 257500 | - | 0.0001 | 0.9636 | -0.0066 | - |
0.8596 | 258000 | 0.0001 | 0.0001 | 0.9636 | -0.0066 | - |
0.8613 | 258500 | - | 0.0001 | 0.9636 | -0.0066 | - |
0.8630 | 259000 | 0.0001 | 0.0001 | 0.9636 | -0.0066 | - |
0.8646 | 259500 | - | 0.0001 | 0.9635 | -0.0066 | - |
0.8663 | 260000 | 0.0001 | 0.0001 | 0.9637 | -0.0066 | - |
0.8680 | 260500 | - | 0.0001 | 0.9637 | -0.0066 | - |
0.8696 | 261000 | 0.0001 | 0.0001 | 0.9639 | -0.0066 | - |
0.8713 | 261500 | - | 0.0001 | 0.9640 | -0.0066 | - |
0.8730 | 262000 | 0.0001 | 0.0001 | 0.9640 | -0.0066 | - |
0.8746 | 262500 | - | 0.0001 | 0.9642 | -0.0066 | - |
0.8763 | 263000 | 0.0001 | 0.0001 | 0.9636 | -0.0066 | - |
0.8780 | 263500 | - | 0.0001 | 0.9640 | -0.0066 | - |
0.8796 | 264000 | 0.0001 | 0.0001 | 0.9642 | -0.0066 | - |
0.8813 | 264500 | - | 0.0001 | 0.9640 | -0.0066 | - |
0.8830 | 265000 | 0.0001 | 0.0001 | 0.9642 | -0.0066 | - |
0.8846 | 265500 | - | 0.0001 | 0.9645 | -0.0066 | - |
0.8863 | 266000 | 0.0001 | 0.0001 | 0.9637 | -0.0066 | - |
0.8880 | 266500 | - | 0.0001 | 0.9640 | -0.0066 | - |
0.8896 | 267000 | 0.0001 | 0.0001 | 0.9643 | -0.0065 | - |
0.8913 | 267500 | - | 0.0001 | 0.9641 | -0.0065 | - |
0.8930 | 268000 | 0.0001 | 0.0001 | 0.9639 | -0.0065 | - |
0.8946 | 268500 | - | 0.0001 | 0.9642 | -0.0065 | - |
0.8963 | 269000 | 0.0001 | 0.0001 | 0.9642 | -0.0065 | - |
0.8980 | 269500 | - | 0.0001 | 0.9640 | -0.0065 | - |
0.8996 | 270000 | 0.0001 | 0.0001 | 0.9642 | -0.0065 | - |
0.9013 | 270500 | - | 0.0001 | 0.9639 | -0.0065 | - |
0.9030 | 271000 | 0.0001 | 0.0001 | 0.9641 | -0.0065 | - |
0.9046 | 271500 | - | 0.0001 | 0.9640 | -0.0065 | - |
0.9063 | 272000 | 0.0001 | 0.0001 | 0.9643 | -0.0065 | - |
0.9080 | 272500 | - | 0.0001 | 0.9645 | -0.0065 | - |
0.9096 | 273000 | 0.0001 | 0.0001 | 0.9645 | -0.0065 | - |
0.9113 | 273500 | - | 0.0001 | 0.9645 | -0.0065 | - |
0.9130 | 274000 | 0.0001 | 0.0001 | 0.9643 | -0.0065 | - |
0.9146 | 274500 | - | 0.0001 | 0.9645 | -0.0065 | - |
0.9163 | 275000 | 0.0001 | 0.0001 | 0.9642 | -0.0065 | - |
0.9180 | 275500 | - | 0.0001 | 0.9645 | -0.0065 | - |
0.9196 | 276000 | 0.0001 | 0.0001 | 0.9643 | -0.0065 | - |
0.9213 | 276500 | - | 0.0001 | 0.9643 | -0.0065 | - |
0.9230 | 277000 | 0.0001 | 0.0001 | 0.9644 | -0.0065 | - |
0.9246 | 277500 | - | 0.0001 | 0.9643 | -0.0065 | - |
0.9263 | 278000 | 0.0001 | 0.0001 | 0.9644 | -0.0065 | - |
0.9280 | 278500 | - | 0.0001 | 0.9646 | -0.0065 | - |
0.9296 | 279000 | 0.0001 | 0.0001 | 0.9643 | -0.0065 | - |
0.9313 | 279500 | - | 0.0001 | 0.9644 | -0.0065 | - |
0.9330 | 280000 | 0.0001 | 0.0001 | 0.9643 | -0.0065 | - |
0.9346 | 280500 | - | 0.0001 | 0.9644 | -0.0065 | - |
0.9363 | 281000 | 0.0001 | 0.0001 | 0.9643 | -0.0065 | - |
0.9379 | 281500 | - | 0.0001 | 0.9645 | -0.0065 | - |
0.9396 | 282000 | 0.0001 | 0.0001 | 0.9643 | -0.0065 | - |
0.9413 | 282500 | - | 0.0001 | 0.9643 | -0.0065 | - |
0.9429 | 283000 | 0.0001 | 0.0001 | 0.9646 | -0.0065 | - |
0.9446 | 283500 | - | 0.0001 | 0.9644 | -0.0064 | - |
0.9463 | 284000 | 0.0001 | 0.0001 | 0.9646 | -0.0065 | - |
0.9479 | 284500 | - | 0.0001 | 0.9648 | -0.0064 | - |
0.9496 | 285000 | 0.0001 | 0.0001 | 0.9650 | -0.0064 | - |
0.9513 | 285500 | - | 0.0001 | 0.9647 | -0.0064 | - |
0.9529 | 286000 | 0.0001 | 0.0001 | 0.9648 | -0.0064 | - |
0.9546 | 286500 | - | 0.0001 | 0.9645 | -0.0064 | - |
0.9563 | 287000 | 0.0001 | 0.0001 | 0.9646 | -0.0064 | - |
0.9579 | 287500 | - | 0.0001 | 0.9647 | -0.0064 | - |
0.9596 | 288000 | 0.0001 | 0.0001 | 0.9648 | -0.0064 | - |
0.9613 | 288500 | - | 0.0001 | 0.9647 | -0.0064 | - |
0.9629 | 289000 | 0.0001 | 0.0001 | 0.9647 | -0.0064 | - |
0.9646 | 289500 | - | 0.0001 | 0.9647 | -0.0064 | - |
0.9663 | 290000 | 0.0001 | 0.0001 | 0.9649 | -0.0064 | - |
0.9679 | 290500 | - | 0.0001 | 0.9648 | -0.0064 | - |
0.9696 | 291000 | 0.0001 | 0.0001 | 0.9648 | -0.0064 | - |
0.9713 | 291500 | - | 0.0001 | 0.9649 | -0.0064 | - |
0.9729 | 292000 | 0.0001 | 0.0001 | 0.9648 | -0.0064 | - |
0.9746 | 292500 | - | 0.0001 | 0.9649 | -0.0064 | - |
0.9763 | 293000 | 0.0001 | 0.0001 | 0.9648 | -0.0064 | - |
0.9779 | 293500 | - | 0.0001 | 0.9648 | -0.0064 | - |
0.9796 | 294000 | 0.0001 | 0.0001 | 0.9650 | -0.0064 | - |
0.9813 | 294500 | - | 0.0001 | 0.9650 | -0.0064 | - |
0.9829 | 295000 | 0.0001 | 0.0001 | 0.9649 | -0.0064 | - |
0.9846 | 295500 | - | 0.0001 | 0.9650 | -0.0064 | - |
0.9863 | 296000 | 0.0001 | 0.0001 | 0.9649 | -0.0064 | - |
0.9879 | 296500 | - | 0.0001 | 0.9650 | -0.0064 | - |
0.9896 | 297000 | 0.0001 | 0.0001 | 0.9650 | -0.0064 | - |
0.9913 | 297500 | - | 0.0001 | 0.9651 | -0.0064 | - |
0.9929 | 298000 | 0.0001 | 0.0001 | 0.9650 | -0.0064 | - |
0.9946 | 298500 | - | 0.0001 | 0.9650 | -0.0064 | - |
0.9963 | 299000 | 0.0001 | 0.0001 | 0.9651 | -0.0064 | - |
0.9979 | 299500 | - | 0.0001 | 0.9650 | -0.0064 | - |
0.9996 | 300000 | 0.0001 | 0.0001 | 0.9650 | -0.0064 | - |
1.0 | 300123 | - | - | - | - | 0.9651 |
- The bold row denotes the saved checkpoint.
Framework Versions
- Python: 3.12.4
- Sentence Transformers: 3.3.1
- Transformers: 4.48.0
- PyTorch: 2.4.1+cu121
- Accelerate: 1.0.1
- Datasets: 2.19.0
- Tokenizers: 0.21.0
Citation
BibTeX
Sentence Transformers
@inproceedings{reimers-2019-sentence-bert,
title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
author = "Reimers, Nils and Gurevych, Iryna",
booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
month = "11",
year = "2019",
publisher = "Association for Computational Linguistics",
url = "https://arxiv.org/abs/1908.10084",
}
MSELoss
@inproceedings{reimers-2020-multilingual-sentence-bert,
title = "Making Monolingual Sentence Embeddings Multilingual using Knowledge Distillation",
author = "Reimers, Nils and Gurevych, Iryna",
booktitle = "Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing",
month = "11",
year = "2020",
publisher = "Association for Computational Linguistics",
url = "https://arxiv.org/abs/2004.09813",
}