SentenceTransformer
This is a sentence-transformers model trained on the parquet dataset. It maps sentences & paragraphs to a 512-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.
Model Details
Model Description
- Model Type: Sentence Transformer
- Maximum Sequence Length: 512 tokens
- Output Dimensionality: 512 dimensions
- Similarity Function: Cosine Similarity
- Training Dataset:
- parquet
Model Sources
- Documentation: Sentence Transformers Documentation
- Repository: Sentence Transformers on GitHub
- Hugging Face: Sentence Transformers on Hugging Face
Full Model Architecture
SentenceTransformer(
(0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: BertModel
(1): Pooling({'word_embedding_dimension': 512, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
)
Usage
Direct Usage (Sentence Transformers)
First install the Sentence Transformers library:
pip install -U sentence-transformers
Then you can load this model and run inference.
from sentence_transformers import SentenceTransformer
# Download from the 🤗 Hub
model = SentenceTransformer("pankajrajdeo/Bioformer-8L-UMLS-Pubmed_PMC-Forward_TCE-Epoch-3")
# Run inference
sentences = [
'De Garengeot Hernia, an acute appendicitis in the right femoral hernia canal, and successful management with transabdominal closure and appendectomy: a case Report.',
'Incarceration of the appendix within a femoral hernia is a rare condition of abdominal wall hernia about 0.1 to 0.5% in reported femoral hernia. We report a case of a 56-year-old female whose appendix was trapped in the right femoral canal. There are few reports in the literature on entrapment of the appendix within a femoral hernia. The management of this condition includes antibiotics, drainage appendectomy, hernioplasty and mesh repair.',
"With the increasing population worldwide more wastewater is created by human activities and discharged into the waterbodies. This is causing the contamination of aquatic bodies, thus disturbing the marine ecosystems. The rising population is also posing a challenge to meet the demands of fresh drinking water in the water-scarce regions of the world, where drinking water is made available to people by desalination process. The fouling of composite membranes remains a major challenge in water desalination. In this innovative study, we present a novel probabilistic approach to analyse and anticipate the predominant fouling mechanisms in the filtration process. Our establishment of a robust theoretical framework hinges upon the utilization of both the geometric law and the Hermia model, elucidating the concept of resistance in series (RIS). By manipulating the transmembrane pressure, we demonstrate effective management of permeate flux rate and overall product quality. Our investigations reveal a decrease in permeate flux in three distinct phases over time, with the final stage marked by a significant reduction due to the accumulation of a denser cake layer. Additionally, an increase in transmembrane pressure leads to a correlative rise in permeate flux, while also exerting negative effects such as membrane ruptures. Our study highlights the minimal immediate impact of the intermediate blocking mechanism (n = 1) on permeate flux, necessitating continuous monitoring for potential long-term effects. Additionally, we note a reduced membrane selectivity across all three fouling types (n = 0, n = 1.5, n = 2). Ultimately, our findings indicate that the membrane undergoes complete fouling with a probability of P = 0.9 in the presence of all three fouling mechanisms. This situation renders the membrane unable to produce water at its previous flow rate, resulting in a significant reduction in the desalination plant's productivity. I have demonstrated that higher pressure values notably correlate with increased permeate flux across all four membrane types. This correlation highlights the significant role of TMP in enhancing the production rate of purified water or desired substances through membrane filtration systems. Our innovative approach opens new perspectives for water desalination management and optimization, providing crucial insights into fouling mechanisms and proposing potential strategies to address associated challenges.",
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 512]
# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities.shape)
# [3, 3]
Training Details
Training Dataset
parquet
- Dataset: parquet
- Size: 33,870,508 training samples
- Columns:
anchor
andpositive
- Approximate statistics based on the first 1000 samples:
anchor positive type string string details - min: 5 tokens
- mean: 36.24 tokens
- max: 106 tokens
- min: 30 tokens
- mean: 282.68 tokens
- max: 512 tokens
- Samples:
anchor positive How TO OBTAIN THE BRAIN OF THE CAT.
How to obtain the Brain of the Cat, (Wilder).-Correction: Page 158, second column, line 7, "grains," should be "grams;" page 159, near middle of 2nd column, "successily," should be "successively;" page 161, the number of Flower's paper is 3.
ADDRESS OF COL. GARRICK MALLERY, U. S. ARMY.
It may be conceded that after man had all his present faculties, he did not choose between the adoption of voice and gesture, and never with those faculties, was in a state where the one was used, to the absolute exclusion of the other. The epoch, however, to which our speculations relate is that in which he had not reached the present symmetric development of his intellect and of his bodily organs, and the inquiry is: Which mode of communication was earliest adopted to his single wants and informed intelligence? With the voice he could imitate distinictively but few sounds of nature, while with gesture he could exhibit actions, motions, positions, forms, dimensions, directions and distances, with their derivations and analogues. It would seem from this unequal division of capacity that oral speech remained rudimentary long after gesture had become an efficient mode of communication. With due allowance for all purely imitative sounds, and for the spontaneous action of vocal organs unde...
DOLBEAR ON THE NATURE AND CONSTITUTION OF MATTER.
Mr. Dopp desires to make the following correction in his paper in the last issue: "In my article on page 200 of "Science", the expression and should have been and being the velocity of light.
- Loss:
MultipleNegativesRankingLoss
with these parameters:{ "scale": 20.0, "similarity_fct": "cos_sim" }
Evaluation Dataset
parquet
- Dataset: parquet
- Size: 33,870,508 evaluation samples
- Columns:
anchor
andpositive
- Approximate statistics based on the first 1000 samples:
anchor positive type string string details - min: 6 tokens
- mean: 24.64 tokens
- max: 64 tokens
- min: 9 tokens
- mean: 279.56 tokens
- max: 512 tokens
- Samples:
anchor positive Noticing education campaigns or public health messages about vaping among youth in the United States, Canada and England from 2018 to 2022.
Public health campaigns have the potential to correct vaping misperceptions. However, campaigns highlighting vaping harms to youth may increase misperceptions that vaping is equally/more harmful than smoking. Vaping campaigns have been implemented in the United States and Canada since 2018 and in England since 2017 but with differing focus: youth vaping prevention. Over half of youth reported noticing vaping campaigns, and noticing increased from August 2018 to February 2020. Consistent with implementation of youth vaping prevention campaigns in the United States and Canada, most youth reported noticing vaping campaigns/messages, and most were perceived to negatively portray vaping.
Comprehensive performance evaluation of six bioaerosol samplers based on an aerosol wind tunnel.
Choosing a suitable bioaerosol sampler for atmospheric microbial monitoring has been a challenge to researchers interested in environmental microbiology, especially during a pandemic. However, a comprehensive and integrated evaluation method to fully assess bioaerosol sampler performance is still lacking. Herein, we constructed a customized wind tunnel operated at 2-20 km/h wind speed to systematically and efficiently evaluate the performance of six frequently used samplers, where various aerosols, including Arizona test dust, bacterial spores, gram-positive and gram-negative bacteria, phages, and viruses, were generated. After 10 or 60 min of sampling, the physical and biological sampling efficiency and short or long-term sampling capabilities were determined by performing aerodynamic particle size analysis, live microbial culturing, and a qPCR assay. The results showed that AGI-30 and BioSampler impingers have good physical and biological sampling efficiencies for short-term sampling...
The occurrence, sources, and health risks of substituted polycyclic aromatic hydrocarbons (SPAHs) cannot be ignored.
Similar to parent polycyclic aromatic hydrocarbons (PPAHs), substituted PAHs (SPAHs) are prevalent in the environment and harmful to humans. However, they have not received much attention. This study investigated the occurrence, distribution, and sources of 10 PPAHs and 15 SPAHs in soil, water, and indoor and outdoor PM2.5 and dust in high-exposure areas (EAH) near industrial parks and low-exposure areas (EAL) far from industrial parks. PAH pollution in all media was more severe in the EAH than in the EAL. All SPAHs were detected in this study, with alkylated and oxygenated PAHs being predominant. Additionally, 3-OH-BaP and 1-OH-Pyr were detected in all dust samples in this study, and 6-N-Chr, a compound with carcinogenicity 10 times higher than that of BaP, was detected at high levels in all tap water samples. According to the indoor-outdoor ratio, PAHs in indoor PM2.5 in the EAH mainly originated from indoor pollution sources; however, those in the EAL were simultaneously affected by...
- Loss:
MultipleNegativesRankingLoss
with these parameters:{ "scale": 20.0, "similarity_fct": "cos_sim" }
Training Hyperparameters
Non-Default Hyperparameters
eval_strategy
: stepsper_device_train_batch_size
: 128learning_rate
: 2e-05max_steps
: 754146log_level
: infofp16
: Truedataloader_num_workers
: 16load_best_model_at_end
: Trueresume_from_checkpoint
: True
All Hyperparameters
Click to expand
overwrite_output_dir
: Falsedo_predict
: Falseeval_strategy
: stepsprediction_loss_only
: Trueper_device_train_batch_size
: 128per_device_eval_batch_size
: 8per_gpu_train_batch_size
: Noneper_gpu_eval_batch_size
: Nonegradient_accumulation_steps
: 1eval_accumulation_steps
: Nonetorch_empty_cache_steps
: Nonelearning_rate
: 2e-05weight_decay
: 0.0adam_beta1
: 0.9adam_beta2
: 0.999adam_epsilon
: 1e-08max_grad_norm
: 1.0num_train_epochs
: 3max_steps
: 754146lr_scheduler_type
: linearlr_scheduler_kwargs
: {}warmup_ratio
: 0.0warmup_steps
: 0log_level
: infolog_level_replica
: warninglog_on_each_node
: Truelogging_nan_inf_filter
: Truesave_safetensors
: Truesave_on_each_node
: Falsesave_only_model
: Falserestore_callback_states_from_checkpoint
: Falseno_cuda
: Falseuse_cpu
: Falseuse_mps_device
: Falseseed
: 42data_seed
: Nonejit_mode_eval
: Falseuse_ipex
: Falsebf16
: Falsefp16
: Truefp16_opt_level
: O1half_precision_backend
: autobf16_full_eval
: Falsefp16_full_eval
: Falsetf32
: Nonelocal_rank
: 0ddp_backend
: Nonetpu_num_cores
: Nonetpu_metrics_debug
: Falsedebug
: []dataloader_drop_last
: Falsedataloader_num_workers
: 16dataloader_prefetch_factor
: Nonepast_index
: -1disable_tqdm
: Falseremove_unused_columns
: Truelabel_names
: Noneload_best_model_at_end
: Trueignore_data_skip
: Falsefsdp
: []fsdp_min_num_params
: 0fsdp_config
: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}fsdp_transformer_layer_cls_to_wrap
: Noneaccelerator_config
: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}deepspeed
: Nonelabel_smoothing_factor
: 0.0optim
: adamw_torchoptim_args
: Noneadafactor
: Falsegroup_by_length
: Falselength_column_name
: lengthddp_find_unused_parameters
: Noneddp_bucket_cap_mb
: Noneddp_broadcast_buffers
: Falsedataloader_pin_memory
: Truedataloader_persistent_workers
: Falseskip_memory_metrics
: Trueuse_legacy_prediction_loop
: Falsepush_to_hub
: Falseresume_from_checkpoint
: Truehub_model_id
: Nonehub_strategy
: every_savehub_private_repo
: Nonehub_always_push
: Falsegradient_checkpointing
: Falsegradient_checkpointing_kwargs
: Noneinclude_inputs_for_metrics
: Falseinclude_for_metrics
: []eval_do_concat_batches
: Truefp16_backend
: autopush_to_hub_model_id
: Nonepush_to_hub_organization
: Nonemp_parameters
:auto_find_batch_size
: Falsefull_determinism
: Falsetorchdynamo
: Noneray_scope
: lastddp_timeout
: 1800torch_compile
: Falsetorch_compile_backend
: Nonetorch_compile_mode
: Nonedispatch_batches
: Nonesplit_batches
: Noneinclude_tokens_per_second
: Falseinclude_num_input_tokens_seen
: Falseneftune_noise_alpha
: Noneoptim_target_modules
: Nonebatch_eval_metrics
: Falseeval_on_start
: Falseuse_liger_kernel
: Falseeval_use_gather_object
: Falseaverage_tokens_across_devices
: Falseprompts
: Nonebatch_sampler
: batch_samplermulti_dataset_batch_sampler
: proportional
Training Logs
Click to expand
Epoch | Step | Training Loss | Validation Loss |
---|---|---|---|
0.0000 | 1 | 4.0337 | - |
0.0040 | 1000 | 1.0731 | - |
0.0080 | 2000 | 0.2694 | - |
0.0119 | 3000 | 0.204 | - |
0.0159 | 4000 | 0.2225 | - |
0.0199 | 5000 | 0.1825 | - |
0.0239 | 6000 | 0.1539 | - |
0.0278 | 7000 | 0.1408 | - |
0.0318 | 8000 | 0.1364 | - |
0.0358 | 9000 | 0.1276 | - |
0.0398 | 10000 | 0.1185 | - |
0.0438 | 11000 | 0.1028 | - |
0.0477 | 12000 | 0.1086 | - |
0.0517 | 13000 | 0.1172 | - |
0.0557 | 14000 | 0.1129 | - |
0.0597 | 15000 | 0.1063 | - |
0.0636 | 16000 | 0.1774 | - |
0.0676 | 17000 | 0.0872 | - |
0.0716 | 18000 | 0.1404 | - |
0.0756 | 19000 | 0.0798 | - |
0.0796 | 20000 | 0.1644 | - |
0.0835 | 21000 | 0.0848 | - |
0.0875 | 22000 | 0.1551 | - |
0.0915 | 23000 | 0.0884 | - |
0.0955 | 24000 | 0.0923 | - |
0.0994 | 25000 | 0.0754 | - |
0.1034 | 26000 | 0.1395 | - |
0.1074 | 27000 | 0.0813 | - |
0.1114 | 28000 | 0.1574 | - |
0.1154 | 29000 | 0.09 | - |
0.1193 | 30000 | 0.1319 | - |
0.1233 | 31000 | 0.116 | - |
0.1273 | 32000 | 0.0806 | - |
0.1313 | 33000 | 0.2095 | - |
0.1353 | 34000 | 0.0716 | - |
0.1392 | 35000 | 0.0909 | - |
0.1432 | 36000 | 0.1333 | - |
0.1472 | 37000 | 0.067 | - |
0.1512 | 38000 | 0.1183 | - |
0.1551 | 39000 | 0.0739 | - |
0.1591 | 40000 | 0.0662 | - |
0.1631 | 41000 | 0.1371 | - |
0.1671 | 42000 | 0.0913 | - |
0.1711 | 43000 | 0.0867 | - |
0.1750 | 44000 | 0.1184 | - |
0.1790 | 45000 | 0.0913 | - |
0.1830 | 46000 | 0.0857 | - |
0.1870 | 47000 | 0.1223 | - |
0.1909 | 48000 | 0.0731 | - |
0.1949 | 49000 | 0.1028 | - |
0.1989 | 50000 | 0.1107 | - |
0.2029 | 51000 | 0.0726 | - |
0.2069 | 52000 | 0.076 | - |
0.2108 | 53000 | 0.0923 | - |
0.2148 | 54000 | 0.0896 | - |
0.2188 | 55000 | 0.0755 | - |
0.2228 | 56000 | 0.0627 | - |
0.2267 | 57000 | 0.0837 | - |
0.2307 | 58000 | 0.0732 | - |
0.2347 | 59000 | 0.0655 | - |
0.2387 | 60000 | 0.0653 | - |
0.2427 | 61000 | 0.0845 | - |
0.2466 | 62000 | 0.0568 | - |
0.2506 | 63000 | 0.0534 | - |
0.2546 | 64000 | 0.0723 | - |
0.2586 | 65000 | 0.0873 | - |
0.2625 | 66000 | 0.0615 | - |
0.2665 | 67000 | 0.0598 | - |
0.2705 | 68000 | 0.0573 | - |
0.2745 | 69000 | 0.062 | - |
0.2785 | 70000 | 0.1152 | - |
0.2824 | 71000 | 0.0945 | - |
0.2864 | 72000 | 0.0853 | - |
0.2904 | 73000 | 0.0457 | - |
0.2944 | 74000 | 0.0604 | - |
0.2983 | 75000 | 0.1008 | - |
0.3023 | 76000 | 0.0564 | - |
0.3063 | 77000 | 0.1009 | - |
0.3103 | 78000 | 0.0531 | - |
0.3143 | 79000 | 0.0966 | - |
0.3182 | 80000 | 0.0991 | - |
0.3222 | 81000 | 0.0617 | - |
0.3262 | 82000 | 0.0685 | - |
0.3302 | 83000 | 0.0895 | - |
0.3342 | 84000 | 0.0424 | - |
0.3381 | 85000 | 0.0646 | - |
0.3421 | 86000 | 0.0796 | - |
0.3461 | 87000 | 0.0599 | - |
0.3501 | 88000 | 0.1033 | - |
0.3540 | 89000 | 0.0475 | - |
0.3580 | 90000 | 0.0366 | - |
0.3620 | 91000 | 0.0402 | - |
0.3660 | 92000 | 0.0587 | - |
0.3700 | 93000 | 0.0661 | - |
0.3739 | 94000 | 0.077 | - |
0.3779 | 95000 | 0.0906 | - |
0.3819 | 96000 | 0.05 | - |
0.3859 | 97000 | 0.0505 | - |
0.3898 | 98000 | 0.0413 | - |
0.3938 | 99000 | 0.038 | - |
0.3978 | 100000 | 0.0478 | - |
0.4018 | 101000 | 0.073 | - |
0.4058 | 102000 | 0.0527 | - |
0.4097 | 103000 | 0.0351 | - |
0.4137 | 104000 | 0.0377 | - |
0.4177 | 105000 | 0.0347 | - |
0.4217 | 106000 | 0.0431 | - |
0.4256 | 107000 | 0.0613 | - |
0.4296 | 108000 | 0.0825 | - |
0.4336 | 109000 | 0.0546 | - |
0.4376 | 110000 | 0.0335 | - |
0.4416 | 111000 | 0.0232 | - |
0.4455 | 112000 | 0.0525 | - |
0.4495 | 113000 | 0.0473 | - |
0.4535 | 114000 | 0.0342 | - |
0.4575 | 115000 | 0.0346 | - |
0.4614 | 116000 | 0.0279 | - |
0.4654 | 117000 | 0.034 | - |
0.4694 | 118000 | 0.0778 | - |
0.4734 | 119000 | 0.0788 | - |
0.4774 | 120000 | 0.0703 | - |
0.4813 | 121000 | 0.0708 | - |
0.4853 | 122000 | 0.0393 | - |
0.4893 | 123000 | 0.037 | - |
0.4933 | 124000 | 0.0426 | - |
0.4972 | 125000 | 0.0335 | - |
0.5012 | 126000 | 0.0317 | - |
0.5052 | 127000 | 0.0406 | - |
0.5092 | 128000 | 0.0302 | - |
0.5132 | 129000 | 0.0284 | - |
0.5171 | 130000 | 0.0416 | - |
0.5211 | 131000 | 0.065 | - |
0.5251 | 132000 | 0.0402 | - |
0.5291 | 133000 | 0.0348 | - |
0.5331 | 134000 | 0.033 | - |
0.5370 | 135000 | 0.0485 | - |
0.5410 | 136000 | 0.0364 | - |
0.5450 | 137000 | 0.0686 | - |
0.5490 | 138000 | 0.0648 | - |
0.5529 | 139000 | 0.0652 | - |
0.5569 | 140000 | 0.0626 | - |
0.5609 | 141000 | 0.0684 | - |
0.5649 | 142000 | 0.0482 | - |
0.5689 | 143000 | 0.0517 | - |
0.5728 | 144000 | 0.0389 | - |
0.5768 | 145000 | 0.0435 | - |
0.5808 | 146000 | 0.044 | - |
0.5848 | 147000 | 0.03 | - |
0.5887 | 148000 | 0.0254 | - |
0.5927 | 149000 | 0.0268 | - |
0.5967 | 150000 | 0.0409 | - |
0.6007 | 151000 | 0.0401 | - |
0.6047 | 152000 | 0.0317 | - |
0.6086 | 153000 | 0.0309 | - |
0.6126 | 154000 | 0.0389 | - |
0.6166 | 155000 | 0.0368 | - |
0.6206 | 156000 | 0.0434 | - |
0.6245 | 157000 | 0.0469 | - |
0.6285 | 158000 | 0.0734 | - |
0.6325 | 159000 | 0.0544 | - |
0.6365 | 160000 | 0.0498 | - |
0.6405 | 161000 | 0.0256 | - |
0.6444 | 162000 | 0.0302 | - |
0.6484 | 163000 | 0.0398 | - |
0.6524 | 164000 | 0.0657 | - |
0.6564 | 165000 | 0.0548 | - |
0.6603 | 166000 | 0.0638 | - |
0.6643 | 167000 | 0.0288 | - |
0.6683 | 168000 | 0.0273 | - |
0.6723 | 169000 | 0.0323 | - |
0.6763 | 170000 | 0.045 | - |
0.6802 | 171000 | 0.0416 | - |
0.6842 | 172000 | 0.0281 | - |
0.6882 | 173000 | 0.0554 | - |
0.6922 | 174000 | 0.0435 | - |
0.6961 | 175000 | 0.0375 | - |
0.7001 | 176000 | 0.0354 | - |
0.7041 | 177000 | 0.038 | - |
0.7081 | 178000 | 0.0319 | - |
0.7121 | 179000 | 0.0329 | - |
0.7160 | 180000 | 0.0492 | - |
0.7200 | 181000 | 0.0545 | - |
0.7240 | 182000 | 0.051 | - |
0.7280 | 183000 | 0.045 | - |
0.7320 | 184000 | 0.0342 | - |
0.7359 | 185000 | 0.0237 | - |
0.7399 | 186000 | 0.0369 | - |
0.7439 | 187000 | 0.0437 | - |
0.7479 | 188000 | 0.0467 | - |
0.7518 | 189000 | 0.0424 | - |
0.7558 | 190000 | 0.0458 | - |
0.7598 | 191000 | 0.0434 | - |
0.7638 | 192000 | 0.0471 | - |
0.7678 | 193000 | 0.0404 | - |
0.7717 | 194000 | 0.0373 | - |
0.7757 | 195000 | 0.0254 | - |
0.7797 | 196000 | 0.0235 | - |
0.7837 | 197000 | 0.0212 | - |
0.7876 | 198000 | 0.0236 | - |
0.7916 | 199000 | 0.0226 | - |
0.7956 | 200000 | 0.0208 | - |
0.7996 | 201000 | 0.0276 | - |
0.8036 | 202000 | 0.0235 | - |
0.8075 | 203000 | 0.0358 | - |
0.8115 | 204000 | 0.0451 | - |
0.8155 | 205000 | 0.0442 | - |
0.8195 | 206000 | 0.0411 | - |
0.8234 | 207000 | 0.0447 | - |
0.8274 | 208000 | 0.0427 | - |
0.8314 | 209000 | 0.0259 | - |
0.8354 | 210000 | 0.0219 | - |
0.8394 | 211000 | 0.0253 | - |
0.8433 | 212000 | 0.0253 | - |
0.8473 | 213000 | 0.0242 | - |
0.8513 | 214000 | 0.0251 | - |
0.8553 | 215000 | 0.0223 | - |
0.8592 | 216000 | 0.0253 | - |
0.8632 | 217000 | 0.024 | - |
0.8672 | 218000 | 0.0249 | - |
0.8712 | 219000 | 0.0234 | - |
0.8752 | 220000 | 0.0232 | - |
0.8791 | 221000 | 0.0231 | - |
0.8831 | 222000 | 0.0264 | - |
0.8871 | 223000 | 0.051 | - |
0.8911 | 224000 | 0.0436 | - |
0.8950 | 225000 | 0.0429 | - |
0.8990 | 226000 | 0.0406 | - |
0.9030 | 227000 | 0.0363 | - |
0.9070 | 228000 | 0.0316 | - |
0.9110 | 229000 | 0.023 | - |
0.9149 | 230000 | 0.0239 | - |
0.9189 | 231000 | 0.0225 | - |
0.9229 | 232000 | 0.0223 | - |
0.9269 | 233000 | 0.0225 | - |
0.9309 | 234000 | 0.0223 | - |
0.9348 | 235000 | 0.0191 | - |
0.9388 | 236000 | 0.0202 | - |
0.9428 | 237000 | 0.0149 | - |
0.9468 | 238000 | 0.0215 | - |
0.9507 | 239000 | 0.012 | - |
0.9547 | 240000 | 0.0141 | - |
0.9587 | 241000 | 0.0152 | - |
0.9627 | 242000 | 0.0311 | - |
0.9667 | 243000 | 0.0344 | - |
0.9706 | 244000 | 0.0209 | - |
0.9746 | 245000 | 0.0145 | - |
0.9786 | 246000 | 0.0128 | - |
0.9826 | 247000 | 0.0124 | - |
0.9865 | 248000 | 0.0123 | - |
0.9905 | 249000 | 0.0141 | - |
0.9945 | 250000 | 0.0115 | - |
0.9985 | 251000 | 0.0131 | - |
1.0000 | 251382 | - | 0.0022 |
1.0025 | 252000 | 0.3338 | - |
1.0064 | 253000 | 0.1293 | - |
1.0104 | 254000 | 0.0893 | - |
1.0144 | 255000 | 0.0811 | - |
1.0184 | 256000 | 0.0863 | - |
1.0223 | 257000 | 0.0872 | - |
1.0263 | 258000 | 0.0693 | - |
1.0303 | 259000 | 0.0643 | - |
1.0343 | 260000 | 0.0638 | - |
1.0383 | 261000 | 0.0511 | - |
1.0422 | 262000 | 0.0554 | - |
1.0462 | 263000 | 0.0643 | - |
1.0502 | 264000 | 0.0492 | - |
1.0542 | 265000 | 0.0583 | - |
1.0581 | 266000 | 0.0738 | - |
1.0621 | 267000 | 0.079 | - |
1.0661 | 268000 | 0.0823 | - |
1.0701 | 269000 | 0.0673 | - |
1.0741 | 270000 | 0.059 | - |
1.0780 | 271000 | 0.0946 | - |
1.0820 | 272000 | 0.0494 | - |
1.0860 | 273000 | 0.0958 | - |
1.0900 | 274000 | 0.0561 | - |
1.0939 | 275000 | 0.0517 | - |
1.0979 | 276000 | 0.0502 | - |
1.1019 | 277000 | 0.0874 | - |
1.1059 | 278000 | 0.0463 | - |
1.1099 | 279000 | 0.0472 | - |
1.1138 | 280000 | 0.1038 | - |
1.1178 | 281000 | 0.0542 | - |
1.1218 | 282000 | 0.1036 | - |
1.1258 | 283000 | 0.05 | - |
1.1298 | 284000 | 0.1052 | - |
1.1337 | 285000 | 0.073 | - |
1.1377 | 286000 | 0.041 | - |
1.1417 | 287000 | 0.0981 | - |
1.1457 | 288000 | 0.0415 | - |
1.1496 | 289000 | 0.0561 | - |
1.1536 | 290000 | 0.0612 | - |
1.1576 | 291000 | 0.0426 | - |
1.1616 | 292000 | 0.0711 | - |
1.1656 | 293000 | 0.0703 | - |
1.1695 | 294000 | 0.0508 | - |
1.1735 | 295000 | 0.0756 | - |
1.1775 | 296000 | 0.053 | - |
1.1815 | 297000 | 0.0581 | - |
1.1854 | 298000 | 0.0793 | - |
1.1894 | 299000 | 0.0476 | - |
1.1934 | 300000 | 0.0601 | - |
1.1974 | 301000 | 0.0713 | - |
1.2014 | 302000 | 0.0501 | - |
1.2053 | 303000 | 0.0457 | - |
1.2093 | 304000 | 0.0471 | - |
1.2133 | 305000 | 0.0682 | - |
1.2173 | 306000 | 0.0544 | - |
1.2212 | 307000 | 0.0359 | - |
1.2252 | 308000 | 0.047 | - |
1.2292 | 309000 | 0.0527 | - |
1.2332 | 310000 | 0.0437 | - |
1.2372 | 311000 | 0.0384 | - |
1.2411 | 312000 | 0.0508 | - |
1.2451 | 313000 | 0.0383 | - |
1.2491 | 314000 | 0.0376 | - |
1.2531 | 315000 | 0.0376 | - |
1.2570 | 316000 | 0.0441 | - |
1.2610 | 317000 | 0.0557 | - |
1.2650 | 318000 | 0.0384 | - |
1.2690 | 319000 | 0.0295 | - |
1.2730 | 320000 | 0.0409 | - |
1.2769 | 321000 | 0.0671 | - |
1.2809 | 322000 | 0.0603 | - |
1.2849 | 323000 | 0.0668 | - |
1.2889 | 324000 | 0.026 | - |
1.2928 | 325000 | 0.0264 | - |
1.2968 | 326000 | 0.056 | - |
1.3008 | 327000 | 0.0399 | - |
1.3048 | 328000 | 0.074 | - |
1.3088 | 329000 | 0.0364 | - |
1.3127 | 330000 | 0.0306 | - |
1.3167 | 331000 | 0.0727 | - |
1.3207 | 332000 | 0.0325 | - |
1.3247 | 333000 | 0.0439 | - |
1.3286 | 334000 | 0.0563 | - |
1.3326 | 335000 | 0.0274 | - |
1.3366 | 336000 | 0.0274 | - |
1.3406 | 337000 | 0.0721 | - |
1.3446 | 338000 | 0.0286 | - |
1.3485 | 339000 | 0.0558 | - |
1.3525 | 340000 | 0.0452 | - |
1.3565 | 341000 | 0.0201 | - |
1.3605 | 342000 | 0.0275 | - |
1.3645 | 343000 | 0.0189 | - |
1.3684 | 344000 | 0.0543 | - |
1.3724 | 345000 | 0.0469 | - |
1.3764 | 346000 | 0.067 | - |
1.3804 | 347000 | 0.0289 | - |
1.3843 | 348000 | 0.0366 | - |
1.3883 | 349000 | 0.026 | - |
1.3923 | 350000 | 0.0286 | - |
1.3963 | 351000 | 0.024 | - |
1.4003 | 352000 | 0.0403 | - |
1.4042 | 353000 | 0.0429 | - |
1.4082 | 354000 | 0.0216 | - |
1.4122 | 355000 | 0.027 | - |
1.4162 | 356000 | 0.0184 | - |
1.4201 | 357000 | 0.0266 | - |
1.4241 | 358000 | 0.0332 | - |
1.4281 | 359000 | 0.0427 | - |
1.4321 | 360000 | 0.0456 | - |
1.4361 | 361000 | 0.0229 | - |
1.4400 | 362000 | 0.0161 | - |
1.4440 | 363000 | 0.025 | - |
1.4480 | 364000 | 0.031 | - |
1.4520 | 365000 | 0.0256 | - |
1.4559 | 366000 | 0.0195 | - |
1.4599 | 367000 | 0.0224 | - |
1.4639 | 368000 | 0.0163 | - |
1.4679 | 369000 | 0.037 | - |
1.4719 | 370000 | 0.0471 | - |
1.4758 | 371000 | 0.0452 | - |
1.4798 | 372000 | 0.0448 | - |
1.4838 | 373000 | 0.034 | - |
1.4878 | 374000 | 0.022 | - |
1.4917 | 375000 | 0.0216 | - |
1.4957 | 376000 | 0.0247 | - |
1.4997 | 377000 | 0.0172 | - |
1.5037 | 378000 | 0.0218 | - |
1.5077 | 379000 | 0.023 | - |
1.5116 | 380000 | 0.0138 | - |
1.5156 | 381000 | 0.025 | - |
1.5196 | 382000 | 0.0361 | - |
1.5236 | 383000 | 0.0295 | - |
1.5275 | 384000 | 0.0257 | - |
1.5315 | 385000 | 0.0162 | - |
1.5355 | 386000 | 0.0283 | - |
1.5395 | 387000 | 0.0246 | - |
1.5435 | 388000 | 0.0379 | - |
1.5474 | 389000 | 0.0411 | - |
1.5514 | 390000 | 0.0424 | - |
1.5554 | 391000 | 0.0384 | - |
1.5594 | 392000 | 0.0458 | - |
1.5634 | 393000 | 0.039 | - |
1.5673 | 394000 | 0.0273 | - |
1.5713 | 395000 | 0.0338 | - |
1.5753 | 396000 | 0.0276 | - |
1.5793 | 397000 | 0.0277 | - |
1.5832 | 398000 | 0.0231 | - |
1.5872 | 399000 | 0.0168 | - |
1.5912 | 400000 | 0.0135 | - |
1.5952 | 401000 | 0.0262 | - |
1.5992 | 402000 | 0.0253 | - |
1.6031 | 403000 | 0.0235 | - |
1.6071 | 404000 | 0.0189 | - |
1.6111 | 405000 | 0.024 | - |
1.6151 | 406000 | 0.0244 | - |
1.6190 | 407000 | 0.0281 | - |
1.6230 | 408000 | 0.0223 | - |
1.6270 | 409000 | 0.0515 | - |
1.6310 | 410000 | 0.0374 | - |
1.6350 | 411000 | 0.0419 | - |
1.6389 | 412000 | 0.0151 | - |
1.6429 | 413000 | 0.0196 | - |
1.6469 | 414000 | 0.0164 | - |
1.6509 | 415000 | 0.0413 | - |
1.6548 | 416000 | 0.0375 | - |
1.6588 | 417000 | 0.0431 | - |
1.6628 | 418000 | 0.0287 | - |
1.6668 | 419000 | 0.0189 | - |
1.6708 | 420000 | 0.0175 | - |
1.6747 | 421000 | 0.0286 | - |
1.6787 | 422000 | 0.0291 | - |
1.6827 | 423000 | 0.0215 | - |
1.6867 | 424000 | 0.027 | - |
1.6906 | 425000 | 0.0366 | - |
1.6946 | 426000 | 0.0285 | - |
1.6986 | 427000 | 0.0216 | - |
1.7026 | 428000 | 0.0268 | - |
1.7066 | 429000 | 0.0255 | - |
1.7105 | 430000 | 0.0209 | - |
1.7145 | 431000 | 0.0257 | - |
1.7185 | 432000 | 0.0407 | - |
1.7225 | 433000 | 0.0349 | - |
1.7264 | 434000 | 0.0342 | - |
1.7304 | 435000 | 0.0235 | - |
1.7344 | 436000 | 0.0216 | - |
1.7384 | 437000 | 0.0201 | - |
1.7424 | 438000 | 0.0257 | - |
1.7463 | 439000 | 0.0269 | - |
1.7503 | 440000 | 0.0305 | - |
1.7543 | 441000 | 0.0319 | - |
1.7583 | 442000 | 0.0292 | - |
1.7623 | 443000 | 0.0311 | - |
1.7662 | 444000 | 0.0304 | - |
1.7702 | 445000 | 0.0263 | - |
1.7742 | 446000 | 0.0195 | - |
1.7782 | 447000 | 0.0162 | - |
1.7821 | 448000 | 0.0122 | - |
1.7861 | 449000 | 0.0156 | - |
1.7901 | 450000 | 0.0148 | - |
1.7941 | 451000 | 0.0126 | - |
1.7981 | 452000 | 0.0162 | - |
1.8020 | 453000 | 0.0154 | - |
1.8060 | 454000 | 0.0157 | - |
1.8100 | 455000 | 0.0321 | - |
1.8140 | 456000 | 0.0296 | - |
1.8179 | 457000 | 0.0296 | - |
1.8219 | 458000 | 0.0305 | - |
1.8259 | 459000 | 0.0285 | - |
1.8299 | 460000 | 0.0263 | - |
1.8339 | 461000 | 0.0116 | - |
1.8378 | 462000 | 0.0156 | - |
1.8418 | 463000 | 0.0172 | - |
1.8458 | 464000 | 0.0154 | - |
1.8498 | 465000 | 0.0162 | - |
1.8537 | 466000 | 0.0151 | - |
1.8577 | 467000 | 0.0162 | - |
1.8617 | 468000 | 0.0149 | - |
1.8657 | 469000 | 0.0157 | - |
1.8697 | 470000 | 0.0161 | - |
1.8736 | 471000 | 0.0159 | - |
1.8776 | 472000 | 0.0152 | - |
1.8816 | 473000 | 0.0148 | - |
1.8856 | 474000 | 0.0315 | - |
1.8895 | 475000 | 0.0282 | - |
1.8935 | 476000 | 0.0331 | - |
1.8975 | 477000 | 0.0284 | - |
1.9015 | 478000 | 0.0272 | - |
1.9055 | 479000 | 0.026 | - |
1.9094 | 480000 | 0.0159 | - |
1.9134 | 481000 | 0.0153 | - |
1.9174 | 482000 | 0.0158 | - |
1.9214 | 483000 | 0.014 | - |
1.9253 | 484000 | 0.0153 | - |
1.9293 | 485000 | 0.0146 | - |
1.9333 | 486000 | 0.0137 | - |
1.9373 | 487000 | 0.0129 | - |
1.9413 | 488000 | 0.0112 | - |
1.9452 | 489000 | 0.0139 | - |
1.9492 | 490000 | 0.0076 | - |
1.9532 | 491000 | 0.0074 | - |
1.9572 | 492000 | 0.0097 | - |
1.9612 | 493000 | 0.0136 | - |
1.9651 | 494000 | 0.0269 | - |
1.9691 | 495000 | 0.0192 | - |
1.9731 | 496000 | 0.0092 | - |
1.9771 | 497000 | 0.0086 | - |
1.9810 | 498000 | 0.0078 | - |
1.9850 | 499000 | 0.0073 | - |
1.9890 | 500000 | 0.0093 | - |
1.9930 | 501000 | 0.0071 | - |
1.9970 | 502000 | 0.0075 | - |
2.0000 | 502764 | - | 0.0018 |
2.0009 | 503000 | 0.1807 | - |
2.0049 | 504000 | 0.1676 | - |
2.0089 | 505000 | 0.0704 | - |
2.0129 | 506000 | 0.0733 | - |
2.0168 | 507000 | 0.0713 | - |
2.0208 | 508000 | 0.0633 | - |
2.0248 | 509000 | 0.0562 | - |
2.0288 | 510000 | 0.0521 | - |
2.0328 | 511000 | 0.0517 | - |
2.0367 | 512000 | 0.0516 | - |
2.0407 | 513000 | 0.047 | - |
2.0447 | 514000 | 0.0411 | - |
2.0487 | 515000 | 0.0445 | - |
2.0526 | 516000 | 0.0493 | - |
2.0566 | 517000 | 0.0618 | - |
2.0606 | 518000 | 0.0335 | - |
2.0646 | 519000 | 0.1001 | - |
2.0686 | 520000 | 0.0349 | - |
2.0725 | 521000 | 0.0675 | - |
2.0765 | 522000 | 0.0376 | - |
2.0805 | 523000 | 0.0838 | - |
2.0845 | 524000 | 0.0411 | - |
2.0884 | 525000 | 0.0798 | - |
2.0924 | 526000 | 0.0401 | - |
2.0964 | 527000 | 0.0455 | - |
2.1004 | 528000 | 0.0312 | - |
2.1044 | 529000 | 0.0778 | - |
2.1083 | 530000 | 0.0368 | - |
2.1123 | 531000 | 0.0872 | - |
2.1163 | 532000 | 0.0411 | - |
2.1203 | 533000 | 0.0906 | - |
2.1242 | 534000 | 0.0409 | - |
2.1282 | 535000 | 0.0523 | - |
2.1322 | 536000 | 0.0986 | - |
2.1362 | 537000 | 0.0332 | - |
2.1402 | 538000 | 0.0668 | - |
2.1441 | 539000 | 0.0472 | - |
2.1481 | 540000 | 0.035 | - |
2.1521 | 541000 | 0.0588 | - |
2.1561 | 542000 | 0.0349 | - |
2.1601 | 543000 | 0.036 | - |
2.1640 | 544000 | 0.0706 | - |
2.1680 | 545000 | 0.0458 | - |
2.1720 | 546000 | 0.0525 | - |
2.1760 | 547000 | 0.0489 | - |
2.1799 | 548000 | 0.0493 | - |
2.1839 | 549000 | 0.044 | - |
2.1879 | 550000 | 0.064 | - |
2.1919 | 551000 | 0.0423 | - |
2.1959 | 552000 | 0.0503 | - |
2.1998 | 553000 | 0.0562 | - |
2.2038 | 554000 | 0.0335 | - |
2.2078 | 555000 | 0.0394 | - |
2.2118 | 556000 | 0.0527 | - |
2.2157 | 557000 | 0.0452 | - |
2.2197 | 558000 | 0.0342 | - |
2.2237 | 559000 | 0.0314 | - |
2.2277 | 560000 | 0.0458 | - |
2.2317 | 561000 | 0.0343 | - |
2.2356 | 562000 | 0.0295 | - |
2.2396 | 563000 | 0.0341 | - |
2.2436 | 564000 | 0.04 | - |
2.2476 | 565000 | 0.0302 | - |
2.2515 | 566000 | 0.0266 | - |
2.2555 | 567000 | 0.0373 | - |
2.2595 | 568000 | 0.0447 | - |
2.2635 | 569000 | 0.03 | - |
2.2675 | 570000 | 0.0252 | - |
2.2714 | 571000 | 0.0313 | - |
2.2754 | 572000 | 0.0472 | - |
2.2794 | 573000 | 0.0495 | - |
2.2834 | 574000 | 0.0588 | - |
2.2873 | 575000 | 0.0249 | - |
2.2913 | 576000 | 0.0221 | - |
2.2953 | 577000 | 0.0303 | - |
2.2993 | 578000 | 0.0416 | - |
2.3033 | 579000 | 0.064 | - |
2.3072 | 580000 | 0.0227 | - |
2.3112 | 581000 | 0.0263 | - |
2.3152 | 582000 | 0.051 | - |
2.3192 | 583000 | 0.0344 | - |
2.3231 | 584000 | 0.0342 | - |
2.3271 | 585000 | 0.0368 | - |
2.3311 | 586000 | 0.0303 | - |
2.3351 | 587000 | 0.0206 | - |
2.3391 | 588000 | 0.0597 | - |
2.3430 | 589000 | 0.0232 | - |
2.3470 | 590000 | 0.0342 | - |
2.3510 | 591000 | 0.0477 | - |
2.3550 | 592000 | 0.015 | - |
2.3590 | 593000 | 0.0169 | - |
2.3629 | 594000 | 0.0209 | - |
2.3669 | 595000 | 0.0392 | - |
2.3709 | 596000 | 0.0299 | - |
2.3749 | 597000 | 0.0362 | - |
2.3788 | 598000 | 0.0503 | - |
2.3828 | 599000 | 0.0262 | - |
2.3868 | 600000 | 0.0275 | - |
2.3908 | 601000 | 0.0217 | - |
2.3948 | 602000 | 0.0185 | - |
2.3987 | 603000 | 0.0251 | - |
2.4027 | 604000 | 0.036 | - |
2.4067 | 605000 | 0.0202 | - |
2.4107 | 606000 | 0.018 | - |
2.4146 | 607000 | 0.0178 | - |
2.4186 | 608000 | 0.0201 | - |
2.4226 | 609000 | 0.0183 | - |
2.4266 | 610000 | 0.0325 | - |
2.4306 | 611000 | 0.0391 | - |
2.4345 | 612000 | 0.0229 | - |
2.4385 | 613000 | 0.0165 | - |
2.4425 | 614000 | 0.0123 | - |
2.4465 | 615000 | 0.0257 | - |
2.4504 | 616000 | 0.022 | - |
2.4544 | 617000 | 0.0167 | - |
2.4584 | 618000 | 0.0167 | - |
2.4624 | 619000 | 0.0127 | - |
2.4664 | 620000 | 0.0201 | - |
2.4703 | 621000 | 0.0364 | - |
2.4743 | 622000 | 0.0368 | - |
2.4783 | 623000 | 0.0352 | - |
2.4823 | 624000 | 0.0319 | - |
2.4862 | 625000 | 0.0171 | - |
2.4902 | 626000 | 0.0185 | - |
2.4942 | 627000 | 0.0206 | - |
2.4982 | 628000 | 0.0138 | - |
2.5022 | 629000 | 0.0151 | - |
2.5061 | 630000 | 0.019 | - |
2.5101 | 631000 | 0.0122 | - |
2.5141 | 632000 | 0.0139 | - |
2.5181 | 633000 | 0.0203 | - |
2.5220 | 634000 | 0.0295 | - |
2.5260 | 635000 | 0.0217 | - |
2.5300 | 636000 | 0.0132 | - |
2.5340 | 637000 | 0.0183 | - |
2.5380 | 638000 | 0.0208 | - |
2.5419 | 639000 | 0.0217 | - |
2.5459 | 640000 | 0.0333 | - |
2.5499 | 641000 | 0.036 | - |
2.5539 | 642000 | 0.0317 | - |
2.5578 | 643000 | 0.034 | - |
2.5618 | 644000 | 0.0365 | - |
2.5658 | 645000 | 0.023 | - |
2.5698 | 646000 | 0.0288 | - |
2.5738 | 647000 | 0.0211 | - |
2.5777 | 648000 | 0.0211 | - |
2.5817 | 649000 | 0.023 | - |
2.5857 | 650000 | 0.0137 | - |
2.5897 | 651000 | 0.0115 | - |
2.5937 | 652000 | 0.0141 | - |
2.5976 | 653000 | 0.0209 | - |
2.6016 | 654000 | 0.0207 | - |
2.6056 | 655000 | 0.0149 | - |
2.6096 | 656000 | 0.0173 | - |
2.6135 | 657000 | 0.0196 | - |
2.6175 | 658000 | 0.0198 | - |
2.6215 | 659000 | 0.0217 | - |
2.6255 | 660000 | 0.0306 | - |
2.6295 | 661000 | 0.0383 | - |
2.6334 | 662000 | 0.0286 | - |
2.6374 | 663000 | 0.0228 | - |
2.6414 | 664000 | 0.0135 | - |
2.6454 | 665000 | 0.0148 | - |
2.6493 | 666000 | 0.0221 | - |
2.6533 | 667000 | 0.0347 | - |
2.6573 | 668000 | 0.0293 | - |
2.6613 | 669000 | 0.0333 | - |
2.6653 | 670000 | 0.015 | - |
2.6692 | 671000 | 0.0143 | - |
2.6732 | 672000 | 0.0195 | - |
2.6772 | 673000 | 0.0251 | - |
2.6812 | 674000 | 0.0212 | - |
2.6851 | 675000 | 0.0155 | - |
2.6891 | 676000 | 0.0306 | - |
2.6931 | 677000 | 0.0231 | - |
2.6971 | 678000 | 0.0202 | - |
2.7011 | 679000 | 0.0215 | - |
2.7050 | 680000 | 0.0212 | - |
2.7090 | 681000 | 0.0192 | - |
2.7130 | 682000 | 0.0187 | - |
2.7170 | 683000 | 0.0305 | - |
2.7209 | 684000 | 0.0286 | - |
2.7249 | 685000 | 0.0282 | - |
2.7289 | 686000 | 0.0229 | - |
2.7329 | 687000 | 0.0211 | - |
2.7369 | 688000 | 0.0124 | - |
2.7408 | 689000 | 0.0227 | - |
2.7448 | 690000 | 0.0205 | - |
2.7488 | 691000 | 0.0246 | - |
2.7528 | 692000 | 0.0234 | - |
2.7567 | 693000 | 0.0245 | - |
2.7607 | 694000 | 0.0239 | - |
2.7647 | 695000 | 0.0261 | - |
2.7687 | 696000 | 0.0227 | - |
2.7727 | 697000 | 0.0181 | - |
2.7766 | 698000 | 0.014 | - |
2.7806 | 699000 | 0.0103 | - |
2.7846 | 700000 | 0.0119 | - |
2.7886 | 701000 | 0.0115 | - |
2.7926 | 702000 | 0.0102 | - |
2.7965 | 703000 | 0.0113 | - |
2.8005 | 704000 | 0.0124 | - |
2.8045 | 705000 | 0.0119 | - |
2.8085 | 706000 | 0.0218 | - |
2.8124 | 707000 | 0.0239 | - |
2.8164 | 708000 | 0.024 | - |
2.8204 | 709000 | 0.0237 | - |
2.8244 | 710000 | 0.0242 | - |
2.8284 | 711000 | 0.0252 | - |
2.8323 | 712000 | 0.011 | - |
2.8363 | 713000 | 0.0129 | - |
2.8403 | 714000 | 0.0129 | - |
2.8443 | 715000 | 0.0133 | - |
2.8482 | 716000 | 0.0117 | - |
2.8522 | 717000 | 0.013 | - |
2.8562 | 718000 | 0.0133 | - |
2.8602 | 719000 | 0.0116 | - |
2.8642 | 720000 | 0.0126 | - |
2.8681 | 721000 | 0.0131 | - |
2.8721 | 722000 | 0.0124 | - |
2.8761 | 723000 | 0.0122 | - |
2.8801 | 724000 | 0.0109 | - |
2.8840 | 725000 | 0.0233 | - |
2.8880 | 726000 | 0.0241 | - |
2.8920 | 727000 | 0.0266 | - |
2.8960 | 728000 | 0.0244 | - |
2.9000 | 729000 | 0.0234 | - |
2.9039 | 730000 | 0.0215 | - |
2.9079 | 731000 | 0.0166 | - |
2.9119 | 732000 | 0.0135 | - |
2.9159 | 733000 | 0.013 | - |
2.9198 | 734000 | 0.0127 | - |
2.9238 | 735000 | 0.0118 | - |
2.9278 | 736000 | 0.0131 | - |
2.9318 | 737000 | 0.0108 | - |
2.9358 | 738000 | 0.0099 | - |
2.9397 | 739000 | 0.011 | - |
2.9437 | 740000 | 0.0068 | - |
2.9477 | 741000 | 0.0103 | - |
2.9517 | 742000 | 0.0054 | - |
2.9556 | 743000 | 0.0071 | - |
2.9596 | 744000 | 0.0076 | - |
2.9636 | 745000 | 0.0206 | - |
2.9676 | 746000 | 0.0212 | - |
2.9716 | 747000 | 0.0075 | - |
2.9755 | 748000 | 0.0078 | - |
2.9795 | 749000 | 0.0067 | - |
2.9835 | 750000 | 0.0057 | - |
2.9875 | 751000 | 0.0062 | - |
2.9915 | 752000 | 0.0075 | - |
2.9954 | 753000 | 0.006 | - |
2.9994 | 754000 | 0.0073 | - |
3.0000 | 754146 | - | 0.0013 |
Framework Versions
- Python: 3.11.11
- Sentence Transformers: 3.4.1
- Transformers: 4.48.2
- PyTorch: 2.6.0+cu124
- Accelerate: 1.3.0
- Datasets: 3.2.0
- Tokenizers: 0.21.0
Citation
BibTeX
Sentence Transformers
@inproceedings{reimers-2019-sentence-bert,
title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
author = "Reimers, Nils and Gurevych, Iryna",
booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
month = "11",
year = "2019",
publisher = "Association for Computational Linguistics",
url = "https://arxiv.org/abs/1908.10084",
}
MultipleNegativesRankingLoss
@misc{henderson2017efficient,
title={Efficient Natural Language Response Suggestion for Smart Reply},
author={Matthew Henderson and Rami Al-Rfou and Brian Strope and Yun-hsuan Sung and Laszlo Lukacs and Ruiqi Guo and Sanjiv Kumar and Balint Miklos and Ray Kurzweil},
year={2017},
eprint={1705.00652},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
- Downloads last month
- 18
Inference Providers
NEW
This model is not currently available via any of the supported Inference Providers.