CrossEncoder based on jhu-clsp/ettin-encoder-68m
This is a Cross Encoder model finetuned from jhu-clsp/ettin-encoder-68m on the ms_marco dataset using the sentence-transformers library. It computes scores for pairs of texts, which can be used for text reranking and semantic search.
Model Details
Model Description
- Model Type: Cross Encoder
- Base model: jhu-clsp/ettin-encoder-68m
- Maximum Sequence Length: 7999 tokens
- Number of Output Labels: 1 label
- Training Dataset:
- Language: en
Model Sources
- Documentation: Sentence Transformers Documentation
- Documentation: Cross Encoder Documentation
- Repository: Sentence Transformers on GitHub
- Hugging Face: Cross Encoders on Hugging Face
Usage
Direct Usage (Sentence Transformers)
First install the Sentence Transformers library:
pip install -U sentence-transformers
Then you can load this model and run inference.
from sentence_transformers import CrossEncoder
# Download from the 🤗 Hub
model = CrossEncoder("rahulseetharaman/reranker-msmarco-v1.1-ettin-encoder-68m-listnet")
# Get scores for pairs of texts
pairs = [
['what is the difference between dna and rna', "1 DNA contains the sugar deoxyribose, while RNA contains the sugar ribose. 2 The only difference between ribose and deoxyribose is that ribose has one more-OH group than deoxyribose, which has-H attached to the second (2') carbon in the ring. Although DNA and RNA both carry genetic information, there are quite a few differences between them. This is a comparison of the differences between DNA versus RNA, including a quick summary and a detailed table of the differences."],
['what is the difference between dna and rna', 'Tweet. The difference between DNA and RNA in the most basic way is that DNA is double stranded whereas RNA is single stranded. The next difference is that DNA is made from deoxyribose and RNA is made from ribose. Ribose has a hydroxyl group attached to it, making it less stable. The third difference is in the complementary nucleotides that DNA and RNA encode for. DNA has thymine (T), guanine (G), adenine (A) and cytosine (C). '],
['what is the difference between dna and rna', "1 The only difference between ribose and deoxyribose is that ribose has one more-OH group than deoxyribose, which has-H attached to the second (2') carbon in the ring. 2 DNA is a double stranded molecule while RNA is a single stranded molecule. 3 DNA is stable under alkaline conditions while RNA is not stable. Although DNA and RNA both carry genetic information, there are quite a few differences between them. This is a comparison of the differences between DNA versus RNA, including a quick summary and a detailed table of the differences."],
['what is the difference between dna and rna', 'RNA (Ribonucleic Acid). RNA is a nucleic acid consisting long chain of nucleotide units. Like the DNA molecule, every nucleotide consists of a nitrogenous base, sugar and phosphates. RNA is created by a process known as Transcribing, which involves the following 4 steps: 1 DNA “unzips” as the bonds break. 2 The free nucleotides lead to the RNA pair up with the complementary bases'],
['what is the difference between dna and rna', 'Each nucleotide consists of a sugar, a phosphate and a nucleic acid base. The sugar in DNA is deoxyribose. The sugar in RNA is ribose, the same as deoxyribose but with one more OH (oxygen-hydrogen atom combination called a hydroxyl). This is the biggest difference between DNA and RNA. #in DNA the sugar is alpha 2 deoxyribose, whereas in RNA it is alpha ribose. #also one major difference is of the base. in DNA four types of bases are found namely, cytosine, adenine, guanine and thymine. In RNA all the bases are same except thymine.'],
]
scores = model.predict(pairs)
print(scores.shape)
# (5,)
# Or rank different texts based on similarity to a single text
ranks = model.rank(
'what is the difference between dna and rna',
[
"1 DNA contains the sugar deoxyribose, while RNA contains the sugar ribose. 2 The only difference between ribose and deoxyribose is that ribose has one more-OH group than deoxyribose, which has-H attached to the second (2') carbon in the ring. Although DNA and RNA both carry genetic information, there are quite a few differences between them. This is a comparison of the differences between DNA versus RNA, including a quick summary and a detailed table of the differences.",
'Tweet. The difference between DNA and RNA in the most basic way is that DNA is double stranded whereas RNA is single stranded. The next difference is that DNA is made from deoxyribose and RNA is made from ribose. Ribose has a hydroxyl group attached to it, making it less stable. The third difference is in the complementary nucleotides that DNA and RNA encode for. DNA has thymine (T), guanine (G), adenine (A) and cytosine (C). ',
"1 The only difference between ribose and deoxyribose is that ribose has one more-OH group than deoxyribose, which has-H attached to the second (2') carbon in the ring. 2 DNA is a double stranded molecule while RNA is a single stranded molecule. 3 DNA is stable under alkaline conditions while RNA is not stable. Although DNA and RNA both carry genetic information, there are quite a few differences between them. This is a comparison of the differences between DNA versus RNA, including a quick summary and a detailed table of the differences.",
'RNA (Ribonucleic Acid). RNA is a nucleic acid consisting long chain of nucleotide units. Like the DNA molecule, every nucleotide consists of a nitrogenous base, sugar and phosphates. RNA is created by a process known as Transcribing, which involves the following 4 steps: 1 DNA “unzips” as the bonds break. 2 The free nucleotides lead to the RNA pair up with the complementary bases',
'Each nucleotide consists of a sugar, a phosphate and a nucleic acid base. The sugar in DNA is deoxyribose. The sugar in RNA is ribose, the same as deoxyribose but with one more OH (oxygen-hydrogen atom combination called a hydroxyl). This is the biggest difference between DNA and RNA. #in DNA the sugar is alpha 2 deoxyribose, whereas in RNA it is alpha ribose. #also one major difference is of the base. in DNA four types of bases are found namely, cytosine, adenine, guanine and thymine. In RNA all the bases are same except thymine.',
]
)
# [{'corpus_id': ..., 'score': ...}, {'corpus_id': ..., 'score': ...}, ...]
Evaluation
Metrics
Cross Encoder Reranking
- Datasets:
NanoMSMARCO_R100
,NanoNFCorpus_R100
andNanoNQ_R100
- Evaluated with
CrossEncoderRerankingEvaluator
with these parameters:{ "at_k": 10, "always_rerank_positives": true }
Metric | NanoMSMARCO_R100 | NanoNFCorpus_R100 | NanoNQ_R100 |
---|---|---|---|
map | 0.5289 (+0.0394) | 0.3581 (+0.0971) | 0.5743 (+0.1547) |
mrr@10 | 0.5255 (+0.0480) | 0.5497 (+0.0499) | 0.5841 (+0.1574) |
ndcg@10 | 0.6038 (+0.0634) | 0.3783 (+0.0532) | 0.6440 (+0.1433) |
Cross Encoder Nano BEIR
- Dataset:
NanoBEIR_R100_mean
- Evaluated with
CrossEncoderNanoBEIREvaluator
with these parameters:{ "dataset_names": [ "msmarco", "nfcorpus", "nq" ], "rerank_k": 100, "at_k": 10, "always_rerank_positives": true }
Metric | Value |
---|---|
map | 0.4871 (+0.0970) |
mrr@10 | 0.5531 (+0.0851) |
ndcg@10 | 0.5420 (+0.0866) |
Training Details
Training Dataset
ms_marco
- Dataset: ms_marco at a47ee7a
- Size: 78,704 training samples
- Columns:
query
,docs
, andlabels
- Approximate statistics based on the first 1000 samples:
query docs labels type string list list details - min: 9 characters
- mean: 34.01 characters
- max: 99 characters
- min: 3 elements
- mean: 6.50 elements
- max: 10 elements
- min: 3 elements
- mean: 6.50 elements
- max: 10 elements
- Samples:
query docs labels when was automatic washing machine invented
["Bendix Corporation introduced the first automatic washing machine in 1937, having …. In 1937 the Bendix Corporation introduced the world's first automatic washing machine and the lives of housewives was made easier from that day on. Alva J. Fisher invented the first electric washing machine in 1908. It was called the Thor and was manufactured by the Hurley Machine Company in Chicago.", "Brief History of Maytag & Washing Machine Innovations. 1951 - Production of Europe's first automatic washing machines. 1904 - Production of the first washing machines. One of the earliest patent for a clothes dryer(U.S. patent #476,416) was received by George T. Sampson on June 7, 1892. Samson also patented a sled propeller (U.S. patent #312,388) on February 17th, 1885.", "There is dispute over who was the first inventor of the automatic washer. A company called Nineteen Hundred Washing Machine Company of Binghamton, NY claims to have produced the first electric washer in 1906; a year before Thor's release.", 'The First Patents. It is impossible to know exactly who invented the first washing machine and dryer. Some of the patents are so old that nothing is known about the original patent holder. 1 The first British patent for a washing machine was issued in 1691.', 'The invention of the washing machine: Machines with washer and dryer. Bendix Deluxe, a machine loaded in the front, was introduced in 1947, accompanying General Electric’s top-loading automatic model. Some machines were semi-automatic, requiring users to intervene at one point or another.']
[1, 0, 0, 0, 0]
what is scylla and charybdis
['Being between Scylla and Charybdis is an idiom deriving from Greek mythology, meaning having to choose between two evils. Several other idioms, such as on the horns of a dilemma, between the devil and the deep blue sea , and between a rock and a hard place express the same meaning. ', 'Avoiding Charybdis meant that the ship would be swallowed by the giant sea monster Scylla, and vice versa. Odysseus has his men try to avoid Charybdis, and leads them to Scylla, he then loses many men. Later, when he is on a raft by himself, he comes back and faces the two monsters again, this time being sucked in by Charybdis he survives though, just by holding a tight grip on to a fig tree. Sirens, Scylla and Charybdis=Obstacles Odysseus & crew face while on the ship. Odysseus has now returned to Circe’s island where the goddess guides him through his next exploration, explaining how to avoid the dangers of the Sirens.', "Scylla. by Micha F. Lindemans. In Greek mythology, a sea monster who lived u...
[1, 0, 0, 0, 0, ...]
how long do you need to slow cook a small whole chicken
['4. Cover and cook. Place the cover on the slow cooker and turn it on to high for 4 hours or low for 6 to 8 hours. You do not need to add any liquid. Chickens today typically have some solution added, so they rarely need added liquid. At the end of the cooking time, the meat will be tender, practically falling off the bone. Roast a whole chicken in the slow cooker for tender, delicious cooked chicken. In my neck of the woods, whole chickens go as low as $0.57/pound. Yes, really. There’s generally a limit of three, but yes, the price is that low.', "Making the world better, one answer at a time. I just finished looking up a slow cooker recipe for whole chicken-cook on high for 4-5 hours, or on low for 6-8 hours. Do not remove the lid for the first two hours to prevent heat loss. Add the greens to the slow cooker along with the remaining ingredients. Turn the slow cooker to low and cook 4-6 hours or until the greens are tender and are a gray-green color. Remove the salt pork before serv...
[1, 0, 0, 0, 0, ...]
- Loss:
ListNetLoss
with these parameters:{ "activation_fn": "torch.nn.modules.linear.Identity", "mini_batch_size": 16 }
Evaluation Dataset
ms_marco
- Dataset: ms_marco at a47ee7a
- Size: 1,000 evaluation samples
- Columns:
query
,docs
, andlabels
- Approximate statistics based on the first 1000 samples:
query docs labels type string list list details - min: 11 characters
- mean: 33.68 characters
- max: 94 characters
- min: 2 elements
- mean: 6.00 elements
- max: 10 elements
- min: 2 elements
- mean: 6.00 elements
- max: 10 elements
- Samples:
query docs labels what is the difference between dna and rna
["1 DNA contains the sugar deoxyribose, while RNA contains the sugar ribose. 2 The only difference between ribose and deoxyribose is that ribose has one more-OH group than deoxyribose, which has-H attached to the second (2') carbon in the ring. Although DNA and RNA both carry genetic information, there are quite a few differences between them. This is a comparison of the differences between DNA versus RNA, including a quick summary and a detailed table of the differences.", 'Tweet. The difference between DNA and RNA in the most basic way is that DNA is double stranded whereas RNA is single stranded. The next difference is that DNA is made from deoxyribose and RNA is made from ribose. Ribose has a hydroxyl group attached to it, making it less stable. The third difference is in the complementary nucleotides that DNA and RNA encode for. DNA has thymine (T), guanine (G), adenine (A) and cytosine (C). ', "1 The only difference between ribose and deoxyribose is that ribose has one more-OH g...
[1, 1, 0, 0, 0, ...]
ecuador location and geography
['Ecuador is a country in western South America, bordering the Pacific Ocean at the Equator, for which the country is named. Ecuador encompasses a wide range of natural formations and climates, from the desert-like southern coast to the snowcapped peaks of the Andes mountain range to the plains of the Amazon Basin. Ecuador is divided into three continental regions—the Costa (coast), Sierra (mountains), and Oriente (east)—and one insular region, the Galapagos Galápagos (islands Officially archipielago Archipiélago). De colon colón the continental regions extend the length of the country from north to south and are Separated By. the andes mountains', 'Map of Ecuador. GEOGRAPHY. Ecuador is located in the western corner at the top of the South American continent. Ecuador, the smallest country in South America, is named after the Equator, the imaginary line around the Earth that splits the country in two. Most of the country is in the Southern Hemisphere.', 'Location of Quito on a map. Quit...
[1, 0, 0, 0, 0, ...]
causes of pterygium
['The primary cause of pterygium is cumulative UV radiation, typically from sun exposure. Dry, windy, and dusty conditions can also contribute to its growth. ', 'To research the causes of Pterygium, consider researching the causes of these these diseases that may be similar, or associated with Pterygium: 1 Ocular mass. 2 Ocular lesion. 3 Corneal surface. 4 Corneal topography. 5 Diplopia. 6 Double vision. 7 Vision loss. ', 'Sometimes, a pterygium causes no symptoms other than its appearance. An enlarging pterygium, however, may cause redness and inflammation. A pterygium can progressively grow onto the cornea (the clear, outer layer of the eye). This can distort the shape of the cornea, causing a condition called astigmatism. The result can be blurred vision.', 'A pterygium, from the Greek word for “wing,” is an abnormal growth of tissue that extends from the conjunctiva (a membrane that covers the white of the eye) onto the cornea. Pterygia may be small, or grow large enough to ...
[1, 1, 1, 0, 0, ...]
- Loss:
ListNetLoss
with these parameters:{ "activation_fn": "torch.nn.modules.linear.Identity", "mini_batch_size": 16 }
Training Hyperparameters
Non-Default Hyperparameters
eval_strategy
: stepsper_device_train_batch_size
: 16per_device_eval_batch_size
: 16learning_rate
: 2e-05num_train_epochs
: 5seed
: 12bf16
: Trueload_best_model_at_end
: True
All Hyperparameters
Click to expand
overwrite_output_dir
: Falsedo_predict
: Falseeval_strategy
: stepsprediction_loss_only
: Trueper_device_train_batch_size
: 16per_device_eval_batch_size
: 16per_gpu_train_batch_size
: Noneper_gpu_eval_batch_size
: Nonegradient_accumulation_steps
: 1eval_accumulation_steps
: Nonetorch_empty_cache_steps
: Nonelearning_rate
: 2e-05weight_decay
: 0.0adam_beta1
: 0.9adam_beta2
: 0.999adam_epsilon
: 1e-08max_grad_norm
: 1.0num_train_epochs
: 5max_steps
: -1lr_scheduler_type
: linearlr_scheduler_kwargs
: {}warmup_ratio
: 0.0warmup_steps
: 0log_level
: passivelog_level_replica
: warninglog_on_each_node
: Truelogging_nan_inf_filter
: Truesave_safetensors
: Truesave_on_each_node
: Falsesave_only_model
: Falserestore_callback_states_from_checkpoint
: Falseno_cuda
: Falseuse_cpu
: Falseuse_mps_device
: Falseseed
: 12data_seed
: Nonejit_mode_eval
: Falseuse_ipex
: Falsebf16
: Truefp16
: Falsefp16_opt_level
: O1half_precision_backend
: autobf16_full_eval
: Falsefp16_full_eval
: Falsetf32
: Nonelocal_rank
: 0ddp_backend
: Nonetpu_num_cores
: Nonetpu_metrics_debug
: Falsedebug
: []dataloader_drop_last
: Falsedataloader_num_workers
: 0dataloader_prefetch_factor
: Nonepast_index
: -1disable_tqdm
: Falseremove_unused_columns
: Truelabel_names
: Noneload_best_model_at_end
: Trueignore_data_skip
: Falsefsdp
: []fsdp_min_num_params
: 0fsdp_config
: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}fsdp_transformer_layer_cls_to_wrap
: Noneaccelerator_config
: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}deepspeed
: Nonelabel_smoothing_factor
: 0.0optim
: adamw_torchoptim_args
: Noneadafactor
: Falsegroup_by_length
: Falselength_column_name
: lengthddp_find_unused_parameters
: Noneddp_bucket_cap_mb
: Noneddp_broadcast_buffers
: Falsedataloader_pin_memory
: Truedataloader_persistent_workers
: Falseskip_memory_metrics
: Trueuse_legacy_prediction_loop
: Falsepush_to_hub
: Falseresume_from_checkpoint
: Nonehub_model_id
: Nonehub_strategy
: every_savehub_private_repo
: Nonehub_always_push
: Falsehub_revision
: Nonegradient_checkpointing
: Falsegradient_checkpointing_kwargs
: Noneinclude_inputs_for_metrics
: Falseinclude_for_metrics
: []eval_do_concat_batches
: Truefp16_backend
: autopush_to_hub_model_id
: Nonepush_to_hub_organization
: Nonemp_parameters
:auto_find_batch_size
: Falsefull_determinism
: Falsetorchdynamo
: Noneray_scope
: lastddp_timeout
: 1800torch_compile
: Falsetorch_compile_backend
: Nonetorch_compile_mode
: Noneinclude_tokens_per_second
: Falseinclude_num_input_tokens_seen
: Falseneftune_noise_alpha
: Noneoptim_target_modules
: Nonebatch_eval_metrics
: Falseeval_on_start
: Falseuse_liger_kernel
: Falseliger_kernel_config
: Noneeval_use_gather_object
: Falseaverage_tokens_across_devices
: Falseprompts
: Nonebatch_sampler
: batch_samplermulti_dataset_batch_sampler
: proportionalrouter_mapping
: {}learning_rate_mapping
: {}
Training Logs
Click to expand
Epoch | Step | Training Loss | Validation Loss | NanoMSMARCO_R100_ndcg@10 | NanoNFCorpus_R100_ndcg@10 | NanoNQ_R100_ndcg@10 | NanoBEIR_R100_mean_ndcg@10 |
---|---|---|---|---|---|---|---|
-1 | -1 | - | - | 0.0442 (-0.4962) | 0.2555 (-0.0695) | 0.0464 (-0.4542) | 0.1154 (-0.3400) |
0.0002 | 1 | 2.6299 | - | - | - | - | - |
0.0203 | 100 | 2.0976 | 2.0794 | 0.1173 (-0.4231) | 0.2127 (-0.1123) | 0.1400 (-0.3607) | 0.1567 (-0.2987) |
0.0407 | 200 | 2.0936 | 2.0765 | 0.2364 (-0.3041) | 0.2546 (-0.0704) | 0.2815 (-0.2192) | 0.2575 (-0.1979) |
0.0610 | 300 | 2.0831 | 2.0736 | 0.3069 (-0.2335) | 0.3083 (-0.0168) | 0.3912 (-0.1095) | 0.3355 (-0.1199) |
0.0813 | 400 | 2.0652 | 2.0724 | 0.3448 (-0.1957) | 0.3274 (+0.0024) | 0.4448 (-0.0558) | 0.3723 (-0.0830) |
0.1016 | 500 | 2.0762 | 2.0701 | 0.4429 (-0.0975) | 0.3032 (-0.0219) | 0.4844 (-0.0163) | 0.4102 (-0.0452) |
0.1220 | 600 | 2.0778 | 2.0690 | 0.4909 (-0.0496) | 0.3097 (-0.0153) | 0.6015 (+0.1008) | 0.4674 (+0.0120) |
0.1423 | 700 | 2.0805 | 2.0687 | 0.4564 (-0.0840) | 0.2922 (-0.0329) | 0.5196 (+0.0190) | 0.4227 (-0.0326) |
0.1626 | 800 | 2.0752 | 2.0679 | 0.4472 (-0.0932) | 0.3165 (-0.0086) | 0.5350 (+0.0343) | 0.4329 (-0.0225) |
0.1830 | 900 | 2.0786 | 2.0674 | 0.5054 (-0.0350) | 0.3363 (+0.0112) | 0.5399 (+0.0393) | 0.4605 (+0.0052) |
0.2033 | 1000 | 2.0791 | 2.0668 | 0.4596 (-0.0808) | 0.3235 (-0.0015) | 0.5025 (+0.0019) | 0.4285 (-0.0268) |
0.2236 | 1100 | 2.0714 | 2.0669 | 0.5469 (+0.0065) | 0.3546 (+0.0296) | 0.5718 (+0.0711) | 0.4911 (+0.0357) |
0.2440 | 1200 | 2.0736 | 2.0668 | 0.5520 (+0.0116) | 0.3521 (+0.0270) | 0.5793 (+0.0786) | 0.4945 (+0.0391) |
0.2643 | 1300 | 2.0736 | 2.0669 | 0.4914 (-0.0490) | 0.3381 (+0.0131) | 0.5117 (+0.0111) | 0.4471 (-0.0083) |
0.2846 | 1400 | 2.0761 | 2.0660 | 0.5132 (-0.0272) | 0.3408 (+0.0158) | 0.5294 (+0.0288) | 0.4611 (+0.0058) |
0.3049 | 1500 | 2.0748 | 2.0671 | 0.5542 (+0.0138) | 0.3374 (+0.0123) | 0.5908 (+0.0902) | 0.4941 (+0.0388) |
0.3253 | 1600 | 2.07 | 2.0662 | 0.5528 (+0.0124) | 0.3474 (+0.0224) | 0.5693 (+0.0687) | 0.4899 (+0.0345) |
0.3456 | 1700 | 2.0779 | 2.0662 | 0.5249 (-0.0155) | 0.3428 (+0.0178) | 0.5655 (+0.0649) | 0.4778 (+0.0224) |
0.3659 | 1800 | 2.0712 | 2.0660 | 0.5347 (-0.0057) | 0.3450 (+0.0200) | 0.5988 (+0.0982) | 0.4929 (+0.0375) |
0.3863 | 1900 | 2.0753 | 2.0661 | 0.5407 (+0.0002) | 0.3199 (-0.0052) | 0.5705 (+0.0698) | 0.4770 (+0.0216) |
0.4066 | 2000 | 2.0753 | 2.0657 | 0.5704 (+0.0300) | 0.3487 (+0.0237) | 0.6416 (+0.1410) | 0.5203 (+0.0649) |
0.4269 | 2100 | 2.0724 | 2.0652 | 0.5492 (+0.0088) | 0.3430 (+0.0180) | 0.6159 (+0.1153) | 0.5027 (+0.0474) |
0.4472 | 2200 | 2.0725 | 2.0651 | 0.5205 (-0.0199) | 0.3445 (+0.0194) | 0.5965 (+0.0958) | 0.4872 (+0.0318) |
0.4676 | 2300 | 2.0661 | 2.0652 | 0.5510 (+0.0106) | 0.3285 (+0.0034) | 0.5664 (+0.0657) | 0.4819 (+0.0266) |
0.4879 | 2400 | 2.0754 | 2.0655 | 0.5724 (+0.0319) | 0.3575 (+0.0324) | 0.5413 (+0.0406) | 0.4904 (+0.0350) |
0.5082 | 2500 | 2.0787 | 2.0652 | 0.5638 (+0.0234) | 0.3214 (-0.0037) | 0.5865 (+0.0858) | 0.4905 (+0.0352) |
0.5286 | 2600 | 2.0675 | 2.0649 | 0.5611 (+0.0207) | 0.3389 (+0.0138) | 0.5819 (+0.0812) | 0.4940 (+0.0386) |
0.5489 | 2700 | 2.0773 | 2.0646 | 0.5572 (+0.0168) | 0.3275 (+0.0024) | 0.6053 (+0.1046) | 0.4967 (+0.0413) |
0.5692 | 2800 | 2.0676 | 2.0649 | 0.5705 (+0.0301) | 0.3277 (+0.0027) | 0.5990 (+0.0984) | 0.4991 (+0.0437) |
0.5896 | 2900 | 2.0683 | 2.0650 | 0.5334 (-0.0070) | 0.3314 (+0.0063) | 0.5347 (+0.0340) | 0.4665 (+0.0111) |
0.6099 | 3000 | 2.066 | 2.0647 | 0.5408 (+0.0004) | 0.3350 (+0.0100) | 0.5753 (+0.0747) | 0.4837 (+0.0284) |
0.6302 | 3100 | 2.0591 | 2.0649 | 0.5507 (+0.0103) | 0.3288 (+0.0038) | 0.5792 (+0.0786) | 0.4863 (+0.0309) |
0.6505 | 3200 | 2.0774 | 2.0644 | 0.5688 (+0.0284) | 0.3358 (+0.0108) | 0.5857 (+0.0851) | 0.4968 (+0.0414) |
0.6709 | 3300 | 2.065 | 2.0645 | 0.5572 (+0.0168) | 0.3567 (+0.0316) | 0.6152 (+0.1145) | 0.5097 (+0.0543) |
0.6912 | 3400 | 2.0725 | 2.0645 | 0.5604 (+0.0199) | 0.3502 (+0.0252) | 0.6303 (+0.1297) | 0.5136 (+0.0583) |
0.7115 | 3500 | 2.0683 | 2.0645 | 0.5384 (-0.0020) | 0.3335 (+0.0085) | 0.6037 (+0.1030) | 0.4919 (+0.0365) |
0.7319 | 3600 | 2.0726 | 2.0646 | 0.5702 (+0.0298) | 0.3299 (+0.0049) | 0.5757 (+0.0751) | 0.4920 (+0.0366) |
0.7522 | 3700 | 2.0668 | 2.0638 | 0.5794 (+0.0389) | 0.3390 (+0.0140) | 0.5556 (+0.0549) | 0.4913 (+0.0359) |
0.7725 | 3800 | 2.0681 | 2.0638 | 0.5842 (+0.0438) | 0.3302 (+0.0052) | 0.6060 (+0.1053) | 0.5068 (+0.0515) |
0.7928 | 3900 | 2.0699 | 2.0636 | 0.5557 (+0.0152) | 0.3491 (+0.0241) | 0.5862 (+0.0856) | 0.4970 (+0.0416) |
0.8132 | 4000 | 2.0751 | 2.0636 | 0.5563 (+0.0159) | 0.3472 (+0.0222) | 0.6149 (+0.1142) | 0.5061 (+0.0508) |
0.8335 | 4100 | 2.0744 | 2.0638 | 0.5685 (+0.0281) | 0.3424 (+0.0173) | 0.5695 (+0.0689) | 0.4935 (+0.0381) |
0.8538 | 4200 | 2.0717 | 2.0642 | 0.5406 (+0.0002) | 0.3155 (-0.0096) | 0.5957 (+0.0951) | 0.4839 (+0.0286) |
0.8742 | 4300 | 2.0639 | 2.0641 | 0.5637 (+0.0233) | 0.3351 (+0.0101) | 0.6055 (+0.1048) | 0.5015 (+0.0461) |
0.8945 | 4400 | 2.0778 | 2.0639 | 0.5627 (+0.0223) | 0.3429 (+0.0178) | 0.6575 (+0.1569) | 0.5210 (+0.0657) |
0.9148 | 4500 | 2.0733 | 2.0638 | 0.5432 (+0.0028) | 0.3306 (+0.0056) | 0.5608 (+0.0602) | 0.4782 (+0.0228) |
0.9351 | 4600 | 2.063 | 2.0638 | 0.5683 (+0.0279) | 0.3305 (+0.0055) | 0.5926 (+0.0919) | 0.4971 (+0.0418) |
0.9555 | 4700 | 2.0832 | 2.0635 | 0.5781 (+0.0377) | 0.3513 (+0.0262) | 0.6369 (+0.1363) | 0.5221 (+0.0667) |
0.9758 | 4800 | 2.064 | 2.0635 | 0.5704 (+0.0299) | 0.3236 (-0.0014) | 0.6039 (+0.1033) | 0.4993 (+0.0439) |
0.9961 | 4900 | 2.0681 | 2.0636 | 0.5755 (+0.0350) | 0.3538 (+0.0287) | 0.6151 (+0.1145) | 0.5148 (+0.0594) |
1.0165 | 5000 | 2.0637 | 2.0635 | 0.5915 (+0.0511) | 0.3704 (+0.0454) | 0.6417 (+0.1411) | 0.5346 (+0.0792) |
1.0368 | 5100 | 2.0584 | 2.0644 | 0.5549 (+0.0145) | 0.3453 (+0.0203) | 0.6383 (+0.1377) | 0.5129 (+0.0575) |
1.0571 | 5200 | 2.0607 | 2.0648 | 0.6057 (+0.0653) | 0.3655 (+0.0405) | 0.6150 (+0.1144) | 0.5287 (+0.0734) |
1.0775 | 5300 | 2.0665 | 2.0652 | 0.5657 (+0.0252) | 0.3521 (+0.0270) | 0.6126 (+0.1119) | 0.5101 (+0.0547) |
1.0978 | 5400 | 2.0629 | 2.0649 | 0.5252 (-0.0152) | 0.3084 (-0.0167) | 0.5580 (+0.0574) | 0.4639 (+0.0085) |
1.1181 | 5500 | 2.0689 | 2.0648 | 0.5810 (+0.0406) | 0.3715 (+0.0465) | 0.6543 (+0.1536) | 0.5356 (+0.0802) |
1.1384 | 5600 | 2.0665 | 2.0650 | 0.5535 (+0.0131) | 0.3593 (+0.0343) | 0.6004 (+0.0998) | 0.5044 (+0.0490) |
1.1588 | 5700 | 2.0674 | 2.0646 | 0.5610 (+0.0206) | 0.3700 (+0.0450) | 0.6448 (+0.1441) | 0.5253 (+0.0699) |
1.1791 | 5800 | 2.061 | 2.0646 | 0.5546 (+0.0142) | 0.3526 (+0.0275) | 0.6016 (+0.1009) | 0.5029 (+0.0475) |
1.1994 | 5900 | 2.0597 | 2.0648 | 0.5736 (+0.0332) | 0.3650 (+0.0399) | 0.5981 (+0.0974) | 0.5122 (+0.0569) |
1.2198 | 6000 | 2.0612 | 2.0642 | 0.5987 (+0.0583) | 0.3538 (+0.0287) | 0.5892 (+0.0885) | 0.5139 (+0.0585) |
1.2401 | 6100 | 2.0601 | 2.0637 | 0.5634 (+0.0229) | 0.3713 (+0.0463) | 0.6055 (+0.1048) | 0.5134 (+0.0580) |
1.2604 | 6200 | 2.0636 | 2.0637 | 0.5724 (+0.0320) | 0.3675 (+0.0424) | 0.6268 (+0.1262) | 0.5222 (+0.0669) |
1.2807 | 6300 | 2.0669 | 2.0642 | 0.5428 (+0.0024) | 0.3458 (+0.0208) | 0.6213 (+0.1206) | 0.5033 (+0.0479) |
1.3011 | 6400 | 2.0694 | 2.0654 | 0.6038 (+0.0634) | 0.3783 (+0.0532) | 0.6440 (+0.1433) | 0.5420 (+0.0866) |
1.3214 | 6500 | 2.0675 | 2.0656 | 0.5688 (+0.0284) | 0.3415 (+0.0165) | 0.6025 (+0.1018) | 0.5043 (+0.0489) |
1.3417 | 6600 | 2.063 | 2.0655 | 0.5470 (+0.0065) | 0.3446 (+0.0196) | 0.5891 (+0.0885) | 0.4936 (+0.0382) |
1.3621 | 6700 | 2.0617 | 2.0650 | 0.5760 (+0.0355) | 0.3629 (+0.0378) | 0.6094 (+0.1087) | 0.5161 (+0.0607) |
1.3824 | 6800 | 2.0631 | 2.0653 | 0.5677 (+0.0273) | 0.3241 (-0.0009) | 0.5949 (+0.0942) | 0.4956 (+0.0402) |
1.4027 | 6900 | 2.0669 | 2.0658 | 0.5650 (+0.0246) | 0.3589 (+0.0339) | 0.5598 (+0.0591) | 0.4946 (+0.0392) |
1.4231 | 7000 | 2.0636 | 2.0654 | 0.5404 (-0.0000) | 0.3481 (+0.0231) | 0.5739 (+0.0733) | 0.4875 (+0.0321) |
1.4434 | 7100 | 2.0645 | 2.0654 | 0.5668 (+0.0264) | 0.3328 (+0.0078) | 0.5804 (+0.0798) | 0.4934 (+0.0380) |
1.4637 | 7200 | 2.0659 | 2.0649 | 0.5637 (+0.0233) | 0.3198 (-0.0052) | 0.5477 (+0.0471) | 0.4771 (+0.0217) |
1.4840 | 7300 | 2.0643 | 2.0647 | 0.5565 (+0.0161) | 0.3238 (-0.0013) | 0.5310 (+0.0304) | 0.4704 (+0.0151) |
1.5044 | 7400 | 2.0714 | 2.0646 | 0.5846 (+0.0441) | 0.3531 (+0.0281) | 0.5996 (+0.0990) | 0.5124 (+0.0571) |
1.5247 | 7500 | 2.0594 | 2.0649 | 0.5619 (+0.0214) | 0.3537 (+0.0286) | 0.5659 (+0.0653) | 0.4938 (+0.0384) |
1.5450 | 7600 | 2.0664 | 2.0647 | 0.5799 (+0.0394) | 0.3506 (+0.0255) | 0.5901 (+0.0894) | 0.5068 (+0.0515) |
1.5654 | 7700 | 2.0629 | 2.0644 | 0.5458 (+0.0054) | 0.3489 (+0.0239) | 0.5311 (+0.0304) | 0.4753 (+0.0199) |
1.5857 | 7800 | 2.0616 | 2.0651 | 0.5665 (+0.0261) | 0.3610 (+0.0360) | 0.5450 (+0.0444) | 0.4908 (+0.0355) |
1.6060 | 7900 | 2.0678 | 2.0643 | 0.5698 (+0.0294) | 0.3756 (+0.0506) | 0.5562 (+0.0556) | 0.5006 (+0.0452) |
1.6263 | 8000 | 2.0655 | 2.0643 | 0.5692 (+0.0288) | 0.3642 (+0.0392) | 0.5601 (+0.0594) | 0.4978 (+0.0425) |
1.6467 | 8100 | 2.0642 | 2.0655 | 0.5572 (+0.0168) | 0.3546 (+0.0296) | 0.5773 (+0.0767) | 0.4964 (+0.0410) |
1.6670 | 8200 | 2.0609 | 2.0644 | 0.5716 (+0.0312) | 0.3705 (+0.0455) | 0.5508 (+0.0501) | 0.4976 (+0.0423) |
1.6873 | 8300 | 2.0617 | 2.0648 | 0.5468 (+0.0063) | 0.3489 (+0.0239) | 0.5482 (+0.0475) | 0.4813 (+0.0259) |
1.7077 | 8400 | 2.0608 | 2.0656 | 0.5677 (+0.0273) | 0.3512 (+0.0261) | 0.5960 (+0.0953) | 0.5050 (+0.0496) |
1.7280 | 8500 | 2.0682 | 2.0667 | 0.5796 (+0.0391) | 0.3318 (+0.0067) | 0.5862 (+0.0855) | 0.4992 (+0.0438) |
1.7483 | 8600 | 2.0658 | 2.0652 | 0.5655 (+0.0251) | 0.3403 (+0.0152) | 0.5936 (+0.0930) | 0.4998 (+0.0444) |
1.7687 | 8700 | 2.0593 | 2.0645 | 0.5365 (-0.0039) | 0.3454 (+0.0204) | 0.6317 (+0.1311) | 0.5046 (+0.0492) |
1.7890 | 8800 | 2.0601 | 2.0650 | 0.5384 (-0.0021) | 0.3516 (+0.0265) | 0.5958 (+0.0952) | 0.4953 (+0.0399) |
1.8093 | 8900 | 2.0654 | 2.0642 | 0.5608 (+0.0204) | 0.3340 (+0.0090) | 0.6094 (+0.1087) | 0.5014 (+0.0460) |
1.8296 | 9000 | 2.061 | 2.0645 | 0.5773 (+0.0369) | 0.3405 (+0.0155) | 0.6160 (+0.1153) | 0.5113 (+0.0559) |
1.8500 | 9100 | 2.0637 | 2.0646 | 0.5371 (-0.0033) | 0.3161 (-0.0090) | 0.6063 (+0.1057) | 0.4865 (+0.0311) |
1.8703 | 9200 | 2.0555 | 2.0654 | 0.5577 (+0.0173) | 0.3397 (+0.0147) | 0.5915 (+0.0908) | 0.4963 (+0.0409) |
1.8906 | 9300 | 2.0725 | 2.0643 | 0.5633 (+0.0229) | 0.3455 (+0.0205) | 0.5919 (+0.0913) | 0.5003 (+0.0449) |
1.9110 | 9400 | 2.0567 | 2.0646 | 0.5381 (-0.0023) | 0.3546 (+0.0295) | 0.5991 (+0.0985) | 0.4973 (+0.0419) |
1.9313 | 9500 | 2.0626 | 2.0645 | 0.5532 (+0.0128) | 0.3450 (+0.0200) | 0.6158 (+0.1152) | 0.5047 (+0.0493) |
1.9516 | 9600 | 2.0674 | 2.0644 | 0.5642 (+0.0238) | 0.3540 (+0.0289) | 0.5974 (+0.0968) | 0.5052 (+0.0498) |
1.9719 | 9700 | 2.0577 | 2.0640 | 0.5489 (+0.0085) | 0.3493 (+0.0243) | 0.5965 (+0.0959) | 0.4983 (+0.0429) |
1.9923 | 9800 | 2.0692 | 2.0646 | 0.5319 (-0.0085) | 0.3461 (+0.0211) | 0.5936 (+0.0929) | 0.4905 (+0.0352) |
2.0126 | 9900 | 2.0467 | 2.0680 | 0.5408 (+0.0003) | 0.3646 (+0.0395) | 0.6232 (+0.1225) | 0.5095 (+0.0541) |
2.0329 | 10000 | 2.0507 | 2.0688 | 0.5189 (-0.0215) | 0.3667 (+0.0417) | 0.5921 (+0.0915) | 0.4926 (+0.0372) |
2.0533 | 10100 | 2.0508 | 2.0715 | 0.5391 (-0.0013) | 0.3619 (+0.0369) | 0.6070 (+0.1063) | 0.5027 (+0.0473) |
2.0736 | 10200 | 2.058 | 2.0687 | 0.5351 (-0.0053) | 0.3560 (+0.0309) | 0.5843 (+0.0836) | 0.4918 (+0.0364) |
2.0939 | 10300 | 2.0412 | 2.0713 | 0.5201 (-0.0204) | 0.3474 (+0.0223) | 0.5567 (+0.0561) | 0.4747 (+0.0193) |
2.1143 | 10400 | 2.0448 | 2.0697 | 0.5271 (-0.0134) | 0.3529 (+0.0279) | 0.5823 (+0.0817) | 0.4874 (+0.0321) |
2.1346 | 10500 | 2.0466 | 2.0714 | 0.5140 (-0.0264) | 0.3549 (+0.0298) | 0.5575 (+0.0569) | 0.4755 (+0.0201) |
2.1549 | 10600 | 2.0436 | 2.0702 | 0.5165 (-0.0239) | 0.3561 (+0.0311) | 0.5354 (+0.0348) | 0.4693 (+0.0140) |
2.1752 | 10700 | 2.0452 | 2.0679 | 0.5200 (-0.0205) | 0.3754 (+0.0504) | 0.5239 (+0.0232) | 0.4731 (+0.0177) |
2.1956 | 10800 | 2.0528 | 2.0702 | 0.5413 (+0.0009) | 0.3445 (+0.0195) | 0.5386 (+0.0379) | 0.4748 (+0.0194) |
2.2159 | 10900 | 2.0509 | 2.0693 | 0.5357 (-0.0047) | 0.3488 (+0.0238) | 0.5594 (+0.0588) | 0.4813 (+0.0260) |
2.2362 | 11000 | 2.0558 | 2.0692 | 0.5426 (+0.0022) | 0.3538 (+0.0288) | 0.5411 (+0.0404) | 0.4792 (+0.0238) |
2.2566 | 11100 | 2.0376 | 2.0701 | 0.5144 (-0.0261) | 0.3308 (+0.0057) | 0.4771 (-0.0235) | 0.4408 (-0.0146) |
2.2769 | 11200 | 2.0491 | 2.0689 | 0.5112 (-0.0292) | 0.3331 (+0.0080) | 0.5029 (+0.0022) | 0.4490 (-0.0063) |
2.2972 | 11300 | 2.0456 | 2.0709 | 0.5185 (-0.0219) | 0.3489 (+0.0239) | 0.4797 (-0.0209) | 0.4490 (-0.0063) |
2.3175 | 11400 | 2.0432 | 2.0718 | 0.5242 (-0.0163) | 0.3528 (+0.0278) | 0.5098 (+0.0092) | 0.4623 (+0.0069) |
2.3379 | 11500 | 2.0561 | 2.0701 | 0.5291 (-0.0113) | 0.3553 (+0.0303) | 0.5532 (+0.0526) | 0.4792 (+0.0238) |
2.3582 | 11600 | 2.0493 | 2.0704 | 0.5404 (-0.0001) | 0.3422 (+0.0172) | 0.5461 (+0.0454) | 0.4762 (+0.0208) |
2.3785 | 11700 | 2.0484 | 2.0703 | 0.5367 (-0.0037) | 0.3548 (+0.0298) | 0.5600 (+0.0594) | 0.4839 (+0.0285) |
2.3989 | 11800 | 2.0536 | 2.0708 | 0.5367 (-0.0037) | 0.3626 (+0.0376) | 0.5235 (+0.0229) | 0.4743 (+0.0189) |
2.4192 | 11900 | 2.0517 | 2.0710 | 0.5515 (+0.0111) | 0.3413 (+0.0163) | 0.5232 (+0.0225) | 0.4720 (+0.0166) |
2.4395 | 12000 | 2.0491 | 2.0707 | 0.5267 (-0.0137) | 0.3488 (+0.0238) | 0.5296 (+0.0289) | 0.4684 (+0.0130) |
2.4598 | 12100 | 2.0448 | 2.0710 | 0.5252 (-0.0152) | 0.3543 (+0.0292) | 0.5164 (+0.0157) | 0.4653 (+0.0099) |
2.4802 | 12200 | 2.0443 | 2.0711 | 0.5377 (-0.0027) | 0.3477 (+0.0227) | 0.5152 (+0.0146) | 0.4669 (+0.0115) |
2.5005 | 12300 | 2.0477 | 2.0699 | 0.5087 (-0.0317) | 0.3396 (+0.0146) | 0.5152 (+0.0146) | 0.4545 (-0.0009) |
2.5208 | 12400 | 2.0474 | 2.0698 | 0.5283 (-0.0121) | 0.3473 (+0.0223) | 0.5336 (+0.0330) | 0.4697 (+0.0144) |
2.5412 | 12500 | 2.0489 | 2.0692 | 0.5246 (-0.0159) | 0.3575 (+0.0324) | 0.5376 (+0.0370) | 0.4732 (+0.0178) |
2.5615 | 12600 | 2.0516 | 2.0711 | 0.5360 (-0.0044) | 0.3498 (+0.0247) | 0.5496 (+0.0489) | 0.4785 (+0.0231) |
2.5818 | 12700 | 2.0417 | 2.0705 | 0.5438 (+0.0034) | 0.3569 (+0.0318) | 0.5264 (+0.0258) | 0.4757 (+0.0203) |
2.6022 | 12800 | 2.0508 | 2.0703 | 0.5186 (-0.0218) | 0.3494 (+0.0243) | 0.5340 (+0.0334) | 0.4673 (+0.0120) |
2.6225 | 12900 | 2.049 | 2.0728 | 0.5161 (-0.0244) | 0.3444 (+0.0193) | 0.5397 (+0.0391) | 0.4667 (+0.0114) |
2.6428 | 13000 | 2.0493 | 2.0725 | 0.5319 (-0.0085) | 0.3467 (+0.0216) | 0.5565 (+0.0558) | 0.4784 (+0.0230) |
2.6631 | 13100 | 2.0532 | 2.0731 | 0.5318 (-0.0087) | 0.3426 (+0.0175) | 0.5512 (+0.0506) | 0.4752 (+0.0198) |
2.6835 | 13200 | 2.0502 | 2.0712 | 0.5292 (-0.0112) | 0.3403 (+0.0152) | 0.5532 (+0.0525) | 0.4742 (+0.0189) |
2.7038 | 13300 | 2.0449 | 2.0745 | 0.5240 (-0.0164) | 0.3545 (+0.0295) | 0.5192 (+0.0185) | 0.4659 (+0.0105) |
2.7241 | 13400 | 2.0487 | 2.0709 | 0.5272 (-0.0132) | 0.3593 (+0.0343) | 0.5299 (+0.0292) | 0.4721 (+0.0168) |
2.7445 | 13500 | 2.0397 | 2.0709 | 0.5364 (-0.0040) | 0.3647 (+0.0397) | 0.5284 (+0.0278) | 0.4765 (+0.0211) |
2.7648 | 13600 | 2.0394 | 2.0711 | 0.5524 (+0.0119) | 0.3607 (+0.0356) | 0.5300 (+0.0293) | 0.4810 (+0.0256) |
2.7851 | 13700 | 2.0564 | 2.0724 | 0.5432 (+0.0028) | 0.3597 (+0.0346) | 0.5214 (+0.0207) | 0.4747 (+0.0194) |
2.8054 | 13800 | 2.0577 | 2.0736 | 0.5250 (-0.0154) | 0.3659 (+0.0409) | 0.5433 (+0.0426) | 0.4781 (+0.0227) |
2.8258 | 13900 | 2.0501 | 2.0726 | 0.5376 (-0.0028) | 0.3501 (+0.0251) | 0.5174 (+0.0167) | 0.4684 (+0.0130) |
2.8461 | 14000 | 2.0508 | 2.0698 | 0.5363 (-0.0042) | 0.3528 (+0.0278) | 0.5319 (+0.0313) | 0.4737 (+0.0183) |
2.8664 | 14100 | 2.0414 | 2.0706 | 0.5482 (+0.0077) | 0.3445 (+0.0194) | 0.5027 (+0.0020) | 0.4651 (+0.0097) |
2.8868 | 14200 | 2.0358 | 2.0697 | 0.5317 (-0.0087) | 0.3581 (+0.0331) | 0.5322 (+0.0316) | 0.4740 (+0.0187) |
2.9071 | 14300 | 2.0517 | 2.0730 | 0.5222 (-0.0183) | 0.3528 (+0.0278) | 0.5416 (+0.0410) | 0.4722 (+0.0168) |
2.9274 | 14400 | 2.0539 | 2.0708 | 0.5096 (-0.0308) | 0.3587 (+0.0336) | 0.5381 (+0.0374) | 0.4688 (+0.0134) |
2.9478 | 14500 | 2.0528 | 2.0739 | 0.5205 (-0.0199) | 0.3518 (+0.0268) | 0.5263 (+0.0257) | 0.4662 (+0.0108) |
2.9681 | 14600 | 2.0467 | 2.0716 | 0.5101 (-0.0303) | 0.3392 (+0.0142) | 0.5453 (+0.0447) | 0.4649 (+0.0095) |
2.9884 | 14700 | 2.0504 | 2.0720 | 0.5168 (-0.0236) | 0.3428 (+0.0178) | 0.5155 (+0.0149) | 0.4584 (+0.0030) |
3.0087 | 14800 | 2.0284 | 2.0752 | 0.4940 (-0.0464) | 0.3604 (+0.0354) | 0.5090 (+0.0083) | 0.4544 (-0.0009) |
3.0291 | 14900 | 2.0364 | 2.0757 | 0.4756 (-0.0649) | 0.3552 (+0.0301) | 0.4839 (-0.0167) | 0.4382 (-0.0172) |
3.0494 | 15000 | 2.0331 | 2.0757 | 0.4731 (-0.0673) | 0.3365 (+0.0114) | 0.4868 (-0.0138) | 0.4322 (-0.0232) |
3.0697 | 15100 | 2.029 | 2.0797 | 0.4908 (-0.0496) | 0.3437 (+0.0186) | 0.4940 (-0.0066) | 0.4428 (-0.0125) |
3.0901 | 15200 | 2.038 | 2.0784 | 0.4806 (-0.0598) | 0.3452 (+0.0201) | 0.4385 (-0.0622) | 0.4214 (-0.0340) |
3.1104 | 15300 | 2.0306 | 2.0772 | 0.4830 (-0.0574) | 0.3511 (+0.0261) | 0.4598 (-0.0409) | 0.4313 (-0.0241) |
3.1307 | 15400 | 2.0332 | 2.0782 | 0.4620 (-0.0784) | 0.3417 (+0.0167) | 0.4302 (-0.0705) | 0.4113 (-0.0441) |
3.1510 | 15500 | 2.0151 | 2.0761 | 0.4839 (-0.0566) | 0.3400 (+0.0149) | 0.4543 (-0.0463) | 0.4261 (-0.0293) |
3.1714 | 15600 | 2.0193 | 2.0768 | 0.4594 (-0.0810) | 0.3422 (+0.0172) | 0.4528 (-0.0478) | 0.4182 (-0.0372) |
3.1917 | 15700 | 2.0331 | 2.0794 | 0.4812 (-0.0592) | 0.3474 (+0.0223) | 0.4562 (-0.0444) | 0.4283 (-0.0271) |
3.2120 | 15800 | 2.0313 | 2.0802 | 0.4700 (-0.0704) | 0.3497 (+0.0247) | 0.4750 (-0.0256) | 0.4316 (-0.0238) |
3.2324 | 15900 | 2.0205 | 2.0793 | 0.4746 (-0.0658) | 0.3435 (+0.0185) | 0.4615 (-0.0392) | 0.4265 (-0.0289) |
3.2527 | 16000 | 2.0385 | 2.0773 | 0.4793 (-0.0611) | 0.3434 (+0.0184) | 0.4676 (-0.0330) | 0.4301 (-0.0253) |
3.2730 | 16100 | 2.0364 | 2.0787 | 0.4890 (-0.0514) | 0.3416 (+0.0166) | 0.4497 (-0.0510) | 0.4268 (-0.0286) |
3.2934 | 16200 | 2.0311 | 2.0784 | 0.4889 (-0.0516) | 0.3549 (+0.0299) | 0.4529 (-0.0478) | 0.4322 (-0.0231) |
3.3137 | 16300 | 2.0272 | 2.0782 | 0.4641 (-0.0763) | 0.3476 (+0.0226) | 0.4499 (-0.0508) | 0.4205 (-0.0348) |
3.3340 | 16400 | 2.0313 | 2.0778 | 0.4643 (-0.0761) | 0.3547 (+0.0297) | 0.4710 (-0.0296) | 0.4300 (-0.0253) |
3.3543 | 16500 | 2.0243 | 2.0781 | 0.4717 (-0.0687) | 0.3498 (+0.0247) | 0.4545 (-0.0462) | 0.4253 (-0.0301) |
3.3747 | 16600 | 2.0317 | 2.0789 | 0.4657 (-0.0747) | 0.3376 (+0.0125) | 0.4367 (-0.0639) | 0.4133 (-0.0420) |
3.3950 | 16700 | 2.0263 | 2.0786 | 0.4715 (-0.0689) | 0.3467 (+0.0217) | 0.4460 (-0.0546) | 0.4214 (-0.0340) |
3.4153 | 16800 | 2.0224 | 2.0783 | 0.4734 (-0.0671) | 0.3537 (+0.0286) | 0.4608 (-0.0399) | 0.4293 (-0.0261) |
3.4357 | 16900 | 2.0365 | 2.0772 | 0.4693 (-0.0711) | 0.3484 (+0.0233) | 0.4554 (-0.0452) | 0.4244 (-0.0310) |
3.4560 | 17000 | 2.0337 | 2.0773 | 0.4767 (-0.0637) | 0.3485 (+0.0234) | 0.4454 (-0.0553) | 0.4235 (-0.0318) |
3.4763 | 17100 | 2.0326 | 2.0775 | 0.4850 (-0.0554) | 0.3389 (+0.0138) | 0.4197 (-0.0809) | 0.4146 (-0.0408) |
3.4966 | 17200 | 2.0285 | 2.0811 | 0.4753 (-0.0651) | 0.3410 (+0.0159) | 0.4285 (-0.0722) | 0.4149 (-0.0405) |
3.5170 | 17300 | 2.0367 | 2.0803 | 0.4697 (-0.0707) | 0.3357 (+0.0107) | 0.4277 (-0.0729) | 0.4111 (-0.0443) |
3.5373 | 17400 | 2.0319 | 2.0773 | 0.4779 (-0.0625) | 0.3366 (+0.0116) | 0.4325 (-0.0681) | 0.4157 (-0.0397) |
3.5576 | 17500 | 2.0314 | 2.0769 | 0.4802 (-0.0603) | 0.3437 (+0.0186) | 0.4565 (-0.0442) | 0.4268 (-0.0286) |
3.5780 | 17600 | 2.0383 | 2.0784 | 0.4790 (-0.0614) | 0.3423 (+0.0172) | 0.4416 (-0.0591) | 0.4210 (-0.0344) |
3.5983 | 17700 | 2.0305 | 2.0789 | 0.4860 (-0.0544) | 0.3321 (+0.0071) | 0.4363 (-0.0643) | 0.4181 (-0.0372) |
3.6186 | 17800 | 2.0341 | 2.0771 | 0.4872 (-0.0532) | 0.3420 (+0.0169) | 0.4339 (-0.0668) | 0.4210 (-0.0344) |
3.6390 | 17900 | 2.0348 | 2.0798 | 0.4823 (-0.0581) | 0.3387 (+0.0137) | 0.4746 (-0.0261) | 0.4319 (-0.0235) |
3.6593 | 18000 | 2.0198 | 2.0774 | 0.4788 (-0.0616) | 0.3436 (+0.0186) | 0.4468 (-0.0539) | 0.4231 (-0.0323) |
3.6796 | 18100 | 2.0253 | 2.0786 | 0.4914 (-0.0490) | 0.3507 (+0.0257) | 0.4724 (-0.0283) | 0.4382 (-0.0172) |
3.6999 | 18200 | 2.0392 | 2.0781 | 0.4812 (-0.0592) | 0.3550 (+0.0300) | 0.4548 (-0.0458) | 0.4303 (-0.0250) |
3.7203 | 18300 | 2.0295 | 2.0784 | 0.4681 (-0.0724) | 0.3527 (+0.0277) | 0.4677 (-0.0330) | 0.4295 (-0.0259) |
3.7406 | 18400 | 2.0289 | 2.0781 | 0.4676 (-0.0728) | 0.3504 (+0.0253) | 0.4679 (-0.0327) | 0.4286 (-0.0267) |
3.7609 | 18500 | 2.0257 | 2.0797 | 0.4768 (-0.0637) | 0.3518 (+0.0267) | 0.4806 (-0.0200) | 0.4364 (-0.0190) |
3.7813 | 18600 | 2.0219 | 2.0801 | 0.4885 (-0.0519) | 0.3376 (+0.0125) | 0.4705 (-0.0301) | 0.4322 (-0.0232) |
3.8016 | 18700 | 2.0279 | 2.0796 | 0.4939 (-0.0465) | 0.3440 (+0.0189) | 0.4858 (-0.0149) | 0.4412 (-0.0141) |
3.8219 | 18800 | 2.0289 | 2.0797 | 0.4834 (-0.0570) | 0.3437 (+0.0186) | 0.4955 (-0.0052) | 0.4408 (-0.0145) |
3.8422 | 18900 | 2.0367 | 2.0816 | 0.4913 (-0.0491) | 0.3426 (+0.0175) | 0.4905 (-0.0102) | 0.4415 (-0.0139) |
3.8626 | 19000 | 2.0428 | 2.0797 | 0.4815 (-0.0589) | 0.3444 (+0.0193) | 0.4815 (-0.0191) | 0.4358 (-0.0196) |
3.8829 | 19100 | 2.0341 | 2.0787 | 0.4741 (-0.0664) | 0.3519 (+0.0269) | 0.4891 (-0.0116) | 0.4383 (-0.0170) |
3.9032 | 19200 | 2.0342 | 2.0782 | 0.4766 (-0.0638) | 0.3420 (+0.0170) | 0.4618 (-0.0388) | 0.4268 (-0.0286) |
3.9236 | 19300 | 2.0179 | 2.0772 | 0.4841 (-0.0564) | 0.3491 (+0.0241) | 0.4384 (-0.0622) | 0.4239 (-0.0315) |
3.9439 | 19400 | 2.0412 | 2.0781 | 0.4769 (-0.0635) | 0.3426 (+0.0175) | 0.4551 (-0.0456) | 0.4249 (-0.0305) |
3.9642 | 19500 | 2.0276 | 2.0765 | 0.4827 (-0.0578) | 0.3409 (+0.0159) | 0.4851 (-0.0155) | 0.4362 (-0.0191) |
3.9845 | 19600 | 2.0277 | 2.0770 | 0.4841 (-0.0563) | 0.3416 (+0.0165) | 0.4734 (-0.0273) | 0.4330 (-0.0223) |
4.0049 | 19700 | 2.0313 | 2.0791 | 0.4884 (-0.0521) | 0.3417 (+0.0166) | 0.4562 (-0.0444) | 0.4287 (-0.0266) |
4.0252 | 19800 | 2.0197 | 2.0798 | 0.4585 (-0.0819) | 0.3478 (+0.0228) | 0.4545 (-0.0461) | 0.4203 (-0.0351) |
4.0455 | 19900 | 2.0214 | 2.0826 | 0.4679 (-0.0725) | 0.3414 (+0.0163) | 0.4501 (-0.0506) | 0.4198 (-0.0356) |
4.0659 | 20000 | 2.011 | 2.0833 | 0.4463 (-0.0941) | 0.3408 (+0.0158) | 0.4275 (-0.0731) | 0.4049 (-0.0505) |
4.0862 | 20100 | 2.0139 | 2.0835 | 0.4689 (-0.0715) | 0.3445 (+0.0195) | 0.4111 (-0.0896) | 0.4082 (-0.0472) |
4.1065 | 20200 | 2.0269 | 2.0813 | 0.4422 (-0.0983) | 0.3447 (+0.0197) | 0.3871 (-0.1135) | 0.3913 (-0.0640) |
4.1269 | 20300 | 2.0214 | 2.0826 | 0.4389 (-0.1016) | 0.3396 (+0.0145) | 0.3781 (-0.1226) | 0.3855 (-0.0699) |
4.1472 | 20400 | 2.028 | 2.0838 | 0.4562 (-0.0842) | 0.3411 (+0.0160) | 0.4128 (-0.0878) | 0.4034 (-0.0520) |
4.1675 | 20500 | 2.0165 | 2.0818 | 0.4596 (-0.0808) | 0.3356 (+0.0105) | 0.4241 (-0.0766) | 0.4064 (-0.0490) |
4.1878 | 20600 | 2.0208 | 2.0820 | 0.4744 (-0.0660) | 0.3440 (+0.0190) | 0.3967 (-0.1040) | 0.4050 (-0.0503) |
4.2082 | 20700 | 2.0151 | 2.0831 | 0.4558 (-0.0846) | 0.3404 (+0.0154) | 0.4029 (-0.0977) | 0.3997 (-0.0557) |
4.2285 | 20800 | 2.023 | 2.0844 | 0.4317 (-0.1087) | 0.3368 (+0.0117) | 0.4272 (-0.0734) | 0.3986 (-0.0568) |
4.2488 | 20900 | 2.0162 | 2.0821 | 0.4356 (-0.1048) | 0.3382 (+0.0132) | 0.3950 (-0.1057) | 0.3896 (-0.0658) |
4.2692 | 21000 | 2.0114 | 2.0816 | 0.4418 (-0.0986) | 0.3349 (+0.0098) | 0.3830 (-0.1177) | 0.3865 (-0.0688) |
4.2895 | 21100 | 2.0153 | 2.0823 | 0.4449 (-0.0955) | 0.3339 (+0.0088) | 0.4023 (-0.0983) | 0.3937 (-0.0617) |
4.3098 | 21200 | 2.0159 | 2.0827 | 0.4317 (-0.1087) | 0.3345 (+0.0094) | 0.4147 (-0.0860) | 0.3936 (-0.0618) |
4.3301 | 21300 | 2.0277 | 2.0818 | 0.4354 (-0.1051) | 0.3401 (+0.0151) | 0.3851 (-0.1156) | 0.3868 (-0.0685) |
4.3505 | 21400 | 2.0176 | 2.0819 | 0.4439 (-0.0965) | 0.3434 (+0.0184) | 0.4006 (-0.1000) | 0.3960 (-0.0594) |
4.3708 | 21500 | 2.0242 | 2.0816 | 0.4532 (-0.0872) | 0.3338 (+0.0088) | 0.3988 (-0.1018) | 0.3953 (-0.0601) |
4.3911 | 21600 | 2.0279 | 2.0814 | 0.4509 (-0.0895) | 0.3383 (+0.0133) | 0.4082 (-0.0925) | 0.3991 (-0.0562) |
4.4115 | 21700 | 2.0172 | 2.0818 | 0.4372 (-0.1032) | 0.3360 (+0.0110) | 0.4029 (-0.0977) | 0.3920 (-0.0633) |
4.4318 | 21800 | 2.0188 | 2.0831 | 0.4556 (-0.0848) | 0.3373 (+0.0123) | 0.4097 (-0.0909) | 0.4009 (-0.0545) |
4.4521 | 21900 | 2.0151 | 2.0824 | 0.4455 (-0.0950) | 0.3349 (+0.0098) | 0.3960 (-0.1046) | 0.3921 (-0.0632) |
4.4725 | 22000 | 2.0149 | 2.0824 | 0.4510 (-0.0894) | 0.3328 (+0.0077) | 0.3974 (-0.1032) | 0.3937 (-0.0616) |
4.4928 | 22100 | 2.0147 | 2.0818 | 0.4295 (-0.1109) | 0.3237 (-0.0014) | 0.4090 (-0.0916) | 0.3874 (-0.0680) |
4.5131 | 22200 | 2.0187 | 2.0826 | 0.4459 (-0.0946) | 0.3312 (+0.0062) | 0.4176 (-0.0830) | 0.3982 (-0.0571) |
4.5334 | 22300 | 2.0178 | 2.0828 | 0.4375 (-0.1029) | 0.3325 (+0.0075) | 0.4199 (-0.0808) | 0.3966 (-0.0587) |
4.5538 | 22400 | 2.0213 | 2.0823 | 0.4401 (-0.1003) | 0.3369 (+0.0119) | 0.3903 (-0.1103) | 0.3891 (-0.0662) |
4.5741 | 22500 | 2.0133 | 2.0826 | 0.4406 (-0.0999) | 0.3397 (+0.0147) | 0.4180 (-0.0826) | 0.3994 (-0.0559) |
4.5944 | 22600 | 2.0209 | 2.0819 | 0.4548 (-0.0856) | 0.3305 (+0.0055) | 0.4084 (-0.0922) | 0.3979 (-0.0575) |
4.6148 | 22700 | 2.018 | 2.0823 | 0.4372 (-0.1032) | 0.3353 (+0.0103) | 0.4014 (-0.0993) | 0.3913 (-0.0641) |
4.6351 | 22800 | 2.0183 | 2.0825 | 0.4469 (-0.0936) | 0.3300 (+0.0050) | 0.3933 (-0.1073) | 0.3901 (-0.0653) |
4.6554 | 22900 | 2.0199 | 2.0829 | 0.4368 (-0.1036) | 0.3259 (+0.0009) | 0.3821 (-0.1186) | 0.3816 (-0.0738) |
4.6757 | 23000 | 2.0222 | 2.0822 | 0.4399 (-0.1005) | 0.3385 (+0.0135) | 0.3704 (-0.1303) | 0.3829 (-0.0724) |
4.6961 | 23100 | 2.0284 | 2.0821 | 0.4404 (-0.1000) | 0.3360 (+0.0109) | 0.3984 (-0.1023) | 0.3916 (-0.0638) |
4.7164 | 23200 | 2.0115 | 2.0832 | 0.4354 (-0.1051) | 0.3423 (+0.0173) | 0.3879 (-0.1128) | 0.3885 (-0.0668) |
4.7367 | 23300 | 2.0193 | 2.0823 | 0.4384 (-0.1020) | 0.3402 (+0.0151) | 0.3943 (-0.1064) | 0.3909 (-0.0644) |
4.7571 | 23400 | 2.0157 | 2.0828 | 0.4196 (-0.1208) | 0.3419 (+0.0169) | 0.3944 (-0.1062) | 0.3853 (-0.0701) |
4.7774 | 23500 | 2.0222 | 2.0830 | 0.4369 (-0.1035) | 0.3317 (+0.0067) | 0.4125 (-0.0882) | 0.3937 (-0.0617) |
4.7977 | 23600 | 2.019 | 2.0829 | 0.4358 (-0.1046) | 0.3428 (+0.0178) | 0.4047 (-0.0959) | 0.3944 (-0.0609) |
4.8181 | 23700 | 2.0117 | 2.0827 | 0.4385 (-0.1019) | 0.3398 (+0.0148) | 0.4043 (-0.0963) | 0.3942 (-0.0612) |
4.8384 | 23800 | 2.0257 | 2.0830 | 0.4485 (-0.0919) | 0.3381 (+0.0130) | 0.4013 (-0.0993) | 0.3960 (-0.0594) |
4.8587 | 23900 | 2.0234 | 2.0825 | 0.4425 (-0.0979) | 0.3419 (+0.0169) | 0.3945 (-0.1062) | 0.3930 (-0.0624) |
4.8790 | 24000 | 2.0174 | 2.0832 | 0.4423 (-0.0982) | 0.3383 (+0.0133) | 0.4012 (-0.0994) | 0.3939 (-0.0614) |
4.8994 | 24100 | 2.0215 | 2.0829 | 0.4494 (-0.0911) | 0.3387 (+0.0137) | 0.3903 (-0.1104) | 0.3928 (-0.0626) |
4.9197 | 24200 | 2.015 | 2.0833 | 0.4518 (-0.0886) | 0.3410 (+0.0160) | 0.3894 (-0.1112) | 0.3941 (-0.0613) |
4.9400 | 24300 | 2.016 | 2.0831 | 0.4366 (-0.1038) | 0.3459 (+0.0209) | 0.3908 (-0.1098) | 0.3911 (-0.0643) |
4.9604 | 24400 | 2.0192 | 2.0831 | 0.4423 (-0.0982) | 0.3424 (+0.0174) | 0.3899 (-0.1107) | 0.3915 (-0.0638) |
4.9807 | 24500 | 2.022 | 2.0831 | 0.4491 (-0.0913) | 0.3494 (+0.0243) | 0.3900 (-0.1106) | 0.3962 (-0.0592) |
-1 | -1 | - | - | 0.6038 (+0.0634) | 0.3783 (+0.0532) | 0.6440 (+0.1433) | 0.5420 (+0.0866) |
- The bold row denotes the saved checkpoint.
Framework Versions
- Python: 3.10.18
- Sentence Transformers: 5.0.0
- Transformers: 4.56.0.dev0
- PyTorch: 2.7.1+cu126
- Accelerate: 1.9.0
- Datasets: 4.0.0
- Tokenizers: 0.21.4
Citation
BibTeX
Sentence Transformers
@inproceedings{reimers-2019-sentence-bert,
title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
author = "Reimers, Nils and Gurevych, Iryna",
booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
month = "11",
year = "2019",
publisher = "Association for Computational Linguistics",
url = "https://arxiv.org/abs/1908.10084",
}
ListNetLoss
@inproceedings{cao2007learning,
title={Learning to Rank: From Pairwise Approach to Listwise Approach},
author={Cao, Zhe and Qin, Tao and Liu, Tie-Yan and Tsai, Ming-Feng and Li, Hang},
booktitle={Proceedings of the 24th international conference on Machine learning},
pages={129--136},
year={2007}
}
- Downloads last month
- -
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
🙋
Ask for provider support
Model tree for rahulseetharaman/reranker-msmarco-v1.1-ettin-encoder-68m-listnet
Base model
jhu-clsp/ettin-encoder-68mDataset used to train rahulseetharaman/reranker-msmarco-v1.1-ettin-encoder-68m-listnet
Evaluation results
- Map on NanoMSMARCO R100self-reported0.529
- Mrr@10 on NanoMSMARCO R100self-reported0.525
- Ndcg@10 on NanoMSMARCO R100self-reported0.604
- Map on NanoNFCorpus R100self-reported0.358
- Mrr@10 on NanoNFCorpus R100self-reported0.550
- Ndcg@10 on NanoNFCorpus R100self-reported0.378
- Map on NanoNQ R100self-reported0.574
- Mrr@10 on NanoNQ R100self-reported0.584
- Ndcg@10 on NanoNQ R100self-reported0.644
- Map on NanoBEIR R100 meanself-reported0.487