You need to agree to share your contact information to access this model

This repository is publicly accessible, but you have to accept the conditions to access its files and content.

Log in or Sign Up to review the conditions and access this model content.

SentenceTransformer based on intfloat/multilingual-e5-base

This is a sentence-transformers model finetuned from intfloat/multilingual-e5-base on the rozetka_positive_pairs dataset. It maps sentences & paragraphs to a 768-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.

Model Details

Model Description

  • Model Type: Sentence Transformer
  • Base model: intfloat/multilingual-e5-base
  • Maximum Sequence Length: 512 tokens
  • Output Dimensionality: 768 dimensions
  • Similarity Function: Dot Product
  • Training Dataset:
    • rozetka_positive_pairs

Model Sources

Full Model Architecture

RZTKSentenceTransformer(
  (0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: XLMRobertaModel 
  (1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
  (2): Normalize()
)

Usage

Direct Usage (Sentence Transformers)

First install the Sentence Transformers library:

pip install -U sentence-transformers

Then you can load this model and run inference.

from sentence_transformers import SentenceTransformer

# Download from the 🤗 Hub
model = SentenceTransformer("yklymchuk-rztk/multilingual-e5-base-matryoshka2d-mnr-10")
# Run inference
sentences = [
    'query: koton женская одежда',
    'passage: Жіночі штани Koton Габарити С Стандарт (до 300x200x250 мм) Кількість вантажних місць 1 Країна реєстрації бренда Туреччина Країна-виробник товару Туреччина Розмір M Стиль Повсякденний (casual) Колір Зелений Моделі Кюлоти Доставка Доставка в магазини ROZETKA',
    'passage: Женские блузы Koton Габариты_old C Стандарт (до 300x200x250 мм) Количество грузовых мест 1 Страна регистрации бренда Турция Страна-производитель товара Турция Размер M Стиль Повседневный (casual) Цвет Бежевый Материал Полиэстер Материал Эластан Доставка Доставка в магазины ROZETKA',
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 768]

# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities.shape)
# [3, 3]

Evaluation

Metrics

RZTKInformation Retrieval

  • Dataset: validation--matryoshka_dim-768--
  • Evaluated with sentence_transformers_training.evaluation.information_retrieval_evaluator.RZTKInformationRetrievalEvaluator
Metric Value
dot_accuracy_10 0.4715
dot_precision_10 0.0816
dot_recall_10 0.3422
dot_ndcg_10 0.2382
dot_mrr_10 0.2343
dot_map_60 0.2037

RZTKInformation Retrieval

  • Datasets: bm-full, core-uk-title, core-ru-title, core-uk-options, core-ru-options, options-uk-title, options-ru-title, options-uk-options, options-ru-options, rusisms-uk-title, rusisms-ru-title, rusisms-uk-options, rusisms-ru-options, rusisms_corrected-uk-title, rusisms_corrected-ru-title, rusisms_corrected-uk-options, rusisms_corrected-ru-options, core_typos-uk-title, core_typos-ru-title, core_typos-uk-options and core_typos-ru-options
  • Evaluated with sentence_transformers_training.evaluation.information_retrieval_evaluator.RZTKInformationRetrievalEvaluator
Metric bm-full core-uk-title core-ru-title core-uk-options core-ru-options options-uk-title options-ru-title options-uk-options options-ru-options rusisms-uk-title rusisms-ru-title rusisms-uk-options rusisms-ru-options rusisms_corrected-uk-title rusisms_corrected-ru-title rusisms_corrected-uk-options rusisms_corrected-ru-options core_typos-uk-title core_typos-ru-title core_typos-uk-options core_typos-ru-options
dot_accuracy_1 0.6862 0.7887 0.8005 0.6667 0.6759 0.8228 0.8204 0.6893 0.6917 0.8308 0.8538 0.6846 0.7308 0.9231 0.9154 0.8 0.8154 0.7021 0.7178 0.563 0.5656
dot_accuracy_3 0.7965 0.9265 0.9226 0.8451 0.8504 0.9442 0.9369 0.8641 0.8714 0.9077 0.8923 0.7846 0.8231 0.9769 0.9615 0.8846 0.9385 0.853 0.8543 0.7507 0.7507
dot_accuracy_5 0.8526 0.9672 0.9646 0.9029 0.9094 0.9733 0.9757 0.915 0.9175 0.9231 0.9154 0.8308 0.8846 0.9846 0.9615 0.9231 0.9692 0.9094 0.9094 0.811 0.8136
dot_accuracy_10 0.9063 0.9908 0.9908 0.9606 0.9593 0.9927 0.9927 0.9587 0.9587 0.9385 0.9385 0.9077 0.9 0.9923 0.9923 0.9923 0.9923 0.9567 0.9593 0.8832 0.8911
dot_precision_1 0.6862 0.7887 0.8005 0.6667 0.6759 0.8228 0.8204 0.6893 0.6917 0.8308 0.8538 0.6846 0.7308 0.9231 0.9154 0.8 0.8154 0.7021 0.7178 0.563 0.5656
dot_precision_3 0.6771 0.7196 0.724 0.6251 0.6251 0.7484 0.75 0.6319 0.6343 0.7872 0.7872 0.6744 0.6769 0.8359 0.8282 0.7462 0.7769 0.6374 0.643 0.5324 0.5284
dot_precision_5 0.6606 0.6291 0.6339 0.5585 0.5593 0.5893 0.5898 0.5131 0.5121 0.7354 0.7415 0.6415 0.6631 0.7969 0.7815 0.7108 0.7431 0.5667 0.5685 0.4745 0.4732
dot_precision_10 0.6134 0.3899 0.3904 0.3681 0.3656 0.3379 0.3367 0.3097 0.309 0.64 0.6362 0.5746 0.5754 0.6754 0.6738 0.6469 0.6469 0.3597 0.3604 0.3202 0.3209
dot_recall_1 0.0467 0.2405 0.2447 0.1954 0.1986 0.2577 0.2554 0.2108 0.2121 0.1537 0.1673 0.1288 0.1384 0.1929 0.1927 0.1614 0.1617 0.2042 0.2075 0.1624 0.1628
dot_recall_3 0.1351 0.5724 0.5729 0.4826 0.4859 0.6689 0.6682 0.5591 0.5635 0.3589 0.3508 0.3125 0.318 0.3944 0.3826 0.3388 0.3648 0.5013 0.5054 0.4062 0.404
dot_recall_5 0.2131 0.7821 0.7877 0.6773 0.6818 0.8499 0.8496 0.7383 0.7374 0.4859 0.487 0.4256 0.4479 0.5414 0.5193 0.4734 0.5108 0.6996 0.7007 0.5715 0.5723
dot_recall_10 0.366 0.934 0.9335 0.8668 0.8626 0.9645 0.9613 0.879 0.8778 0.7002 0.6997 0.6454 0.6398 0.7549 0.7513 0.7306 0.7309 0.8574 0.8614 0.7553 0.7583
dot_ndcg_10 0.656 0.8576 0.861 0.7688 0.7676 0.8845 0.8832 0.7782 0.7788 0.843 0.8448 0.7481 0.7591 0.9158 0.9094 0.8466 0.8601 0.7737 0.778 0.6584 0.6597
dot_mrr_10 0.7509 0.8627 0.868 0.7697 0.7727 0.8859 0.8836 0.7828 0.7869 0.8716 0.8784 0.7509 0.7872 0.9503 0.9432 0.858 0.8773 0.7895 0.7988 0.6702 0.6718
dot_map_100 0.6072 0.8123 0.8174 0.7161 0.7162 0.8301 0.8311 0.7188 0.7188 0.8333 0.8405 0.7426 0.7545 0.9056 0.8989 0.8358 0.8504 0.7246 0.7278 0.6089 0.6086

RZTKInformation Retrieval

  • Datasets: bm-full--matryoshka_dim-768--, bm-full--matryoshka_dim-512--, bm-full--matryoshka_dim-256-- and bm-full--matryoshka_dim-128--
  • Evaluated with sentence_transformers_training.evaluation.information_retrieval_evaluator.RZTKInformationRetrievalEvaluator
Metric bm-full--matryoshka_dim-768-- bm-full--matryoshka_dim-512-- bm-full--matryoshka_dim-256-- bm-full--matryoshka_dim-128--
dot_accuracy_1 0.6862 0.6786 0.6705 0.6562
dot_precision_1 0.6862 0.6786 0.6705 0.6562
dot_recall_1 0.0467 0.0462 0.0454 0.0434
dot_ndcg_1 0.6862 0.6786 0.6705 0.6562
dot_mrr_1 0.6862 0.6786 0.6705 0.6562
dot_map_100 0.6072 0.6023 0.5909 0.5607

Training Details

Training Dataset

rozetka_positive_pairs

  • Dataset: rozetka_positive_pairs
  • Size: 53,499,287 training samples
  • Columns: query and text
  • Approximate statistics based on the first 1000 samples:
    query text
    type string string
    details
    • min: 7 tokens
    • mean: 13.2 tokens
    • max: 40 tokens
    • min: 6 tokens
    • mean: 56.49 tokens
    • max: 512 tokens
  • Samples:
    query text
    query: campingaz fold n cool classic 10l dark blue passage: Термосумка Campingaz Fold'n Cool Classic 10L Dark Blue (4823082704729)
    query: campingaz fold n cool classic 10l dark blue passage: Термопродукція Campingaz Гарантія 14 днів Вид Термосумки Колір Синій з білим Режим роботи Охолодження Країна реєстрації бренда Франція Країна-виробник товару Китай Тип гарантійного талона Гарантія по чеку Можливість доставки Почтомати Доставка Premium Немає
    query: campingaz fold n cool classic 10l dark blue passage: Термосумка Campingaz Fold'n Cool Classic 10L Dark Blue (4823082704729)
  • Loss: sentence_transformers_training.model.matryoshka2d_loss.RZTKMatryoshka2dLoss with these parameters:
    {
        "loss": "RZTKMultipleNegativesRankingLoss",
        "n_layers_per_step": 1,
        "last_layer_weight": 1.0,
        "prior_layers_weight": 1.0,
        "kl_div_weight": 1.0,
        "kl_temperature": 0.3,
        "matryoshka_dims": [
            768,
            512,
            256,
            128
        ],
        "matryoshka_weights": [
            1,
            1,
            1,
            1
        ],
        "n_dims_per_step": 1
    }
    

Evaluation Dataset

rozetka_positive_pairs

  • Dataset: rozetka_positive_pairs
  • Size: 1,369,397 evaluation samples
  • Columns: query and text
  • Approximate statistics based on the first 1000 samples:
    query text
    type string string
    details
    • min: 8 tokens
    • mean: 10.41 tokens
    • max: 14 tokens
    • min: 11 tokens
    • mean: 50.29 tokens
    • max: 512 tokens
  • Samples:
    query text
    query: ab553446bu passage: Акумулятори для мобільних телефонів
    query: ab553446bu passage: Аккумулятор AB553446BU для Samsung i320 1000 mAh (03649-25)
    query: ab553446bu passage: Аккумуляторы для мобильных телефонов
  • Loss: sentence_transformers_training.model.matryoshka2d_loss.RZTKMatryoshka2dLoss with these parameters:
    {
        "loss": "RZTKMultipleNegativesRankingLoss",
        "n_layers_per_step": 1,
        "last_layer_weight": 1.0,
        "prior_layers_weight": 1.0,
        "kl_div_weight": 1.0,
        "kl_temperature": 0.3,
        "matryoshka_dims": [
            768,
            512,
            256,
            128
        ],
        "matryoshka_weights": [
            1,
            1,
            1,
            1
        ],
        "n_dims_per_step": 1
    }
    

Training Hyperparameters

Non-Default Hyperparameters

  • eval_strategy: steps
  • per_device_train_batch_size: 88
  • per_device_eval_batch_size: 88
  • learning_rate: 2e-05
  • num_train_epochs: 1.0
  • warmup_ratio: 0.1
  • bf16: True
  • bf16_full_eval: True
  • tf32: True
  • dataloader_num_workers: 4
  • load_best_model_at_end: True
  • optim: adafactor
  • push_to_hub: True
  • hub_model_id: yklymchuk-rztk/multilingual-e5-base-matryoshka2d-mnr-10
  • hub_private_repo: True
  • prompts: {'query': 'query: ', 'text': 'passage: '}
  • batch_sampler: no_duplicates

All Hyperparameters

Click to expand
  • overwrite_output_dir: False
  • do_predict: False
  • eval_strategy: steps
  • prediction_loss_only: True
  • per_device_train_batch_size: 88
  • per_device_eval_batch_size: 88
  • per_gpu_train_batch_size: None
  • per_gpu_eval_batch_size: None
  • gradient_accumulation_steps: 1
  • eval_accumulation_steps: None
  • torch_empty_cache_steps: None
  • learning_rate: 2e-05
  • weight_decay: 0.0
  • adam_beta1: 0.9
  • adam_beta2: 0.999
  • adam_epsilon: 1e-08
  • max_grad_norm: 1.0
  • num_train_epochs: 1.0
  • max_steps: -1
  • lr_scheduler_type: linear
  • lr_scheduler_kwargs: {}
  • warmup_ratio: 0.1
  • warmup_steps: 0
  • log_level: passive
  • log_level_replica: warning
  • log_on_each_node: True
  • logging_nan_inf_filter: True
  • save_safetensors: True
  • save_on_each_node: False
  • save_only_model: False
  • restore_callback_states_from_checkpoint: False
  • no_cuda: False
  • use_cpu: False
  • use_mps_device: False
  • seed: 42
  • data_seed: None
  • jit_mode_eval: False
  • use_ipex: False
  • bf16: True
  • fp16: False
  • fp16_opt_level: O1
  • half_precision_backend: auto
  • bf16_full_eval: True
  • fp16_full_eval: False
  • tf32: True
  • local_rank: 0
  • ddp_backend: None
  • tpu_num_cores: None
  • tpu_metrics_debug: False
  • debug: []
  • dataloader_drop_last: True
  • dataloader_num_workers: 4
  • dataloader_prefetch_factor: None
  • past_index: -1
  • disable_tqdm: False
  • remove_unused_columns: True
  • label_names: None
  • load_best_model_at_end: True
  • ignore_data_skip: False
  • fsdp: []
  • fsdp_min_num_params: 0
  • fsdp_config: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
  • fsdp_transformer_layer_cls_to_wrap: None
  • accelerator_config: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
  • deepspeed: None
  • label_smoothing_factor: 0.0
  • optim: adafactor
  • optim_args: None
  • adafactor: False
  • group_by_length: False
  • length_column_name: length
  • ddp_find_unused_parameters: None
  • ddp_bucket_cap_mb: None
  • ddp_broadcast_buffers: False
  • dataloader_pin_memory: True
  • dataloader_persistent_workers: False
  • skip_memory_metrics: True
  • use_legacy_prediction_loop: False
  • push_to_hub: True
  • resume_from_checkpoint: None
  • hub_model_id: yklymchuk-rztk/multilingual-e5-base-matryoshka2d-mnr-10
  • hub_strategy: every_save
  • hub_private_repo: True
  • hub_always_push: False
  • gradient_checkpointing: False
  • gradient_checkpointing_kwargs: None
  • include_inputs_for_metrics: False
  • include_for_metrics: []
  • eval_do_concat_batches: True
  • fp16_backend: auto
  • push_to_hub_model_id: None
  • push_to_hub_organization: None
  • mp_parameters:
  • auto_find_batch_size: False
  • full_determinism: False
  • torchdynamo: None
  • ray_scope: last
  • ddp_timeout: 1800
  • torch_compile: False
  • torch_compile_backend: None
  • torch_compile_mode: None
  • dispatch_batches: None
  • split_batches: None
  • include_tokens_per_second: False
  • include_num_input_tokens_seen: False
  • neftune_noise_alpha: None
  • optim_target_modules: None
  • batch_eval_metrics: False
  • eval_on_start: False
  • use_liger_kernel: False
  • eval_use_gather_object: False
  • average_tokens_across_devices: False
  • prompts: {'query': 'query: ', 'text': 'passage: '}
  • batch_sampler: no_duplicates
  • multi_dataset_batch_sampler: proportional
  • ddp_static_graph: False
  • ddp_comm_hook: bf16
  • gradient_as_bucket_view: False
  • num_proc: 30

Training Logs

Click to expand
Epoch Step Training Loss Validation Loss validation--matryoshka_dim-768--_dot_ndcg_10 bm-full_dot_ndcg_10 core-uk-title_dot_ndcg_10 core-ru-title_dot_ndcg_10 core-uk-options_dot_ndcg_10 core-ru-options_dot_ndcg_10 options-uk-title_dot_ndcg_10 options-ru-title_dot_ndcg_10 options-uk-options_dot_ndcg_10 options-ru-options_dot_ndcg_10 rusisms-uk-title_dot_ndcg_10 rusisms-ru-title_dot_ndcg_10 rusisms-uk-options_dot_ndcg_10 rusisms-ru-options_dot_ndcg_10 rusisms_corrected-uk-title_dot_ndcg_10 rusisms_corrected-ru-title_dot_ndcg_10 rusisms_corrected-uk-options_dot_ndcg_10 rusisms_corrected-ru-options_dot_ndcg_10 core_typos-uk-title_dot_ndcg_10 core_typos-ru-title_dot_ndcg_10 core_typos-uk-options_dot_ndcg_10 core_typos-ru-options_dot_ndcg_10 bm-full--matryoshka_dim-768--_dot_ndcg_1 bm-full--matryoshka_dim-512--_dot_ndcg_1 bm-full--matryoshka_dim-256--_dot_ndcg_1 bm-full--matryoshka_dim-128--_dot_ndcg_1
0.0050 760 4.4782 - - - - - - - - - - - - - - - - - - - - - - - - - - -
0.0100 1520 4.298 - - - - - - - - - - - - - - - - - - - - - - - - - - -
0.0150 2280 3.7517 - - - - - - - - - - - - - - - - - - - - - - - - - - -
0.0200 3040 2.9677 - - - - - - - - - - - - - - - - - - - - - - - - - - -
0.0250 3800 2.1971 - - - - - - - - - - - - - - - - - - - - - - - - - - -
0.0300 4560 1.8109 - - - - - - - - - - - - - - - - - - - - - - - - - - -
0.0350 5320 1.6811 - - - - - - - - - - - - - - - - - - - - - - - - - - -
0.0400 6080 1.5155 - - - - - - - - - - - - - - - - - - - - - - - - - - -
0.0450 6840 1.4494 - - - - - - - - - - - - - - - - - - - - - - - - - - -
0.0500 7600 1.3583 - - - - - - - - - - - - - - - - - - - - - - - - - - -
0.0550 8360 1.2634 - - - - - - - - - - - - - - - - - - - - - - - - - - -
0.0600 9120 1.1704 - - - - - - - - - - - - - - - - - - - - - - - - - - -
0.0650 9880 1.1106 - - - - - - - - - - - - - - - - - - - - - - - - - - -
0.0700 10640 1.0827 - - - - - - - - - - - - - - - - - - - - - - - - - - -
0.0750 11400 1.0318 - - - - - - - - - - - - - - - - - - - - - - - - - - -
0.0800 12160 1.0159 - - - - - - - - - - - - - - - - - - - - - - - - - - -
0.0850 12920 0.9551 - - - - - - - - - - - - - - - - - - - - - - - - - - -
0.0900 13680 0.908 - - - - - - - - - - - - - - - - - - - - - - - - - - -
0.0950 14440 0.9252 - - - - - - - - - - - - - - - - - - - - - - - - - - -
0.1000 15199 - 0.7099 0.2217 - - - - - - - - - - - - - - - - - - - - - - - - -
0.1000 15200 0.8214 - - - - - - - - - - - - - - - - - - - - - - - - - - -
0.1050 15960 0.8172 - - - - - - - - - - - - - - - - - - - - - - - - - - -
0.1100 16720 0.7878 - - - - - - - - - - - - - - - - - - - - - - - - - - -
0.1150 17480 0.8079 - - - - - - - - - - - - - - - - - - - - - - - - - - -
0.1200 18240 0.7556 - - - - - - - - - - - - - - - - - - - - - - - - - - -
0.1250 19000 0.7015 - - - - - - - - - - - - - - - - - - - - - - - - - - -
0.1300 19760 0.6926 - - - - - - - - - - - - - - - - - - - - - - - - - - -
0.1350 20520 0.6752 - - - - - - - - - - - - - - - - - - - - - - - - - - -
0.1400 21280 0.6514 - - - - - - - - - - - - - - - - - - - - - - - - - - -
0.1450 22040 0.6533 - - - - - - - - - - - - - - - - - - - - - - - - - - -
0.1500 22800 0.6643 - - - - - - - - - - - - - - - - - - - - - - - - - - -
0.1550 23560 0.6372 - - - - - - - - - - - - - - - - - - - - - - - - - - -
0.1600 24320 0.602 - - - - - - - - - - - - - - - - - - - - - - - - - - -
0.1650 25080 0.5874 - - - - - - - - - - - - - - - - - - - - - - - - - - -
0.1700 25840 0.5992 - - - - - - - - - - - - - - - - - - - - - - - - - - -
0.1750 26600 0.5684 - - - - - - - - - - - - - - - - - - - - - - - - - - -
0.1800 27360 0.5775 - - - - - - - - - - - - - - - - - - - - - - - - - - -
0.1850 28120 0.5668 - - - - - - - - - - - - - - - - - - - - - - - - - - -
0.1900 28880 0.539 - - - - - - - - - - - - - - - - - - - - - - - - - - -
0.1950 29640 0.5759 - - - - - - - - - - - - - - - - - - - - - - - - - - -
0.2000 30398 - 0.4521 0.2296 - - - - - - - - - - - - - - - - - - - - - - - - -
0.2000 30400 0.5295 - - - - - - - - - - - - - - - - - - - - - - - - - - -
0.2050 31160 0.5536 - - - - - - - - - - - - - - - - - - - - - - - - - - -
0.2100 31920 0.5089 - - - - - - - - - - - - - - - - - - - - - - - - - - -
0.2150 32680 0.4998 - - - - - - - - - - - - - - - - - - - - - - - - - - -
0.2200 33440 0.5035 - - - - - - - - - - - - - - - - - - - - - - - - - - -
0.2250 34200 0.5086 - - - - - - - - - - - - - - - - - - - - - - - - - - -
0.2300 34960 0.5093 - - - - - - - - - - - - - - - - - - - - - - - - - - -
0.2350 35720 0.5082 - - - - - - - - - - - - - - - - - - - - - - - - - - -
0.2400 36480 0.5111 - - - - - - - - - - - - - - - - - - - - - - - - - - -
0.2450 37240 0.5204 - - - - - - - - - - - - - - - - - - - - - - - - - - -
0.2500 38000 0.4984 - - - - - - - - - - - - - - - - - - - - - - - - - - -
0.2550 38760 0.4695 - - - - - - - - - - - - - - - - - - - - - - - - - - -
0.2600 39520 0.492 - - - - - - - - - - - - - - - - - - - - - - - - - - -
0.2650 40280 0.4831 - - - - - - - - - - - - - - - - - - - - - - - - - - -
0.2700 41040 0.4885 - - - - - - - - - - - - - - - - - - - - - - - - - - -
0.2750 41800 0.4742 - - - - - - - - - - - - - - - - - - - - - - - - - - -
0.2800 42560 0.4814 - - - - - - - - - - - - - - - - - - - - - - - - - - -
0.2850 43320 0.4895 - - - - - - - - - - - - - - - - - - - - - - - - - - -
0.2900 44080 0.4735 - - - - - - - - - - - - - - - - - - - - - - - - - - -
0.2950 44840 0.479 - - - - - - - - - - - - - - - - - - - - - - - - - - -
0.3000 45597 - 0.3801 0.2321 - - - - - - - - - - - - - - - - - - - - - - - - -
0.3000 45600 0.4739 - - - - - - - - - - - - - - - - - - - - - - - - - - -
0.3050 46360 0.4787 - - - - - - - - - - - - - - - - - - - - - - - - - - -
0.3100 47120 0.4854 - - - - - - - - - - - - - - - - - - - - - - - - - - -
0.3150 47880 0.4499 - - - - - - - - - - - - - - - - - - - - - - - - - - -
0.3200 48640 0.4825 - - - - - - - - - - - - - - - - - - - - - - - - - - -
0.3250 49400 0.4401 - - - - - - - - - - - - - - - - - - - - - - - - - - -
0.3300 50160 0.4441 - - - - - - - - - - - - - - - - - - - - - - - - - - -
0.3350 50920 0.4512 - - - - - - - - - - - - - - - - - - - - - - - - - - -
0.3400 51680 0.459 - - - - - - - - - - - - - - - - - - - - - - - - - - -
0.3450 52440 0.4381 - - - - - - - - - - - - - - - - - - - - - - - - - - -
0.3500 53200 0.4219 - - - - - - - - - - - - - - - - - - - - - - - - - - -
0.3550 53960 0.4417 - - - - - - - - - - - - - - - - - - - - - - - - - - -
0.3600 54720 0.4416 - - - - - - - - - - - - - - - - - - - - - - - - - - -
0.3650 55480 0.4143 - - - - - - - - - - - - - - - - - - - - - - - - - - -
0.3700 56240 0.4345 - - - - - - - - - - - - - - - - - - - - - - - - - - -
0.3750 57000 0.4351 - - - - - - - - - - - - - - - - - - - - - - - - - - -
0.3800 57760 0.445 - - - - - - - - - - - - - - - - - - - - - - - - - - -
0.3850 58520 0.4296 - - - - - - - - - - - - - - - - - - - - - - - - - - -
0.3900 59280 0.4487 - - - - - - - - - - - - - - - - - - - - - - - - - - -
0.3950 60040 0.4218 - - - - - - - - - - - - - - - - - - - - - - - - - - -
0.4000 60796 - 0.3541 0.2339 - - - - - - - - - - - - - - - - - - - - - - - - -
0.4000 60800 0.4427 - - - - - - - - - - - - - - - - - - - - - - - - - - -
0.4050 61560 0.4535 - - - - - - - - - - - - - - - - - - - - - - - - - - -
0.4100 62320 0.4506 - - - - - - - - - - - - - - - - - - - - - - - - - - -
0.4150 63080 0.4213 - - - - - - - - - - - - - - - - - - - - - - - - - - -
0.4200 63840 0.4293 - - - - - - - - - - - - - - - - - - - - - - - - - - -
0.4250 64600 0.4112 - - - - - - - - - - - - - - - - - - - - - - - - - - -
0.4300 65360 0.4261 - - - - - - - - - - - - - - - - - - - - - - - - - - -
0.4350 66120 0.4232 - - - - - - - - - - - - - - - - - - - - - - - - - - -
0.4400 66880 0.4322 - - - - - - - - - - - - - - - - - - - - - - - - - - -
0.4450 67640 0.4169 - - - - - - - - - - - - - - - - - - - - - - - - - - -
0.4500 68400 0.398 - - - - - - - - - - - - - - - - - - - - - - - - - - -
0.4550 69160 0.426 - - - - - - - - - - - - - - - - - - - - - - - - - - -
0.4600 69920 0.4083 - - - - - - - - - - - - - - - - - - - - - - - - - - -
0.4650 70680 0.4139 - - - - - - - - - - - - - - - - - - - - - - - - - - -
0.4700 71440 0.4305 - - - - - - - - - - - - - - - - - - - - - - - - - - -
0.4750 72200 0.4146 - - - - - - - - - - - - - - - - - - - - - - - - - - -
0.4800 72960 0.4228 - - - - - - - - - - - - - - - - - - - - - - - - - - -
0.4850 73720 0.4149 - - - - - - - - - - - - - - - - - - - - - - - - - - -
0.4900 74480 0.411 - - - - - - - - - - - - - - - - - - - - - - - - - - -
0.4950 75240 0.3896 - - - - - - - - - - - - - - - - - - - - - - - - - - -
0.5000 75995 - 0.3302 0.2366 - - - - - - - - - - - - - - - - - - - - - - - - -
0.5000 76000 0.3936 - - - - - - - - - - - - - - - - - - - - - - - - - - -
0.5050 76760 0.3955 - - - - - - - - - - - - - - - - - - - - - - - - - - -
0.5100 77520 0.3984 - - - - - - - - - - - - - - - - - - - - - - - - - - -
0.5150 78280 0.4076 - - - - - - - - - - - - - - - - - - - - - - - - - - -
0.5200 79040 0.4109 - - - - - - - - - - - - - - - - - - - - - - - - - - -
0.5250 79800 0.428 - - - - - - - - - - - - - - - - - - - - - - - - - - -
0.5300 80560 0.4064 - - - - - - - - - - - - - - - - - - - - - - - - - - -
0.5350 81320 0.4113 - - - - - - - - - - - - - - - - - - - - - - - - - - -
0.5400 82080 0.409 - - - - - - - - - - - - - - - - - - - - - - - - - - -
0.5451 82840 0.3872 - - - - - - - - - - - - - - - - - - - - - - - - - - -
0.5501 83600 0.403 - - - - - - - - - - - - - - - - - - - - - - - - - - -
0.5551 84360 0.3903 - - - - - - - - - - - - - - - - - - - - - - - - - - -
0.5601 85120 0.4044 - - - - - - - - - - - - - - - - - - - - - - - - - - -
0.5651 85880 0.401 - - - - - - - - - - - - - - - - - - - - - - - - - - -
0.5701 86640 0.4059 - - - - - - - - - - - - - - - - - - - - - - - - - - -
0.5751 87400 0.3946 - - - - - - - - - - - - - - - - - - - - - - - - - - -
0.5801 88160 0.39 - - - - - - - - - - - - - - - - - - - - - - - - - - -
0.5851 88920 0.3826 - - - - - - - - - - - - - - - - - - - - - - - - - - -
0.5901 89680 0.4143 - - - - - - - - - - - - - - - - - - - - - - - - - - -
0.5951 90440 0.3974 - - - - - - - - - - - - - - - - - - - - - - - - - - -
0.6 91194 - 0.329 0.2374 - - - - - - - - - - - - - - - - - - - - - - - - -
0.6001 91200 0.4139 - - - - - - - - - - - - - - - - - - - - - - - - - - -
0.6051 91960 0.4232 - - - - - - - - - - - - - - - - - - - - - - - - - - -
0.6101 92720 0.4011 - - - - - - - - - - - - - - - - - - - - - - - - - - -
0.6151 93480 0.3973 - - - - - - - - - - - - - - - - - - - - - - - - - - -
0.6201 94240 0.4059 - - - - - - - - - - - - - - - - - - - - - - - - - - -
0.6251 95000 0.397 - - - - - - - - - - - - - - - - - - - - - - - - - - -
0.6301 95760 0.4073 - - - - - - - - - - - - - - - - - - - - - - - - - - -
0.6351 96520 0.365 - - - - - - - - - - - - - - - - - - - - - - - - - - -
0.6401 97280 0.3963 - - - - - - - - - - - - - - - - - - - - - - - - - - -
0.6451 98040 0.3938 - - - - - - - - - - - - - - - - - - - - - - - - - - -
0.6501 98800 0.3894 - - - - - - - - - - - - - - - - - - - - - - - - - - -
0.6551 99560 0.3977 - - - - - - - - - - - - - - - - - - - - - - - - - - -
0.6601 100320 0.4 - - - - - - - - - - - - - - - - - - - - - - - - - - -
0.6651 101080 0.3977 - - - - - - - - - - - - - - - - - - - - - - - - - - -
0.6701 101840 0.4152 - - - - - - - - - - - - - - - - - - - - - - - - - - -
0.6751 102600 0.3812 - - - - - - - - - - - - - - - - - - - - - - - - - - -
0.6801 103360 0.4086 - - - - - - - - - - - - - - - - - - - - - - - - - - -
0.6851 104120 0.4051 - - - - - - - - - - - - - - - - - - - - - - - - - - -
0.6901 104880 0.4072 - - - - - - - - - - - - - - - - - - - - - - - - - - -
0.6951 105640 0.4022 - - - - - - - - - - - - - - - - - - - - - - - - - - -
0.7000 106393 - 0.3372 0.2381 - - - - - - - - - - - - - - - - - - - - - - - - -
0.7001 106400 0.3968 - - - - - - - - - - - - - - - - - - - - - - - - - - -
0.7051 107160 0.3706 - - - - - - - - - - - - - - - - - - - - - - - - - - -
0.7101 107920 0.4186 - - - - - - - - - - - - - - - - - - - - - - - - - - -
0.7151 108680 0.4076 - - - - - - - - - - - - - - - - - - - - - - - - - - -
0.7201 109440 0.3908 - - - - - - - - - - - - - - - - - - - - - - - - - - -
0.7251 110200 0.4042 - - - - - - - - - - - - - - - - - - - - - - - - - - -
0.7301 110960 0.3835 - - - - - - - - - - - - - - - - - - - - - - - - - - -
0.7351 111720 0.3891 - - - - - - - - - - - - - - - - - - - - - - - - - - -
0.7401 112480 0.4026 - - - - - - - - - - - - - - - - - - - - - - - - - - -
0.7451 113240 0.4007 - - - - - - - - - - - - - - - - - - - - - - - - - - -
0.7501 114000 0.3967 - - - - - - - - - - - - - - - - - - - - - - - - - - -
0.7551 114760 0.3847 - - - - - - - - - - - - - - - - - - - - - - - - - - -
0.7601 115520 0.3817 - - - - - - - - - - - - - - - - - - - - - - - - - - -
0.7651 116280 0.3981 - - - - - - - - - - - - - - - - - - - - - - - - - - -
0.7701 117040 0.3889 - - - - - - - - - - - - - - - - - - - - - - - - - - -
0.7751 117800 0.4015 - - - - - - - - - - - - - - - - - - - - - - - - - - -
0.7801 118560 0.391 - - - - - - - - - - - - - - - - - - - - - - - - - - -
0.7851 119320 0.3887 - - - - - - - - - - - - - - - - - - - - - - - - - - -
0.7901 120080 0.4005 - - - - - - - - - - - - - - - - - - - - - - - - - - -
0.7951 120840 0.3823 - - - - - - - - - - - - - - - - - - - - - - - - - - -
0.8000 121592 - 0.3333 0.2376 - - - - - - - - - - - - - - - - - - - - - - - - -
0.8001 121600 0.383 - - - - - - - - - - - - - - - - - - - - - - - - - - -
0.8051 122360 0.4252 - - - - - - - - - - - - - - - - - - - - - - - - - - -
0.8101 123120 0.396 - - - - - - - - - - - - - - - - - - - - - - - - - - -
0.8151 123880 0.3882 - - - - - - - - - - - - - - - - - - - - - - - - - - -
0.8201 124640 0.4026 - - - - - - - - - - - - - - - - - - - - - - - - - - -
0.8251 125400 0.4042 - - - - - - - - - - - - - - - - - - - - - - - - - - -
0.8301 126160 0.4047 - - - - - - - - - - - - - - - - - - - - - - - - - - -
0.8351 126920 0.3832 - - - - - - - - - - - - - - - - - - - - - - - - - - -
0.8401 127680 0.3977 - - - - - - - - - - - - - - - - - - - - - - - - - - -
0.8451 128440 0.3842 - - - - - - - - - - - - - - - - - - - - - - - - - - -
0.8501 129200 0.3679 - - - - - - - - - - - - - - - - - - - - - - - - - - -
0.8551 129960 0.3889 - - - - - - - - - - - - - - - - - - - - - - - - - - -
0.8601 130720 0.3985 - - - - - - - - - - - - - - - - - - - - - - - - - - -
0.8651 131480 0.3843 - - - - - - - - - - - - - - - - - - - - - - - - - - -
0.8701 132240 0.4125 - - - - - - - - - - - - - - - - - - - - - - - - - - -
0.8751 133000 0.3934 - - - - - - - - - - - - - - - - - - - - - - - - - - -
0.8801 133760 0.3835 - - - - - - - - - - - - - - - - - - - - - - - - - - -
0.8851 134520 0.3852 - - - - - - - - - - - - - - - - - - - - - - - - - - -
0.8901 135280 0.4017 - - - - - - - - - - - - - - - - - - - - - - - - - - -
0.8951 136040 0.4022 - - - - - - - - - - - - - - - - - - - - - - - - - - -
0.9000 136791 - 0.3308 0.2382 0.6560 0.8576 0.8610 0.7688 0.7676 0.8845 0.8832 0.7782 0.7788 0.8430 0.8448 0.7481 0.7591 0.9158 0.9094 0.8466 0.8601 0.7737 0.7780 0.6584 0.6597 0.6862 0.6786 0.6705 0.6562
  • The bold row denotes the saved checkpoint.

Framework Versions

  • Python: 3.11.10
  • Sentence Transformers: 3.3.0
  • Transformers: 4.46.3
  • PyTorch: 2.5.1+cu124
  • Accelerate: 1.1.1
  • Datasets: 3.1.0
  • Tokenizers: 0.20.3

Citation

BibTeX

Sentence Transformers

@inproceedings{reimers-2019-sentence-bert,
    title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
    author = "Reimers, Nils and Gurevych, Iryna",
    booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
    month = "11",
    year = "2019",
    publisher = "Association for Computational Linguistics",
    url = "https://arxiv.org/abs/1908.10084",
}
Downloads last month
15
Safetensors
Model size
278M params
Tensor type
BF16
·
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and the model is not deployed on the HF Inference API.

Model tree for yklymchuk-rztk/multilingual-e5-base-matryoshka2d-mnr-10

Quantized
(19)
this model

Evaluation results