SentenceTransformer based on BAAI/bge-m3
This is a sentence-transformers model finetuned from BAAI/bge-m3. It maps sentences & paragraphs to a 1024-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.
Model Details
Model Description
- Model Type: Sentence Transformer
- Base model: BAAI/bge-m3
- Maximum Sequence Length: 8192 tokens
- Output Dimensionality: 1024 dimensions
- Similarity Function: Cosine Similarity
Model Sources
- Documentation: Sentence Transformers Documentation
- Repository: Sentence Transformers on GitHub
- Hugging Face: Sentence Transformers on Hugging Face
Full Model Architecture
SentenceTransformer(
(0): Transformer({'max_seq_length': 8192, 'do_lower_case': False}) with Transformer model: XLMRobertaModel
(1): Pooling({'word_embedding_dimension': 1024, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
)
Usage
Direct Usage (Sentence Transformers)
First install the Sentence Transformers library:
pip install -U sentence-transformers
Then you can load this model and run inference.
from sentence_transformers import SentenceTransformer
# Download from the 🤗 Hub
model = SentenceTransformer("BlackBeenie/bge-m3-msmarco-v3-sbert")
# Run inference
sentences = [
'who is christopher kyle',
'Chris Kyle American Sniper. Christopher Scott Kyle was born and raised in Texas and was a United States Navy SEAL from 1999 to 2009. He is currently known as the most successful sniper in American military history. According to his book American Sniper, he had 160 confirmed kills (which was from 255 claimed kills).',
"'American Sniper' Chris Kyle's wife thanks audiences for 'watching the hard stuff'. Taya Kyle has told of her gratitude to audiences for supporting the film about her dead husband Chris Kyle, a Navy Seal played by Bradley Cooper.",
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 1024]
# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities.shape)
# [3, 3]
Training Details
Training Dataset
Unnamed Dataset
- Size: 498,970 training samples
- Columns:
sentence_0
,sentence_1
, andsentence_2
- Approximate statistics based on the first 1000 samples:
sentence_0 sentence_1 sentence_2 type string string string details - min: 4 tokens
- mean: 9.93 tokens
- max: 37 tokens
- min: 17 tokens
- mean: 90.01 tokens
- max: 239 tokens
- min: 16 tokens
- mean: 86.47 tokens
- max: 229 tokens
- Samples:
sentence_0 sentence_1 sentence_2 how much does it cost to paint a interior house
Interior House Painting Cost Factors. Generally, it will take a minimum of two gallons of paint to cover a room. At the highest end, paint will cost anywhere between $30 and $60 per gallon and come in three different finishes: flat, semi-gloss or high-gloss.Flat finishes are the least shiny and are best suited for areas requiring frequent cleaning.rovide a few details about your project and receive competitive quotes from local pros. The average national cost to paint a home interior is $1,671, with most homeowners spending between $966 and $2,426.
Question DetailsAsked on 3/12/2014. Guest_... How much does it cost per square foot to paint the interior of a house? We just bought roughly a 1500 sg ft townhouse and want to get the entire house, including ceilings painted (including a roughly 400 sq ft finished basement not included in square footage).
when is s corp taxes due
If you form a corporate entity for your small business, regardless of whether it's taxed as a C or S corporation, a tax return must be filed with the Internal Revenue Service on its due date each year. Corporate tax returns are always due on the 15th day of the third month following the close of the tax year. The actual day that the tax return filing deadline falls on, however, isn't the same for every corporation.
But if you havenât, donât panic: the majority of forms arenât due quite yet. Most tax forms have an annual January 31 due date. Your tax forms are considered on time if the form is properly addressed and mailed on or before that date. If the regular due date falls on a Saturday, Sunday, or legal holiday â which is the case in 2015 for both January and February due dates â issuers have until the next business day.
what are disaccharides
Disaccharides are formed when two monosaccharides are joined together and a molecule of water is removed, a process known as dehydration reaction. For example; milk sugar (lactose) is made from glucose and galactose whereas the sugar from sugar cane and sugar beets (sucrose) is made from glucose and fructose.altose, another notable disaccharide, is made up of two glucose molecules. The two monosaccharides are bonded via a dehydration reaction (also called a condensation reaction or dehydration synthesis) that leads to the loss of a molecule of water and formation of a glycosidic bond.
Other disaccharides include (diagrams p. 364): Sucrose, common table sugar, has a glycosidic bond linking the anomeric hydroxyls of glucose and fructose. Because the configuration at the anomeric carbon of glucose is a (O points down from the ring), the linkage is designated a(12).
- Loss:
MultipleNegativesRankingLoss
with these parameters:{ "scale": 20.0, "similarity_fct": "cos_sim" }
Training Hyperparameters
Non-Default Hyperparameters
eval_strategy
: stepsper_device_train_batch_size
: 32per_device_eval_batch_size
: 32num_train_epochs
: 5fp16
: Truemulti_dataset_batch_sampler
: round_robin
All Hyperparameters
Click to expand
overwrite_output_dir
: Falsedo_predict
: Falseeval_strategy
: stepsprediction_loss_only
: Trueper_device_train_batch_size
: 32per_device_eval_batch_size
: 32per_gpu_train_batch_size
: Noneper_gpu_eval_batch_size
: Nonegradient_accumulation_steps
: 1eval_accumulation_steps
: Nonetorch_empty_cache_steps
: Nonelearning_rate
: 5e-05weight_decay
: 0.0adam_beta1
: 0.9adam_beta2
: 0.999adam_epsilon
: 1e-08max_grad_norm
: 1num_train_epochs
: 5max_steps
: -1lr_scheduler_type
: linearlr_scheduler_kwargs
: {}warmup_ratio
: 0.0warmup_steps
: 0log_level
: passivelog_level_replica
: warninglog_on_each_node
: Truelogging_nan_inf_filter
: Truesave_safetensors
: Truesave_on_each_node
: Falsesave_only_model
: Falserestore_callback_states_from_checkpoint
: Falseno_cuda
: Falseuse_cpu
: Falseuse_mps_device
: Falseseed
: 42data_seed
: Nonejit_mode_eval
: Falseuse_ipex
: Falsebf16
: Falsefp16
: Truefp16_opt_level
: O1half_precision_backend
: autobf16_full_eval
: Falsefp16_full_eval
: Falsetf32
: Nonelocal_rank
: 0ddp_backend
: Nonetpu_num_cores
: Nonetpu_metrics_debug
: Falsedebug
: []dataloader_drop_last
: Falsedataloader_num_workers
: 0dataloader_prefetch_factor
: Nonepast_index
: -1disable_tqdm
: Falseremove_unused_columns
: Truelabel_names
: Noneload_best_model_at_end
: Falseignore_data_skip
: Falsefsdp
: []fsdp_min_num_params
: 0fsdp_config
: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}fsdp_transformer_layer_cls_to_wrap
: Noneaccelerator_config
: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}deepspeed
: Nonelabel_smoothing_factor
: 0.0optim
: adamw_torchoptim_args
: Noneadafactor
: Falsegroup_by_length
: Falselength_column_name
: lengthddp_find_unused_parameters
: Noneddp_bucket_cap_mb
: Noneddp_broadcast_buffers
: Falsedataloader_pin_memory
: Truedataloader_persistent_workers
: Falseskip_memory_metrics
: Trueuse_legacy_prediction_loop
: Falsepush_to_hub
: Falseresume_from_checkpoint
: Nonehub_model_id
: Nonehub_strategy
: every_savehub_private_repo
: Nonehub_always_push
: Falsegradient_checkpointing
: Falsegradient_checkpointing_kwargs
: Noneinclude_inputs_for_metrics
: Falseinclude_for_metrics
: []eval_do_concat_batches
: Truefp16_backend
: autopush_to_hub_model_id
: Nonepush_to_hub_organization
: Nonemp_parameters
:auto_find_batch_size
: Falsefull_determinism
: Falsetorchdynamo
: Noneray_scope
: lastddp_timeout
: 1800torch_compile
: Falsetorch_compile_backend
: Nonetorch_compile_mode
: Nonedispatch_batches
: Nonesplit_batches
: Noneinclude_tokens_per_second
: Falseinclude_num_input_tokens_seen
: Falseneftune_noise_alpha
: Noneoptim_target_modules
: Nonebatch_eval_metrics
: Falseeval_on_start
: Falseuse_liger_kernel
: Falseeval_use_gather_object
: Falseaverage_tokens_across_devices
: Falseprompts
: Nonebatch_sampler
: batch_samplermulti_dataset_batch_sampler
: round_robin
Training Logs
Click to expand
Epoch | Step | Training Loss |
---|---|---|
0.0321 | 500 | 0.3086 |
0.0641 | 1000 | 0.2339 |
0.0962 | 1500 | 0.2289 |
0.1283 | 2000 | 0.2262 |
0.1603 | 2500 | 0.2213 |
0.1924 | 3000 | 0.2158 |
0.2245 | 3500 | 0.2101 |
0.2565 | 4000 | 0.2082 |
0.2886 | 4500 | 0.2107 |
0.3207 | 5000 | 0.2015 |
0.3527 | 5500 | 0.2023 |
0.3848 | 6000 | 0.201 |
0.4169 | 6500 | 0.1974 |
0.4489 | 7000 | 0.191 |
0.4810 | 7500 | 0.1956 |
0.5131 | 8000 | 0.2 |
0.5451 | 8500 | 0.191 |
0.5772 | 9000 | 0.1888 |
0.6092 | 9500 | 0.1885 |
0.6413 | 10000 | 0.1936 |
0.6734 | 10500 | 0.1944 |
0.7054 | 11000 | 0.1806 |
0.7375 | 11500 | 0.1834 |
0.7696 | 12000 | 0.1853 |
0.8016 | 12500 | 0.1823 |
0.8337 | 13000 | 0.1827 |
0.8658 | 13500 | 0.1821 |
0.8978 | 14000 | 0.1724 |
0.9299 | 14500 | 0.1745 |
0.9620 | 15000 | 0.1776 |
0.9940 | 15500 | 0.1781 |
1.0 | 15593 | - |
1.0261 | 16000 | 0.1133 |
1.0582 | 16500 | 0.0964 |
1.0902 | 17000 | 0.0931 |
1.1223 | 17500 | 0.0947 |
1.1544 | 18000 | 0.097 |
1.1864 | 18500 | 0.0977 |
1.2185 | 19000 | 0.096 |
1.2506 | 19500 | 0.1005 |
1.2826 | 20000 | 0.1008 |
1.3147 | 20500 | 0.0998 |
1.3468 | 21000 | 0.0972 |
1.3788 | 21500 | 0.0992 |
1.4109 | 22000 | 0.0994 |
1.4430 | 22500 | 0.1029 |
1.4750 | 23000 | 0.1008 |
1.5071 | 23500 | 0.0985 |
1.5392 | 24000 | 0.1013 |
1.5712 | 24500 | 0.1027 |
1.6033 | 25000 | 0.0988 |
1.6353 | 25500 | 0.0982 |
1.6674 | 26000 | 0.0994 |
1.6995 | 26500 | 0.0998 |
1.7315 | 27000 | 0.0989 |
1.7636 | 27500 | 0.101 |
1.7957 | 28000 | 0.099 |
1.8277 | 28500 | 0.096 |
1.8598 | 29000 | 0.0989 |
1.8919 | 29500 | 0.1011 |
1.9239 | 30000 | 0.0974 |
1.9560 | 30500 | 0.0999 |
1.9881 | 31000 | 0.0976 |
2.0 | 31186 | - |
2.0201 | 31500 | 0.0681 |
2.0522 | 32000 | 0.0478 |
2.0843 | 32500 | 0.0483 |
2.1163 | 33000 | 0.0485 |
2.1484 | 33500 | 0.0472 |
2.1805 | 34000 | 0.0482 |
2.2125 | 34500 | 0.0491 |
2.2446 | 35000 | 0.0484 |
2.2767 | 35500 | 0.0493 |
2.3087 | 36000 | 0.0484 |
2.3408 | 36500 | 0.0503 |
2.3729 | 37000 | 0.0498 |
2.4049 | 37500 | 0.0507 |
2.4370 | 38000 | 0.0502 |
2.4691 | 38500 | 0.0508 |
2.5011 | 39000 | 0.0483 |
2.5332 | 39500 | 0.0486 |
2.5653 | 40000 | 0.0494 |
2.5973 | 40500 | 0.0511 |
2.6294 | 41000 | 0.0508 |
2.6615 | 41500 | 0.0496 |
2.6935 | 42000 | 0.0487 |
2.7256 | 42500 | 0.0497 |
2.7576 | 43000 | 0.0491 |
2.7897 | 43500 | 0.0486 |
2.8218 | 44000 | 0.0503 |
2.8538 | 44500 | 0.0504 |
2.8859 | 45000 | 0.0499 |
2.9180 | 45500 | 0.048 |
2.9500 | 46000 | 0.047 |
2.9821 | 46500 | 0.0497 |
3.0 | 46779 | - |
3.0142 | 47000 | 0.0395 |
3.0462 | 47500 | 0.0247 |
3.0783 | 48000 | 0.0256 |
3.1104 | 48500 | 0.0254 |
3.1424 | 49000 | 0.0247 |
3.1745 | 49500 | 0.0251 |
3.2066 | 50000 | 0.0253 |
3.2386 | 50500 | 0.0263 |
3.2707 | 51000 | 0.0261 |
3.3028 | 51500 | 0.0259 |
3.3348 | 52000 | 0.0256 |
3.3669 | 52500 | 0.0254 |
3.3990 | 53000 | 0.026 |
3.4310 | 53500 | 0.0255 |
3.4631 | 54000 | 0.0255 |
3.4952 | 54500 | 0.0257 |
3.5272 | 55000 | 0.0249 |
3.5593 | 55500 | 0.0251 |
3.5914 | 56000 | 0.026 |
3.6234 | 56500 | 0.0246 |
3.6555 | 57000 | 0.0258 |
3.6876 | 57500 | 0.0266 |
3.7196 | 58000 | 0.0242 |
3.7517 | 58500 | 0.0251 |
3.7837 | 59000 | 0.0243 |
3.8158 | 59500 | 0.0249 |
3.8479 | 60000 | 0.0252 |
3.8799 | 60500 | 0.0251 |
3.9120 | 61000 | 0.025 |
3.9441 | 61500 | 0.0249 |
3.9761 | 62000 | 0.0254 |
4.0 | 62372 | - |
4.0082 | 62500 | 0.0221 |
4.0403 | 63000 | 0.0146 |
4.0723 | 63500 | 0.0146 |
4.1044 | 64000 | 0.0152 |
4.1365 | 64500 | 0.0153 |
4.1685 | 65000 | 0.0144 |
4.2006 | 65500 | 0.0154 |
4.2327 | 66000 | 0.0137 |
4.2647 | 66500 | 0.0145 |
4.2968 | 67000 | 0.0148 |
4.3289 | 67500 | 0.0148 |
4.3609 | 68000 | 0.0142 |
4.3930 | 68500 | 0.0148 |
4.4251 | 69000 | 0.0155 |
4.4571 | 69500 | 0.0148 |
4.4892 | 70000 | 0.0144 |
4.5213 | 70500 | 0.0144 |
4.5533 | 71000 | 0.0148 |
4.5854 | 71500 | 0.015 |
4.6175 | 72000 | 0.0149 |
4.6495 | 72500 | 0.0135 |
4.6816 | 73000 | 0.0142 |
4.7137 | 73500 | 0.0152 |
4.7457 | 74000 | 0.0144 |
4.7778 | 74500 | 0.0143 |
4.8099 | 75000 | 0.0141 |
4.8419 | 75500 | 0.0146 |
4.8740 | 76000 | 0.0142 |
4.9060 | 76500 | 0.0142 |
4.9381 | 77000 | 0.0147 |
4.9702 | 77500 | 0.0145 |
5.0 | 77965 | - |
Framework Versions
- Python: 3.11.11
- Sentence Transformers: 3.4.1
- Transformers: 4.48.3
- PyTorch: 2.5.1+cu124
- Accelerate: 1.3.0
- Datasets: 3.3.2
- Tokenizers: 0.21.0
Citation
BibTeX
Sentence Transformers
@inproceedings{reimers-2019-sentence-bert,
title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
author = "Reimers, Nils and Gurevych, Iryna",
booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
month = "11",
year = "2019",
publisher = "Association for Computational Linguistics",
url = "https://arxiv.org/abs/1908.10084",
}
MultipleNegativesRankingLoss
@misc{henderson2017efficient,
title={Efficient Natural Language Response Suggestion for Smart Reply},
author={Matthew Henderson and Rami Al-Rfou and Brian Strope and Yun-hsuan Sung and Laszlo Lukacs and Ruiqi Guo and Sanjiv Kumar and Balint Miklos and Ray Kurzweil},
year={2017},
eprint={1705.00652},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
- Downloads last month
- 2
Inference Providers
NEW
This model is not currently available via any of the supported Inference Providers.
Model tree for BlackBeenie/bge-m3-msmarco-v3-sbert
Base model
BAAI/bge-m3