Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks
Paper
•
1908.10084
•
Published
•
9
This is a sentence-transformers model finetuned from intfloat/multilingual-e5-large-instruct on the mcqa-rag-finetune dataset. It maps sentences & paragraphs to a 1024-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.
SentenceTransformer(
(0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: XLMRobertaModel
(1): Pooling({'word_embedding_dimension': 1024, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
(2): Normalize()
)
First install the Sentence Transformers library:
pip install -U sentence-transformers
Then you can load this model and run inference.
from sentence_transformers import SentenceTransformer
# Download from the 🤗 Hub
model = SentenceTransformer("sentence_transformers_model_id")
# Run inference
sentences = [
'Hipotalamusi NUK kontrollon sekretimin e hormoneve:\nA. FSH dhe LH\nB. te rritjes(GH)\nC. ACTH\nD. te pankreasit',
'Hipotalamusi është një pjesë e trurit që ndodhet nën talamusin. Ai luan një rol kryesor në lidhjen e sistemit nervor me sistemin endokrin përmes gjëndrës së hipofizës.',
'State laws that regulate matters of legitimate local concern but have an incidental effect on interstate commerce are subject to a less strict balancing test. Under this test, a state law will be upheld unless the burden imposed on interstate commerce is clearly excessive in relation to the putative local benefits.',
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 1024]
# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities.shape)
# [3, 3]
anchor and positive| anchor | positive | |
|---|---|---|
| type | string | string |
| details |
|
|
| anchor | positive |
|---|---|
Find all c in Z_3 such that Z_3[x]/(x^2 + c) is a field. |
The notation Z_3 refers to the finite field with three elements, often denoted as {0, 1, 2}. This field operates under modular arithmetic, specifically modulo 3. Elements in Z_3 can be added and multiplied according to the rules of modulo 3, where any number can wrap around upon reaching 3. |
Find all c in Z_3 such that Z_3[x]/(x^2 + c) is a field. |
A field is a set equipped with two operations, addition and multiplication, satisfying certain properties: associativity, commutativity, distributivity, the existence of additive and multiplicative identities, and the existence of additive inverses and multiplicative inverses (for all elements except the zero element). In order for Z_3[x]/(f(x)) to be a field, the polynomial f(x) must be irreducible over Z_3. |
Find all c in Z_3 such that Z_3[x]/(x^2 + c) is a field. |
The expression Z_3[x] indicates the set of all polynomials with coefficients in Z_3. A polynomial is said to be irreducible over Z_3 if it cannot be factored into the product of two non-constant polynomials with coefficients in Z_3. In the case of quadratic polynomials like x^2 + c, irreducibility depends on whether it has any roots in the field Z_3. |
MultipleNegativesRankingLoss with these parameters:{
"scale": 20.0,
"similarity_fct": "cos_sim"
}
anchor and positive| anchor | positive | |
|---|---|---|
| type | string | string |
| details |
|
|
| anchor | positive |
|---|---|
ക്രൂരകോഷ്ഠം ഉള്ള ഒരാളിൽ കോപിച്ചിരിക്കുന്ന ദോഷം താഴെപ്പറയുന്നവയിൽ ഏതാണ്? |
ഓരോ ദോഷത്തിനും അതിന്റേതായ സ്വഭാവങ്ങളും ശരീരത്തിൽ അത് ഉണ്ടാക്കുന്ന ഫലങ്ങളും ഉണ്ട്. |
Melyik tényező nem befolyásolja a fagylalt keresleti függvényét? |
A keresleti függvény negatív meredekségű, ami azt jelenti, hogy az ár növekedésével a keresett mennyiség csökken (csökkenő kereslet törvénye). |
In contrast to _______, _______ aim to reward favourable behaviour by companies. The success of such campaigns have been heightened through the use of ___________, which allow campaigns to facilitate the company in achieving _________ . |
Consumer Activism: This term refers to the actions taken by consumers to promote social, political, or environmental causes. These actions can include boycotting certain companies or buycotting others, influencing market dynamics based on ethical considerations. The effectiveness of consumer activism can vary but has gained prominence in recent years with increased visibility through social media. |
MultipleNegativesRankingLoss with these parameters:{
"scale": 20.0,
"similarity_fct": "cos_sim"
}
eval_strategy: stepsper_device_train_batch_size: 12per_device_eval_batch_size: 12learning_rate: 3e-05num_train_epochs: 1warmup_steps: 5000fp16: Trueload_best_model_at_end: Trueoverwrite_output_dir: Falsedo_predict: Falseeval_strategy: stepsprediction_loss_only: Trueper_device_train_batch_size: 12per_device_eval_batch_size: 12per_gpu_train_batch_size: Noneper_gpu_eval_batch_size: Nonegradient_accumulation_steps: 1eval_accumulation_steps: Nonetorch_empty_cache_steps: Nonelearning_rate: 3e-05weight_decay: 0.0adam_beta1: 0.9adam_beta2: 0.999adam_epsilon: 1e-08max_grad_norm: 1.0num_train_epochs: 1max_steps: -1lr_scheduler_type: linearlr_scheduler_kwargs: {}warmup_ratio: 0.0warmup_steps: 5000log_level: passivelog_level_replica: warninglog_on_each_node: Truelogging_nan_inf_filter: Truesave_safetensors: Truesave_on_each_node: Falsesave_only_model: Falserestore_callback_states_from_checkpoint: Falseno_cuda: Falseuse_cpu: Falseuse_mps_device: Falseseed: 42data_seed: Nonejit_mode_eval: Falseuse_ipex: Falsebf16: Falsefp16: Truefp16_opt_level: O1half_precision_backend: autobf16_full_eval: Falsefp16_full_eval: Falsetf32: Nonelocal_rank: 0ddp_backend: Nonetpu_num_cores: Nonetpu_metrics_debug: Falsedebug: []dataloader_drop_last: Falsedataloader_num_workers: 0dataloader_prefetch_factor: Nonepast_index: -1disable_tqdm: Falseremove_unused_columns: Truelabel_names: Noneload_best_model_at_end: Trueignore_data_skip: Falsefsdp: []fsdp_min_num_params: 0fsdp_config: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}fsdp_transformer_layer_cls_to_wrap: Noneaccelerator_config: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}deepspeed: Nonelabel_smoothing_factor: 0.0optim: adamw_torchoptim_args: Noneadafactor: Falsegroup_by_length: Falselength_column_name: lengthddp_find_unused_parameters: Noneddp_bucket_cap_mb: Noneddp_broadcast_buffers: Falsedataloader_pin_memory: Truedataloader_persistent_workers: Falseskip_memory_metrics: Trueuse_legacy_prediction_loop: Falsepush_to_hub: Falseresume_from_checkpoint: Nonehub_model_id: Nonehub_strategy: every_savehub_private_repo: Nonehub_always_push: Falsegradient_checkpointing: Falsegradient_checkpointing_kwargs: Noneinclude_inputs_for_metrics: Falseinclude_for_metrics: []eval_do_concat_batches: Truefp16_backend: autopush_to_hub_model_id: Nonepush_to_hub_organization: Nonemp_parameters: auto_find_batch_size: Falsefull_determinism: Falsetorchdynamo: Noneray_scope: lastddp_timeout: 1800torch_compile: Falsetorch_compile_backend: Nonetorch_compile_mode: Noneinclude_tokens_per_second: Falseinclude_num_input_tokens_seen: Falseneftune_noise_alpha: Noneoptim_target_modules: Nonebatch_eval_metrics: Falseeval_on_start: Falseuse_liger_kernel: Falseeval_use_gather_object: Falseaverage_tokens_across_devices: Falseprompts: Nonebatch_sampler: batch_samplermulti_dataset_batch_sampler: proportional| Epoch | Step | Training Loss | Validation Loss |
|---|---|---|---|
| 0.05 | 2476 | 0.1209 | 0.0347 |
| 0.1000 | 4952 | 0.0737 | 0.0459 |
| 0.1501 | 7428 | 0.087 | 0.0732 |
| 0.2001 | 9904 | 0.0825 | 0.1209 |
| 0.2501 | 12380 | 0.0783 | 0.0934 |
| 0.3001 | 14856 | 0.071 | 0.0793 |
| 0.3501 | 17332 | 0.0661 | 0.0855 |
| 0.4001 | 19808 | 0.0652 | 0.0964 |
| 0.4502 | 22284 | 0.063 | 0.0892 |
| 0.5002 | 24760 | 0.056 | 0.0923 |
| 0.5502 | 27236 | 0.0509 | 0.1016 |
| 0.6002 | 29712 | 0.045 | 0.0918 |
| 0.6502 | 32188 | 0.0472 | 0.0896 |
| 0.7002 | 34664 | 0.0396 | 0.0959 |
| 0.7503 | 37140 | 0.0371 | 0.0819 |
| 0.8003 | 39616 | 0.0341 | 0.0845 |
| 0.8503 | 42092 | 0.0344 | 0.0790 |
| 0.9003 | 44568 | 0.0288 | 0.0863 |
| 0.9503 | 47044 | 0.03 | 0.0767 |
@inproceedings{reimers-2019-sentence-bert,
title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
author = "Reimers, Nils and Gurevych, Iryna",
booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
month = "11",
year = "2019",
publisher = "Association for Computational Linguistics",
url = "https://arxiv.org/abs/1908.10084",
}
@misc{henderson2017efficient,
title={Efficient Natural Language Response Suggestion for Smart Reply},
author={Matthew Henderson and Rami Al-Rfou and Brian Strope and Yun-hsuan Sung and Laszlo Lukacs and Ruiqi Guo and Sanjiv Kumar and Balint Miklos and Ray Kurzweil},
year={2017},
eprint={1705.00652},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
Base model
intfloat/multilingual-e5-large-instruct