metadata
language:
- en
license: apache-2.0
tags:
- sentence-transformers
- sentence-similarity
- feature-extraction
- dense
- generated_from_trainer
- dataset_size:99231
- loss:CachedMultipleNegativesRankingLoss
widget:
- source_sentence: who ordered the charge of the light brigade
sentences:
- >-
Charge of the Light Brigade The Charge of the Light Brigade was a charge
of British light cavalry led by Lord Cardigan against Russian forces
during the Battle of Balaclava on 25 October 1854 in the Crimean War.
Lord Raglan, overall commander of the British forces, had intended to
send the Light Brigade to prevent the Russians from removing captured
guns from overrun Turkish positions, a task well-suited to light
cavalry.
- >-
UNICEF The United Nations International Children's Emergency Fund was
created by the United Nations General Assembly on 11 December 1946, to
provide emergency food and healthcare to children in countries that had
been devastated by World War II. The Polish physician Ludwik Rajchman is
widely regarded as the founder of UNICEF and served as its first
chairman from 1946. On Rajchman's suggestion, the American Maurice Pate
was appointed its first executive director, serving from 1947 until his
death in 1965.[5][6] In 1950, UNICEF's mandate was extended to address
the long-term needs of children and women in developing countries
everywhere. In 1953 it became a permanent part of the United Nations
System, and the words "international" and "emergency" were dropped from
the organization's name, making it simply the United Nations Children's
Fund, retaining the original acronym, "UNICEF".[3]
- >-
Marcus Jordan Marcus James Jordan (born December 24, 1990) is an
American former college basketball player who played for the UCF Knights
men's basketball team of Conference USA.[1] He is the son of retired
Hall of Fame basketball player Michael Jordan.
- source_sentence: what part of the cow is the rib roast
sentences:
- >-
Standing rib roast A standing rib roast, also known as prime rib, is a
cut of beef from the primal rib, one of the nine primal cuts of beef.
While the entire rib section comprises ribs six through 12, a standing
rib roast may contain anywhere from two to seven ribs.
- >-
Blaine Anderson Kurt begins to mend their relationship in
"Thanksgiving", just before New Directions loses at Sectionals to the
Warblers, and they spend Christmas together in New York City.[29][30]
Though he and Kurt continue to be on good terms, Blaine finds himself
developing a crush on his best friend, Sam, which he knows will come to
nothing as he knows Sam is not gay; the two of them team up to find
evidence that the Warblers cheated at Sectionals, which means New
Directions will be competing at Regionals. He ends up going to the Sadie
Hawkins dance with Tina Cohen-Chang (Jenna Ushkowitz), who has developed
a crush on him, but as friends only.[31] When Kurt comes to Lima for the
wedding of glee club director Will (Matthew Morrison) and Emma (Jayma
Mays)—which Emma flees—he and Blaine make out beforehand, and sleep
together afterward, though they do not resume a permanent
relationship.[32]
- "Soviet Union The Soviet Union (Russian: Сове́тский Сою́з, tr. Sovétsky Soyúz, IPA:\_[sɐˈvʲɛt͡skʲɪj sɐˈjus]\_(\_listen)), officially the Union of Soviet Socialist Republics (Russian: Сою́з Сове́тских Социалисти́ческих Респу́блик, tr. Soyúz Sovétskikh Sotsialistícheskikh Respúblik, IPA:\_[sɐˈjus sɐˈvʲɛtskʲɪx sətsɨəlʲɪsˈtʲitɕɪskʲɪx rʲɪˈspublʲɪk]\_(\_listen)), abbreviated as the USSR (Russian: СССР, tr. SSSR), was a socialist state in Eurasia that existed from 1922 to 1991. Nominally a union of multiple national Soviet republics,[a] its government and economy were highly centralized. The country was a one-party state, governed by the Communist Party with Moscow as its capital in its largest republic, the Russian Soviet Federative Socialist Republic. The Russian nation had constitutionally equal status among the many nations of the union but exerted de facto dominance in various respects.[7] Other major urban centres were Leningrad, Kiev, Minsk, Alma-Ata and Novosibirsk. The Soviet Union was one of the five recognized nuclear weapons states and possessed the largest stockpile of weapons of mass destruction.[8] It was a founding permanent member of the United Nations Security Council, as well as a member of the Organization for Security and Co-operation in Europe (OSCE) and the leading member of the Council for Mutual Economic Assistance (CMEA) and the Warsaw Pact."
- source_sentence: what is the current big bang theory season
sentences:
- >-
Byzantine army From the seventh to the 12th centuries, the Byzantine
army was among the most powerful and effective military forces in the
world – neither Middle Ages Europe nor (following its early successes)
the fracturing Caliphate could match the strategies and the efficiency
of the Byzantine army. Restricted to a largely defensive role in the 7th
to mid-9th centuries, the Byzantines developed the theme-system to
counter the more powerful Caliphate. From the mid-9th century, however,
they gradually went on the offensive, culminating in the great conquests
of the 10th century under a series of soldier-emperors such as
Nikephoros II Phokas, John Tzimiskes and Basil II. The army they led was
less reliant on the militia of the themes; it was by now a largely
professional force, with a strong and well-drilled infantry at its core
and augmented by a revived heavy cavalry arm. With one of the most
powerful economies in the world at the time, the Empire had the
resources to put to the field a powerful host when needed, in order to
reclaim its long-lost territories.
- >-
The Big Bang Theory The Big Bang Theory is an American television sitcom
created by Chuck Lorre and Bill Prady, both of whom serve as executive
producers on the series, along with Steven Molaro. All three also serve
as head writers. The show premiered on CBS on September 24, 2007.[3] The
series' tenth season premiered on September 19, 2016.[4] In March 2017,
the series was renewed for two additional seasons, bringing its total to
twelve, and running through the 2018–19 television season. The
eleventh season is set to premiere on September 25, 2017.[5]
- >-
2016 NCAA Division I Softball Tournament The 2016 NCAA Division I
Softball Tournament was held from May 20 through June 8, 2016 as the
final part of the 2016 NCAA Division I softball season. The 64 NCAA
Division I college softball teams were to be selected out of an eligible
293 teams on May 15, 2016. Thirty-two teams were awarded an automatic
bid as champions of their conference, and thirty-two teams were selected
at-large by the NCAA Division I softball selection committee. The
tournament culminated with eight teams playing in the 2016 Women's
College World Series at ASA Hall of Fame Stadium in Oklahoma City in
which the Oklahoma Sooners were crowned the champions.
- source_sentence: what happened to tates mom on days of our lives
sentences:
- >-
Paige O'Hara Donna Paige Helmintoller, better known as Paige O'Hara
(born May 10, 1956),[1] is an American actress, voice actress, singer
and painter. O'Hara began her career as a Broadway actress in 1983 when
she portrayed Ellie May Chipley in the musical Showboat. In 1991, she
made her motion picture debut in Disney's Beauty and the Beast, in which
she voiced the film's heroine, Belle. Following the critical and
commercial success of Beauty and the Beast, O'Hara reprised her role as
Belle in the film's two direct-to-video follow-ups, Beauty and the
Beast: The Enchanted Christmas and Belle's Magical World.
- >-
M. Shadows Matthew Charles Sanders (born July 31, 1981), better known as
M. Shadows, is an American singer, songwriter, and musician. He is best
known as the lead vocalist, songwriter, and a founding member of the
American heavy metal band Avenged Sevenfold. In 2017, he was voted 3rd
in the list of Top 25 Greatest Modern Frontmen by Ultimate Guitar.[1]
- >-
Theresa Donovan In July 2013, Jeannie returns to Salem, this time going
by her middle name, Theresa. Initially, she strikes up a connection with
resident bad boy JJ Deveraux (Casey Moss) while trying to secure some
pot.[28] During a confrontation with JJ and his mother Jennifer Horton
(Melissa Reeves) in her office, her aunt Kayla confirms that Theresa is
in fact Jeannie and that Jen promised to hire her as her assistant, a
promise she reluctantly agrees to. Kayla reminds Theresa it is her last
chance at a fresh start.[29] Theresa also strikes up a bad first
impression with Jennifer's daughter Abigail Deveraux (Kate Mansi) when
Abigail smells pot on Theresa in her mother's office.[30] To continue to
battle against Jennifer, she teams up with Anne Milbauer (Meredith Scott
Lynn) in hopes of exacting her perfect revenge. In a ploy, Theresa
reveals her intentions to hopefully woo Dr. Daniel Jonas (Shawn
Christian). After sleeping with JJ, Theresa overdoses on marijuana and
GHB. Upon hearing of their daughter's overdose and continuing problems,
Shane and Kimberly return to town in the hopes of handling their
daughter's problem, together. After believing that Theresa has a handle
on her addictions, Shane and Kimberly leave town together. Theresa then
teams up with hospital co-worker Anne Milbauer (Meredith Scott Lynn) to
conspire against Jennifer, using Daniel as a way to hurt their
relationship. In early 2014, following a Narcotics Anonymous (NA)
meeting, she begins a sexual and drugged-fused relationship with Brady
Black (Eric Martsolf). In 2015, after it is found that Kristen DiMera
(Eileen Davidson) stole Theresa's embryo and carried it to term, Brady
and Melanie Jonas return her son, Christopher, to her and Brady, and the
pair rename him Tate. When Theresa moves into the Kiriakis mansion,
tensions arise between her and Victor. She eventually expresses her
interest in purchasing Basic Black and running it as her own fashion
company, with financial backing from Maggie Horton (Suzanne Rogers). In
the hopes of finding the right partner, she teams up with Kate Roberts
(Lauren Koslow) and Nicole Walker (Arianne Zucker) to achieve the goal
of purchasing Basic Black, with Kate and Nicole's business background
and her own interest in fashion design. As she and Brady share several
instances of rekindling their romance, she is kicked out of the mansion
by Victor; as a result, Brady quits Titan and moves in with Theresa and
Tate, in their own penthouse.
- source_sentence: where does the last name francisco come from
sentences:
- >-
Francisco Francisco is the Spanish and Portuguese form of the masculine
given name Franciscus (corresponding to English Francis).
- >-
Book of Esther The Book of Esther, also known in Hebrew as "the Scroll"
(Megillah), is a book in the third section (Ketuvim, "Writings") of the
Jewish Tanakh (the Hebrew Bible) and in the Christian Old Testament. It
is one of the five Scrolls (Megillot) in the Hebrew Bible. It relates
the story of a Hebrew woman in Persia, born as Hadassah but known as
Esther, who becomes queen of Persia and thwarts a genocide of her
people. The story forms the core of the Jewish festival of Purim, during
which it is read aloud twice: once in the evening and again the
following morning. The books of Esther and Song of Songs are the only
books in the Hebrew Bible that do not explicitly mention God.[2]
- >-
Times Square Times Square is a major commercial intersection, tourist
destination, entertainment center and neighborhood in the Midtown
Manhattan section of New York City at the junction of Broadway and
Seventh Avenue. It stretches from West 42nd to West 47th Streets.[1]
Brightly adorned with billboards and advertisements, Times Square is
sometimes referred to as "The Crossroads of the World",[2] "The Center
of the Universe",[3] "the heart of The Great White Way",[4][5][6] and
the "heart of the world".[7] One of the world's busiest pedestrian
areas,[8] it is also the hub of the Broadway Theater District[9] and a
major center of the world's entertainment industry.[10] Times Square is
one of the world's most visited tourist attractions, drawing an
estimated 50 million visitors annually.[11] Approximately 330,000 people
pass through Times Square daily,[12] many of them tourists,[13] while
over 460,000 pedestrians walk through Times Square on its busiest
days.[7]
datasets:
- sentence-transformers/natural-questions
pipeline_tag: sentence-similarity
library_name: sentence-transformers
metrics:
- cosine_accuracy@1
- cosine_accuracy@3
- cosine_accuracy@5
- cosine_accuracy@10
- cosine_precision@1
- cosine_precision@3
- cosine_precision@5
- cosine_precision@10
- cosine_recall@1
- cosine_recall@3
- cosine_recall@5
- cosine_recall@10
- cosine_ndcg@10
- cosine_mrr@10
- cosine_map@100
co2_eq_emissions:
emissions: 405.46378000747745
energy_consumed: 1.043122443433472
source: codecarbon
training_type: fine-tuning
on_cloud: false
cpu_model: 13th Gen Intel(R) Core(TM) i7-13700K
ram_total_size: 31.777088165283203
hours_used: 3.425
hardware_used: 1 x NVIDIA GeForce RTX 3090
model-index:
- name: LiquidAI/LFM2-350M trained on Natural Questions pairs
results:
- task:
type: information-retrieval
name: Information Retrieval
dataset:
name: NanoMSMARCO
type: NanoMSMARCO
metrics:
- type: cosine_accuracy@1
value: 0.28
name: Cosine Accuracy@1
- type: cosine_accuracy@3
value: 0.46
name: Cosine Accuracy@3
- type: cosine_accuracy@5
value: 0.64
name: Cosine Accuracy@5
- type: cosine_accuracy@10
value: 0.74
name: Cosine Accuracy@10
- type: cosine_precision@1
value: 0.28
name: Cosine Precision@1
- type: cosine_precision@3
value: 0.15333333333333332
name: Cosine Precision@3
- type: cosine_precision@5
value: 0.128
name: Cosine Precision@5
- type: cosine_precision@10
value: 0.07400000000000001
name: Cosine Precision@10
- type: cosine_recall@1
value: 0.28
name: Cosine Recall@1
- type: cosine_recall@3
value: 0.46
name: Cosine Recall@3
- type: cosine_recall@5
value: 0.64
name: Cosine Recall@5
- type: cosine_recall@10
value: 0.74
name: Cosine Recall@10
- type: cosine_ndcg@10
value: 0.4909415698599729
name: Cosine Ndcg@10
- type: cosine_mrr@10
value: 0.4130714285714285
name: Cosine Mrr@10
- type: cosine_map@100
value: 0.42354966538209315
name: Cosine Map@100
- task:
type: information-retrieval
name: Information Retrieval
dataset:
name: NanoNFCorpus
type: NanoNFCorpus
metrics:
- type: cosine_accuracy@1
value: 0.4
name: Cosine Accuracy@1
- type: cosine_accuracy@3
value: 0.5
name: Cosine Accuracy@3
- type: cosine_accuracy@5
value: 0.58
name: Cosine Accuracy@5
- type: cosine_accuracy@10
value: 0.68
name: Cosine Accuracy@10
- type: cosine_precision@1
value: 0.4
name: Cosine Precision@1
- type: cosine_precision@3
value: 0.35999999999999993
name: Cosine Precision@3
- type: cosine_precision@5
value: 0.324
name: Cosine Precision@5
- type: cosine_precision@10
value: 0.26599999999999996
name: Cosine Precision@10
- type: cosine_recall@1
value: 0.02298357366763854
name: Cosine Recall@1
- type: cosine_recall@3
value: 0.061632366484571384
name: Cosine Recall@3
- type: cosine_recall@5
value: 0.09750915762412557
name: Cosine Recall@5
- type: cosine_recall@10
value: 0.13301219077618073
name: Cosine Recall@10
- type: cosine_ndcg@10
value: 0.32361002047039217
name: Cosine Ndcg@10
- type: cosine_mrr@10
value: 0.47583333333333333
name: Cosine Mrr@10
- type: cosine_map@100
value: 0.12539829347446158
name: Cosine Map@100
- task:
type: information-retrieval
name: Information Retrieval
dataset:
name: NanoNQ
type: NanoNQ
metrics:
- type: cosine_accuracy@1
value: 0.48
name: Cosine Accuracy@1
- type: cosine_accuracy@3
value: 0.68
name: Cosine Accuracy@3
- type: cosine_accuracy@5
value: 0.78
name: Cosine Accuracy@5
- type: cosine_accuracy@10
value: 0.82
name: Cosine Accuracy@10
- type: cosine_precision@1
value: 0.48
name: Cosine Precision@1
- type: cosine_precision@3
value: 0.22666666666666666
name: Cosine Precision@3
- type: cosine_precision@5
value: 0.15600000000000003
name: Cosine Precision@5
- type: cosine_precision@10
value: 0.086
name: Cosine Precision@10
- type: cosine_recall@1
value: 0.47
name: Cosine Recall@1
- type: cosine_recall@3
value: 0.64
name: Cosine Recall@3
- type: cosine_recall@5
value: 0.72
name: Cosine Recall@5
- type: cosine_recall@10
value: 0.78
name: Cosine Recall@10
- type: cosine_ndcg@10
value: 0.632163202477609
name: Cosine Ndcg@10
- type: cosine_mrr@10
value: 0.5983571428571428
name: Cosine Mrr@10
- type: cosine_map@100
value: 0.5837963147038205
name: Cosine Map@100
- task:
type: nano-beir
name: Nano BEIR
dataset:
name: NanoBEIR mean
type: NanoBEIR_mean
metrics:
- type: cosine_accuracy@1
value: 0.3866666666666667
name: Cosine Accuracy@1
- type: cosine_accuracy@3
value: 0.5466666666666667
name: Cosine Accuracy@3
- type: cosine_accuracy@5
value: 0.6666666666666666
name: Cosine Accuracy@5
- type: cosine_accuracy@10
value: 0.7466666666666666
name: Cosine Accuracy@10
- type: cosine_precision@1
value: 0.3866666666666667
name: Cosine Precision@1
- type: cosine_precision@3
value: 0.24666666666666662
name: Cosine Precision@3
- type: cosine_precision@5
value: 0.2026666666666667
name: Cosine Precision@5
- type: cosine_precision@10
value: 0.142
name: Cosine Precision@10
- type: cosine_recall@1
value: 0.25766119122254616
name: Cosine Recall@1
- type: cosine_recall@3
value: 0.38721078882819054
name: Cosine Recall@3
- type: cosine_recall@5
value: 0.48583638587470857
name: Cosine Recall@5
- type: cosine_recall@10
value: 0.5510040635920602
name: Cosine Recall@10
- type: cosine_ndcg@10
value: 0.48223826426932465
name: Cosine Ndcg@10
- type: cosine_mrr@10
value: 0.4957539682539682
name: Cosine Mrr@10
- type: cosine_map@100
value: 0.37758142452012505
name: Cosine Map@100
base_model:
- LiquidAI/LFM2-350M
LiquidAI/LFM2-350M trained on Natural Questions pairs
This is a sentence-transformers model finetuned from LiquidAI/LFM2-350M on the natural-questions dataset. It maps sentences & paragraphs to a 1024-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.
Model Details
Model Description
- Model Type: Sentence Transformer
- Base model: LiquidAI/LFM2-350M
- Maximum Sequence Length: 128000 tokens
- Output Dimensionality: 1024 dimensions
- Similarity Function: Cosine Similarity
- Training Dataset:
- Language: en
- License: apache-2.0
Model Sources
- Documentation: Sentence Transformers Documentation
- Repository: Sentence Transformers on GitHub
- Hugging Face: Sentence Transformers on Hugging Face
Full Model Architecture
SentenceTransformer(
(0): Transformer({'max_seq_length': 128000, 'do_lower_case': False, 'architecture': 'LFM2Model'})
(1): Pooling({'word_embedding_dimension': 1024, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': True, 'include_prompt': True})
)
Usage
Direct Usage (Sentence Transformers)
First install the Sentence Transformers library:
pip install -U sentence-transformers
Then you can load this model and run inference.
from sentence_transformers import SentenceTransformer
# Download from the 🤗 Hub
model = SentenceTransformer("tomaarsen/LFM2-350M-nq-prompts")
# Run inference
queries = [
"where does the last name francisco come from",
]
documents = [
'Francisco Francisco is the Spanish and Portuguese form of the masculine given name Franciscus (corresponding to English Francis).',
'Book of Esther The Book of Esther, also known in Hebrew as "the Scroll" (Megillah), is a book in the third section (Ketuvim, "Writings") of the Jewish Tanakh (the Hebrew Bible) and in the Christian Old Testament. It is one of the five Scrolls (Megillot) in the Hebrew Bible. It relates the story of a Hebrew woman in Persia, born as Hadassah but known as Esther, who becomes queen of Persia and thwarts a genocide of her people. The story forms the core of the Jewish festival of Purim, during which it is read aloud twice: once in the evening and again the following morning. The books of Esther and Song of Songs are the only books in the Hebrew Bible that do not explicitly mention God.[2]',
'Times Square Times Square is a major commercial intersection, tourist destination, entertainment center and neighborhood in the Midtown Manhattan section of New York City at the junction of Broadway and Seventh Avenue. It stretches from West 42nd to West 47th Streets.[1] Brightly adorned with billboards and advertisements, Times Square is sometimes referred to as "The Crossroads of the World",[2] "The Center of the Universe",[3] "the heart of The Great White Way",[4][5][6] and the "heart of the world".[7] One of the world\'s busiest pedestrian areas,[8] it is also the hub of the Broadway Theater District[9] and a major center of the world\'s entertainment industry.[10] Times Square is one of the world\'s most visited tourist attractions, drawing an estimated 50 million visitors annually.[11] Approximately 330,000 people pass through Times Square daily,[12] many of them tourists,[13] while over 460,000 pedestrians walk through Times Square on its busiest days.[7]',
]
query_embeddings = model.encode_query(queries)
document_embeddings = model.encode_document(documents)
print(query_embeddings.shape, document_embeddings.shape)
# [1, 1024] [3, 1024]
# Get the similarity scores for the embeddings
similarities = model.similarity(query_embeddings, document_embeddings)
print(similarities)
# tensor([[ 0.7825, -0.0811, -0.0414]])
Evaluation
Metrics
Information Retrieval
- Datasets:
NanoMSMARCO
,NanoNFCorpus
andNanoNQ
- Evaluated with
InformationRetrievalEvaluator
with these parameters:{ "query_prompt": "query: ", "corpus_prompt": "document: " }
Metric | NanoMSMARCO | NanoNFCorpus | NanoNQ |
---|---|---|---|
cosine_accuracy@1 | 0.28 | 0.4 | 0.48 |
cosine_accuracy@3 | 0.46 | 0.5 | 0.68 |
cosine_accuracy@5 | 0.64 | 0.58 | 0.78 |
cosine_accuracy@10 | 0.74 | 0.68 | 0.82 |
cosine_precision@1 | 0.28 | 0.4 | 0.48 |
cosine_precision@3 | 0.1533 | 0.36 | 0.2267 |
cosine_precision@5 | 0.128 | 0.324 | 0.156 |
cosine_precision@10 | 0.074 | 0.266 | 0.086 |
cosine_recall@1 | 0.28 | 0.023 | 0.47 |
cosine_recall@3 | 0.46 | 0.0616 | 0.64 |
cosine_recall@5 | 0.64 | 0.0975 | 0.72 |
cosine_recall@10 | 0.74 | 0.133 | 0.78 |
cosine_ndcg@10 | 0.4909 | 0.3236 | 0.6322 |
cosine_mrr@10 | 0.4131 | 0.4758 | 0.5984 |
cosine_map@100 | 0.4235 | 0.1254 | 0.5838 |
Nano BEIR
- Dataset:
NanoBEIR_mean
- Evaluated with
NanoBEIREvaluator
with these parameters:{ "dataset_names": [ "msmarco", "nfcorpus", "nq" ], "query_prompts": { "msmarco": "query: ", "nfcorpus": "query: ", "nq": "query: " }, "corpus_prompts": { "msmarco": "document: ", "nfcorpus": "document: ", "nq": "document: " } }
Metric | Value |
---|---|
cosine_accuracy@1 | 0.3867 |
cosine_accuracy@3 | 0.5467 |
cosine_accuracy@5 | 0.6667 |
cosine_accuracy@10 | 0.7467 |
cosine_precision@1 | 0.3867 |
cosine_precision@3 | 0.2467 |
cosine_precision@5 | 0.2027 |
cosine_precision@10 | 0.142 |
cosine_recall@1 | 0.2577 |
cosine_recall@3 | 0.3872 |
cosine_recall@5 | 0.4858 |
cosine_recall@10 | 0.551 |
cosine_ndcg@10 | 0.4822 |
cosine_mrr@10 | 0.4958 |
cosine_map@100 | 0.3776 |
Training Details
Training Dataset
natural-questions
- Dataset: natural-questions at f9e894e
- Size: 99,231 training samples
- Columns:
query
andanswer
- Approximate statistics based on the first 1000 samples:
query answer type string string details - min: 9 tokens
- mean: 11.59 tokens
- max: 26 tokens
- min: 16 tokens
- mean: 142.35 tokens
- max: 559 tokens
- Samples:
query answer who is required to report according to the hmda
Home Mortgage Disclosure Act US financial institutions must report HMDA data to their regulator if they meet certain criteria, such as having assets above a specific threshold. The criteria is different for depository and non-depository institutions and are available on the FFIEC website.[4] In 2012, there were 7,400 institutions that reported a total of 18.7 million HMDA records.[5]
what is the definition of endoplasmic reticulum in biology
Endoplasmic reticulum The endoplasmic reticulum (ER) is a type of organelle in eukaryotic cells that forms an interconnected network of flattened, membrane-enclosed sacs or tube-like structures known as cisternae. The membranes of the ER are continuous with the outer nuclear membrane. The endoplasmic reticulum occurs in most types of eukaryotic cells, but is absent from red blood cells and spermatozoa. There are two types of endoplasmic reticulum: rough and smooth. The outer (cytosolic) face of the rough endoplasmic reticulum is studded with ribosomes that are the sites of protein synthesis. The rough endoplasmic reticulum is especially prominent in cells such as hepatocytes. The smooth endoplasmic reticulum lacks ribosomes and functions in lipid manufacture and metabolism, the production of steroid hormones, and detoxification.[1] The smooth ER is especially abundant in mammalian liver and gonad cells. The lacy membranes of the endoplasmic reticulum were first seen in 1945 using elect...
what does the ski mean in polish names
Polish name Since the High Middle Ages, Polish-sounding surnames ending with the masculine -ski suffix, including -cki and -dzki, and the corresponding feminine suffix -ska/-cka/-dzka were associated with the nobility (Polish szlachta), which alone, in the early years, had such suffix distinctions.[1] They are widely popular today.
- Loss:
CachedMultipleNegativesRankingLoss
with these parameters:{ "scale": 20.0, "similarity_fct": "cos_sim", "mini_batch_size": 4 }
Evaluation Dataset
natural-questions
- Dataset: natural-questions at f9e894e
- Size: 1,000 evaluation samples
- Columns:
query
andanswer
- Approximate statistics based on the first 1000 samples:
query answer type string string details - min: 9 tokens
- mean: 11.62 tokens
- max: 23 tokens
- min: 9 tokens
- mean: 141.66 tokens
- max: 664 tokens
- Samples:
query answer difference between russian blue and british blue cat
Russian Blue The coat is known as a "double coat", with the undercoat being soft, downy and equal in length to the guard hairs, which are an even blue with silver tips. However, the tail may have a few very dull, almost unnoticeable stripes. The coat is described as thick, plush and soft to the touch. The feeling is softer than the softest silk. The silver tips give the coat a shimmering appearance. Its eyes are almost always a dark and vivid green. Any white patches of fur or yellow eyes in adulthood are seen as flaws in show cats.[3] Russian Blues should not be confused with British Blues (which are not a distinct breed, but rather a British Shorthair with a blue coat as the British Shorthair breed itself comes in a wide variety of colors and patterns), nor the Chartreux or Korat which are two other naturally occurring breeds of blue cats, although they have similar traits.
who played the little girl on mrs doubtfire
Mara Wilson Mara Elizabeth Wilson[2] (born July 24, 1987) is an American writer and former child actress. She is known for playing Natalie Hillard in Mrs. Doubtfire (1993), Susan Walker in Miracle on 34th Street (1994), Matilda Wormwood in Matilda (1996) and Lily Stone in Thomas and the Magic Railroad (2000). Since retiring from film acting, Wilson has focused on writing.
what year did the movie the sound of music come out
The Sound of Music (film) The film was released on March 2, 1965 in the United States, initially as a limited roadshow theatrical release. Although critical response to the film was widely mixed, the film was a major commercial success, becoming the number one box office movie after four weeks, and the highest-grossing film of 1965. By November 1966, The Sound of Music had become the highest-grossing film of all-time—surpassing Gone with the Wind—and held that distinction for five years. The film was just as popular throughout the world, breaking previous box-office records in twenty-nine countries. Following an initial theatrical release that lasted four and a half years, and two successful re-releases, the film sold 283 million admissions worldwide and earned a total worldwide gross of $286,000,000.
- Loss:
CachedMultipleNegativesRankingLoss
with these parameters:{ "scale": 20.0, "similarity_fct": "cos_sim", "mini_batch_size": 4 }
Training Hyperparameters
Non-Default Hyperparameters
eval_strategy
: stepsper_device_train_batch_size
: 256per_device_eval_batch_size
: 256learning_rate
: 2e-05num_train_epochs
: 1warmup_ratio
: 0.1seed
: 12bf16
: Trueprompts
: {'query': 'query: ', 'answer': 'document: '}batch_sampler
: no_duplicates
All Hyperparameters
Click to expand
overwrite_output_dir
: Falsedo_predict
: Falseeval_strategy
: stepsprediction_loss_only
: Trueper_device_train_batch_size
: 256per_device_eval_batch_size
: 256per_gpu_train_batch_size
: Noneper_gpu_eval_batch_size
: Nonegradient_accumulation_steps
: 1eval_accumulation_steps
: Nonetorch_empty_cache_steps
: Nonelearning_rate
: 2e-05weight_decay
: 0.0adam_beta1
: 0.9adam_beta2
: 0.999adam_epsilon
: 1e-08max_grad_norm
: 1.0num_train_epochs
: 1max_steps
: -1lr_scheduler_type
: linearlr_scheduler_kwargs
: {}warmup_ratio
: 0.1warmup_steps
: 0log_level
: passivelog_level_replica
: warninglog_on_each_node
: Truelogging_nan_inf_filter
: Truesave_safetensors
: Truesave_on_each_node
: Falsesave_only_model
: Falserestore_callback_states_from_checkpoint
: Falseno_cuda
: Falseuse_cpu
: Falseuse_mps_device
: Falseseed
: 12data_seed
: Nonejit_mode_eval
: Falseuse_ipex
: Falsebf16
: Truefp16
: Falsefp16_opt_level
: O1half_precision_backend
: autobf16_full_eval
: Falsefp16_full_eval
: Falsetf32
: Nonelocal_rank
: 0ddp_backend
: Nonetpu_num_cores
: Nonetpu_metrics_debug
: Falsedebug
: []dataloader_drop_last
: Falsedataloader_num_workers
: 0dataloader_prefetch_factor
: Nonepast_index
: -1disable_tqdm
: Falseremove_unused_columns
: Truelabel_names
: Noneload_best_model_at_end
: Falseignore_data_skip
: Falsefsdp
: []fsdp_min_num_params
: 0fsdp_config
: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}fsdp_transformer_layer_cls_to_wrap
: Noneaccelerator_config
: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}deepspeed
: Nonelabel_smoothing_factor
: 0.0optim
: adamw_torchoptim_args
: Noneadafactor
: Falsegroup_by_length
: Falselength_column_name
: lengthddp_find_unused_parameters
: Noneddp_bucket_cap_mb
: Noneddp_broadcast_buffers
: Falsedataloader_pin_memory
: Truedataloader_persistent_workers
: Falseskip_memory_metrics
: Trueuse_legacy_prediction_loop
: Falsepush_to_hub
: Falseresume_from_checkpoint
: Nonehub_model_id
: Nonehub_strategy
: every_savehub_private_repo
: Nonehub_always_push
: Falsehub_revision
: Nonegradient_checkpointing
: Falsegradient_checkpointing_kwargs
: Noneinclude_inputs_for_metrics
: Falseinclude_for_metrics
: []eval_do_concat_batches
: Truefp16_backend
: autopush_to_hub_model_id
: Nonepush_to_hub_organization
: Nonemp_parameters
:auto_find_batch_size
: Falsefull_determinism
: Falsetorchdynamo
: Noneray_scope
: lastddp_timeout
: 1800torch_compile
: Falsetorch_compile_backend
: Nonetorch_compile_mode
: Noneinclude_tokens_per_second
: Falseinclude_num_input_tokens_seen
: Falseneftune_noise_alpha
: Noneoptim_target_modules
: Nonebatch_eval_metrics
: Falseeval_on_start
: Falseuse_liger_kernel
: Falseliger_kernel_config
: Noneeval_use_gather_object
: Falseaverage_tokens_across_devices
: Falseprompts
: {'query': 'query: ', 'answer': 'document: '}batch_sampler
: no_duplicatesmulti_dataset_batch_sampler
: proportionalrouter_mapping
: {}learning_rate_mapping
: {}
Training Logs
Epoch | Step | Training Loss | Validation Loss | NanoMSMARCO_cosine_ndcg@10 | NanoNFCorpus_cosine_ndcg@10 | NanoNQ_cosine_ndcg@10 | NanoBEIR_mean_cosine_ndcg@10 |
---|---|---|---|---|---|---|---|
-1 | -1 | - | - | 0.0086 | 0.0233 | 0.0063 | 0.0128 |
0.0026 | 1 | 4.6189 | - | - | - | - | - |
0.0129 | 5 | 4.1284 | - | - | - | - | - |
0.0258 | 10 | 3.6638 | - | - | - | - | - |
0.0387 | 15 | 2.3118 | - | - | - | - | - |
0.0515 | 20 | 1.0986 | - | - | - | - | - |
0.0644 | 25 | 0.5063 | - | - | - | - | - |
0.0773 | 30 | 0.2891 | - | - | - | - | - |
0.0902 | 35 | 0.2138 | - | - | - | - | - |
0.1031 | 40 | 0.1967 | - | - | - | - | - |
0.1160 | 45 | 0.1745 | - | - | - | - | - |
0.1289 | 50 | 0.1479 | 0.1425 | 0.4927 | 0.3162 | 0.5375 | 0.4488 |
0.1418 | 55 | 0.1257 | - | - | - | - | - |
0.1546 | 60 | 0.1215 | - | - | - | - | - |
0.1675 | 65 | 0.1475 | - | - | - | - | - |
0.1804 | 70 | 0.1066 | - | - | - | - | - |
0.1933 | 75 | 0.1056 | - | - | - | - | - |
0.2062 | 80 | 0.1181 | - | - | - | - | - |
0.2191 | 85 | 0.118 | - | - | - | - | - |
0.2320 | 90 | 0.1031 | - | - | - | - | - |
0.2448 | 95 | 0.0775 | - | - | - | - | - |
0.2577 | 100 | 0.0906 | 0.1009 | 0.4791 | 0.3151 | 0.6007 | 0.4650 |
0.2706 | 105 | 0.0921 | - | - | - | - | - |
0.2835 | 110 | 0.1105 | - | - | - | - | - |
0.2964 | 115 | 0.0906 | - | - | - | - | - |
0.3093 | 120 | 0.1002 | - | - | - | - | - |
0.3222 | 125 | 0.0952 | - | - | - | - | - |
0.3351 | 130 | 0.0652 | - | - | - | - | - |
0.3479 | 135 | 0.079 | - | - | - | - | - |
0.3608 | 140 | 0.0951 | - | - | - | - | - |
0.3737 | 145 | 0.0918 | - | - | - | - | - |
0.3866 | 150 | 0.065 | 0.0772 | 0.5115 | 0.3070 | 0.6105 | 0.4763 |
0.3995 | 155 | 0.1065 | - | - | - | - | - |
0.4124 | 160 | 0.0871 | - | - | - | - | - |
0.4253 | 165 | 0.0623 | - | - | - | - | - |
0.4381 | 170 | 0.0771 | - | - | - | - | - |
0.4510 | 175 | 0.0795 | - | - | - | - | - |
0.4639 | 180 | 0.0814 | - | - | - | - | - |
0.4768 | 185 | 0.0794 | - | - | - | - | - |
0.4897 | 190 | 0.0744 | - | - | - | - | - |
0.5026 | 195 | 0.0612 | - | - | - | - | - |
0.5155 | 200 | 0.0684 | 0.0692 | 0.4818 | 0.3173 | 0.6161 | 0.4717 |
0.5284 | 205 | 0.0635 | - | - | - | - | - |
0.5412 | 210 | 0.0768 | - | - | - | - | - |
0.5541 | 215 | 0.0544 | - | - | - | - | - |
0.5670 | 220 | 0.0654 | - | - | - | - | - |
0.5799 | 225 | 0.0729 | - | - | - | - | - |
0.5928 | 230 | 0.0923 | - | - | - | - | - |
0.6057 | 235 | 0.0763 | - | - | - | - | - |
0.6186 | 240 | 0.0687 | - | - | - | - | - |
0.6314 | 245 | 0.0657 | - | - | - | - | - |
0.6443 | 250 | 0.0708 | 0.0643 | 0.4843 | 0.3152 | 0.6023 | 0.4673 |
0.6572 | 255 | 0.0555 | - | - | - | - | - |
0.6701 | 260 | 0.0792 | - | - | - | - | - |
0.6830 | 265 | 0.0681 | - | - | - | - | - |
0.6959 | 270 | 0.0855 | - | - | - | - | - |
0.7088 | 275 | 0.0788 | - | - | - | - | - |
0.7216 | 280 | 0.0631 | - | - | - | - | - |
0.7345 | 285 | 0.0676 | - | - | - | - | - |
0.7474 | 290 | 0.0536 | - | - | - | - | - |
0.7603 | 295 | 0.0814 | - | - | - | - | - |
0.7732 | 300 | 0.062 | 0.0606 | 0.4630 | 0.3235 | 0.6256 | 0.4707 |
0.7861 | 305 | 0.0777 | - | - | - | - | - |
0.7990 | 310 | 0.0801 | - | - | - | - | - |
0.8119 | 315 | 0.0566 | - | - | - | - | - |
0.8247 | 320 | 0.0711 | - | - | - | - | - |
0.8376 | 325 | 0.0643 | - | - | - | - | - |
0.8505 | 330 | 0.0422 | - | - | - | - | - |
0.8634 | 335 | 0.0614 | - | - | - | - | - |
0.8763 | 340 | 0.06 | - | - | - | - | - |
0.8892 | 345 | 0.0584 | - | - | - | - | - |
0.9021 | 350 | 0.0457 | 0.0583 | 0.4952 | 0.3214 | 0.6268 | 0.4811 |
0.9149 | 355 | 0.0838 | - | - | - | - | - |
0.9278 | 360 | 0.0657 | - | - | - | - | - |
0.9407 | 365 | 0.0658 | - | - | - | - | - |
0.9536 | 370 | 0.0757 | - | - | - | - | - |
0.9665 | 375 | 0.0603 | - | - | - | - | - |
0.9794 | 380 | 0.0647 | - | - | - | - | - |
0.9923 | 385 | 0.0575 | - | - | - | - | - |
-1 | -1 | - | - | 0.4909 | 0.3236 | 0.6322 | 0.4822 |
Environmental Impact
Carbon emissions were measured using CodeCarbon.
- Energy Consumed: 1.043 kWh
- Carbon Emitted: 0.405 kg of CO2
- Hours Used: 3.425 hours
Training Hardware
- On Cloud: No
- GPU Model: 1 x NVIDIA GeForce RTX 3090
- CPU Model: 13th Gen Intel(R) Core(TM) i7-13700K
- RAM Size: 31.78 GB
Framework Versions
- Python: 3.11.6
- Sentence Transformers: 5.1.0.dev0
- Transformers: 4.53.0
- PyTorch: 2.7.1+cu126
- Accelerate: 1.5.1
- Datasets: 2.21.0
- Tokenizers: 0.21.1
Citation
BibTeX
Sentence Transformers
@inproceedings{reimers-2019-sentence-bert,
title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
author = "Reimers, Nils and Gurevych, Iryna",
booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
month = "11",
year = "2019",
publisher = "Association for Computational Linguistics",
url = "https://arxiv.org/abs/1908.10084",
}
CachedMultipleNegativesRankingLoss
@misc{gao2021scaling,
title={Scaling Deep Contrastive Learning Batch Size under Memory Limited Setup},
author={Luyu Gao and Yunyi Zhang and Jiawei Han and Jamie Callan},
year={2021},
eprint={2101.06983},
archivePrefix={arXiv},
primaryClass={cs.LG}
}