Muennighoff commited on
Commit
5229306
·
verified ·
1 Parent(s): c11b9ef

Scheduled Commit

Browse files
data/retrieval_battle-76c7b5f4-5fbc-4311-bf05-8defcdc1ba4e.jsonl CHANGED
@@ -78,3 +78,4 @@
78
  {"tstamp": 1723026035.9816, "task_type": "retrieval", "type": "rightvote", "models": ["", ""], "ip": "", "0_conv_id": "9222e7c7f6d04350b94e1a85e239c11e", "0_model_name": "sentence-transformers/all-MiniLM-L6-v2", "0_prompt": "llm survey", "0_output": [["llm survey", "Title: Polymer dynamics of isotropic universe in Ashtekar and in volume variables\n\nAbstract: We analyze the semiclassical and quantum polymer dynamics of the isotropic Universe in terms of both the standard Ashtekar-Barbero-Immirzi connection and its conjugate momentum and also of the new generalized coordinate conjugate to the Universe volume. We study the properties of the resulting bouncing cosmology that emerges in both the representations and we show that the Big Bounce is an intrinsic cut-off on the cosmological dynamics only when the volume variable is implemented, while in terms of the standard connection the Universe Bounce energy density is fixed by the initial conditions on the prepared wavepacket. As a phenomenological implication, we introduce particle creation as a dissipative term and study the production of entropy in the two formulations. Then, we compare the obtained dynamics with what emerges in Loop Quantum Cosmology, where the same difference in the nature of the Big Bounce is associated to fixing a minimum area eigenvalue in a comoving or in a physical representation. We conclude that the privileged character of the Ashtekar-Barbero-Immirzi connection suggests that the natural scenario in the polymer framework is a Big Bounce that is not a Universal cut-off. However, by a parallelism between the polymer and Loop Quantum Cosmology properties of the basic operators, we also develop some considerations in favour of the viability of the $\\bar{\\mu}$ scheme of Loop Quantum Cosmology on a semiclassical level."]], "0_corpus": "arxiv", "1_conv_id": "6493498bf5b24f528971f158d1ab61a6", "1_model_name": "Alibaba-NLP/gte-Qwen2-7B-instruct", "1_prompt": "llm survey", "1_output": [["llm survey", "Title: Large Language Models: A Survey\n\nAbstract: Large Language Models (LLMs) have drawn a lot of attention due to their strong performance on a wide range of natural language tasks, since the release of ChatGPT in November 2022. LLMs' ability of general-purpose language understanding and generation is acquired by training billions of model's parameters on massive amounts of text data, as predicted by scaling laws \\cite{kaplan2020scaling,hoffmann2022training}. The research area of LLMs, while very recent, is evolving rapidly in many different ways. In this paper, we review some of the most prominent LLMs, including three popular LLM families (GPT, LLaMA, PaLM), and discuss their characteristics, contributions and limitations. We also give an overview of techniques developed to build, and augment LLMs. We then survey popular datasets prepared for LLM training, fine-tuning, and evaluation, review widely used LLM evaluation metrics, and compare the performance of several popular LLMs on a set of representative benchmarks. Finally, we conclude the paper by discussing open challenges and future research directions."]], "1_corpus": "arxiv"}
79
  {"tstamp": 1723030790.7052, "task_type": "retrieval", "type": "tievote", "models": ["", ""], "ip": "", "0_conv_id": "b245aa463b1046acb891aaf6f32bfc24", "0_model_name": "nomic-ai/nomic-embed-text-v1.5", "0_prompt": "Why did the machines in The Matrix keep humans around?", "0_output": [["Why did the machines in The Matrix keep humans around?", "Scifi Stackexchange\n\nQ: Why use full brained humans as batteries? \nPossible Duplicate:\nIs the basic premise of humans as a power source in The Matrix reasonable?\nWhy Did the Machines Even Bother With a Matrix? \n\nThe machines in The Matrix series use humans as batteries. That's OK. But why did the machines not alter their livestock such that the matrix itself would be unnecessary? \nConsider the recently proposed headless chicken farm. Something similar for the human batteries would seem to be an ideal solution for logical machines. No longer would they need to spend resources maintaining a virtual reality for their batteries. The energy, physical hardware and programs running the matrix could then be put to who knows what other uses.\n\nA: The explanation for the use of human brains would seem to be the fact that our complex thoughts lead to higher level of brain activity, thus resulting in exponentially more synapses firing (i.e. more electrical activity to harvest).\nThat makes human brains far more efficient than the brains of animals, even with the overhead of maintaining the matrix. Plus, that movie would have really sucked... dumb cows and chickens revolt against the machines! ;)\n"]], "0_corpus": "stackexchange", "1_conv_id": "16dbd4f57fe044bfa3bb7f6b12f24f38", "1_model_name": "GritLM/GritLM-7B", "1_prompt": "Why did the machines in The Matrix keep humans around?", "1_output": [["Why did the machines in The Matrix keep humans around?", "Scifi Stackexchange\n\nQ: Why use full brained humans as batteries? \nPossible Duplicate:\nIs the basic premise of humans as a power source in The Matrix reasonable?\nWhy Did the Machines Even Bother With a Matrix? \n\nThe machines in The Matrix series use humans as batteries. That's OK. But why did the machines not alter their livestock such that the matrix itself would be unnecessary? \nConsider the recently proposed headless chicken farm. Something similar for the human batteries would seem to be an ideal solution for logical machines. No longer would they need to spend resources maintaining a virtual reality for their batteries. The energy, physical hardware and programs running the matrix could then be put to who knows what other uses.\n\nA: The explanation for the use of human brains would seem to be the fact that our complex thoughts lead to higher level of brain activity, thus resulting in exponentially more synapses firing (i.e. more electrical activity to harvest).\nThat makes human brains far more efficient than the brains of animals, even with the overhead of maintaining the matrix. Plus, that movie would have really sucked... dumb cows and chickens revolt against the machines! ;)\n"]], "1_corpus": "stackexchange"}
80
  {"tstamp": 1723038132.4843, "task_type": "retrieval", "type": "rightvote", "models": ["", ""], "ip": "", "0_conv_id": "bc8a255d2b3d4d2392e1b38a5fd03e89", "0_model_name": "embed-english-v3.0", "0_prompt": "J'aimerai trier mon jeu de données", "0_output": [["J'aimerai trier mon jeu de données", "Joachim Trier\n\nJoachim Trier () (born 1 March 1974) is a Danish-born Norwegian filmmaker. His films have been described as \"melancholy meditations concerned with existential questions of love, ambition, memory, and identity.\" He has received numerous nominations including for a Academy Award, a BAFTA Award, two Cesar Awards, and three Cannes Film Festival Awards.\nHe is best known for his Oslo trilogy which comprises the films Reprise (2006), Oslo, August 31st (2011), The Worst Person in the World (2021). For the last film, he was nominated for the Best Original Screenplay at the 94th Academy Awards, with the film also nominated for Best International Feature. He is also known for directing Louder Than Bombs (2015), Thelma (2017), and the documentary The Other Munch (2018).\nEarly life and education\nTrier was born in Denmark to Norwegian parents and raised in Oslo, Norway. His father, Jacob Trier, was the sound technician of The Pinchcliffe Grand Prix, a notable film produced in Norway in 1975. His grandfather was Erik Løchen, artistic director of Norsk Film from 1981 to 1983 and also a filmmaker and screenwriter known for such experimental work as his 1972 film Remonstrance, which was constructed so that its five reels could be shown in any order, rendering 120 possible versions of its radical story of a film crew trying to make a political film."]], "0_corpus": "wikipedia", "1_conv_id": "a28ff8878f6b453284b085f6b665bb39", "1_model_name": "GritLM/GritLM-7B", "1_prompt": "J'aimerai trier mon jeu de données", "1_output": [["J'aimerai trier mon jeu de données", "Sorting\n\nSorting refers to ordering data in an increasing or decreasing manner according to some linear relationship among the data items.\nordering: arranging items in a sequence ordered by some criterion;"]], "1_corpus": "wikipedia"}
 
 
78
  {"tstamp": 1723026035.9816, "task_type": "retrieval", "type": "rightvote", "models": ["", ""], "ip": "", "0_conv_id": "9222e7c7f6d04350b94e1a85e239c11e", "0_model_name": "sentence-transformers/all-MiniLM-L6-v2", "0_prompt": "llm survey", "0_output": [["llm survey", "Title: Polymer dynamics of isotropic universe in Ashtekar and in volume variables\n\nAbstract: We analyze the semiclassical and quantum polymer dynamics of the isotropic Universe in terms of both the standard Ashtekar-Barbero-Immirzi connection and its conjugate momentum and also of the new generalized coordinate conjugate to the Universe volume. We study the properties of the resulting bouncing cosmology that emerges in both the representations and we show that the Big Bounce is an intrinsic cut-off on the cosmological dynamics only when the volume variable is implemented, while in terms of the standard connection the Universe Bounce energy density is fixed by the initial conditions on the prepared wavepacket. As a phenomenological implication, we introduce particle creation as a dissipative term and study the production of entropy in the two formulations. Then, we compare the obtained dynamics with what emerges in Loop Quantum Cosmology, where the same difference in the nature of the Big Bounce is associated to fixing a minimum area eigenvalue in a comoving or in a physical representation. We conclude that the privileged character of the Ashtekar-Barbero-Immirzi connection suggests that the natural scenario in the polymer framework is a Big Bounce that is not a Universal cut-off. However, by a parallelism between the polymer and Loop Quantum Cosmology properties of the basic operators, we also develop some considerations in favour of the viability of the $\\bar{\\mu}$ scheme of Loop Quantum Cosmology on a semiclassical level."]], "0_corpus": "arxiv", "1_conv_id": "6493498bf5b24f528971f158d1ab61a6", "1_model_name": "Alibaba-NLP/gte-Qwen2-7B-instruct", "1_prompt": "llm survey", "1_output": [["llm survey", "Title: Large Language Models: A Survey\n\nAbstract: Large Language Models (LLMs) have drawn a lot of attention due to their strong performance on a wide range of natural language tasks, since the release of ChatGPT in November 2022. LLMs' ability of general-purpose language understanding and generation is acquired by training billions of model's parameters on massive amounts of text data, as predicted by scaling laws \\cite{kaplan2020scaling,hoffmann2022training}. The research area of LLMs, while very recent, is evolving rapidly in many different ways. In this paper, we review some of the most prominent LLMs, including three popular LLM families (GPT, LLaMA, PaLM), and discuss their characteristics, contributions and limitations. We also give an overview of techniques developed to build, and augment LLMs. We then survey popular datasets prepared for LLM training, fine-tuning, and evaluation, review widely used LLM evaluation metrics, and compare the performance of several popular LLMs on a set of representative benchmarks. Finally, we conclude the paper by discussing open challenges and future research directions."]], "1_corpus": "arxiv"}
79
  {"tstamp": 1723030790.7052, "task_type": "retrieval", "type": "tievote", "models": ["", ""], "ip": "", "0_conv_id": "b245aa463b1046acb891aaf6f32bfc24", "0_model_name": "nomic-ai/nomic-embed-text-v1.5", "0_prompt": "Why did the machines in The Matrix keep humans around?", "0_output": [["Why did the machines in The Matrix keep humans around?", "Scifi Stackexchange\n\nQ: Why use full brained humans as batteries? \nPossible Duplicate:\nIs the basic premise of humans as a power source in The Matrix reasonable?\nWhy Did the Machines Even Bother With a Matrix? \n\nThe machines in The Matrix series use humans as batteries. That's OK. But why did the machines not alter their livestock such that the matrix itself would be unnecessary? \nConsider the recently proposed headless chicken farm. Something similar for the human batteries would seem to be an ideal solution for logical machines. No longer would they need to spend resources maintaining a virtual reality for their batteries. The energy, physical hardware and programs running the matrix could then be put to who knows what other uses.\n\nA: The explanation for the use of human brains would seem to be the fact that our complex thoughts lead to higher level of brain activity, thus resulting in exponentially more synapses firing (i.e. more electrical activity to harvest).\nThat makes human brains far more efficient than the brains of animals, even with the overhead of maintaining the matrix. Plus, that movie would have really sucked... dumb cows and chickens revolt against the machines! ;)\n"]], "0_corpus": "stackexchange", "1_conv_id": "16dbd4f57fe044bfa3bb7f6b12f24f38", "1_model_name": "GritLM/GritLM-7B", "1_prompt": "Why did the machines in The Matrix keep humans around?", "1_output": [["Why did the machines in The Matrix keep humans around?", "Scifi Stackexchange\n\nQ: Why use full brained humans as batteries? \nPossible Duplicate:\nIs the basic premise of humans as a power source in The Matrix reasonable?\nWhy Did the Machines Even Bother With a Matrix? \n\nThe machines in The Matrix series use humans as batteries. That's OK. But why did the machines not alter their livestock such that the matrix itself would be unnecessary? \nConsider the recently proposed headless chicken farm. Something similar for the human batteries would seem to be an ideal solution for logical machines. No longer would they need to spend resources maintaining a virtual reality for their batteries. The energy, physical hardware and programs running the matrix could then be put to who knows what other uses.\n\nA: The explanation for the use of human brains would seem to be the fact that our complex thoughts lead to higher level of brain activity, thus resulting in exponentially more synapses firing (i.e. more electrical activity to harvest).\nThat makes human brains far more efficient than the brains of animals, even with the overhead of maintaining the matrix. Plus, that movie would have really sucked... dumb cows and chickens revolt against the machines! ;)\n"]], "1_corpus": "stackexchange"}
80
  {"tstamp": 1723038132.4843, "task_type": "retrieval", "type": "rightvote", "models": ["", ""], "ip": "", "0_conv_id": "bc8a255d2b3d4d2392e1b38a5fd03e89", "0_model_name": "embed-english-v3.0", "0_prompt": "J'aimerai trier mon jeu de données", "0_output": [["J'aimerai trier mon jeu de données", "Joachim Trier\n\nJoachim Trier () (born 1 March 1974) is a Danish-born Norwegian filmmaker. His films have been described as \"melancholy meditations concerned with existential questions of love, ambition, memory, and identity.\" He has received numerous nominations including for a Academy Award, a BAFTA Award, two Cesar Awards, and three Cannes Film Festival Awards.\nHe is best known for his Oslo trilogy which comprises the films Reprise (2006), Oslo, August 31st (2011), The Worst Person in the World (2021). For the last film, he was nominated for the Best Original Screenplay at the 94th Academy Awards, with the film also nominated for Best International Feature. He is also known for directing Louder Than Bombs (2015), Thelma (2017), and the documentary The Other Munch (2018).\nEarly life and education\nTrier was born in Denmark to Norwegian parents and raised in Oslo, Norway. His father, Jacob Trier, was the sound technician of The Pinchcliffe Grand Prix, a notable film produced in Norway in 1975. His grandfather was Erik Løchen, artistic director of Norsk Film from 1981 to 1983 and also a filmmaker and screenwriter known for such experimental work as his 1972 film Remonstrance, which was constructed so that its five reels could be shown in any order, rendering 120 possible versions of its radical story of a film crew trying to make a political film."]], "0_corpus": "wikipedia", "1_conv_id": "a28ff8878f6b453284b085f6b665bb39", "1_model_name": "GritLM/GritLM-7B", "1_prompt": "J'aimerai trier mon jeu de données", "1_output": [["J'aimerai trier mon jeu de données", "Sorting\n\nSorting refers to ordering data in an increasing or decreasing manner according to some linear relationship among the data items.\nordering: arranging items in a sequence ordered by some criterion;"]], "1_corpus": "wikipedia"}
81
+ {"tstamp": 1723039339.1165, "task_type": "retrieval", "type": "leftvote", "models": ["", ""], "ip": "", "0_conv_id": "80295bed659643e2bf77f8da311dd884", "0_model_name": "BAAI/bge-large-en-v1.5", "0_prompt": "Which test was devised to determine whether robots can think?", "0_output": [["Which test was devised to determine whether robots can think?", "Turing test\n\nThe Turing test, originally called the imitation game by Alan Turing in 1950, is a test of a machine's ability to exhibit intelligent behaviour equivalent to, or indistinguishable from, that of a human. Turing proposed that a human evaluator would judge natural language conversations between a human and a machine designed to generate human-like responses. The evaluator would be aware that one of the two partners in conversation was a machine, and all participants would be separated from one another. The conversation would be limited to a text-only channel, such as a computer keyboard and screen, so the result would not depend on the machine's ability to render words as speech. If the evaluator could not reliably tell the machine from the human, the machine would be said to have passed the test. The test results would not depend on the machine's ability to give correct answers to questions, only on how closely its answers resembled those a human would give. Since the Turing test is a test of indistinguishability in performance capacity, the verbal version generalizes naturally to all of human performance capacity, verbal as well as nonverbal (robotic)."]], "0_corpus": "wikipedia", "1_conv_id": "85eae3c1152f4540800ece9fcdabfb21", "1_model_name": "BM25", "1_prompt": "Which test was devised to determine whether robots can think?", "1_output": [["Which test was devised to determine whether robots can think?", "Artificial intelligence\n\nIn the late teens and early 2020s, AGI companies began to deliver programs that created enormous interest. In 2015, AlphaGo, developed by DeepMind, beat the world champion Go player. The program was taught only the rules of the game and developed strategy by itself. GPT-3 is a large language model that was released in 2020 by OpenAI and is capable of generating high-quality human-like text. These programs, and others, inspired an aggressive AI boom, where large companies began investing billions in AI research. According to AI Impacts, about $50 billion annually was invested in \"AI\" around 2022 in the U.S. alone and about 20% of the new U.S. Computer Science PhD graduates have specialized in \"AI\".\nAbout 800,000 \"AI\"-related U.S. job openings existed in 2022.\nPhilosophy\nDefining artificial intelligence\nAlan Turing wrote in 1950 \"I propose to consider the question 'can machines think'?\" He advised changing the question from whether a machine \"thinks\", to \"whether or not it is possible for machinery to show intelligent behaviour\". He devised the Turing test, which measures the ability of a machine to simulate human conversation. Since we can only observe the behavior of the machine, it does not matter if it is \"actually\" thinking or literally has a \"mind\". Turing notes that we can not determine these things about other people but \"it is usual to have a polite convention that everyone thinks.\""]], "1_corpus": "wikipedia"}
data/sts_battle-76c7b5f4-5fbc-4311-bf05-8defcdc1ba4e.jsonl CHANGED
@@ -2,3 +2,4 @@
2
  {"tstamp": 1722928188.3088, "task_type": "sts", "type": "rightvote", "models": ["", ""], "ip": "", "0_conv_id": "6896004aa77340929aa5bb399b2df7ce", "0_model_name": "intfloat/multilingual-e5-large-instruct", "0_txt0": "Boy in red and black shirt walks through grass.", "0_txt1": "The boy is walking.", "0_txt2": "The boy has a green and purple shirt.", "0_output": "", "1_conv_id": "8691d46a0fd84d26b74a6d9ba77f2613", "1_model_name": "mixedbread-ai/mxbai-embed-large-v1", "1_txt0": "Boy in red and black shirt walks through grass.", "1_txt1": "The boy is walking.", "1_txt2": "The boy has a green and purple shirt.", "1_output": ""}
3
  {"tstamp": 1722928299.6279, "task_type": "sts", "type": "bothbadvote", "models": ["", ""], "ip": "", "0_conv_id": "1d65eb3efbfa45a4a825969ba93ca736", "0_model_name": "GritLM/GritLM-7B", "0_txt0": " a man with a dog", "0_txt1": "a man without a dog", "0_txt2": "a man with a cat", "0_output": "", "1_conv_id": "cde1bc0fff494889836f584349d4194b", "1_model_name": "text-embedding-3-large", "1_txt0": " a man with a dog", "1_txt1": "a man without a dog", "1_txt2": "a man with a cat", "1_output": ""}
4
  {"tstamp": 1723026363.4614, "task_type": "sts", "type": "rightvote", "models": ["", ""], "ip": "", "0_conv_id": "b974e4bf41a8442ab4936caafba60ec2", "0_model_name": "mixedbread-ai/mxbai-embed-large-v1", "0_txt0": "People are shopping.", "0_txt1": "Numerous customers browsing for produce in a market", "0_txt2": "People are showering.", "0_output": "", "1_conv_id": "9b0a641ffecf44609775427308d159b2", "1_model_name": "intfloat/multilingual-e5-large-instruct", "1_txt0": "People are shopping.", "1_txt1": "Numerous customers browsing for produce in a market", "1_txt2": "People are showering.", "1_output": ""}
 
 
2
  {"tstamp": 1722928188.3088, "task_type": "sts", "type": "rightvote", "models": ["", ""], "ip": "", "0_conv_id": "6896004aa77340929aa5bb399b2df7ce", "0_model_name": "intfloat/multilingual-e5-large-instruct", "0_txt0": "Boy in red and black shirt walks through grass.", "0_txt1": "The boy is walking.", "0_txt2": "The boy has a green and purple shirt.", "0_output": "", "1_conv_id": "8691d46a0fd84d26b74a6d9ba77f2613", "1_model_name": "mixedbread-ai/mxbai-embed-large-v1", "1_txt0": "Boy in red and black shirt walks through grass.", "1_txt1": "The boy is walking.", "1_txt2": "The boy has a green and purple shirt.", "1_output": ""}
3
  {"tstamp": 1722928299.6279, "task_type": "sts", "type": "bothbadvote", "models": ["", ""], "ip": "", "0_conv_id": "1d65eb3efbfa45a4a825969ba93ca736", "0_model_name": "GritLM/GritLM-7B", "0_txt0": " a man with a dog", "0_txt1": "a man without a dog", "0_txt2": "a man with a cat", "0_output": "", "1_conv_id": "cde1bc0fff494889836f584349d4194b", "1_model_name": "text-embedding-3-large", "1_txt0": " a man with a dog", "1_txt1": "a man without a dog", "1_txt2": "a man with a cat", "1_output": ""}
4
  {"tstamp": 1723026363.4614, "task_type": "sts", "type": "rightvote", "models": ["", ""], "ip": "", "0_conv_id": "b974e4bf41a8442ab4936caafba60ec2", "0_model_name": "mixedbread-ai/mxbai-embed-large-v1", "0_txt0": "People are shopping.", "0_txt1": "Numerous customers browsing for produce in a market", "0_txt2": "People are showering.", "0_output": "", "1_conv_id": "9b0a641ffecf44609775427308d159b2", "1_model_name": "intfloat/multilingual-e5-large-instruct", "1_txt0": "People are shopping.", "1_txt1": "Numerous customers browsing for produce in a market", "1_txt2": "People are showering.", "1_output": ""}
5
+ {"tstamp": 1723039416.6593, "task_type": "sts", "type": "leftvote", "models": ["", ""], "ip": "", "0_conv_id": "500fff3527cc4f419d8c9a7e84a3d392", "0_model_name": "intfloat/e5-mistral-7b-instruct", "0_txt0": "People are shopping.", "0_txt1": "Numerous customers browsing for produce in a market", "0_txt2": "People are showering.", "0_output": "", "1_conv_id": "46ca44fba5ff4cc3bce2bfbce60322c7", "1_model_name": "mixedbread-ai/mxbai-embed-large-v1", "1_txt0": "People are shopping.", "1_txt1": "Numerous customers browsing for produce in a market", "1_txt2": "People are showering.", "1_output": ""}
data/sts_individual-76c7b5f4-5fbc-4311-bf05-8defcdc1ba4e.jsonl CHANGED
@@ -24,3 +24,5 @@
24
  {"tstamp": 1723026375.9478, "task_type": "sts", "type": "chat", "model": "text-embedding-004", "gen_params": {}, "start": 1723026375.6811, "finish": 1723026375.9478, "ip": "", "conv_id": "e550f2f7fb194c02a1ea9112ab19f9bd", "model_name": "text-embedding-004", "txt0": "People are shopping.", "txt1": "Numerous customers browsing for produce in a market", "txt2": "People are showering.", "output": ""}
25
  {"tstamp": 1723038975.6937, "task_type": "sts", "type": "chat", "model": "Salesforce/SFR-Embedding-2_R", "gen_params": {}, "start": 1723038975.3837, "finish": 1723038975.6937, "ip": "", "conv_id": "00cf43fe5f264f2e8ef748d3f23298d5", "model_name": "Salesforce/SFR-Embedding-2_R", "txt0": "hello", "txt1": "good morning", "txt2": "早上好", "output": ""}
26
  {"tstamp": 1723038975.6937, "task_type": "sts", "type": "chat", "model": "voyage-multilingual-2", "gen_params": {}, "start": 1723038975.3837, "finish": 1723038975.6937, "ip": "", "conv_id": "10e91fc794964368a5d4776c2ba73f41", "model_name": "voyage-multilingual-2", "txt0": "hello", "txt1": "good morning", "txt2": "早上好", "output": ""}
 
 
 
24
  {"tstamp": 1723026375.9478, "task_type": "sts", "type": "chat", "model": "text-embedding-004", "gen_params": {}, "start": 1723026375.6811, "finish": 1723026375.9478, "ip": "", "conv_id": "e550f2f7fb194c02a1ea9112ab19f9bd", "model_name": "text-embedding-004", "txt0": "People are shopping.", "txt1": "Numerous customers browsing for produce in a market", "txt2": "People are showering.", "output": ""}
25
  {"tstamp": 1723038975.6937, "task_type": "sts", "type": "chat", "model": "Salesforce/SFR-Embedding-2_R", "gen_params": {}, "start": 1723038975.3837, "finish": 1723038975.6937, "ip": "", "conv_id": "00cf43fe5f264f2e8ef748d3f23298d5", "model_name": "Salesforce/SFR-Embedding-2_R", "txt0": "hello", "txt1": "good morning", "txt2": "早上好", "output": ""}
26
  {"tstamp": 1723038975.6937, "task_type": "sts", "type": "chat", "model": "voyage-multilingual-2", "gen_params": {}, "start": 1723038975.3837, "finish": 1723038975.6937, "ip": "", "conv_id": "10e91fc794964368a5d4776c2ba73f41", "model_name": "voyage-multilingual-2", "txt0": "hello", "txt1": "good morning", "txt2": "早上好", "output": ""}
27
+ {"tstamp": 1723039374.8312, "task_type": "sts", "type": "chat", "model": "intfloat/e5-mistral-7b-instruct", "gen_params": {}, "start": 1723039374.7759, "finish": 1723039374.8312, "ip": "", "conv_id": "500fff3527cc4f419d8c9a7e84a3d392", "model_name": "intfloat/e5-mistral-7b-instruct", "txt0": "People are shopping.", "txt1": "Numerous customers browsing for produce in a market", "txt2": "People are showering.", "output": ""}
28
+ {"tstamp": 1723039374.8312, "task_type": "sts", "type": "chat", "model": "mixedbread-ai/mxbai-embed-large-v1", "gen_params": {}, "start": 1723039374.7759, "finish": 1723039374.8312, "ip": "", "conv_id": "46ca44fba5ff4cc3bce2bfbce60322c7", "model_name": "mixedbread-ai/mxbai-embed-large-v1", "txt0": "People are shopping.", "txt1": "Numerous customers browsing for produce in a market", "txt2": "People are showering.", "output": ""}