Scheduled Commit
Browse files
data/retrieval_battle-f36c9f97-e795-4522-8eb7-7dee254b42c7.jsonl
CHANGED
@@ -84,3 +84,5 @@
|
|
84 |
{"tstamp": 1723717950.2537, "task_type": "retrieval", "type": "bothbadvote", "models": ["", ""], "ip": "", "0_conv_id": "e119c357e0d94d9cbabf8c1b0729176e", "0_model_name": "BM25", "0_prompt": "how do I unnest a list in python?", "0_output": [["how do I unnest a list in python?", "Scope (computer science)\n\nThe default name resolution rules can be overridden with the global or nonlocal (in Python 3) keywords. In the below code, the global x declaration in g means that x resolves to the global variable. It thus can be accessed (as it has already been defined), and assignment assigns to the global variable, rather than declaring a new local variable. Note that no global declaration is needed in f—since it does not assign to the variable, it defaults to resolving to the global variable.\n>>> def f():\n... print(x)\n...\n>>> def g():\n... global x\n... print(x)\n... x = \"g\"\n...\n>>> x = \"global\"\n>>> f()\nglobal\n>>> g()\nglobal\n>>> f()\ng\nglobal can also be used for nested functions. In addition to allowing assignment to a global variable, as in an unnested function, this can also be used to access the global variable in the presence of a nonlocal variable:\n>>> def f():\n... def g():\n... global x\n... print(x)\n... x = \"f\"\n... g()\n...\n>>> x = \"global\"\n>>> f()\nglobal\nFor nested functions, there is also the nonlocal declaration, for assigning to a nonlocal variable, similar to using global in an unnested function:\n>>> def f():\n... def g():\n... nonlocal x # Python 3 only\n... x = \"g\"\n... x = \"f\"\n... g()\n... print(x)\n...\n>>> x = \"global\"\n>>> f()\ng\n>>> print(x)\nglobal"]], "0_corpus": "wikipedia", "1_conv_id": "34c26a7f9cd44667ac70f5e77587434c", "1_model_name": "intfloat/e5-mistral-7b-instruct", "1_prompt": "how do I unnest a list in python?", "1_output": [["how do I unnest a list in python?", "List comprehension\n\nA list comprehension is a syntactic construct available in some programming languages for creating a list based on existing lists. It follows the form of the mathematical set-builder notation (set comprehension) as distinct from the use of map and filter functions.\nOverview\nConsider the following example in mathematical set-builder notation.\nor often\nThis can be read, \" is the set of all numbers \"2 times \" SUCH THAT is an ELEMENT or MEMBER of the set of natural numbers (), AND squared is greater than .\"\nThe smallest natural number, x = 1, fails to satisfy the condition x2>3 (the condition 12>3 is false) so 2 ·1 is not included in S. The next natural number, 2, does satisfy the condition (22>3) as does every other natural number. Thus x consists of 2, 3, 4, 5... Since the set consists of all numbers \"2 times x\" it is given by S = {4, 6, 8, 10,...}. S is, in other words, the set of all even numbers greater than 2.\nIn this annotated version of the example:"]], "1_corpus": "wikipedia"}
|
85 |
{"tstamp": 1723718048.178, "task_type": "retrieval", "type": "leftvote", "models": ["", ""], "ip": "", "0_conv_id": "308bc65381f84595b0caefcdbbaaafb7", "0_model_name": "voyage-multilingual-2", "0_prompt": "who started the war in Iraq?", "0_output": [["who started the war in Iraq?", "Preventive war\n\nIraq War (2003–2011) The 2003 invasion of Iraq was framed primarily as a preemptive war by the George W. Bush administration, although President Bush also argued it was supported by Security Council Resolutions: \"Under Resolutions 678 and 687--both still in effect--the United States and our allies are authorized to use force in ridding Iraq of weapons of mass destruction.\" At the time, the US public and its allies were led to believe that Ba'athist Iraq might have restarted its nuclear weapons program or been \"cheating\" on its obligations to dispose of its large stockpile of chemical weapons dating from the Iran–Iraq War. Supporters of the war have argued it to be justified, as Iraq both harbored Islamic terrorist groups sharing a common hatred of Western countries and was suspected to be developing weapons of mass destruction (WMD). Iraq's history of noncompliance of international security matters and its history of both developing and using such weapons were factors in the public perception of Iraq's having weapons of mass destruction.\nIn support of an attack on Iraq, US President George W. Bush stated in an address to the UN General Assembly on September 12, 2002 that the Iraqi \"regime is a grave and gathering danger.\" However, despite extensive searches during the several years of occupation, the suspected weapons of mass destruction or weapons program infrastructure alleged by the Bush administration were not found to be functional or even known to most Iraqi leaders. Coalition forces instead found dispersed and sometimes-buried and partially dismantled stockpiles of abandoned and functionally expired chemical weapons. Some of the caches had been dangerously stored and were leaking, and many were then disposed of hastily and in secret, leading to secondary exposure from improper handling. Allegations of mismanagement and information suppression followed.\nCase for preventive nuclear war"]], "0_corpus": "wikipedia", "1_conv_id": "e5b83c7ca70d4a788eb52ea3307922f4", "1_model_name": "sentence-transformers/all-MiniLM-L6-v2", "1_prompt": "who started the war in Iraq?", "1_output": [["who started the war in Iraq?", "First Iraqi–Kurdish War\n\nThe First Iraqi–Kurdish War (Arabic: الحرب العراقية الكردية الأولى) also known as Aylul revolts () was a major event of the Iraqi–Kurdish conflict, lasting from 1961 until 1970. The struggle was led by Mustafa Barzani, in an attempt to establish an independent Kurdistan. Throughout the 1960s, the uprising escalated into a long war, which failed to resolve despite internal power changes in Iraq. During the war, 80% of the Iraqi army was engaged in combat with the Kurds. The war ended with a stalemate in 1970, resulting in between 75,000 to 105,000 casualties. A series of Iraqi–Kurdish negotiations followed the war in an attempt to resolve the conflict. The negotiations led to the Iraqi–Kurdish Autonomy Agreement of 1970.\nBackground\nAfter the military coup by Abdul Karim Qasim in 1958, Barzani was invited by Qasim to return from exile. As part of a deal arranged by Qasim and Barzani, Qasim promised to give the Kurds regional autonomy in return for Barzani's support for his policies. Meanwhile, during 1959–1960, Barzani became the head of the Kurdistan Democratic Party (KDP), which was granted legal status in 1960.\nWarfare\nBy early 1960, it became apparent that Qasim would not follow through with his promise of regional autonomy. As a result, the KDP began to agitate for regional autonomy. In the face of growing Kurdish dissent, as well as Barzani's personal power, Qasim began to incite the Barzanis historical enemies, the Bradost and Zebari tribes, which led to intertribal warfare throughout 1960 and early 1961."]], "1_corpus": "wikipedia"}
|
86 |
{"tstamp": 1723718100.6415, "task_type": "retrieval", "type": "tievote", "models": ["", ""], "ip": "", "0_conv_id": "58ec6aebb8f145edae728ea5dff27722", "0_model_name": "intfloat/multilingual-e5-large-instruct", "0_prompt": "who came up with the theory of relativity?", "0_output": [["who came up with the theory of relativity?", "Theory of relativity\n\nThe theory transformed theoretical physics and astronomy during the 20th century, superseding a 200-year-old theory of mechanics created primarily by Isaac Newton. It introduced concepts including 4-dimensional spacetime as a unified entity of space and time, relativity of simultaneity, kinematic and gravitational time dilation, and length contraction. In the field of physics, relativity improved the science of elementary particles and their fundamental interactions, along with ushering in the nuclear age. With relativity, cosmology and astrophysics predicted extraordinary astronomical phenomena such as neutron stars, black holes, and gravitational waves.\nDevelopment and acceptance\nAlbert Einstein published the theory of special relativity in 1905, building on many theoretical results and empirical findings obtained by Albert A. Michelson, Hendrik Lorentz, Henri Poincaré and others. Max Planck, Hermann Minkowski and others did subsequent work.\nEinstein developed general relativity between 1907 and 1915, with contributions by many others after 1915. The final form of general relativity was published in 1916.\nThe term \"theory of relativity\" was based on the expression \"relative theory\" () used in 1906 by Planck, who emphasized how the theory uses the principle of relativity. In the discussion section of the same paper, Alfred Bucherer used for the first time the expression \"theory of relativity\" ()."]], "0_corpus": "wikipedia", "1_conv_id": "010cddc19e4d427a834fa0230878e2af", "1_model_name": "embed-english-v3.0", "1_prompt": "who came up with the theory of relativity?", "1_output": [["who came up with the theory of relativity?", "Theory of relativity\n\nThe theory transformed theoretical physics and astronomy during the 20th century, superseding a 200-year-old theory of mechanics created primarily by Isaac Newton. It introduced concepts including 4-dimensional spacetime as a unified entity of space and time, relativity of simultaneity, kinematic and gravitational time dilation, and length contraction. In the field of physics, relativity improved the science of elementary particles and their fundamental interactions, along with ushering in the nuclear age. With relativity, cosmology and astrophysics predicted extraordinary astronomical phenomena such as neutron stars, black holes, and gravitational waves.\nDevelopment and acceptance\nAlbert Einstein published the theory of special relativity in 1905, building on many theoretical results and empirical findings obtained by Albert A. Michelson, Hendrik Lorentz, Henri Poincaré and others. Max Planck, Hermann Minkowski and others did subsequent work.\nEinstein developed general relativity between 1907 and 1915, with contributions by many others after 1915. The final form of general relativity was published in 1916.\nThe term \"theory of relativity\" was based on the expression \"relative theory\" () used in 1906 by Planck, who emphasized how the theory uses the principle of relativity. In the discussion section of the same paper, Alfred Bucherer used for the first time the expression \"theory of relativity\" ()."]], "1_corpus": "wikipedia"}
|
|
|
|
|
|
84 |
{"tstamp": 1723717950.2537, "task_type": "retrieval", "type": "bothbadvote", "models": ["", ""], "ip": "", "0_conv_id": "e119c357e0d94d9cbabf8c1b0729176e", "0_model_name": "BM25", "0_prompt": "how do I unnest a list in python?", "0_output": [["how do I unnest a list in python?", "Scope (computer science)\n\nThe default name resolution rules can be overridden with the global or nonlocal (in Python 3) keywords. In the below code, the global x declaration in g means that x resolves to the global variable. It thus can be accessed (as it has already been defined), and assignment assigns to the global variable, rather than declaring a new local variable. Note that no global declaration is needed in f—since it does not assign to the variable, it defaults to resolving to the global variable.\n>>> def f():\n... print(x)\n...\n>>> def g():\n... global x\n... print(x)\n... x = \"g\"\n...\n>>> x = \"global\"\n>>> f()\nglobal\n>>> g()\nglobal\n>>> f()\ng\nglobal can also be used for nested functions. In addition to allowing assignment to a global variable, as in an unnested function, this can also be used to access the global variable in the presence of a nonlocal variable:\n>>> def f():\n... def g():\n... global x\n... print(x)\n... x = \"f\"\n... g()\n...\n>>> x = \"global\"\n>>> f()\nglobal\nFor nested functions, there is also the nonlocal declaration, for assigning to a nonlocal variable, similar to using global in an unnested function:\n>>> def f():\n... def g():\n... nonlocal x # Python 3 only\n... x = \"g\"\n... x = \"f\"\n... g()\n... print(x)\n...\n>>> x = \"global\"\n>>> f()\ng\n>>> print(x)\nglobal"]], "0_corpus": "wikipedia", "1_conv_id": "34c26a7f9cd44667ac70f5e77587434c", "1_model_name": "intfloat/e5-mistral-7b-instruct", "1_prompt": "how do I unnest a list in python?", "1_output": [["how do I unnest a list in python?", "List comprehension\n\nA list comprehension is a syntactic construct available in some programming languages for creating a list based on existing lists. It follows the form of the mathematical set-builder notation (set comprehension) as distinct from the use of map and filter functions.\nOverview\nConsider the following example in mathematical set-builder notation.\nor often\nThis can be read, \" is the set of all numbers \"2 times \" SUCH THAT is an ELEMENT or MEMBER of the set of natural numbers (), AND squared is greater than .\"\nThe smallest natural number, x = 1, fails to satisfy the condition x2>3 (the condition 12>3 is false) so 2 ·1 is not included in S. The next natural number, 2, does satisfy the condition (22>3) as does every other natural number. Thus x consists of 2, 3, 4, 5... Since the set consists of all numbers \"2 times x\" it is given by S = {4, 6, 8, 10,...}. S is, in other words, the set of all even numbers greater than 2.\nIn this annotated version of the example:"]], "1_corpus": "wikipedia"}
|
85 |
{"tstamp": 1723718048.178, "task_type": "retrieval", "type": "leftvote", "models": ["", ""], "ip": "", "0_conv_id": "308bc65381f84595b0caefcdbbaaafb7", "0_model_name": "voyage-multilingual-2", "0_prompt": "who started the war in Iraq?", "0_output": [["who started the war in Iraq?", "Preventive war\n\nIraq War (2003–2011) The 2003 invasion of Iraq was framed primarily as a preemptive war by the George W. Bush administration, although President Bush also argued it was supported by Security Council Resolutions: \"Under Resolutions 678 and 687--both still in effect--the United States and our allies are authorized to use force in ridding Iraq of weapons of mass destruction.\" At the time, the US public and its allies were led to believe that Ba'athist Iraq might have restarted its nuclear weapons program or been \"cheating\" on its obligations to dispose of its large stockpile of chemical weapons dating from the Iran–Iraq War. Supporters of the war have argued it to be justified, as Iraq both harbored Islamic terrorist groups sharing a common hatred of Western countries and was suspected to be developing weapons of mass destruction (WMD). Iraq's history of noncompliance of international security matters and its history of both developing and using such weapons were factors in the public perception of Iraq's having weapons of mass destruction.\nIn support of an attack on Iraq, US President George W. Bush stated in an address to the UN General Assembly on September 12, 2002 that the Iraqi \"regime is a grave and gathering danger.\" However, despite extensive searches during the several years of occupation, the suspected weapons of mass destruction or weapons program infrastructure alleged by the Bush administration were not found to be functional or even known to most Iraqi leaders. Coalition forces instead found dispersed and sometimes-buried and partially dismantled stockpiles of abandoned and functionally expired chemical weapons. Some of the caches had been dangerously stored and were leaking, and many were then disposed of hastily and in secret, leading to secondary exposure from improper handling. Allegations of mismanagement and information suppression followed.\nCase for preventive nuclear war"]], "0_corpus": "wikipedia", "1_conv_id": "e5b83c7ca70d4a788eb52ea3307922f4", "1_model_name": "sentence-transformers/all-MiniLM-L6-v2", "1_prompt": "who started the war in Iraq?", "1_output": [["who started the war in Iraq?", "First Iraqi–Kurdish War\n\nThe First Iraqi–Kurdish War (Arabic: الحرب العراقية الكردية الأولى) also known as Aylul revolts () was a major event of the Iraqi–Kurdish conflict, lasting from 1961 until 1970. The struggle was led by Mustafa Barzani, in an attempt to establish an independent Kurdistan. Throughout the 1960s, the uprising escalated into a long war, which failed to resolve despite internal power changes in Iraq. During the war, 80% of the Iraqi army was engaged in combat with the Kurds. The war ended with a stalemate in 1970, resulting in between 75,000 to 105,000 casualties. A series of Iraqi–Kurdish negotiations followed the war in an attempt to resolve the conflict. The negotiations led to the Iraqi–Kurdish Autonomy Agreement of 1970.\nBackground\nAfter the military coup by Abdul Karim Qasim in 1958, Barzani was invited by Qasim to return from exile. As part of a deal arranged by Qasim and Barzani, Qasim promised to give the Kurds regional autonomy in return for Barzani's support for his policies. Meanwhile, during 1959–1960, Barzani became the head of the Kurdistan Democratic Party (KDP), which was granted legal status in 1960.\nWarfare\nBy early 1960, it became apparent that Qasim would not follow through with his promise of regional autonomy. As a result, the KDP began to agitate for regional autonomy. In the face of growing Kurdish dissent, as well as Barzani's personal power, Qasim began to incite the Barzanis historical enemies, the Bradost and Zebari tribes, which led to intertribal warfare throughout 1960 and early 1961."]], "1_corpus": "wikipedia"}
|
86 |
{"tstamp": 1723718100.6415, "task_type": "retrieval", "type": "tievote", "models": ["", ""], "ip": "", "0_conv_id": "58ec6aebb8f145edae728ea5dff27722", "0_model_name": "intfloat/multilingual-e5-large-instruct", "0_prompt": "who came up with the theory of relativity?", "0_output": [["who came up with the theory of relativity?", "Theory of relativity\n\nThe theory transformed theoretical physics and astronomy during the 20th century, superseding a 200-year-old theory of mechanics created primarily by Isaac Newton. It introduced concepts including 4-dimensional spacetime as a unified entity of space and time, relativity of simultaneity, kinematic and gravitational time dilation, and length contraction. In the field of physics, relativity improved the science of elementary particles and their fundamental interactions, along with ushering in the nuclear age. With relativity, cosmology and astrophysics predicted extraordinary astronomical phenomena such as neutron stars, black holes, and gravitational waves.\nDevelopment and acceptance\nAlbert Einstein published the theory of special relativity in 1905, building on many theoretical results and empirical findings obtained by Albert A. Michelson, Hendrik Lorentz, Henri Poincaré and others. Max Planck, Hermann Minkowski and others did subsequent work.\nEinstein developed general relativity between 1907 and 1915, with contributions by many others after 1915. The final form of general relativity was published in 1916.\nThe term \"theory of relativity\" was based on the expression \"relative theory\" () used in 1906 by Planck, who emphasized how the theory uses the principle of relativity. In the discussion section of the same paper, Alfred Bucherer used for the first time the expression \"theory of relativity\" ()."]], "0_corpus": "wikipedia", "1_conv_id": "010cddc19e4d427a834fa0230878e2af", "1_model_name": "embed-english-v3.0", "1_prompt": "who came up with the theory of relativity?", "1_output": [["who came up with the theory of relativity?", "Theory of relativity\n\nThe theory transformed theoretical physics and astronomy during the 20th century, superseding a 200-year-old theory of mechanics created primarily by Isaac Newton. It introduced concepts including 4-dimensional spacetime as a unified entity of space and time, relativity of simultaneity, kinematic and gravitational time dilation, and length contraction. In the field of physics, relativity improved the science of elementary particles and their fundamental interactions, along with ushering in the nuclear age. With relativity, cosmology and astrophysics predicted extraordinary astronomical phenomena such as neutron stars, black holes, and gravitational waves.\nDevelopment and acceptance\nAlbert Einstein published the theory of special relativity in 1905, building on many theoretical results and empirical findings obtained by Albert A. Michelson, Hendrik Lorentz, Henri Poincaré and others. Max Planck, Hermann Minkowski and others did subsequent work.\nEinstein developed general relativity between 1907 and 1915, with contributions by many others after 1915. The final form of general relativity was published in 1916.\nThe term \"theory of relativity\" was based on the expression \"relative theory\" () used in 1906 by Planck, who emphasized how the theory uses the principle of relativity. In the discussion section of the same paper, Alfred Bucherer used for the first time the expression \"theory of relativity\" ()."]], "1_corpus": "wikipedia"}
|
87 |
+
{"tstamp": 1723725557.1694, "task_type": "retrieval", "type": "bothbadvote", "models": ["", ""], "ip": "", "0_conv_id": "c143db0a911448b39011cf8639cdeaa0", "0_model_name": "intfloat/e5-mistral-7b-instruct", "0_prompt": "what is cancer lymphodema", "0_output": [["what is cancer lymphodema", "Title: Microwave lymphedema assessment using deep learning with contour assisted backprojection\n\nAbstract: We present a method for early detection of lymphatic fluid accumulation in lymphedema patients based on microwave imaging of the limb volume across an air gap. The proposed algorithm uses contour information of the imaged limb surface to approximate the wave propagation velocity locally to solve the eikonal equation for implementing the adjoint imaging operator. This modified backprojection algorithm results in focused imagery close to the limb surface where lymphatic fluid accumulation presents itself. Next, a deep neural network based on U-Net architecture is employed to identify the location and extent of the lymphatic fluid. Simulation studies with various upper and lower arm profiles compare the focusing performance of the proposed contour assisted backprojection imaging with the baseline imaging approach that assumes homogeneous media. The empirical results of the simulation experiments show that the proposed imaging method significantly improves the ability of the deepnet model to identify the location and the volume of the excess fluid in the limb."]], "0_corpus": "arxiv", "1_conv_id": "34a8dc28e83e43fda8b5b8bd6fcbd97b", "1_model_name": "jinaai/jina-embeddings-v2-base-en", "1_prompt": "what is cancer lymphodema", "1_output": [["what is cancer lymphodema", "Title: Literal Movement Grammars\n\nAbstract: Literal movement grammars (LMGs) provide a general account of extraposition phenomena through an attribute mechanism allowing top-down displacement of syntactical information. LMGs provide a simple and efficient treatment of complex linguistic phenomena such as cross-serial dependencies in German and Dutch---separating the treatment of natural language into a parsing phase closely resembling traditional context-free treatment, and a disambiguation phase which can be carried out using matching, as opposed to full unification employed in most current grammar formalisms of linguistical relevance."]], "1_corpus": "arxiv"}
|
88 |
+
{"tstamp": 1723725611.3436, "task_type": "retrieval", "type": "tievote", "models": ["", ""], "ip": "", "0_conv_id": "15df5739a8a94a79b276fd0c742b31c8", "0_model_name": "embed-english-v3.0", "0_prompt": "I am looking for a paper to help me deal with data constraints as I scale up my language model.", "0_output": [["I am looking for a paper to help me deal with data constraints as I scale up my language model.", "Title: Scaling Data-Constrained Language Models\n\nAbstract: The current trend of scaling language models involves increasing both parameter count and training dataset size. Extrapolating this trend suggests that training dataset size may soon be limited by the amount of text data available on the internet. Motivated by this limit, we investigate scaling language models in data-constrained regimes. Specifically, we run a large set of experiments varying the extent of data repetition and compute budget, ranging up to 900 billion training tokens and 9 billion parameter models. We find that with constrained data for a fixed compute budget, training with up to 4 epochs of repeated data yields negligible changes to loss compared to having unique data. However, with more repetition, the value of adding compute eventually decays to zero. We propose and empirically validate a scaling law for compute optimality that accounts for the decreasing value of repeated tokens and excess parameters. Finally, we experiment with approaches mitigating data scarcity, including augmenting the training dataset with code data or removing commonly used filters. Models and datasets from our 400 training runs are freely available at https://github.com/huggingface/datablations."]], "0_corpus": "arxiv", "1_conv_id": "d67244c71dff47dd8578f0ae183983d6", "1_model_name": "intfloat/multilingual-e5-large-instruct", "1_prompt": "I am looking for a paper to help me deal with data constraints as I scale up my language model.", "1_output": [["I am looking for a paper to help me deal with data constraints as I scale up my language model.", "Title: Scaling Data-Constrained Language Models\n\nAbstract: The current trend of scaling language models involves increasing both parameter count and training dataset size. Extrapolating this trend suggests that training dataset size may soon be limited by the amount of text data available on the internet. Motivated by this limit, we investigate scaling language models in data-constrained regimes. Specifically, we run a large set of experiments varying the extent of data repetition and compute budget, ranging up to 900 billion training tokens and 9 billion parameter models. We find that with constrained data for a fixed compute budget, training with up to 4 epochs of repeated data yields negligible changes to loss compared to having unique data. However, with more repetition, the value of adding compute eventually decays to zero. We propose and empirically validate a scaling law for compute optimality that accounts for the decreasing value of repeated tokens and excess parameters. Finally, we experiment with approaches mitigating data scarcity, including augmenting the training dataset with code data or removing commonly used filters. Models and datasets from our 400 training runs are freely available at https://github.com/huggingface/datablations."]], "1_corpus": "arxiv"}
|
data/retrieval_individual-f36c9f97-e795-4522-8eb7-7dee254b42c7.jsonl
CHANGED
@@ -393,3 +393,5 @@
|
|
393 |
{"tstamp": 1723718134.2953, "task_type": "retrieval", "type": "chat", "model": "BAAI/bge-large-en-v1.5", "gen_params": {}, "start": 1723718132.0582, "finish": 1723718134.2953, "ip": "", "conv_id": "fe0931a78c79424d90bd46ab97a599e0", "model_name": "BAAI/bge-large-en-v1.5", "prompt": "who is angela merkel?", "output": [["who is angela merkel?", "Angela Merkel\n\nAngela Dorothea Merkel (; ; born 17 July 1954) is a retired German politician who served as Chancellor of Germany from 2005 to 2021 and was the first woman to hold that office. She previously served as Leader of the Opposition from 2002 to 2005 and as Leader of the Christian Democratic Union from 2000 to 2018. During her chancellorship, Merkel was frequently referred to as the de facto leader of the European Union (EU) and the most powerful woman in the world. Beginning in 2016, she was often described as the leader of the free world.\nMerkel was born in Hamburg in West Germany. Her family moved to East Germany when she was an infant. Merkel obtained a doctorate in quantum chemistry in 1986 and worked as a research scientist until 1989. She then entered politics in the wake of the Revolutions of 1989, briefly serving as deputy spokeswoman for the first democratically elected government of East Germany led by Lothar de Maizière. Following German reunification in 1990, Merkel was elected to the Bundestag for the state of Mecklenburg-Vorpommern. As the protégée of chancellor Helmut Kohl, Merkel was appointed as Minister for Women and Youth in 1991, later becoming Minister for the Environment, Nature Conservation and Nuclear Safety in 1994. After the CDU lost the 1998 federal election, Merkel was elected general secretary of the party. She then became the party's first female leader, and the first female Leader of the Opposition, two years later."]], "corpus": "wikipedia"}
|
394 |
{"tstamp": 1723725495.6183, "task_type": "retrieval", "type": "chat", "model": "intfloat/e5-mistral-7b-instruct", "gen_params": {}, "start": 1723725495.3917, "finish": 1723725495.6183, "ip": "", "conv_id": "c143db0a911448b39011cf8639cdeaa0", "model_name": "intfloat/e5-mistral-7b-instruct", "prompt": "what is cancer lymphodema", "output": [["what is cancer lymphodema", "Title: Microwave lymphedema assessment using deep learning with contour assisted backprojection\n\nAbstract: We present a method for early detection of lymphatic fluid accumulation in lymphedema patients based on microwave imaging of the limb volume across an air gap. The proposed algorithm uses contour information of the imaged limb surface to approximate the wave propagation velocity locally to solve the eikonal equation for implementing the adjoint imaging operator. This modified backprojection algorithm results in focused imagery close to the limb surface where lymphatic fluid accumulation presents itself. Next, a deep neural network based on U-Net architecture is employed to identify the location and extent of the lymphatic fluid. Simulation studies with various upper and lower arm profiles compare the focusing performance of the proposed contour assisted backprojection imaging with the baseline imaging approach that assumes homogeneous media. The empirical results of the simulation experiments show that the proposed imaging method significantly improves the ability of the deepnet model to identify the location and the volume of the excess fluid in the limb."]], "corpus": "arxiv"}
|
395 |
{"tstamp": 1723725495.6183, "task_type": "retrieval", "type": "chat", "model": "jinaai/jina-embeddings-v2-base-en", "gen_params": {}, "start": 1723725495.3917, "finish": 1723725495.6183, "ip": "", "conv_id": "34a8dc28e83e43fda8b5b8bd6fcbd97b", "model_name": "jinaai/jina-embeddings-v2-base-en", "prompt": "what is cancer lymphodema", "output": [["what is cancer lymphodema", "Title: Literal Movement Grammars\n\nAbstract: Literal movement grammars (LMGs) provide a general account of extraposition phenomena through an attribute mechanism allowing top-down displacement of syntactical information. LMGs provide a simple and efficient treatment of complex linguistic phenomena such as cross-serial dependencies in German and Dutch---separating the treatment of natural language into a parsing phase closely resembling traditional context-free treatment, and a disambiguation phase which can be carried out using matching, as opposed to full unification employed in most current grammar formalisms of linguistical relevance."]], "corpus": "arxiv"}
|
|
|
|
|
|
393 |
{"tstamp": 1723718134.2953, "task_type": "retrieval", "type": "chat", "model": "BAAI/bge-large-en-v1.5", "gen_params": {}, "start": 1723718132.0582, "finish": 1723718134.2953, "ip": "", "conv_id": "fe0931a78c79424d90bd46ab97a599e0", "model_name": "BAAI/bge-large-en-v1.5", "prompt": "who is angela merkel?", "output": [["who is angela merkel?", "Angela Merkel\n\nAngela Dorothea Merkel (; ; born 17 July 1954) is a retired German politician who served as Chancellor of Germany from 2005 to 2021 and was the first woman to hold that office. She previously served as Leader of the Opposition from 2002 to 2005 and as Leader of the Christian Democratic Union from 2000 to 2018. During her chancellorship, Merkel was frequently referred to as the de facto leader of the European Union (EU) and the most powerful woman in the world. Beginning in 2016, she was often described as the leader of the free world.\nMerkel was born in Hamburg in West Germany. Her family moved to East Germany when she was an infant. Merkel obtained a doctorate in quantum chemistry in 1986 and worked as a research scientist until 1989. She then entered politics in the wake of the Revolutions of 1989, briefly serving as deputy spokeswoman for the first democratically elected government of East Germany led by Lothar de Maizière. Following German reunification in 1990, Merkel was elected to the Bundestag for the state of Mecklenburg-Vorpommern. As the protégée of chancellor Helmut Kohl, Merkel was appointed as Minister for Women and Youth in 1991, later becoming Minister for the Environment, Nature Conservation and Nuclear Safety in 1994. After the CDU lost the 1998 federal election, Merkel was elected general secretary of the party. She then became the party's first female leader, and the first female Leader of the Opposition, two years later."]], "corpus": "wikipedia"}
|
394 |
{"tstamp": 1723725495.6183, "task_type": "retrieval", "type": "chat", "model": "intfloat/e5-mistral-7b-instruct", "gen_params": {}, "start": 1723725495.3917, "finish": 1723725495.6183, "ip": "", "conv_id": "c143db0a911448b39011cf8639cdeaa0", "model_name": "intfloat/e5-mistral-7b-instruct", "prompt": "what is cancer lymphodema", "output": [["what is cancer lymphodema", "Title: Microwave lymphedema assessment using deep learning with contour assisted backprojection\n\nAbstract: We present a method for early detection of lymphatic fluid accumulation in lymphedema patients based on microwave imaging of the limb volume across an air gap. The proposed algorithm uses contour information of the imaged limb surface to approximate the wave propagation velocity locally to solve the eikonal equation for implementing the adjoint imaging operator. This modified backprojection algorithm results in focused imagery close to the limb surface where lymphatic fluid accumulation presents itself. Next, a deep neural network based on U-Net architecture is employed to identify the location and extent of the lymphatic fluid. Simulation studies with various upper and lower arm profiles compare the focusing performance of the proposed contour assisted backprojection imaging with the baseline imaging approach that assumes homogeneous media. The empirical results of the simulation experiments show that the proposed imaging method significantly improves the ability of the deepnet model to identify the location and the volume of the excess fluid in the limb."]], "corpus": "arxiv"}
|
395 |
{"tstamp": 1723725495.6183, "task_type": "retrieval", "type": "chat", "model": "jinaai/jina-embeddings-v2-base-en", "gen_params": {}, "start": 1723725495.3917, "finish": 1723725495.6183, "ip": "", "conv_id": "34a8dc28e83e43fda8b5b8bd6fcbd97b", "model_name": "jinaai/jina-embeddings-v2-base-en", "prompt": "what is cancer lymphodema", "output": [["what is cancer lymphodema", "Title: Literal Movement Grammars\n\nAbstract: Literal movement grammars (LMGs) provide a general account of extraposition phenomena through an attribute mechanism allowing top-down displacement of syntactical information. LMGs provide a simple and efficient treatment of complex linguistic phenomena such as cross-serial dependencies in German and Dutch---separating the treatment of natural language into a parsing phase closely resembling traditional context-free treatment, and a disambiguation phase which can be carried out using matching, as opposed to full unification employed in most current grammar formalisms of linguistical relevance."]], "corpus": "arxiv"}
|
396 |
+
{"tstamp": 1723725599.3534, "task_type": "retrieval", "type": "chat", "model": "embed-english-v3.0", "gen_params": {}, "start": 1723725597.1153, "finish": 1723725599.3534, "ip": "", "conv_id": "15df5739a8a94a79b276fd0c742b31c8", "model_name": "embed-english-v3.0", "prompt": "I am looking for a paper to help me deal with data constraints as I scale up my language model.", "output": [["I am looking for a paper to help me deal with data constraints as I scale up my language model.", "Title: Scaling Data-Constrained Language Models\n\nAbstract: The current trend of scaling language models involves increasing both parameter count and training dataset size. Extrapolating this trend suggests that training dataset size may soon be limited by the amount of text data available on the internet. Motivated by this limit, we investigate scaling language models in data-constrained regimes. Specifically, we run a large set of experiments varying the extent of data repetition and compute budget, ranging up to 900 billion training tokens and 9 billion parameter models. We find that with constrained data for a fixed compute budget, training with up to 4 epochs of repeated data yields negligible changes to loss compared to having unique data. However, with more repetition, the value of adding compute eventually decays to zero. We propose and empirically validate a scaling law for compute optimality that accounts for the decreasing value of repeated tokens and excess parameters. Finally, we experiment with approaches mitigating data scarcity, including augmenting the training dataset with code data or removing commonly used filters. Models and datasets from our 400 training runs are freely available at https://github.com/huggingface/datablations."]], "corpus": "arxiv"}
|
397 |
+
{"tstamp": 1723725599.3534, "task_type": "retrieval", "type": "chat", "model": "intfloat/multilingual-e5-large-instruct", "gen_params": {}, "start": 1723725597.1153, "finish": 1723725599.3534, "ip": "", "conv_id": "d67244c71dff47dd8578f0ae183983d6", "model_name": "intfloat/multilingual-e5-large-instruct", "prompt": "I am looking for a paper to help me deal with data constraints as I scale up my language model.", "output": [["I am looking for a paper to help me deal with data constraints as I scale up my language model.", "Title: Scaling Data-Constrained Language Models\n\nAbstract: The current trend of scaling language models involves increasing both parameter count and training dataset size. Extrapolating this trend suggests that training dataset size may soon be limited by the amount of text data available on the internet. Motivated by this limit, we investigate scaling language models in data-constrained regimes. Specifically, we run a large set of experiments varying the extent of data repetition and compute budget, ranging up to 900 billion training tokens and 9 billion parameter models. We find that with constrained data for a fixed compute budget, training with up to 4 epochs of repeated data yields negligible changes to loss compared to having unique data. However, with more repetition, the value of adding compute eventually decays to zero. We propose and empirically validate a scaling law for compute optimality that accounts for the decreasing value of repeated tokens and excess parameters. Finally, we experiment with approaches mitigating data scarcity, including augmenting the training dataset with code data or removing commonly used filters. Models and datasets from our 400 training runs are freely available at https://github.com/huggingface/datablations."]], "corpus": "arxiv"}
|