Scheduled Commit
Browse files
data/retrieval_individual-839a5845-da8b-404c-8647-ecbee5eb2a3c.jsonl
CHANGED
@@ -88,3 +88,5 @@
|
|
88 |
{"tstamp": 1725340436.6889, "task_type": "retrieval", "type": "chat", "model": "nomic-ai/nomic-embed-text-v1.5", "gen_params": {}, "start": 1725340436.49, "finish": 1725340436.6889, "ip": "", "conv_id": "d78f60ab585540fa90402b3cc668dfb8", "model_name": "nomic-ai/nomic-embed-text-v1.5", "prompt": "Pizza debate, which is better", "output": [["Pizza debate, which is better", "Pizza\n\nThe term pizza was first recorded in the year 997 AD, in a Latin manuscript from the southern Italian town of Gaeta, in Lazio, on the border with Campania. Raffaele Esposito is often credited for creating modern pizza in Naples. In 2009, Neapolitan pizza was registered with the European Union as a traditional speciality guaranteed (TSG) dish. In 2017, the art of making Neapolitan pizza was added to UNESCO's list of intangible cultural heritage.\nPizza and its variants are among the most popular foods in the world. Pizza is sold at a variety of restaurants, including pizzerias (pizza specialty restaurants), Mediterranean restaurants, via delivery, and as street food. In Italy, pizza served in a restaurant is presented unsliced, and is eaten with the use of a knife and fork. In casual settings, however, it is typically cut into slices to be eaten while held in the hand. Pizza is also sold in grocery stores in a variety of forms, including frozen or as kits for self-assembly. They are then cooked using a home oven.\nIn 2017, the world pizza market was US$128 billion, and in the US it was $44 billion spread over 76,000 pizzerias. Overall, 13% of the U.S. population aged two years and over consumed pizza on any given day.\nEtymology\nThe oldest recorded usage of the word pizza is from a Latin text from the town of Gaeta, then still part of the Byzantine Empire, in 997 AD; the text states that a tenant of certain property is to give the bishop of Gaeta duodecim pizze () every Christmas Day, and another twelve every Easter Sunday."]], "corpus": "wikipedia"}
|
89 |
{"tstamp": 1725340488.219, "task_type": "retrieval", "type": "chat", "model": "Alibaba-NLP/gte-Qwen2-7B-instruct", "gen_params": {}, "start": 1725340488.0202, "finish": 1725340488.219, "ip": "", "conv_id": "b1959c714e0a40c0a940f2ce9bf4e7c8", "model_name": "Alibaba-NLP/gte-Qwen2-7B-instruct", "prompt": "Who was the founder of neural networks?", "output": [["Who was the founder of neural networks?", "Neural network (machine learning)\n\nDuring the AI Winter era, however, research outside the United States continued, especially in Eastern Europe. By the time Minsky and Papert's book on Perceptrons came out, methods for training multilayer perceptrons (MLPs) were already known. The first deep learning MLP was published by Alexey Grigorevich Ivakhnenko and Valentin Lapa in 1965, as the Group Method of Data Handling. The first deep learning MLP trained by stochastic gradient descent was published in 1967 by Shun'ichi Amari. In computer experiments conducted by Amari's student Saito, a five layer MLP with two modifiable layers learned useful internal representations to classify non-linearily separable pattern classes.\nSelf-organizing maps (SOMs) were described by Teuvo Kohonen in 1982. SOMs are neurophysiologically inspired neural networks that learn low-dimensional representations of high-dimensional data while preserving the topological structure of the data. They are trained using competitive learning.\nThe convolutional neural network (CNN) architecture with convolutional layers and downsampling layers was introduced by Kunihiko Fukushima in 1980. He called it the neocognitron. In 1969, he also introduced the ReLU (rectified linear unit) activation function. The rectifier has become the most popular activation function for CNNs and deep neural networks in general. CNNs have become an essential tool for computer vision.\nA key in later advances in artificial neural network research was the backpropagation algorithm, an efficient application of the Leibniz chain rule (1673) to networks of differentiable nodes. It is also known as"]], "corpus": "wikipedia"}
|
90 |
{"tstamp": 1725340488.219, "task_type": "retrieval", "type": "chat", "model": "nomic-ai/nomic-embed-text-v1.5", "gen_params": {}, "start": 1725340488.0202, "finish": 1725340488.219, "ip": "", "conv_id": "dbc902ebe84f4d6d9dad5d3a063e60d7", "model_name": "nomic-ai/nomic-embed-text-v1.5", "prompt": "Who was the founder of neural networks?", "output": [["Who was the founder of neural networks?", "Neural network (machine learning)\n\nIn the late 1940s, D. O. Hebb created a learning hypothesis based on the mechanism of neural plasticity that became known as Hebbian learning. Hebbian learning is considered to be a 'typical' unsupervised learning rule and its later variants were early models for long term potentiation. These ideas started being applied to computational models in 1948 with Turing's \"unorganized machines\". Farley and Wesley A. Clark were the first to simulate a Hebbian network in 1954 at MIT. They used computational machines, then called \"calculators\". Other neural network computational machines were created by Rochester, Holland, Habit, and Duda in 1956. In 1958, psychologist Frank Rosenblatt invented the perceptron, the first implemented artificial neural network, funded by the United States Office of Naval Research.\nThe invention of the perceptron raised public excitement for research in Artificial Neural Networks, causing the US government to drastically increase funding into deep learning research. This led to \"the golden age of AI\" fueled by the optimistic claims made by computer scientists regarding the ability of perceptrons to emulate human intelligence. For example, in 1957 Herbert Simon famously said:However, this wasn't the case as research stagnated in the United States following the work of Minsky and Papert (1969), who discovered that basic perceptrons were incapable of processing the exclusive-or circuit and that computers lacked sufficient power to train useful neural networks. This, along with other factors such as the 1973 Lighthill report by James Lighthill stating that research in Artificial Intelligence has not \"produced the major impact that was then promised,\" shutting funding in research into the field of AI in all but two universities in the UK and in many major institutions across the world. This ushered an era called the AI Winter with reduced research into connectionism due to a decrease in government funding and an increased stress on symbolic artificial intelligence in the United States and other Western countries."]], "corpus": "wikipedia"}
|
|
|
|
|
|
88 |
{"tstamp": 1725340436.6889, "task_type": "retrieval", "type": "chat", "model": "nomic-ai/nomic-embed-text-v1.5", "gen_params": {}, "start": 1725340436.49, "finish": 1725340436.6889, "ip": "", "conv_id": "d78f60ab585540fa90402b3cc668dfb8", "model_name": "nomic-ai/nomic-embed-text-v1.5", "prompt": "Pizza debate, which is better", "output": [["Pizza debate, which is better", "Pizza\n\nThe term pizza was first recorded in the year 997 AD, in a Latin manuscript from the southern Italian town of Gaeta, in Lazio, on the border with Campania. Raffaele Esposito is often credited for creating modern pizza in Naples. In 2009, Neapolitan pizza was registered with the European Union as a traditional speciality guaranteed (TSG) dish. In 2017, the art of making Neapolitan pizza was added to UNESCO's list of intangible cultural heritage.\nPizza and its variants are among the most popular foods in the world. Pizza is sold at a variety of restaurants, including pizzerias (pizza specialty restaurants), Mediterranean restaurants, via delivery, and as street food. In Italy, pizza served in a restaurant is presented unsliced, and is eaten with the use of a knife and fork. In casual settings, however, it is typically cut into slices to be eaten while held in the hand. Pizza is also sold in grocery stores in a variety of forms, including frozen or as kits for self-assembly. They are then cooked using a home oven.\nIn 2017, the world pizza market was US$128 billion, and in the US it was $44 billion spread over 76,000 pizzerias. Overall, 13% of the U.S. population aged two years and over consumed pizza on any given day.\nEtymology\nThe oldest recorded usage of the word pizza is from a Latin text from the town of Gaeta, then still part of the Byzantine Empire, in 997 AD; the text states that a tenant of certain property is to give the bishop of Gaeta duodecim pizze () every Christmas Day, and another twelve every Easter Sunday."]], "corpus": "wikipedia"}
|
89 |
{"tstamp": 1725340488.219, "task_type": "retrieval", "type": "chat", "model": "Alibaba-NLP/gte-Qwen2-7B-instruct", "gen_params": {}, "start": 1725340488.0202, "finish": 1725340488.219, "ip": "", "conv_id": "b1959c714e0a40c0a940f2ce9bf4e7c8", "model_name": "Alibaba-NLP/gte-Qwen2-7B-instruct", "prompt": "Who was the founder of neural networks?", "output": [["Who was the founder of neural networks?", "Neural network (machine learning)\n\nDuring the AI Winter era, however, research outside the United States continued, especially in Eastern Europe. By the time Minsky and Papert's book on Perceptrons came out, methods for training multilayer perceptrons (MLPs) were already known. The first deep learning MLP was published by Alexey Grigorevich Ivakhnenko and Valentin Lapa in 1965, as the Group Method of Data Handling. The first deep learning MLP trained by stochastic gradient descent was published in 1967 by Shun'ichi Amari. In computer experiments conducted by Amari's student Saito, a five layer MLP with two modifiable layers learned useful internal representations to classify non-linearily separable pattern classes.\nSelf-organizing maps (SOMs) were described by Teuvo Kohonen in 1982. SOMs are neurophysiologically inspired neural networks that learn low-dimensional representations of high-dimensional data while preserving the topological structure of the data. They are trained using competitive learning.\nThe convolutional neural network (CNN) architecture with convolutional layers and downsampling layers was introduced by Kunihiko Fukushima in 1980. He called it the neocognitron. In 1969, he also introduced the ReLU (rectified linear unit) activation function. The rectifier has become the most popular activation function for CNNs and deep neural networks in general. CNNs have become an essential tool for computer vision.\nA key in later advances in artificial neural network research was the backpropagation algorithm, an efficient application of the Leibniz chain rule (1673) to networks of differentiable nodes. It is also known as"]], "corpus": "wikipedia"}
|
90 |
{"tstamp": 1725340488.219, "task_type": "retrieval", "type": "chat", "model": "nomic-ai/nomic-embed-text-v1.5", "gen_params": {}, "start": 1725340488.0202, "finish": 1725340488.219, "ip": "", "conv_id": "dbc902ebe84f4d6d9dad5d3a063e60d7", "model_name": "nomic-ai/nomic-embed-text-v1.5", "prompt": "Who was the founder of neural networks?", "output": [["Who was the founder of neural networks?", "Neural network (machine learning)\n\nIn the late 1940s, D. O. Hebb created a learning hypothesis based on the mechanism of neural plasticity that became known as Hebbian learning. Hebbian learning is considered to be a 'typical' unsupervised learning rule and its later variants were early models for long term potentiation. These ideas started being applied to computational models in 1948 with Turing's \"unorganized machines\". Farley and Wesley A. Clark were the first to simulate a Hebbian network in 1954 at MIT. They used computational machines, then called \"calculators\". Other neural network computational machines were created by Rochester, Holland, Habit, and Duda in 1956. In 1958, psychologist Frank Rosenblatt invented the perceptron, the first implemented artificial neural network, funded by the United States Office of Naval Research.\nThe invention of the perceptron raised public excitement for research in Artificial Neural Networks, causing the US government to drastically increase funding into deep learning research. This led to \"the golden age of AI\" fueled by the optimistic claims made by computer scientists regarding the ability of perceptrons to emulate human intelligence. For example, in 1957 Herbert Simon famously said:However, this wasn't the case as research stagnated in the United States following the work of Minsky and Papert (1969), who discovered that basic perceptrons were incapable of processing the exclusive-or circuit and that computers lacked sufficient power to train useful neural networks. This, along with other factors such as the 1973 Lighthill report by James Lighthill stating that research in Artificial Intelligence has not \"produced the major impact that was then promised,\" shutting funding in research into the field of AI in all but two universities in the UK and in many major institutions across the world. This ushered an era called the AI Winter with reduced research into connectionism due to a decrease in government funding and an increased stress on symbolic artificial intelligence in the United States and other Western countries."]], "corpus": "wikipedia"}
|
91 |
+
{"tstamp": 1725340588.9232, "task_type": "retrieval", "type": "chat", "model": "Alibaba-NLP/gte-Qwen2-7B-instruct", "gen_params": {}, "start": 1725340588.736, "finish": 1725340588.9232, "ip": "", "conv_id": "5043186b68a048379d14d1ce35cdd94e", "model_name": "Alibaba-NLP/gte-Qwen2-7B-instruct", "prompt": "Who invented GIT for code versioning", "output": [["Who invented GIT for code versioning", "Git\n\nThe development of Git began on 3 April 2005. Torvalds announced the project on 6 April and became self-hosting the next day. The first merge of multiple branches took place on 18 April. Torvalds achieved his performance goals; on 29 April, the nascent Git was benchmarked recording patches to the Linux kernel tree at a rate of 6.7 patches per second. On 16 June, Git managed the kernel 2.6.12 release.\nTorvalds turned over maintenance on 26 July 2005 to Junio Hamano, a major contributor to the project. Hamano was responsible for the 1.0 release on 21 December 2005.\nNaming\nTorvalds sarcastically quipped about the name git (which means \"unpleasant person\" in British English slang): \"I'm an egotistical bastard, and I name all my projects after myself. First 'Linux', now 'git'.\" The man page describes Git as \"the stupid content tracker\".\nThe read-me file of the source code elaborates further:\nThe source code for Git refers to the program as \"the information manager from hell\".\nCharacteristics\nDesign\nGit's design is a synthesis of Torvalds's experience with Linux in maintaining a large distributed development project, along with his intimate knowledge of file-system performance gained from the same project and the urgent need to produce a working system in short order. These influences led to the following implementation choices:"]], "corpus": "wikipedia"}
|
92 |
+
{"tstamp": 1725340588.9232, "task_type": "retrieval", "type": "chat", "model": "nomic-ai/nomic-embed-text-v1.5", "gen_params": {}, "start": 1725340588.736, "finish": 1725340588.9232, "ip": "", "conv_id": "f641986d2a7740eb886e297ce6ec14e1", "model_name": "nomic-ai/nomic-embed-text-v1.5", "prompt": "Who invented GIT for code versioning", "output": [["Who invented GIT for code versioning", "Git\n\nGit () is a distributed version control system that tracks versions of files. It is often used to control source code by programmers collaboratively developing software.\nDesign goals of Git include speed, data integrity, and support for distributed, non-linear workflows thousands of parallel branches running on different computers.\nGit was created for use in the development of the Linux kernel by Linus Torvalds and others developing the kernel."]], "corpus": "wikipedia"}
|
data/retrieval_side_by_side-839a5845-da8b-404c-8647-ecbee5eb2a3c.jsonl
CHANGED
@@ -28,3 +28,4 @@
|
|
28 |
{"tstamp": 1725340411.8871, "task_type": "retrieval", "type": "leftvote", "models": ["Alibaba-NLP/gte-Qwen2-7B-instruct", "nomic-ai/nomic-embed-text-v1.5"], "ip": "", "0_conv_id": "41d09b6bb9994589a19cfe4258581aa4", "0_model_name": "Alibaba-NLP/gte-Qwen2-7B-instruct", "0_prompt": "Story about a news paper journalist who is hearing about a vampires life", "0_output": [["Story about a news paper journalist who is hearing about a vampires life", "Vampire film\n\nThere is, however, a very small subgenre, pioneered in Murnau's seminal Nosferatu (1922) in which the portrayal of the vampire is similar to the hideous creature of European folklore. Max Schreck's portrayal of this role in Murnau's film was copied by Klaus Kinski in Werner Herzog's remake Nosferatu the Vampyre (1979). In Shadow of the Vampire (2000) (directed by E. Elias Merhige) Willem Dafoe plays Max Schreck, himself, though portrayed here as an actual vampire. Stephen King's Salem's Lot (1979) notably depicts vampires as terrifying, simple-minded creatures, without eroticism, and with the only desire to feed on the blood of others. The main vampire in the Subspecies films, Radu, also exhibits similar aesthetic influences, such as long fingers and nails and generally grotesque facial features. This type of vampire is also featured in the film 30 Days of Night. The 2011 remake of Fright Night is notable for such a hideous depiction of the vampire when manifesting.\nA major character in most vampire films is the vampire hunter, of which Stoker's Abraham Van Helsing is a prototype. Peter Vincent (Roddy McDowell) in Fright Night (1985) and the Frog brothers in The Lost Boys (1987) were all vampire hunters. However, killing vampires has changed. Where Van Helsing relied on a stake through the heart, in Vampires (1998), directed by John Carpenter, Jack Crow (James Woods) has a heavily armed squad of vampire hunters and in Buffy the Vampire Slayer (1992, directed by Fran Rubel Kuzui), writer Joss Whedon (who created TV's Buffy the Vampire Slayer and spin-off Angel) attached the Slayer, Buffy Summers (Kristy Swanson in the film, Sarah Michelle Gellar in the TV series), to a network of Watchers and mystically endowed her with superhuman powers.\nDracula in films and his legacy"]], "0_corpus": "wikipedia", "1_conv_id": "62e0b758c03243f79366a4152606e3e1", "1_model_name": "nomic-ai/nomic-embed-text-v1.5", "1_prompt": "Story about a news paper journalist who is hearing about a vampires life", "1_output": [["Story about a news paper journalist who is hearing about a vampires life", "Vampire hunting\n\nVampire hunter or vampire slayer is a fictional occupation in folklore and fiction which specializes in finding vampires, and sometimes other supernatural creatures. A vampire hunter is usually described as having extensive knowledge of vampires and other monstrous or undead creatures, including their powers and weaknesses, and uses this knowledge to effectively combat them.\nVampire hunters range in characterization in fiction from sages with more than average knowledge about the occult, to athletes with the skill and ability to confront vampires with traditional or holy weapons often including wooden stakes and holy water, to supernatural beings themselves who fight vampires with mystical powers. Many characterizations draw from the history and folklore of the Balkans.\nA well known and influential archetypal vampire hunter is Professor Abraham Van Helsing, a character in Bram Stoker's 1897 horror novel, Dracula, a foundational work in the genre."]], "1_corpus": "wikipedia"}
|
29 |
{"tstamp": 1725340457.3679, "task_type": "retrieval", "type": "leftvote", "models": ["Alibaba-NLP/gte-Qwen2-7B-instruct", "nomic-ai/nomic-embed-text-v1.5"], "ip": "", "0_conv_id": "d98739ff69704a5faa60910f9913b4c3", "0_model_name": "Alibaba-NLP/gte-Qwen2-7B-instruct", "0_prompt": "Pizza debate, which is better", "0_output": [["Pizza debate, which is better", "Pizza\n\nA popular variant of pizza in Italy is Sicilian pizza (locally called sfincione or sfinciuni), a thick-crust or deep-dish pizza originating during the 17th century in Sicily: it is essentially a focaccia that is typically topped with tomato sauce and other ingredients. Until the 1860s, sfincione was the type of pizza usually consumed in Sicily, especially in the Western portion of the island. Other variations of pizzas are also found in other regions of Italy, for example pizza al padellino or pizza al tegamino, a small-sized, thick-crust, deep-dish pizza typically served in Turin, Piedmont.\nUnited States\nThe first pizzeria in the U.S. was opened in New York City's Little Italy in 1905. Common toppings for pizza in the United States include anchovies, ground beef, chicken, ham, mushrooms, olives, onions, peppers, pepperoni, salami, sausage, spinach, steak, and tomatoes. Distinct regional types developed in the 20th century, including Buffalo, California, Chicago, Detroit, Greek, New Haven, New York, and St. Louis styles. These regional variations include deep-dish, stuffed, pockets, turnovers, rolled, and pizza-on-a-stick, each with seemingly limitless combinations of sauce and toppings.\nThirteen percent of the United States population consumes pizza on any given day. Pizza chains such as Domino's Pizza, Pizza Hut, and Papa John's, pizzas from take and bake pizzerias, and chilled or frozen pizzas from supermarkets make pizza readily available nationwide."]], "0_corpus": "wikipedia", "1_conv_id": "d78f60ab585540fa90402b3cc668dfb8", "1_model_name": "nomic-ai/nomic-embed-text-v1.5", "1_prompt": "Pizza debate, which is better", "1_output": [["Pizza debate, which is better", "Pizza\n\nThe term pizza was first recorded in the year 997 AD, in a Latin manuscript from the southern Italian town of Gaeta, in Lazio, on the border with Campania. Raffaele Esposito is often credited for creating modern pizza in Naples. In 2009, Neapolitan pizza was registered with the European Union as a traditional speciality guaranteed (TSG) dish. In 2017, the art of making Neapolitan pizza was added to UNESCO's list of intangible cultural heritage.\nPizza and its variants are among the most popular foods in the world. Pizza is sold at a variety of restaurants, including pizzerias (pizza specialty restaurants), Mediterranean restaurants, via delivery, and as street food. In Italy, pizza served in a restaurant is presented unsliced, and is eaten with the use of a knife and fork. In casual settings, however, it is typically cut into slices to be eaten while held in the hand. Pizza is also sold in grocery stores in a variety of forms, including frozen or as kits for self-assembly. They are then cooked using a home oven.\nIn 2017, the world pizza market was US$128 billion, and in the US it was $44 billion spread over 76,000 pizzerias. Overall, 13% of the U.S. population aged two years and over consumed pizza on any given day.\nEtymology\nThe oldest recorded usage of the word pizza is from a Latin text from the town of Gaeta, then still part of the Byzantine Empire, in 997 AD; the text states that a tenant of certain property is to give the bishop of Gaeta duodecim pizze () every Christmas Day, and another twelve every Easter Sunday."]], "1_corpus": "wikipedia"}
|
30 |
{"tstamp": 1725340562.8358, "task_type": "retrieval", "type": "rightvote", "models": ["Alibaba-NLP/gte-Qwen2-7B-instruct", "nomic-ai/nomic-embed-text-v1.5"], "ip": "", "0_conv_id": "b1959c714e0a40c0a940f2ce9bf4e7c8", "0_model_name": "Alibaba-NLP/gte-Qwen2-7B-instruct", "0_prompt": "Who was the founder of neural networks?", "0_output": [["Who was the founder of neural networks?", "Neural network (machine learning)\n\nDuring the AI Winter era, however, research outside the United States continued, especially in Eastern Europe. By the time Minsky and Papert's book on Perceptrons came out, methods for training multilayer perceptrons (MLPs) were already known. The first deep learning MLP was published by Alexey Grigorevich Ivakhnenko and Valentin Lapa in 1965, as the Group Method of Data Handling. The first deep learning MLP trained by stochastic gradient descent was published in 1967 by Shun'ichi Amari. In computer experiments conducted by Amari's student Saito, a five layer MLP with two modifiable layers learned useful internal representations to classify non-linearily separable pattern classes.\nSelf-organizing maps (SOMs) were described by Teuvo Kohonen in 1982. SOMs are neurophysiologically inspired neural networks that learn low-dimensional representations of high-dimensional data while preserving the topological structure of the data. They are trained using competitive learning.\nThe convolutional neural network (CNN) architecture with convolutional layers and downsampling layers was introduced by Kunihiko Fukushima in 1980. He called it the neocognitron. In 1969, he also introduced the ReLU (rectified linear unit) activation function. The rectifier has become the most popular activation function for CNNs and deep neural networks in general. CNNs have become an essential tool for computer vision.\nA key in later advances in artificial neural network research was the backpropagation algorithm, an efficient application of the Leibniz chain rule (1673) to networks of differentiable nodes. It is also known as"]], "0_corpus": "wikipedia", "1_conv_id": "dbc902ebe84f4d6d9dad5d3a063e60d7", "1_model_name": "nomic-ai/nomic-embed-text-v1.5", "1_prompt": "Who was the founder of neural networks?", "1_output": [["Who was the founder of neural networks?", "Neural network (machine learning)\n\nIn the late 1940s, D. O. Hebb created a learning hypothesis based on the mechanism of neural plasticity that became known as Hebbian learning. Hebbian learning is considered to be a 'typical' unsupervised learning rule and its later variants were early models for long term potentiation. These ideas started being applied to computational models in 1948 with Turing's \"unorganized machines\". Farley and Wesley A. Clark were the first to simulate a Hebbian network in 1954 at MIT. They used computational machines, then called \"calculators\". Other neural network computational machines were created by Rochester, Holland, Habit, and Duda in 1956. In 1958, psychologist Frank Rosenblatt invented the perceptron, the first implemented artificial neural network, funded by the United States Office of Naval Research.\nThe invention of the perceptron raised public excitement for research in Artificial Neural Networks, causing the US government to drastically increase funding into deep learning research. This led to \"the golden age of AI\" fueled by the optimistic claims made by computer scientists regarding the ability of perceptrons to emulate human intelligence. For example, in 1957 Herbert Simon famously said:However, this wasn't the case as research stagnated in the United States following the work of Minsky and Papert (1969), who discovered that basic perceptrons were incapable of processing the exclusive-or circuit and that computers lacked sufficient power to train useful neural networks. This, along with other factors such as the 1973 Lighthill report by James Lighthill stating that research in Artificial Intelligence has not \"produced the major impact that was then promised,\" shutting funding in research into the field of AI in all but two universities in the UK and in many major institutions across the world. This ushered an era called the AI Winter with reduced research into connectionism due to a decrease in government funding and an increased stress on symbolic artificial intelligence in the United States and other Western countries."]], "1_corpus": "wikipedia"}
|
|
|
|
28 |
{"tstamp": 1725340411.8871, "task_type": "retrieval", "type": "leftvote", "models": ["Alibaba-NLP/gte-Qwen2-7B-instruct", "nomic-ai/nomic-embed-text-v1.5"], "ip": "", "0_conv_id": "41d09b6bb9994589a19cfe4258581aa4", "0_model_name": "Alibaba-NLP/gte-Qwen2-7B-instruct", "0_prompt": "Story about a news paper journalist who is hearing about a vampires life", "0_output": [["Story about a news paper journalist who is hearing about a vampires life", "Vampire film\n\nThere is, however, a very small subgenre, pioneered in Murnau's seminal Nosferatu (1922) in which the portrayal of the vampire is similar to the hideous creature of European folklore. Max Schreck's portrayal of this role in Murnau's film was copied by Klaus Kinski in Werner Herzog's remake Nosferatu the Vampyre (1979). In Shadow of the Vampire (2000) (directed by E. Elias Merhige) Willem Dafoe plays Max Schreck, himself, though portrayed here as an actual vampire. Stephen King's Salem's Lot (1979) notably depicts vampires as terrifying, simple-minded creatures, without eroticism, and with the only desire to feed on the blood of others. The main vampire in the Subspecies films, Radu, also exhibits similar aesthetic influences, such as long fingers and nails and generally grotesque facial features. This type of vampire is also featured in the film 30 Days of Night. The 2011 remake of Fright Night is notable for such a hideous depiction of the vampire when manifesting.\nA major character in most vampire films is the vampire hunter, of which Stoker's Abraham Van Helsing is a prototype. Peter Vincent (Roddy McDowell) in Fright Night (1985) and the Frog brothers in The Lost Boys (1987) were all vampire hunters. However, killing vampires has changed. Where Van Helsing relied on a stake through the heart, in Vampires (1998), directed by John Carpenter, Jack Crow (James Woods) has a heavily armed squad of vampire hunters and in Buffy the Vampire Slayer (1992, directed by Fran Rubel Kuzui), writer Joss Whedon (who created TV's Buffy the Vampire Slayer and spin-off Angel) attached the Slayer, Buffy Summers (Kristy Swanson in the film, Sarah Michelle Gellar in the TV series), to a network of Watchers and mystically endowed her with superhuman powers.\nDracula in films and his legacy"]], "0_corpus": "wikipedia", "1_conv_id": "62e0b758c03243f79366a4152606e3e1", "1_model_name": "nomic-ai/nomic-embed-text-v1.5", "1_prompt": "Story about a news paper journalist who is hearing about a vampires life", "1_output": [["Story about a news paper journalist who is hearing about a vampires life", "Vampire hunting\n\nVampire hunter or vampire slayer is a fictional occupation in folklore and fiction which specializes in finding vampires, and sometimes other supernatural creatures. A vampire hunter is usually described as having extensive knowledge of vampires and other monstrous or undead creatures, including their powers and weaknesses, and uses this knowledge to effectively combat them.\nVampire hunters range in characterization in fiction from sages with more than average knowledge about the occult, to athletes with the skill and ability to confront vampires with traditional or holy weapons often including wooden stakes and holy water, to supernatural beings themselves who fight vampires with mystical powers. Many characterizations draw from the history and folklore of the Balkans.\nA well known and influential archetypal vampire hunter is Professor Abraham Van Helsing, a character in Bram Stoker's 1897 horror novel, Dracula, a foundational work in the genre."]], "1_corpus": "wikipedia"}
|
29 |
{"tstamp": 1725340457.3679, "task_type": "retrieval", "type": "leftvote", "models": ["Alibaba-NLP/gte-Qwen2-7B-instruct", "nomic-ai/nomic-embed-text-v1.5"], "ip": "", "0_conv_id": "d98739ff69704a5faa60910f9913b4c3", "0_model_name": "Alibaba-NLP/gte-Qwen2-7B-instruct", "0_prompt": "Pizza debate, which is better", "0_output": [["Pizza debate, which is better", "Pizza\n\nA popular variant of pizza in Italy is Sicilian pizza (locally called sfincione or sfinciuni), a thick-crust or deep-dish pizza originating during the 17th century in Sicily: it is essentially a focaccia that is typically topped with tomato sauce and other ingredients. Until the 1860s, sfincione was the type of pizza usually consumed in Sicily, especially in the Western portion of the island. Other variations of pizzas are also found in other regions of Italy, for example pizza al padellino or pizza al tegamino, a small-sized, thick-crust, deep-dish pizza typically served in Turin, Piedmont.\nUnited States\nThe first pizzeria in the U.S. was opened in New York City's Little Italy in 1905. Common toppings for pizza in the United States include anchovies, ground beef, chicken, ham, mushrooms, olives, onions, peppers, pepperoni, salami, sausage, spinach, steak, and tomatoes. Distinct regional types developed in the 20th century, including Buffalo, California, Chicago, Detroit, Greek, New Haven, New York, and St. Louis styles. These regional variations include deep-dish, stuffed, pockets, turnovers, rolled, and pizza-on-a-stick, each with seemingly limitless combinations of sauce and toppings.\nThirteen percent of the United States population consumes pizza on any given day. Pizza chains such as Domino's Pizza, Pizza Hut, and Papa John's, pizzas from take and bake pizzerias, and chilled or frozen pizzas from supermarkets make pizza readily available nationwide."]], "0_corpus": "wikipedia", "1_conv_id": "d78f60ab585540fa90402b3cc668dfb8", "1_model_name": "nomic-ai/nomic-embed-text-v1.5", "1_prompt": "Pizza debate, which is better", "1_output": [["Pizza debate, which is better", "Pizza\n\nThe term pizza was first recorded in the year 997 AD, in a Latin manuscript from the southern Italian town of Gaeta, in Lazio, on the border with Campania. Raffaele Esposito is often credited for creating modern pizza in Naples. In 2009, Neapolitan pizza was registered with the European Union as a traditional speciality guaranteed (TSG) dish. In 2017, the art of making Neapolitan pizza was added to UNESCO's list of intangible cultural heritage.\nPizza and its variants are among the most popular foods in the world. Pizza is sold at a variety of restaurants, including pizzerias (pizza specialty restaurants), Mediterranean restaurants, via delivery, and as street food. In Italy, pizza served in a restaurant is presented unsliced, and is eaten with the use of a knife and fork. In casual settings, however, it is typically cut into slices to be eaten while held in the hand. Pizza is also sold in grocery stores in a variety of forms, including frozen or as kits for self-assembly. They are then cooked using a home oven.\nIn 2017, the world pizza market was US$128 billion, and in the US it was $44 billion spread over 76,000 pizzerias. Overall, 13% of the U.S. population aged two years and over consumed pizza on any given day.\nEtymology\nThe oldest recorded usage of the word pizza is from a Latin text from the town of Gaeta, then still part of the Byzantine Empire, in 997 AD; the text states that a tenant of certain property is to give the bishop of Gaeta duodecim pizze () every Christmas Day, and another twelve every Easter Sunday."]], "1_corpus": "wikipedia"}
|
30 |
{"tstamp": 1725340562.8358, "task_type": "retrieval", "type": "rightvote", "models": ["Alibaba-NLP/gte-Qwen2-7B-instruct", "nomic-ai/nomic-embed-text-v1.5"], "ip": "", "0_conv_id": "b1959c714e0a40c0a940f2ce9bf4e7c8", "0_model_name": "Alibaba-NLP/gte-Qwen2-7B-instruct", "0_prompt": "Who was the founder of neural networks?", "0_output": [["Who was the founder of neural networks?", "Neural network (machine learning)\n\nDuring the AI Winter era, however, research outside the United States continued, especially in Eastern Europe. By the time Minsky and Papert's book on Perceptrons came out, methods for training multilayer perceptrons (MLPs) were already known. The first deep learning MLP was published by Alexey Grigorevich Ivakhnenko and Valentin Lapa in 1965, as the Group Method of Data Handling. The first deep learning MLP trained by stochastic gradient descent was published in 1967 by Shun'ichi Amari. In computer experiments conducted by Amari's student Saito, a five layer MLP with two modifiable layers learned useful internal representations to classify non-linearily separable pattern classes.\nSelf-organizing maps (SOMs) were described by Teuvo Kohonen in 1982. SOMs are neurophysiologically inspired neural networks that learn low-dimensional representations of high-dimensional data while preserving the topological structure of the data. They are trained using competitive learning.\nThe convolutional neural network (CNN) architecture with convolutional layers and downsampling layers was introduced by Kunihiko Fukushima in 1980. He called it the neocognitron. In 1969, he also introduced the ReLU (rectified linear unit) activation function. The rectifier has become the most popular activation function for CNNs and deep neural networks in general. CNNs have become an essential tool for computer vision.\nA key in later advances in artificial neural network research was the backpropagation algorithm, an efficient application of the Leibniz chain rule (1673) to networks of differentiable nodes. It is also known as"]], "0_corpus": "wikipedia", "1_conv_id": "dbc902ebe84f4d6d9dad5d3a063e60d7", "1_model_name": "nomic-ai/nomic-embed-text-v1.5", "1_prompt": "Who was the founder of neural networks?", "1_output": [["Who was the founder of neural networks?", "Neural network (machine learning)\n\nIn the late 1940s, D. O. Hebb created a learning hypothesis based on the mechanism of neural plasticity that became known as Hebbian learning. Hebbian learning is considered to be a 'typical' unsupervised learning rule and its later variants were early models for long term potentiation. These ideas started being applied to computational models in 1948 with Turing's \"unorganized machines\". Farley and Wesley A. Clark were the first to simulate a Hebbian network in 1954 at MIT. They used computational machines, then called \"calculators\". Other neural network computational machines were created by Rochester, Holland, Habit, and Duda in 1956. In 1958, psychologist Frank Rosenblatt invented the perceptron, the first implemented artificial neural network, funded by the United States Office of Naval Research.\nThe invention of the perceptron raised public excitement for research in Artificial Neural Networks, causing the US government to drastically increase funding into deep learning research. This led to \"the golden age of AI\" fueled by the optimistic claims made by computer scientists regarding the ability of perceptrons to emulate human intelligence. For example, in 1957 Herbert Simon famously said:However, this wasn't the case as research stagnated in the United States following the work of Minsky and Papert (1969), who discovered that basic perceptrons were incapable of processing the exclusive-or circuit and that computers lacked sufficient power to train useful neural networks. This, along with other factors such as the 1973 Lighthill report by James Lighthill stating that research in Artificial Intelligence has not \"produced the major impact that was then promised,\" shutting funding in research into the field of AI in all but two universities in the UK and in many major institutions across the world. This ushered an era called the AI Winter with reduced research into connectionism due to a decrease in government funding and an increased stress on symbolic artificial intelligence in the United States and other Western countries."]], "1_corpus": "wikipedia"}
|
31 |
+
{"tstamp": 1725340622.9094, "task_type": "retrieval", "type": "tievote", "models": ["Alibaba-NLP/gte-Qwen2-7B-instruct", "nomic-ai/nomic-embed-text-v1.5"], "ip": "", "0_conv_id": "5043186b68a048379d14d1ce35cdd94e", "0_model_name": "Alibaba-NLP/gte-Qwen2-7B-instruct", "0_prompt": "Who invented GIT for code versioning", "0_output": [["Who invented GIT for code versioning", "Git\n\nThe development of Git began on 3 April 2005. Torvalds announced the project on 6 April and became self-hosting the next day. The first merge of multiple branches took place on 18 April. Torvalds achieved his performance goals; on 29 April, the nascent Git was benchmarked recording patches to the Linux kernel tree at a rate of 6.7 patches per second. On 16 June, Git managed the kernel 2.6.12 release.\nTorvalds turned over maintenance on 26 July 2005 to Junio Hamano, a major contributor to the project. Hamano was responsible for the 1.0 release on 21 December 2005.\nNaming\nTorvalds sarcastically quipped about the name git (which means \"unpleasant person\" in British English slang): \"I'm an egotistical bastard, and I name all my projects after myself. First 'Linux', now 'git'.\" The man page describes Git as \"the stupid content tracker\".\nThe read-me file of the source code elaborates further:\nThe source code for Git refers to the program as \"the information manager from hell\".\nCharacteristics\nDesign\nGit's design is a synthesis of Torvalds's experience with Linux in maintaining a large distributed development project, along with his intimate knowledge of file-system performance gained from the same project and the urgent need to produce a working system in short order. These influences led to the following implementation choices:"]], "0_corpus": "wikipedia", "1_conv_id": "f641986d2a7740eb886e297ce6ec14e1", "1_model_name": "nomic-ai/nomic-embed-text-v1.5", "1_prompt": "Who invented GIT for code versioning", "1_output": [["Who invented GIT for code versioning", "Git\n\nGit () is a distributed version control system that tracks versions of files. It is often used to control source code by programmers collaboratively developing software.\nDesign goals of Git include speed, data integrity, and support for distributed, non-linear workflows thousands of parallel branches running on different computers.\nGit was created for use in the development of the Linux kernel by Linus Torvalds and others developing the kernel."]], "1_corpus": "wikipedia"}
|