Scheduled Commit
Browse files
data/retrieval_individual-fc2e4671-7e20-44ad-a373-899a077f1401.jsonl
CHANGED
@@ -135,3 +135,4 @@
|
|
135 |
{"tstamp": 1726530610.2837, "task_type": "retrieval", "type": "chat", "model": "voyage-multilingual-2", "gen_params": {}, "start": 1726530609.804, "finish": 1726530610.2837, "ip": "", "conv_id": "7342e55f4864430084a5aa6712dc4379", "model_name": "voyage-multilingual-2", "prompt": "who created GPT?", "output": [["who created GPT?", "GPT-1\n\nGenerative Pre-trained Transformer 1 (GPT-1) was the first of OpenAI's large language models following Google's invention of the transformer architecture in 2017. In June 2018, OpenAI released a paper entitled \"Improving Language Understanding by Generative Pre-Training\", in which they introduced that initial model along with the general concept of a generative pre-trained transformer."]], "corpus": "wikipedia"}
|
136 |
{"tstamp": 1726530610.2837, "task_type": "retrieval", "type": "chat", "model": "intfloat/e5-mistral-7b-instruct", "gen_params": {}, "start": 1726530609.804, "finish": 1726530610.2837, "ip": "", "conv_id": "8e96268f9bb248ce9b5acd84206204f2", "model_name": "intfloat/e5-mistral-7b-instruct", "prompt": "who created GPT?", "output": [["who created GPT?", "Generative pre-trained transformer\n\nGenerative pre-trained transformers (GPT) are a type of large language model (LLM) and a prominent framework for generative artificial intelligence. They are artificial neural networks that are used in natural language processing tasks. GPTs are based on the transformer architecture, pre-trained on large data sets of unlabelled text, and able to generate novel human-like content. As of 2023, most LLMs have these characteristics and are sometimes referred to broadly as GPTs.\nThe first GPT was introduced in 2018 by OpenAI. OpenAI has released very influential GPT foundation models that have been sequentially numbered, to comprise its \"GPT-n\" series. Each of these was significantly more capable than the previous, due to increased size (number of trainable parameters) and training. The most recent of these, GPT-4, was released in March 2023. Such models have been the basis for their more task-specific GPT systems, including models fine-tuned for instruction followingwhich in turn power the ChatGPT chatbot service."]], "corpus": "wikipedia"}
|
137 |
{"tstamp": 1726535276.1287, "task_type": "retrieval", "type": "chat", "model": "text-embedding-3-large", "gen_params": {}, "start": 1726535273.1231, "finish": 1726535276.1287, "ip": "", "conv_id": "ccff8e48a5304f6f883b16eded20e3b6", "model_name": "text-embedding-3-large", "prompt": "hello there", "output": [["hello there", "Title: The distributed Language Hello White Paper\n\nAbstract: Hello is a general-purpose, object-oriented, protocol-agnostic distributed programming language. This paper explains the ideas that guided design of Hello. It shows the spirit of Hello using two brief expressive programs and provides a summary of language features. In addition, it explores historical parallels between the binary programming of early computers and the distributed programming of modern networks."]], "corpus": "arxiv"}
|
|
|
|
135 |
{"tstamp": 1726530610.2837, "task_type": "retrieval", "type": "chat", "model": "voyage-multilingual-2", "gen_params": {}, "start": 1726530609.804, "finish": 1726530610.2837, "ip": "", "conv_id": "7342e55f4864430084a5aa6712dc4379", "model_name": "voyage-multilingual-2", "prompt": "who created GPT?", "output": [["who created GPT?", "GPT-1\n\nGenerative Pre-trained Transformer 1 (GPT-1) was the first of OpenAI's large language models following Google's invention of the transformer architecture in 2017. In June 2018, OpenAI released a paper entitled \"Improving Language Understanding by Generative Pre-Training\", in which they introduced that initial model along with the general concept of a generative pre-trained transformer."]], "corpus": "wikipedia"}
|
136 |
{"tstamp": 1726530610.2837, "task_type": "retrieval", "type": "chat", "model": "intfloat/e5-mistral-7b-instruct", "gen_params": {}, "start": 1726530609.804, "finish": 1726530610.2837, "ip": "", "conv_id": "8e96268f9bb248ce9b5acd84206204f2", "model_name": "intfloat/e5-mistral-7b-instruct", "prompt": "who created GPT?", "output": [["who created GPT?", "Generative pre-trained transformer\n\nGenerative pre-trained transformers (GPT) are a type of large language model (LLM) and a prominent framework for generative artificial intelligence. They are artificial neural networks that are used in natural language processing tasks. GPTs are based on the transformer architecture, pre-trained on large data sets of unlabelled text, and able to generate novel human-like content. As of 2023, most LLMs have these characteristics and are sometimes referred to broadly as GPTs.\nThe first GPT was introduced in 2018 by OpenAI. OpenAI has released very influential GPT foundation models that have been sequentially numbered, to comprise its \"GPT-n\" series. Each of these was significantly more capable than the previous, due to increased size (number of trainable parameters) and training. The most recent of these, GPT-4, was released in March 2023. Such models have been the basis for their more task-specific GPT systems, including models fine-tuned for instruction followingwhich in turn power the ChatGPT chatbot service."]], "corpus": "wikipedia"}
|
137 |
{"tstamp": 1726535276.1287, "task_type": "retrieval", "type": "chat", "model": "text-embedding-3-large", "gen_params": {}, "start": 1726535273.1231, "finish": 1726535276.1287, "ip": "", "conv_id": "ccff8e48a5304f6f883b16eded20e3b6", "model_name": "text-embedding-3-large", "prompt": "hello there", "output": [["hello there", "Title: The distributed Language Hello White Paper\n\nAbstract: Hello is a general-purpose, object-oriented, protocol-agnostic distributed programming language. This paper explains the ideas that guided design of Hello. It shows the spirit of Hello using two brief expressive programs and provides a summary of language features. In addition, it explores historical parallels between the binary programming of early computers and the distributed programming of modern networks."]], "corpus": "arxiv"}
|
138 |
+
{"tstamp": 1726535319.6188, "task_type": "retrieval", "type": "chat", "model": "text-embedding-3-large", "gen_params": {}, "start": 1726535316.801, "finish": 1726535319.6188, "ip": "", "conv_id": "966cda5dbd6a4d26aa82c2244682a8fd", "model_name": "text-embedding-3-large", "prompt": "Which test was devised to determine whether robots can think?", "output": [["Which test was devised to determine whether robots can think?", "Turing test\n\nThe test was introduced by Turing in his 1950 paper \"Computing Machinery and Intelligence\" while working at the University of Manchester. It opens with the words: \"I propose to consider the question, 'Can machines think? Because \"thinking\" is difficult to define, Turing chooses to \"replace the question by another, which is closely related to it and is expressed in relatively unambiguous words.\" Turing describes the new form of the problem in terms of a three-person game called the \"imitation game\", in which an interrogator asks questions of a man and a woman in another room in order to determine the correct sex of the two players. Turing's new question is: \"Are there imaginable digital computers which would do well in the imitation game?\" This question, Turing believed, was one that could actually be answered. In the remainder of the paper, he argued against all the major objections to the proposition that \"machines can think\".\nSince Turing introduced his test, it has been both highly influential and widely criticized, and has become an important concept in the philosophy of artificial intelligence. Philosopher John Searle would comment on the Turing test in his Chinese room argument, a thought experiment that stipulates that a machine cannot have a \"mind\", \"understanding\", or \"consciousness\", regardless of how intelligently or human-like the program may make the computer behave. Searle criticizes Turing's test and claims it is insufficient to detect the presence of consciousness.\nHistory"]], "corpus": "wikipedia"}
|