Muennighoff commited on
Commit
d38586c
·
verified ·
1 Parent(s): cde47d6

Scheduled Commit

Browse files
data/retrieval_individual-dc61b353-c2a4-4719-875a-064e75e0f6f7.jsonl CHANGED
@@ -85,3 +85,5 @@
85
  {"tstamp": 1736937299.5154, "task_type": "retrieval", "type": "chat", "model": "text-embedding-004", "gen_params": {}, "start": 1736937299.1113, "finish": 1736937299.5154, "ip": "", "conv_id": "f5e65b9241ed4559a1650dbdd28b7322", "model_name": "text-embedding-004", "prompt": "Amount of protein per KG of bodyweight for optimal muscle growth", "output": [["Amount of protein per KG of bodyweight for optimal muscle growth", "Muscle hypertrophy\n\nIt is not uncommon for bodybuilders to advise a protein intake as high as 2–4 g per kilogram of bodyweight per day. However, scientific literature has suggested this is higher than necessary, as protein intakes greater than 1.8 g per kilogram of body weight showed to have no greater effect on muscle hypertrophy. A study carried out by American College of Sports Medicine (2002) put the recommended daily protein intake for athletes at 1.2–1.8 g per kilogram of body weight. Conversely, Di Pasquale (2008), citing recent studies, recommends a minimum protein intake of 2.2 g/kg \"for anyone involved in competitive or intense recreational sports who wants to maximize lean body mass but does not wish to gain weight. However athletes involved in strength events (..) may need even more to maximize body composition and athletic performance. In those attempting to minimize body fat and thus maximize body composition, for example in sports with weight classes and in bodybuilding, it's possible that protein may well make up over 50% of their daily caloric intake.\"\nMicrotrauma\nMicrotrauma is tiny damage to the muscle fibers. The precise relation between microtrauma and muscle growth is not entirely understood yet.\nOne theory is that microtrauma plays a significant role in muscle growth. When microtrauma occurs (from weight training or other strenuous activities), the body responds by overcompensating, replacing the damaged tissue and adding more, so that the risk of repeat damage is reduced. Damage to these fibers has been theorized as the possible cause for the symptoms of delayed onset muscle soreness (DOMS), and is why progressive overload is essential to continued improvement, as the body adapts and becomes more resistant to stress."]], "corpus": "wikipedia"}
86
  {"tstamp": 1736940116.8687, "task_type": "retrieval", "type": "chat", "model": "BAAI/bge-large-en-v1.5", "gen_params": {}, "start": 1736940116.6571, "finish": 1736940116.8687, "ip": "", "conv_id": "bc058c3d9ae146539e2b9eefe9283604", "model_name": "BAAI/bge-large-en-v1.5", "prompt": "I am looking for a paper to help me deal with data constraints as I scale up my language model.", "output": [["I am looking for a paper to help me deal with data constraints as I scale up my language model.", "Title: Scaling Data-Constrained Language Models\n\nAbstract: The current trend of scaling language models involves increasing both parameter count and training dataset size. Extrapolating this trend suggests that training dataset size may soon be limited by the amount of text data available on the internet. Motivated by this limit, we investigate scaling language models in data-constrained regimes. Specifically, we run a large set of experiments varying the extent of data repetition and compute budget, ranging up to 900 billion training tokens and 9 billion parameter models. We find that with constrained data for a fixed compute budget, training with up to 4 epochs of repeated data yields negligible changes to loss compared to having unique data. However, with more repetition, the value of adding compute eventually decays to zero. We propose and empirically validate a scaling law for compute optimality that accounts for the decreasing value of repeated tokens and excess parameters. Finally, we experiment with approaches mitigating data scarcity, including augmenting the training dataset with code data or removing commonly used filters. Models and datasets from our 400 training runs are freely available at https://github.com/huggingface/datablations."]], "corpus": "arxiv"}
87
  {"tstamp": 1736940116.8687, "task_type": "retrieval", "type": "chat", "model": "Alibaba-NLP/gte-Qwen2-7B-instruct", "gen_params": {}, "start": 1736940116.6571, "finish": 1736940116.8687, "ip": "", "conv_id": "f4976e8c211f419680d6ce7d30b2c58a", "model_name": "Alibaba-NLP/gte-Qwen2-7B-instruct", "prompt": "I am looking for a paper to help me deal with data constraints as I scale up my language model.", "output": [["I am looking for a paper to help me deal with data constraints as I scale up my language model.", "Title: Scaling Laws for Neural Language Models\n\nAbstract: We study empirical scaling laws for language model performance on the cross-entropy loss. The loss scales as a power-law with model size, dataset size, and the amount of compute used for training, with some trends spanning more than seven orders of magnitude. Other architectural details such as network width or depth have minimal effects within a wide range. Simple equations govern the dependence of overfitting on model/dataset size and the dependence of training speed on model size. These relationships allow us to determine the optimal allocation of a fixed compute budget. Larger models are significantly more sample-efficient, such that optimally compute-efficient training involves training very large models on a relatively modest amount of data and stopping significantly before convergence."]], "corpus": "arxiv"}
 
 
 
85
  {"tstamp": 1736937299.5154, "task_type": "retrieval", "type": "chat", "model": "text-embedding-004", "gen_params": {}, "start": 1736937299.1113, "finish": 1736937299.5154, "ip": "", "conv_id": "f5e65b9241ed4559a1650dbdd28b7322", "model_name": "text-embedding-004", "prompt": "Amount of protein per KG of bodyweight for optimal muscle growth", "output": [["Amount of protein per KG of bodyweight for optimal muscle growth", "Muscle hypertrophy\n\nIt is not uncommon for bodybuilders to advise a protein intake as high as 2–4 g per kilogram of bodyweight per day. However, scientific literature has suggested this is higher than necessary, as protein intakes greater than 1.8 g per kilogram of body weight showed to have no greater effect on muscle hypertrophy. A study carried out by American College of Sports Medicine (2002) put the recommended daily protein intake for athletes at 1.2–1.8 g per kilogram of body weight. Conversely, Di Pasquale (2008), citing recent studies, recommends a minimum protein intake of 2.2 g/kg \"for anyone involved in competitive or intense recreational sports who wants to maximize lean body mass but does not wish to gain weight. However athletes involved in strength events (..) may need even more to maximize body composition and athletic performance. In those attempting to minimize body fat and thus maximize body composition, for example in sports with weight classes and in bodybuilding, it's possible that protein may well make up over 50% of their daily caloric intake.\"\nMicrotrauma\nMicrotrauma is tiny damage to the muscle fibers. The precise relation between microtrauma and muscle growth is not entirely understood yet.\nOne theory is that microtrauma plays a significant role in muscle growth. When microtrauma occurs (from weight training or other strenuous activities), the body responds by overcompensating, replacing the damaged tissue and adding more, so that the risk of repeat damage is reduced. Damage to these fibers has been theorized as the possible cause for the symptoms of delayed onset muscle soreness (DOMS), and is why progressive overload is essential to continued improvement, as the body adapts and becomes more resistant to stress."]], "corpus": "wikipedia"}
86
  {"tstamp": 1736940116.8687, "task_type": "retrieval", "type": "chat", "model": "BAAI/bge-large-en-v1.5", "gen_params": {}, "start": 1736940116.6571, "finish": 1736940116.8687, "ip": "", "conv_id": "bc058c3d9ae146539e2b9eefe9283604", "model_name": "BAAI/bge-large-en-v1.5", "prompt": "I am looking for a paper to help me deal with data constraints as I scale up my language model.", "output": [["I am looking for a paper to help me deal with data constraints as I scale up my language model.", "Title: Scaling Data-Constrained Language Models\n\nAbstract: The current trend of scaling language models involves increasing both parameter count and training dataset size. Extrapolating this trend suggests that training dataset size may soon be limited by the amount of text data available on the internet. Motivated by this limit, we investigate scaling language models in data-constrained regimes. Specifically, we run a large set of experiments varying the extent of data repetition and compute budget, ranging up to 900 billion training tokens and 9 billion parameter models. We find that with constrained data for a fixed compute budget, training with up to 4 epochs of repeated data yields negligible changes to loss compared to having unique data. However, with more repetition, the value of adding compute eventually decays to zero. We propose and empirically validate a scaling law for compute optimality that accounts for the decreasing value of repeated tokens and excess parameters. Finally, we experiment with approaches mitigating data scarcity, including augmenting the training dataset with code data or removing commonly used filters. Models and datasets from our 400 training runs are freely available at https://github.com/huggingface/datablations."]], "corpus": "arxiv"}
87
  {"tstamp": 1736940116.8687, "task_type": "retrieval", "type": "chat", "model": "Alibaba-NLP/gte-Qwen2-7B-instruct", "gen_params": {}, "start": 1736940116.6571, "finish": 1736940116.8687, "ip": "", "conv_id": "f4976e8c211f419680d6ce7d30b2c58a", "model_name": "Alibaba-NLP/gte-Qwen2-7B-instruct", "prompt": "I am looking for a paper to help me deal with data constraints as I scale up my language model.", "output": [["I am looking for a paper to help me deal with data constraints as I scale up my language model.", "Title: Scaling Laws for Neural Language Models\n\nAbstract: We study empirical scaling laws for language model performance on the cross-entropy loss. The loss scales as a power-law with model size, dataset size, and the amount of compute used for training, with some trends spanning more than seven orders of magnitude. Other architectural details such as network width or depth have minimal effects within a wide range. Simple equations govern the dependence of overfitting on model/dataset size and the dependence of training speed on model size. These relationships allow us to determine the optimal allocation of a fixed compute budget. Larger models are significantly more sample-efficient, such that optimally compute-efficient training involves training very large models on a relatively modest amount of data and stopping significantly before convergence."]], "corpus": "arxiv"}
88
+ {"tstamp": 1736947902.9721, "task_type": "retrieval", "type": "chat", "model": "text-embedding-004", "gen_params": {}, "start": 1736947900.7636, "finish": 1736947902.9721, "ip": "", "conv_id": "71b42fe1fb644bc69df39e918e8622b5", "model_name": "text-embedding-004", "prompt": "welches land hat die meisten einwohner", "output": [["welches land hat die meisten einwohner", "Geography of Germany\n\nDemographics With an estimated 83.2 million inhabitants in December 2020, Germany is the most populous country in the European Union and ranks as the 19th largest country in the world in terms of population. Its population density stands at .\nThe United Nations Population Fund lists Germany as host to the second-highest number of international migrants worldwide, around 23% of Germany's population do not hold a German passport or are direct descendants of immigrants. The number of third and subsequent generations of immigrants are not statistically recorded.\nAdministrative divisions\nGermany comprises sixteen states that are collectively referred to as Länder.\nEach state has its own state constitution and is largely autonomous in regard to its internal organisation. Due to differences in size and population the subdivision of these states varies, especially between city states (Stadtstaaten) and states with larger territories (Flächenländer). For regional administrative purposes five states, namely Baden-Württemberg, Bavaria, Hesse, North Rhine-Westphalia and Saxony, consist of a total of 22 Government Districts (Regierungsbezirke). As of July 2021 Germany is divided into 400 districts (Kreise) on municipal level, these consist of 294 rural districts and 106 urban districts.\nUrbanization\nGermany has a number of large cities; the most populous are: Berlin, Hamburg, Munich, Cologne, Frankfurt, and Stuttgart. The largest conurbation is the Rhine-Ruhr region (12 million), including Düsseldorf (the capital of North Rhine-Westphalia), Cologne, Essen, Dortmund, Duisburg, and Bochum."]], "corpus": "wikipedia"}
89
+ {"tstamp": 1736947902.9721, "task_type": "retrieval", "type": "chat", "model": "sentence-transformers/all-MiniLM-L6-v2", "gen_params": {}, "start": 1736947900.7636, "finish": 1736947902.9721, "ip": "", "conv_id": "0f3236bf959146949e7de42d4834e851", "model_name": "sentence-transformers/all-MiniLM-L6-v2", "prompt": "welches land hat die meisten einwohner", "output": [["welches land hat die meisten einwohner", "Karl Rahner\n\nBiography Karl Rahner's parents, Karl and Luise (née Trescher) Rahner, had seven children, of whom Karl was the fourth. His father was a professor in a local college and his mother had a profound religious personality, which influenced the home atmosphere. Karl attended primary and secondary school in Freiburg, entering the Society of Jesus upon graduation; he began his Jesuit formation in the North German Province of the Jesuits in 1922, four years after his older brother Hugo entered the same order. Deeply affected by the spirituality of Ignatius of Loyola during the initial phase of his formation (1922–24), he concentrated the next phase of his formation (1924–7) on Catholic scholastic philosophy and the modern German philosophers: he seems to have been particularly interested in Immanuel Kant and two contemporary Thomists, the Belgian Jesuit Joseph Maréchal and the French Jesuit Pierre Rousselot, who were to influence Rahner's understanding of Thomas Aquinas in his later writings.\nAs a part of his Jesuit training, Rahner taught Latin to novices at Feldkirch (1927–29), then began his theological studies at the Jesuit theologate in Valkenburg aan de Geul in 1929. This allowed him to develop a thorough understanding of patristic theology, also developing interests in spiritual theology, mysticism, and the history of piety. Rahner was ordained a priest on 26 July 1932, and then made his final year of tertianship, the study and taking of Ignatius' Spiritual Exercises, at St. Andrä in Austria's Lavanttal Valley."]], "corpus": "wikipedia"}
data/retrieval_side_by_side-dc61b353-c2a4-4719-875a-064e75e0f6f7.jsonl CHANGED
@@ -1 +1,2 @@
1
  {"tstamp": 1736766890.3859, "task_type": "retrieval", "type": "bothbadvote", "models": ["GritLM/GritLM-7B", "BAAI/bge-large-en-v1.5"], "ip": "", "0_conv_id": "5eddafa9d53c4386a7814d9f00e85911", "0_model_name": "GritLM/GritLM-7B", "0_prompt": "напиши стих пушкина", "0_output": [["напиши стих пушкина", "Alla Pugacheva\n\n«Арлекино» (Harlequin) «Женщина, которая поёт» (The woman who sings)\n«Всё могут короли» (Kings can do anything)\n«Миллион алых роз» (Millions of scarlet Roses)\n«До свидания, лето» (Goodbye, summer)\n«Сонет Шекспира» (Shakespeare's Sonnet)\n«Балет» (Ballet)\n«Этот мир» (This world)\n«А знаешь, все ещё будет» (You know, it's still going to be)\n«Две звезды» (Two stars)\n«Я тебя поцеловала» (I kissed you)\n«Бумажный змей» (The Paper Kite)\n«Осенний поцелуй» (Autumn Kiss)\n«Песенка про себя (Также как все)» (A song about myself (Just like everyone else))\n«На Тихорецкую состав отправится» (The train will go to Tikhoretskaya)\n«100 друзей» (100 friends)\n«Айсберг» (Iceberg)\n«Паромщик» (The ferryman)\n«Волшебник-недоучка» (The half-educated Wizard)\n«Старинные часы» (Vintage watch)\n«Маэстро» (Maestro)\n«Цыганский хор» (Gypsy Choir)\n«Белая дверь» (The white door)\n«Надо же» (Wow)\n«Найти меня» (Find Me)\n«Алло» (Hello)\n«Superman»\n«Every night and every day»\n«Moscow Rock»\n«Любовь, похожая на сон» (Love, like a dream)\n«Я тебя никому не отдам» (I won't give you up to anyone)\n«Осенние листья» (Autumn leaves)\n«Позови меня с собой» (Call me with)\n«Свеча горела на столе» (A candle was burning on the table)\n«В воду войду» (I'll go into the water)\n«Осторожно листопад» (Careful leaf fall)\n«Речной трамвайчик» (River tram)\n«Голубка» (Dove)\n«Мадам Брошкина» (Madame Broshkina)\n«Тысяча лет» (A thousand years)"]], "0_corpus": "wikipedia", "1_conv_id": "26557b726c8f415cb01f3e7b54285f6d", "1_model_name": "BAAI/bge-large-en-v1.5", "1_prompt": "напиши стих пушкина", "1_output": [["напиши стих пушкина", "Una furtiva lagrima\n\nShe loves me! Yes, she loves me,\nI see it, I see it.\nFor a single instant the beats\nof her beautiful heart to hear!\nMy sighs to blend\nfor a while with her sighs!\nHer heartbeats, her heartbeats to hear,\nmy sighs with hers to merge.\nHeavens! One could die!\nMore I cannot ask, I cannot ask.\nOh, heavens! One could, one could die!\nMore I cannot ask, I cannot ask.\nOne could die! One could die of love!\nPoetic translation\nSoftly a furtive teardrop fell,\nshadowed her sparkling eyes;\nSeeing the others follow me\nhas caused her jealous sighs.\nWhat is there more to prize?\nWhat more than this could I prize?\nSighing, she loves me,\nI saw that she loves me.\nCould I but feel her heart on mine,\nbreathing that tender sigh!\nCould my own sighing comfort her,\nand whisper in sweet reply!\nHer heart on mine, as heart to heart we sigh.\nSo tenderly we'd share a sweet reply!\nHeaven, I then could die;\nno more I'd ask you, I'd ask you,\nah! heaven, I, then, I then could die;\nno more I'd ask you, I'd ask you.\nI then could die, I then could die of love."]], "1_corpus": "wikipedia"}
 
 
1
  {"tstamp": 1736766890.3859, "task_type": "retrieval", "type": "bothbadvote", "models": ["GritLM/GritLM-7B", "BAAI/bge-large-en-v1.5"], "ip": "", "0_conv_id": "5eddafa9d53c4386a7814d9f00e85911", "0_model_name": "GritLM/GritLM-7B", "0_prompt": "напиши стих пушкина", "0_output": [["напиши стих пушкина", "Alla Pugacheva\n\n«Арлекино» (Harlequin) «Женщина, которая поёт» (The woman who sings)\n«Всё могут короли» (Kings can do anything)\n«Миллион алых роз» (Millions of scarlet Roses)\n«До свидания, лето» (Goodbye, summer)\n«Сонет Шекспира» (Shakespeare's Sonnet)\n«Балет» (Ballet)\n«Этот мир» (This world)\n«А знаешь, все ещё будет» (You know, it's still going to be)\n«Две звезды» (Two stars)\n«Я тебя поцеловала» (I kissed you)\n«Бумажный змей» (The Paper Kite)\n«Осенний поцелуй» (Autumn Kiss)\n«Песенка про себя (Также как все)» (A song about myself (Just like everyone else))\n«На Тихорецкую состав отправится» (The train will go to Tikhoretskaya)\n«100 друзей» (100 friends)\n«Айсберг» (Iceberg)\n«Паромщик» (The ferryman)\n«Волшебник-недоучка» (The half-educated Wizard)\n«Старинные часы» (Vintage watch)\n«Маэстро» (Maestro)\n«Цыганский хор» (Gypsy Choir)\n«Белая дверь» (The white door)\n«Надо же» (Wow)\n«Найти меня» (Find Me)\n«Алло» (Hello)\n«Superman»\n«Every night and every day»\n«Moscow Rock»\n«Любовь, похожая на сон» (Love, like a dream)\n«Я тебя никому не отдам» (I won't give you up to anyone)\n«Осенние листья» (Autumn leaves)\n«Позови меня с собой» (Call me with)\n«Свеча горела на столе» (A candle was burning on the table)\n«В воду войду» (I'll go into the water)\n«Осторожно листопад» (Careful leaf fall)\n«Речной трамвайчик» (River tram)\n«Голубка» (Dove)\n«Мадам Брошкина» (Madame Broshkina)\n«Тысяча лет» (A thousand years)"]], "0_corpus": "wikipedia", "1_conv_id": "26557b726c8f415cb01f3e7b54285f6d", "1_model_name": "BAAI/bge-large-en-v1.5", "1_prompt": "напиши стих пушкина", "1_output": [["напиши стих пушкина", "Una furtiva lagrima\n\nShe loves me! Yes, she loves me,\nI see it, I see it.\nFor a single instant the beats\nof her beautiful heart to hear!\nMy sighs to blend\nfor a while with her sighs!\nHer heartbeats, her heartbeats to hear,\nmy sighs with hers to merge.\nHeavens! One could die!\nMore I cannot ask, I cannot ask.\nOh, heavens! One could, one could die!\nMore I cannot ask, I cannot ask.\nOne could die! One could die of love!\nPoetic translation\nSoftly a furtive teardrop fell,\nshadowed her sparkling eyes;\nSeeing the others follow me\nhas caused her jealous sighs.\nWhat is there more to prize?\nWhat more than this could I prize?\nSighing, she loves me,\nI saw that she loves me.\nCould I but feel her heart on mine,\nbreathing that tender sigh!\nCould my own sighing comfort her,\nand whisper in sweet reply!\nHer heart on mine, as heart to heart we sigh.\nSo tenderly we'd share a sweet reply!\nHeaven, I then could die;\nno more I'd ask you, I'd ask you,\nah! heaven, I, then, I then could die;\nno more I'd ask you, I'd ask you.\nI then could die, I then could die of love."]], "1_corpus": "wikipedia"}
2
+ {"tstamp": 1736947955.8113, "task_type": "retrieval", "type": "leftvote", "models": ["text-embedding-004", "sentence-transformers/all-MiniLM-L6-v2"], "ip": "", "0_conv_id": "71b42fe1fb644bc69df39e918e8622b5", "0_model_name": "text-embedding-004", "0_prompt": "welches land hat die meisten einwohner", "0_output": [["welches land hat die meisten einwohner", "Geography of Germany\n\nDemographics With an estimated 83.2 million inhabitants in December 2020, Germany is the most populous country in the European Union and ranks as the 19th largest country in the world in terms of population. Its population density stands at .\nThe United Nations Population Fund lists Germany as host to the second-highest number of international migrants worldwide, around 23% of Germany's population do not hold a German passport or are direct descendants of immigrants. The number of third and subsequent generations of immigrants are not statistically recorded.\nAdministrative divisions\nGermany comprises sixteen states that are collectively referred to as Länder.\nEach state has its own state constitution and is largely autonomous in regard to its internal organisation. Due to differences in size and population the subdivision of these states varies, especially between city states (Stadtstaaten) and states with larger territories (Flächenländer). For regional administrative purposes five states, namely Baden-Württemberg, Bavaria, Hesse, North Rhine-Westphalia and Saxony, consist of a total of 22 Government Districts (Regierungsbezirke). As of July 2021 Germany is divided into 400 districts (Kreise) on municipal level, these consist of 294 rural districts and 106 urban districts.\nUrbanization\nGermany has a number of large cities; the most populous are: Berlin, Hamburg, Munich, Cologne, Frankfurt, and Stuttgart. The largest conurbation is the Rhine-Ruhr region (12 million), including Düsseldorf (the capital of North Rhine-Westphalia), Cologne, Essen, Dortmund, Duisburg, and Bochum."]], "0_corpus": "wikipedia", "1_conv_id": "0f3236bf959146949e7de42d4834e851", "1_model_name": "sentence-transformers/all-MiniLM-L6-v2", "1_prompt": "welches land hat die meisten einwohner", "1_output": [["welches land hat die meisten einwohner", "Karl Rahner\n\nBiography Karl Rahner's parents, Karl and Luise (née Trescher) Rahner, had seven children, of whom Karl was the fourth. His father was a professor in a local college and his mother had a profound religious personality, which influenced the home atmosphere. Karl attended primary and secondary school in Freiburg, entering the Society of Jesus upon graduation; he began his Jesuit formation in the North German Province of the Jesuits in 1922, four years after his older brother Hugo entered the same order. Deeply affected by the spirituality of Ignatius of Loyola during the initial phase of his formation (1922–24), he concentrated the next phase of his formation (1924–7) on Catholic scholastic philosophy and the modern German philosophers: he seems to have been particularly interested in Immanuel Kant and two contemporary Thomists, the Belgian Jesuit Joseph Maréchal and the French Jesuit Pierre Rousselot, who were to influence Rahner's understanding of Thomas Aquinas in his later writings.\nAs a part of his Jesuit training, Rahner taught Latin to novices at Feldkirch (1927–29), then began his theological studies at the Jesuit theologate in Valkenburg aan de Geul in 1929. This allowed him to develop a thorough understanding of patristic theology, also developing interests in spiritual theology, mysticism, and the history of piety. Rahner was ordained a priest on 26 July 1932, and then made his final year of tertianship, the study and taking of Ignatius' Spiritual Exercises, at St. Andrä in Austria's Lavanttal Valley."]], "1_corpus": "wikipedia"}