Ahmadzei's picture
added 3 more tables for large emb model
5fa1a76
Prior state-of-the-art (SOTA) models require at least tens of thousands of training examples and their reasoning capabilities are still much limited, especially on complex human-written queries.