https://arxiv.org/pdf/2404.07143.pdf
Google has dropped possibly THE most important and future defining AI paper under 12 pages. Models can now have infinite context.
https://arxiv.org/pdf/2404.07143.pdf
Google has dropped possibly THE most important and future defining AI paper under 12 pages. Models can now have infinite context.
2 replies7 likes
Parampreet Singh
Python Developer ๐ป ...ย โขย 1m
3B LLM outperforms 405B LLM ๐คฏ
Similarly, a 7B LLM outperforms OpenAI o1 & DeepSeek-R1 ๐คฏ ๐คฏ
LLM: llama 3
Datasets: MATH-500 & AIME-2024
This has done on research with compute optimal Test-Time Scaling (TTS).
Recently, OpenAI o1 shows that Test-