Back

DK

 • 

Ride • 1y

https://arxiv.org/pdf/2404.07143.pdf Google has dropped possibly THE most important and future defining AI paper under 12 pages. Models can now have infinite context.

2 Replies
7
Replies (2)

More like this

Recommendations from Medial

AI Engineer

AI Deep Explorer | f... • 4m

Want to learn AI the right way in 2025? Don’t just take courses. Don’t just build toy projects. Look at what’s actually being used in the real world. The most practical way to really learn AI today is to follow the models that are shaping the indus

See More
Reply
1
9
Image Description
Image Description

Mohammed Zaid

Shitposter of Medial • 5d

DeepSeek has quietly dropped V3.1, a 685B-parameter open-source LLM on Hugging Face—128K token context, hybrid reasoning/chat/coding, multi-precision support. Early benchmarks: 71.6 % Aider coding, on par with proprietary models & 68× cheaper.

2 Replies
2
13

Keval Rajpal

Software Engineer • 15d

Recently learn how LLM works, maths behind attention layers and transformer, continously trying to keep up with rapid development in AI space, getting so overwhelmed 🥲 Now google came with "Mixture-of-Agents (MoA): A Breakthrough in LLM Performance

See More
Reply
2
Image Description
Image Description

Pulakit Bararia

Building Snippetz la... • 26d

I didn’t think I’d enjoy reading 80+ pages on training AI models. But this one? I couldn’t stop. Hugging Face dropped a playbook on how they train massive models across 512 GPUs — and it’s insanely good. Not just technical stuff… it’s like reading a

See More
4 Replies
1
7
Image Description

Shuvodip Ray

 • 

YouTube • 1y

Researchers at Google DeepMind introduced Semantica, an image-conditioned diffusion model capable of generating images based on the semantics of a conditioning image. The paper explores adapting image generative models to different datasets. Instea

See More
2 Replies
3

Swami Gadila

Founder of Friday AI • 29d

Introducing the World's First Adaptive Context Window for Emotional AI Built by the Friday AI Core Technologies Research Team Most LLMs today work with static memory — 8K, 32K, maybe 128K tokens. But human conversations aren't static. Emotions evo

See More
Reply
1
1

Swami Gadila

Founder of Friday AI • 29d

Introducing the World's First Adaptive Context Window for Emotional AI Built by the Friday AI Core Technologies Research Team Most LLMs today work with static memory — 8K, 32K, maybe 128K tokens. But human conversations aren't static. Emotions evo

See More
Reply
1
3

Siddharth K Nair

Thatmoonemojiguy 🌝 • 2m

Apple 🍎 is planning to integrate Al-powered heart rate monitoring into AirPods 🎧 Apple's newest research suggests that AirPods could soon double as Al-powered heart monitors. In a study published on May 29, 2025, Apple's team tested six advanced A

See More
Reply
3
Image Description
Image Description

Mohd Rihan

Student| Passionate ... • 7d

Dhruv Rathee just dropped a banger AI startup: AI Fiesta. As we know, different AI models excel at different tasks, so switching between them efficiently can be a hassle. With AI Fiesta, you can use multiple AI tools like ChatGPT, Gemini, Claude, Gr

See More
15 Replies
1
8

Download the medial app to read full posts, comements and news.