Back

Mohammed Zaid

Shitposter of Medialย โ€ขย 1m

DeepSeek has quietly dropped V3.1, a 685B-parameter open-source LLM on Hugging Faceโ€”128K token context, hybrid reasoning/chat/coding, multi-precision support. Early benchmarks: 71.6 % Aider coding, on par with proprietary models & 68ร— cheaper.

2 Replies
2
13
Replies (2)

More like this

Recommendations from Medial

Avinash A

Hey I am on Medialย โ€ขย 7m

๐Ÿค– Claude 3.7 Sonnet just changed the #AI #coding game. From intricate cloth sims in p5.js ๐ŸŽจ to #fullstack Connect 4 ๐Ÿ•น๏ธ in 30 mins! 128K-token extended thinking, auto-fixes ๐Ÿณ ๐Ÿ“„. Ready for agentic coding? Stay tuned for the demo! #Anthropic h

See More
Reply

Vicky

Ask yourself the que...ย โ€ขย 5m

The Next AI Battleground? Open-Source LLMs Are Gaining Fast GPT-4 may still lead the pack โ€” but the real action is now in open-source LLMs, and the gap is closing *faster than anyone expected In just 3 months: - Mistralโ€™s Mixtral matched GPT-3.5 on

See More
Reply
9

Download the medial app to read full posts, comements and news.