Back

vineet arora

Serenity in chaos • 6d

ai+llm + local on a laptop is fantasy unless you have a super graphics card or npu or 32gb ram. sorry but this is the real fact, you may still try to run it in 16 gb but then it lags like hell. for you context , the thing is called 'Ollama'. it can run many free llms like deepseek, qwen, mistral etc. they have small llm too but for your case they are worthless. best of luck bro

Reply

More like this

Recommendations from Medial

Owl Maniac

Rethink and Breakdow... • 2m

Would anyone be interested in a Hosted Postgres service. Hosting, Backups, Recovery, Auto Scaling, Observation taken care of at $5 - 1 vCpu and 2 Gb RAM at $10 - 2 vCpu and 4 GB RAM with some more additional features and flair Not Serverless but ma

See More
Reply

Manit

Young mind. Bold cod... • 4m

I started something wrong Its my 2nd business and i think it will be a failure I started a hosting company gameserver hosting but we are getting no clients I did everything and I dont have much to gave to big youtubers for promotion I bought 2 VP

See More
Reply

Sai

Founder • 23d

First, AI took our GPUs. Now, it’s coming for our RAM. Epic Games is giving us free games for Christmas, but with DDR5 prices hitting +600%, who can afford the upgrade to run them? We're stuck in a vicious cycle.

Reply
1
Image Description
Image Description

Pulakit Bararia

Founder Snippetz Lab... • 10m

AI shouldn’t be a luxury—it should run anywhere, even on low-powered devices, without costly hardware or cloud dependency. Companies like TinyML, Edge Impulse, LM Studio, Mistral AI, Llama (Meta), and Ollama are already making AI lighter, faster, a

See More
3 Replies
2
15
Image Description
Image Description

Arnav Goyal

https://arnavgoyal.m... • 9m

Introducing Nodey – A Low-Code Platform for Building LLM Flows and AI Agents Locally I'm building a project called Nodey, a low-code, drag-and-drop platform that helps you create LLM workflows and AI agents without needing to write a lot of code. C

See More
4 Replies
4
12
Image Description
Image Description

Vishu Bheda

 • 

Medial • 4m

𝗜 𝘀𝗽𝗲𝗻𝘁 𝟰+ 𝗵𝗼𝘂𝗿𝘀 𝗿𝗲𝘄𝗮𝘁𝗰𝗵𝗶𝗻𝗴 𝗞𝗮𝗿𝗽𝗮𝘁𝗵𝘆’𝘀 𝗬𝗖 𝗸𝗲𝘆𝗻𝗼𝘁𝗲. And I realized — we’ve been looking at LLMs the wrong way. They’re not just “AI models.” They’re a new kind of computer. • LLM = CPU • Context window = mem

See More
6 Replies
42
44

Prasanna Raj Neupane

Hey I am on Medial • 6m

🚨 Meet Subduct — The Secure Bridge Between AI & Enterprise Data Modern LLMs are powerful — but they can’t access the core of your business. Subduct changes that. We’re building the secure infrastructure layer that connects GPT-4, Claude, and Mist

See More
Reply
2
Image Description
Image Description

Pulakit Bararia

Founder Snippetz Lab... • 5m

I’m building an AI-first architectural engine that converts natural language prompts into parametric, code-compliant floor plans. It's powered by a fine-tuned Mistral LLM orchestrating a multi-model architecture — spatial parsing, geometric reasonin

See More
8 Replies
19
31

mg

mysterious guy • 7m

30 AI Buzzwords Explained for Entrepreneurs 1) Large Language Model (LLM) LLMs are like super-smart computer programs that can understand and do almost anything you ask them using regular language. Think of tools like ChatGPT or Gemini – they're a

See More
Reply
8
Image Description
Image Description

Havish Gupta

Figuring Out • 1y

Do You Think Krutuim Ai will dominate Ai in India? • So Krutuim ai is LLM made by Ola designed to cater diverse needs of people. It works in about 22 Languages. • Krutrim's future plans include developing its own chips and other AI-focused hardwar

See More
20 Replies
1
25

Download the medial app to read full posts, comements and news.