Back

Anonymous

Anonymous 1

Hey I am on Medialย โ€ขย 9h

Option 1: Integrate LLM directly into the OS Pros: โ€ข ๐Ÿ”ฅ Speed โ€“ No external calls; everything runs natively. โ€ข ๐Ÿง  Tight integration โ€“ AI can interact with system-level processes (files, memory, user interface) seamlessly. โ€ข ๐ŸŒ Offline capable โ€“ If model fits on-device, it works without internet. โ€ข ๐Ÿ”’ Privacy โ€“ Data doesnโ€™t leave the machine. Cons: โ€ข ๐Ÿ› ๏ธ Hard to update โ€“ Every time you want to upgrade or swap the model, you may need to rework core OS components. โ€ข ๐Ÿ’พ Resource hungry โ€“ Big LLMs eat CPU/GPU/RAM. Many devices canโ€™t handle that without draining battery/heat. โ€ข โš ๏ธ High risk โ€“ If the integrated LLM breaks, it could destabilize the whole OS. ๐Ÿ‘‰ Who does this? Apple is slowly moving toward this with on-device AI, but only for small models (like Apple Intelligence). They keep larger models server-side. โธป โšก Option 2: Run LLM separately and connect via MCP servers Pros: โ€ข ๐Ÿš€ Scalability โ€“ You can upgrade models or switch to a better one without touching the OS core. โ€ข ๐Ÿงฉ Flexibility โ€“ The OS is lighter; AI can evolve independently. โ€ข โ˜๏ธ Access to bigger models โ€“ Youโ€™re not limited by device hardware; cloud/server LLMs can be massive. โ€ข ๐Ÿ›ก๏ธ Safer โ€“ If AI crashes, OS still works fine. Cons: โ€ข ๐ŸŒ Requires connectivity (unless you also run a local mini-LLM). โ€ข ๐ŸŒ Latency โ€“ Server round-trips are slower than local processing. โ€ข ๐Ÿ”‘ Privacy risks โ€“ Data goes to servers (unless you encrypt/keep self-hosted). ๐Ÿ‘‰ Who does this? Microsoft (Copilot), Google (Gemini in Android), OpenAI (ChatGPT apps). They keep AI mostly external for flexibility. โธป โ€ข If you want control, privacy, and OS tightly bound with AI โ†’ Option 1 (direct integration) is futuristic but only practical when models shrink enough to run efficiently on-device. This could be the 5โ€“10 year vision. โ€ข If you want scalability, easier updates, and faster iteration โ†’ Option 2 (servers) is smarter right now. Thatโ€™s why all big players (Google, Microsoft, OpenAI) are doing it this way today. If you are building for India second one is better

Reply

More like this

Recommendations from Medial

Image Description
Image Description

Srikanth Peddibhotla

Founder HappiLabs81 ...ย โ€ขย 3m

searching for confounders for my AI OS and device startup. DM me

2 Replies
1
Image Description

Aman Tiwari

๐Ÿ˜Ž Strategistย โ€ขย 13h

๐Ÿค”I am confused in one thing? should I integrate my LLM directry into the architecture of my OS or should I launch it separately and connect the OS infra via MCP servers? every suggestions are respected here...๐Ÿซก

1 Reply
4
Image Description

Divyam Gupta

Building products, l...ย โ€ขย 2m

Google Unveils AI Edge Gallery: On-Device Generative AI Without Internet Google has launched the AI Edge Gallery, an experimental app that brings cutting-edge Generative AI models directly to your Android device. Once a model is downloaded, all proc

See More
1 Reply
8

Bharath Varma

ย โ€ขย 

Googleย โ€ขย 1y

Do you know you can run Ubuntu os on your phone.?

Reply
4
Image Description
Image Description

Devarsh Ukani

Co-Founder @INDIย โ€ขย 1y

Is our app going to be gamechanger in AI revolution? We are making an Private Emotional Wellness AI Companion. Runs LLM locally on device.

10 Replies
4
Image Description

Yogesh Jamdade

.....ย โ€ขย 5m

how to connect ai models with your own softwares or tools? The answer is mcp servers The Model Context Protocol (MCP) is an open standard that enables AI models to securely access data and tools through standardized servers. MCP servers facilitate

See More
1 Reply
2
Image Description
Image Description

Sarthak Gupta

Developerย โ€ขย 4m

For past context, I am launching an AI SaaS that aims to lower llm subscription fees by providing users 15 plus llm models having comparable performance to ChatGpt models. I am working on optimizing pricing. Which is better. Please comment the reas

See More
5 Replies
1
9
Image Description
Image Description

Kunal sapkal

A Machine Learning E...ย โ€ขย 1y

Do you guys know devnagari language datasets to train an LLM models

5 Replies
4
Image Description
Image Description

Nawal

ย โ€ขย 

SELFย โ€ขย 1y

โ€ผ๏ธ Why i feel AI as core OS level is not a good idea in terms of privacy . What's your view over this ?

10 Replies
11
Image Description
Image Description

Omansh Arora

Abi nhi toh kabi nhiย โ€ขย 2m

just thinking about ai, i wanna ask if you have all in one llm model of 3-4b which cn run on phone nd laptop easily wht type of product you will made using it

4 Replies
4

Download the medial app to read full posts, comements and news.