There’s a common belief online that ChatGPT’s ₹1,600/month (or $20) plan isn’t sustainable. People assume it’s backed by VC money and that OpenAI is burning cash to keep it running. But that's not really the case. Each GPT-4o query costs less than a cent to run. It uses around 0.34 Wh of energy per prompt. At scale, those costs stay very low. Even with a billion queries a day, and massive GPU usage, it still adds up to a solid margin. Training might cost hundreds of millions, but that’s a one-time effort. At $20/month, this is more than profitable. It’s a textbook example of good unit economics. A reminder that sustainable models can still scale fast — without setting money on fire.
Download the medial app to read full posts, comements and news.