Data Enthusiast • 2m
hii actually I have my ai application. and I used gemini for the ai model. as am scaling my application I had to use gemini more. so what I did was I created multiple projects and in each project i created a single key such that the limits are only
•
Mould Innovation • 2m
I would suggest using proxy for AI api request however proxy has it's own problems and I created a package on npm called llmpool it can loadbalance multiple LLMs so you can use this for each of your feature and for every feature use groq openai whate
yepp thank you
Hey I am on Medial • 5m
Thank you Meta
Iamaj7champ • 1m
Thank you Universe
Hey I am on Medial • 1y
Thank you Hindenberg 🙏
Founder of VedspaceA... • 1y
Thank you for 20
Hakuna matata • 4m
thank you ai 😃
Nature based Health ... • 5m
Thank you Idiot Meta
Entrepreneur | Build... • 1y
Thank you Everyone 🔥 Didn't event noticed it & here we are 500+ connections . Niket Raj Dwivedi Thank you brother 🔥
Hungry? DM me • 1y
Thank you for 100 followers!! 🙏
Hakuna matata • 6m
Always say thank you 😉😂
Aspiring Entrepreneu... • 19d
Thank you Accel Atoms , Wil push harder
Why world models are the next big ...
Haryana govt, SVCL sign Rs 100 cr ...
OpenAI made ChatGPT better at sif ...
Download the medial app to read full posts, comements and news.