Microsoft developing AI chips to reduce ChatGPT's $700k daily operation cost

What You Need to Know

  • A new report estimates ChatGPT's daily cost to be around $700,000.
  • There are rumors that Microsoft is developing its own AI chips.
  • Microsoft's own AI chip could reduce costs by a third or more.

AI is in demand right now, with millions of users utilizing ChatGPT daily. With such demand, it's not surprising to learn that ChatGPT is quite expensive to run. The Chief Analyst of research firm SemiAnalysis, Dylan Patel, estimates that ChatGPT currently costs OpenAI $700,000 per day, or 36 cents per query.

That's a lot of money, and it's not helped by the fact that AI companies, including Microsoft, have to purchase NVIDIA GPUs in bulk to help sustain their AI services as demand increases. Industry analysts believe that OpenAI may need to request an additional 30,000 NVIDIA GPUs this year to support their business efforts.

Building their own AI chip can help reduce costs, and that seems to be exactly what Microsoft is doing. Microsoft's own AI chip has the codename "Athena" and would already be well positioned to be tested with select groups internally at Microsoft. The company plans to deploy its own AI chip across its Azure AI services next year.

Industry experts say that they do not expect Microsoft to directly replace NVIDIA, but rather to reduce their reliance on them. Given that ChatGPT runs on Azure services, cost reduction will also help OpenAI to run ChatGPT more economically as well as other AI services that choose Azure as their host.

Microsoft has been aggressively pushing AI in recent months, after Google failed to launch its own AI product in the form of Bard. Reports have recently indicated that some Google employees have no faith in Bard, believing that Google launched the product to the market in response to Microsoft's Bing AI efforts.

Deja una respuesta

Tu dirección de correo electrónico no será publicada. Los campos obligatorios están marcados con *

Subir