At an estimated 4 cents per ChatGPT query, OpenAI looks for cheaper AI chip solutions.
OpenAI, the creator of ChatGPT and DALL-E 3 generative AI products, is exploring the possibility of manufacturing its own AI accelerator chips, according to Reuters. Citing anonymous sources, the Reuters report indicates that OpenAI is considering the option due to a shortage of specialized AI GPU chips and the high costs associated with running them.
OpenAI has been evaluating various options to address this issue, including potentially acquiring a chipmaking company and working more closely with other chip manufacturers like Nvidia. Currently, the AI firm has not made a final decision, but the discussions have been ongoing since at least last year. Nvidia dominates the AI chip market, holding more than 80 percent of the global share for processors best suited for AI applications. OpenAI CEO Sam Altman has publicly expressed his concerns over the scarcity and cost of these chips.
The hardware situation is said to be a top priority for OpenAI, as the company currently relies on a massive supercomputer built by Microsoft, one of its largest backers. The supercomputer uses 10,000 Nvidia graphics processing units (GPUs), according to Reuters. Running ChatGPT comes with significant costs, with each query costing approximately 4 cents, according to Bernstein analyst Stacy Rasgon. If queries grow to even a tenth of the scale of Google search, the initial investment in GPUs would be around $48.1 billion, with annual maintenance costs at about $16 billion.
Read 3 remaining paragraphs | Comments