Skip to content Skip to sidebar Skip to footer

According to a Reuters article citing people with knowledge of the situation, OpenAI has been looking into alternatives to some of Nvidia’s most recent AI chips due to worries over inference performance. The action may complicate the relationship between two of the biggest participants in the AI boom and underlines the increasing competitiveness in the rapidly changing AI chip market. 

OpenAI’s emphasis has increasingly switched to inference, the method by which models like ChatGPT produce responses for users, while Nvidia continues to be the leading supplier of chips needed to train huge AI models. According to sources, OpenAI is looking for faster hardware for particular tasks like software development and AI-to-software communication, with the ultimate goal of sourcing about 10% of its inference computing requirements from other suppliers. 

According to reports, OpenAI has spoken with semiconductor manufacturers AMD, Cerebras, and Groq. However, after Nvidia signed a significant licensing agreement with the startup, negotiations with Groq came to a stop. Both businesses have minimized any conflict in public despite these developments. While OpenAI said it still depends on Nvidia for the great bulk of its inference operations, Nvidia claimed that clients still prefer its chips because of their performance and affordability. 

Sam Altman, the CEO of OpenAI, also reinforced Nvidia’s leadership, describing its chips as the best in the world and expressing optimism that OpenAI would continue to be a significant client for many years to come.