
At the World Economic Forum in Davos, Cerebras co-founder and CEO Andrew Feldman said India’s role in the global artificial intelligence (AI) ecosystem is impossible to ignore, noting that the company already has an expanding presence in the country, according to a report dated January 19, 2026.
One would be a fool to overlook India
“As a market, as a place to partner, as a place to deploy gear, I think one would be a fool to overlook India,” Feldman reportedly said in an interview with Moneycontrol's Chandra R Srikanth.
“We have a facility in Bangalore, and we’re growing steadily. This year, touch wood, we seek to increase substantially,” he was quoted as saying.
Feldman also highlighted India’s talent pool. “India has some of the best universities in the world. They’ve got the human capital,” he said, adding, “I think India has a collection of some of the world’s leading software engineers, and we’re seeing more and more interesting AI work come out of India.”
All the Ingredients for AI Leadership Are Already in Place
According to him, the core ingredients for leadership in AI already exist in India. “There’s the desire and the willingness to put up the capital,” Feldman reportedly said. “I think all of these things are present in India.”
“If they choose, India can be one of the global AI powerhouses,” he said, according to the report. “It takes work, it takes effort, but the fundamentals are already there.”
He added that India’s importance in the global AI ecosystem will only grow stronger. “This is how AI diffuses through an entire economy,” Feldman said, underscoring why India is increasingly critical to the global AI story.
OpenAI and Cerebras Sign AI Infrastructure Deal
After signing deals to use computer chips from Nvidia and AMD and to design its own chips with Broadcom, OpenAI has recently reached an agreement with another chipmaker, Cerebras.
On January 14, 2026, OpenAI and Cerebras announced that they had signed a multi-year agreement to deploy 750 megawatts of Cerebras wafer-scale systems to serve OpenAI customers. The deployment will roll out in multiple stages beginning in 2026, making it the largest high-speed AI inference deployment in the world.
"The release of ChatGPT set the direction for the entire AI industry. It showed the world what was possible,” the companies said in a joint statement. “We are now in the next phase of AI adoption, the challenge is no longer proving what AI can do, but ensuring its benefits can reach everyone. The history of the technology industry has taught us a simple lesson: speed is the fundamental driver of technology adoption. The PC industry would not exist without the leap from kilohertz to megahertz to gigahertz, and the modern internet would not exist without the transition from dial-up to broadband."
Cerebras said it provides a high-speed solution for AI. "Whether running coding agents or voice chat, large language models on Cerebras deliver responses up to 15× faster than GPU-based systems. For consumers, this translates into greater engagement and novel applications. For the broader economy, where AI agents are expected to be a key growth driver over the coming decade, speed directly fuels productivity growth."
“We are delighted to partner with OpenAI, bringing the world’s leading AI models to the world’s fastest AI processor, Andrew Feldman said. “Just as broadband transformed the internet, real-time inference will transform AI, enabling entirely new ways to build and interact with AI models.” The capacity will come online in multiple tranches through 2028.





