Amazon's cloud computing division (AWS) announced on Tuesday that it will provide free computing power to researchers interested in using its custom artificial intelligence chips, aiming to compete with Nvidia's dominance in the field. Amazon has announced a USD 110 million investment in the Build on Trainium program, aimed at expanding AI research and training opportunities.
Also Read: Comviva and AWS Partner to Offer Cloud-First, AI-Driven Solutions for Telecom Providers
Access to AWS Trainium UltraClusters
The initiative will provide university-led research teams with access to AWS Trainium UltraClusters (collections of AI accelerators that work together on complex computational tasks) to explore new AI architectures, machine learning (ML) libraries, and performance optimisations.
Trainium Chips Empower AI Challenges
AWS Trainium, the ML chip built specifically for deep learning training and inference, will enable researchers to tackle large-scale AI challenges. As part of Build on Trainium, AWS created a Trainium research UltraCluster with up to 40,000 Trainium chips, which are optimally designed for the unique workloads and computational structures of AI, Amazon said.
As part of Build on Trainium, AWS and AI research institutions are also establishing dedicated funding for new research and student education. "A researcher might invent a new model architecture or a new performance optimisation technique, but they may not be able to afford the high-performance computing resources required for a large-scale experiment," Amazon noted.
Also Read: Anthropic, Palantir, and AWS Partner to Bring Claude AI Models to US Defense Operations
Collaborations with AI Research Institutions
The program will support a range of research areas, including algorithmic improvements and distributed systems, and foster collaborations between AI experts and research institutions like Carnegie Mellon University (CMU) and the University of California at Berkeley.
"Trainium is beyond programmable—not only can you run a program, you get low-level access to tune features of the hardware itself," said Christopher Fletcher, an associate professor of computer science research at the University of California at Berkeley, and a participant in Build on Trainium. "The knobs of flexibility built into the architecture at every step make it a dream platform from a research perspective."
Dedicated Funding and Support for AI Researchers
As part of Build on Trainium, selected research teams will receive AWS Trainium credits, technical support, and access to educational resources. The initiative will also make its advancements open-source, according to Amazon.
Also Read: AWS Announces Generative AI Partner Innovation Alliance
Neuron Kernel Interface (NKI)
Amazon said these advancements are made possible, in part, by a new programming interface for AWS Trainium and Inferentia called the Neuron Kernel Interface (NKI). "This interface gives direct access to the chip's instruction set and allows researchers to build optimised compute kernels (core computational units) for new model operations, performance optimizations, and science innovations," Amazon said.