Red Hat to Acquire Neural Magic to Drive Gen AI Across Hybrid Cloud Environments

Follow Us

Red Hat Acquires Neural Magic to Boost Hybrid Cloud AI Performance
Red Hat, an open-source solutions provider, has announced its acquisition of Neural Magic, a Massachusetts-based company specialising in software and algorithms to accelerate generative AI (gen AI) inference workloads. The acquisition aims to make high-performance AI accessible across hybrid cloud environments, addressing key challenges in deploying large language models (LLMs), which typically require costly and resource-intensive infrastructure.

Also Read: L&T Technology Services Acquires Intelliswift to Boost AI and Digital Engineering Capabilities




AI Accessibility and Cost-Efficiency with vLLM

"Neural Magic's expertise in inference performance engineering and commitment to open source aligns with Red Hat’s vision of high-performing AI workloads that directly map to customer-specific use cases and data, anywhere and everywhere across the hybrid cloud," Red Hat said in a statement this week.

Red Hat intends to address the challenges of building cost-efficient and reliable LLM services which requires significant computing power, energy resources and specialised operational skills by making gen AI more accessible to more organisations through the open-source vLLM.

Developed by UC Berkeley, vLLM is a community-driven open-source project for open model serving (how gen AI models infer and solve problems), with support for all key model families, advanced inference acceleration research and diverse hardware backends, including AMD GPUs, AWS Neuron, Google TPUs, Intel Gaudi, Nvidia GPUs and x86 CPUs.

Red Hat says Neural Magic's expertise in the vLLM project, combined with Red Hat's portfolio of hybrid cloud AI technologies, will offer organisations an open pathway to building AI strategies that meet their unique needs.

Also Read: L&T Partners with E2E Networks to Drive GenAI Cloud Solutions in India

Neural Magic's Expertise

Founded in 2018 as a spinout from MIT, Neural Magic has developed technologies for optimising AI models, particularly in model inference performance. Red Hat intends to leverage this expertise in the open-source vLLM project to democratise AI by offering scalable, cost-efficient options for businesses of all sizes, supporting various model types and hardware backends.

"AI workloads need to run wherever customer data lives across the hybrid cloud; this makes flexible, standardised and open platforms and tools a necessity, as they enable organisations to select the environments, resources and architectures that best align with their unique operational and data needs," said Matt Hicks, President and CEO of Red Hat.

Neural Magic uses its expertise in vLLM to build an enterprise-grade inference stack which enables customers to optimise, deploy and scale LLM workloads across hybrid cloud environments with full control over infrastructure choice, security policies and model lifecycle. Neural Magic also conducts model optimisation research, develops the LLM Compressor (a unified library for optimising LLMs with state-of-the-art sparsity and quantisation algorithms) and maintains a repository of pre-optimised models ready to deploy with vLLM.

Also Read: HCLTech to Open AI, Cloud Native Lab in Singapore to Accelerate AI Innovation

Expanding Red Hat’s AI Portfolio

With Neural Magic's capabilities, Red Hat plans to enhance its AI portfolio, including Red Hat Enterprise Linux AI (RHEL AI) for developing foundation models, Red Hat OpenShift AI for machine learning across Kubernetes environments, and InstructLab, a collaborative project with IBM to advance open-source models. This expanded portfolio will enable enterprises to fine-tune and deploy AI models with flexibility and security across corporate data centers, cloud platforms, and edge locations, Red Hat said.

Brian Stevens, CEO, Neural Magic added, "Open source has proven time and again to drive innovation through the power of community collaboration. At Neural Magic, we’ve assembled some of the industry's top talent in AI performance engineering with a singular mission of building open, cross-platform, ultra-efficient LLM serving capabilities."

Dario Gil, IBM senior vice president and director of Research, said, "As our clients look to scale AI across their hybrid environments – virtualised, cloud-native LLMs built on open foundations will become the industry standard. Red Hat's leadership in open source combined with the choice of efficient, open source models like IBM Granite and Neural Magic's offerings for scaling AI across platforms empower businesses with the control and flexibility that they need to deploy AI across the enterprise."

The transaction is subject to applicable regulatory reviews and other customary closing conditions.

Recent Comments

Shivraj Roy :

I have iPhone 11 which is 4G phone

Nokia to Deploy 3,300 New Sites for Vodafone Idea by…

TheAndroidFreak :

No CA on that phone. By the way, it's time to upgrade.

Airtel CEO Hints at New Structure for Mobile Tariffs to…

TheAndroidFreak :

Are you sure cap is there?

Airtel CEO Hints at New Structure for Mobile Tariffs to…

TheAndroidFreak :

It's not. Even if it's there, it hardly stays on for long duration. In Vi, it's stable for CA. Sites…

Airtel CEO Hints at New Structure for Mobile Tariffs to…

Phoenix96 :

It tears my soul watching the condition of our Indian telecom sector, I just wish the healthy competition returns again…

Vodafone Idea Reports Rs 7,176 Crore Loss in Q2, ARPU…

Load More
Subscribe
Notify of
0 Comments
Inline Feedbacks
View all comments