Red Hat to Acquire Neural Magic to Drive Gen AI Across Hybrid Cloud Environments

Follow Us

Red Hat Acquires Neural Magic to Boost Hybrid Cloud AI Performance
Red Hat, an open-source solutions provider, has announced its acquisition of Neural Magic, a Massachusetts-based company specialising in software and algorithms to accelerate generative AI (gen AI) inference workloads. The acquisition aims to make high-performance AI accessible across hybrid cloud environments, addressing key challenges in deploying large language models (LLMs), which typically require costly and resource-intensive infrastructure.

Also Read: L&T Technology Services Acquires Intelliswift to Boost AI and Digital Engineering Capabilities




AI Accessibility and Cost-Efficiency with vLLM

"Neural Magic's expertise in inference performance engineering and commitment to open source aligns with Red Hat’s vision of high-performing AI workloads that directly map to customer-specific use cases and data, anywhere and everywhere across the hybrid cloud," Red Hat said in a statement this week.

Red Hat intends to address the challenges of building cost-efficient and reliable LLM services which requires significant computing power, energy resources and specialised operational skills by making gen AI more accessible to more organisations through the open-source vLLM.

Developed by UC Berkeley, vLLM is a community-driven open-source project for open model serving (how gen AI models infer and solve problems), with support for all key model families, advanced inference acceleration research and diverse hardware backends, including AMD GPUs, AWS Neuron, Google TPUs, Intel Gaudi, Nvidia GPUs and x86 CPUs.

Red Hat says Neural Magic's expertise in the vLLM project, combined with Red Hat's portfolio of hybrid cloud AI technologies, will offer organisations an open pathway to building AI strategies that meet their unique needs.

Also Read: L&T Partners with E2E Networks to Drive GenAI Cloud Solutions in India

Neural Magic's Expertise

Founded in 2018 as a spinout from MIT, Neural Magic has developed technologies for optimising AI models, particularly in model inference performance. Red Hat intends to leverage this expertise in the open-source vLLM project to democratise AI by offering scalable, cost-efficient options for businesses of all sizes, supporting various model types and hardware backends.

"AI workloads need to run wherever customer data lives across the hybrid cloud; this makes flexible, standardised and open platforms and tools a necessity, as they enable organisations to select the environments, resources and architectures that best align with their unique operational and data needs," said Matt Hicks, President and CEO of Red Hat.

Neural Magic uses its expertise in vLLM to build an enterprise-grade inference stack which enables customers to optimise, deploy and scale LLM workloads across hybrid cloud environments with full control over infrastructure choice, security policies and model lifecycle. Neural Magic also conducts model optimisation research, develops the LLM Compressor (a unified library for optimising LLMs with state-of-the-art sparsity and quantisation algorithms) and maintains a repository of pre-optimised models ready to deploy with vLLM.

Also Read: HCLTech to Open AI, Cloud Native Lab in Singapore to Accelerate AI Innovation

Expanding Red Hat’s AI Portfolio

With Neural Magic's capabilities, Red Hat plans to enhance its AI portfolio, including Red Hat Enterprise Linux AI (RHEL AI) for developing foundation models, Red Hat OpenShift AI for machine learning across Kubernetes environments, and InstructLab, a collaborative project with IBM to advance open-source models. This expanded portfolio will enable enterprises to fine-tune and deploy AI models with flexibility and security across corporate data centers, cloud platforms, and edge locations, Red Hat said.

Brian Stevens, CEO, Neural Magic added, "Open source has proven time and again to drive innovation through the power of community collaboration. At Neural Magic, we’ve assembled some of the industry's top talent in AI performance engineering with a singular mission of building open, cross-platform, ultra-efficient LLM serving capabilities."

Dario Gil, IBM senior vice president and director of Research, said, "As our clients look to scale AI across their hybrid environments – virtualised, cloud-native LLMs built on open foundations will become the industry standard. Red Hat's leadership in open source combined with the choice of efficient, open source models like IBM Granite and Neural Magic's offerings for scaling AI across platforms empower businesses with the control and flexibility that they need to deploy AI across the enterprise."

The transaction is subject to applicable regulatory reviews and other customary closing conditions.

Reported By

Kirpa B is passionate about the latest advancements in Artificial Intelligence technologies and has a keen interest in telecom. In her free time, she enjoys gardening or diving into insightful articles on AI.

Recent Comments

Faraz :

So do they plan to pay all AGR dues, or are they waiving white flag here ?

Vodafone Idea Says No Legal Options Left After SC Rejects…

Rahul Yadav :

Bsnl is still cheaper than other you need find plan which suits you. BSNL has multiple plans with just minor…

BSNL Reports Rs 262 Crore Profit in Q3, Marking First…

Faraz :

It is much better in Kerala, T.N, Rajasthan and H.P circle. Improving slowly in A.P, Punjab and Haryana circle. Might…

BSNL Reports Rs 262 Crore Profit in Q3, Marking First…

Faraz :

Don't work, Now even can't send SMS on that number as if it's blocked.

BSNL Reports Rs 262 Crore Profit in Q3, Marking First…

Faraz :

With this article, there's another news too on same day, Where Vi is asked to pay 6090 crore before 10…

BSNL Reports Rs 262 Crore Profit in Q3, Marking First…

Load More
Subscribe
Notify of
guest

0 Comments
Inline Feedbacks
View all comments