Spirent Communications today announced a test solution capable of emulating realistic Artificial Intelligence (AI) workloads over Ethernet. The company highlighted that Ethernet, as the networking world's technology of choice, serves as the backbone of the Cloud, making it crucial to emulate realistic AI traffic workloads and test their impact on AI data center networks and interconnects.
Eric Updyke, CEO of Spirent Communications, emphasized the importance of AI in today's landscape, with applications like ChatGPT, Lensa, and Copilot reshaping communications.
"Hyperscale cloud providers are shifting investment from their traditional data center front-end focus to new back-end infrastructures needed to manage the explosion in AI applications and workloads. These new environments are increasingly being built and operated separately from traditional data centers and are physically very different in order to cope with the specific needs of AI."
Also Read: NetActuate Launches Services From Third Data Center in India
Simplified Testing Solutions
"Our new solution will enable engineers to test their Ethernet fabric without having to go to the expense of building a whole new lab of costly xPU servers and configure test cases to generate AI workloads using these real servers," said Updyke.
Spirent's new AI test solution, running on the A1 400G platform, is designed to emulate high-density 400G xPU workloads for AI environments. This solution allows customers to test their Ethernet fabrics in existing environments without the need for costly xPU server-equipped labs.
Multi-Purpose Platform Advantages
Using the RoCEv2 protocol, the platform offers ease of use, straightforward configuration, and consistent results, reducing the complexity of testing AI use cases. Additionally, it serves as a multi-purpose platform capable of testing both AI and routing/switching use cases concurrently, the company said.
"We are proud to be first to market with our AI workload emulation platform and as the AI landscape continues to develop and data center architecture evolves to cater to AI/ML workloads."