Getty Images/iStockphoto

WEKApod appliance built for Nvidia GPUs a first for company

Weka enters new territory with WEKApod, an appliance purpose-built for Nvidia DGX SuperPod. The offering is part of a major push from storage vendors vying to work with the GPU maker.

Weka has released its first-ever hardware offering, which is specifically designed for Nvidia's DGX SuperPod AI data center infrastructure.

WEKApod, unveiled at this week's Nvidia GTC 2024, is an Nvidia-certified storage appliance, a credential that indicates that the hardware and software meet Nvidia standards to underpin its DGX H100 systems. The parallel file system and software-defined storage vendor joins a string of other SuperPod-certified storage providers, including DataDirect Networks (DDN), IBM, NetApp and Vast, but Weka said its appliance can deliver high performance while using less space and energy than competitors.

Storage vendors are racing to partner and work with GPU maker Nvidia as AI takes hold in the enterprise, but Weka is attempting to set itself apart with its new storage appliance's form factor and performance claims, according to Nick Patience, an analyst at 451 Research, part of S&P Global Market Intelligence.

"Customers like to have boxes … to reduce the complexity of implementation. They like to have everything in a set form factor of hardware, [and in this case] with Weka software," he said.

While WEKApod is part of an AI stack designed to run on premises, Weka's software-defined storage can be used independently in the cloud for the same use cases, Patience said.

WEKApod and high performance

Weka partnered with an original design manufacturer to produce branded hardware, but the appliance is supported by Weka. The hardware utilizes the latest PCIe 5 and comes with performance numbers of 18.3 million IOPS and 765 GBps in a single, 1 PB cluster, according to the vendor. Weka said a minimum cluster of eight nodes can deliver this kind of performance.

This is proof that Weka is fast or faster than anybody else who can deliver for the SuperPOD.
Steve McDowellAnalyst and founding partner, NAND Research

The software-hardware combination and its related metrics show Weka can provide the storage infrastructure for the highest-performance workloads, according to Steve McDowell, an analyst and founding partner at NAND Research. He specifically pointed to WEKApod's ability to achieve such high bandwidth, which means Weka can meet customer demand while also providing room for growth.

"This is proof that Weka is fast, or faster than anybody else who can deliver for the SuperPod," he said.

Currently, Weka is targeting early AI adopters or companies that need upper-tier performance, according to Patience. But he said that in the next two years, he expects to see more widespread adoption and an adjustment in AI demands.

"As the shift moves from a focus on training -- a practice undertaken by cutting-edge customers [such as] financial services, life sciences, etc. -- to inferencing, there will be a bigger opportunity for a company like Weka and its competitors," Patience said.

AI-specific storage, reducing power and footprint

This isn't Weka's first time working with Nvidia. It is also a certified partner for Nvidia DGX BasePod, where Nvidia recommends a reference architecture but storage vendors like Weka can customize the offering. SuperPod DGX, on the other hand, is a turnkey product, offering storage vendors less flexibility.

Weka also unveiled at GTC that it is part of a new storage validation program for Nvidia OVX computing systems, which combine high-performance GPUs, high-speed storage access and low-latency networking. OVX use cases include digital twins, AI training and inferencing.

Weka claims its WEKApod is one of the few storage products on the market designed for AI. Some vendors are rebranding existing storage as AI-ready, but the products weren't purpose-built to deliver efficient use of data pipelines or to populate data pipelines with the amount of data required for AI workloads, according to Weka.

On top of this, Weka underscored that WEKApod takes up less rack space in a company's data center and provides more performance per server, thereby consuming less power than competitors.

"If you offer better performance, you need fewer servers and you use less power," said Dave Raffo, an independent storage analyst.

Other vendors build products that fit Weka's definition for purpose-built AI storage, according to Raffo. He pointed to DDN as an example, whose AI400X2 appliance is also SuperPod-certified. DDN also made news at GTC, unveiling its AI400X2 Turbo appliance at the show.

Power and footprint are important for cloud providers that offer GPU as a service, McDowell said. This is an area where vendors such as Weka or Vast Data will architect storage differently.

"For people building big training clusters, this use case quickly outgrows traditional storage architecture," McDowell said.

Adam Armstrong is a TechTarget Editorial news writer covering file and block storage hardware, and private clouds. He previously worked at StorageReview.com.

Next Steps

Nvidia partners, customers drive AI into data centers

Dig Deeper on Storage system and application software