Dell AI Factory curates AI tech for customers
Dell showcased its AI Factory at Dell Technologies World 2024, and some early customers are already talking about the lift AI has given their businesses.
LAS VEGAS -- Dell made it clear at its annual user show: It wants to provide a one-stop-shop for AI infrastructure.
The thread that keynotes and company executives wove through Dell Technologies World 2024 was singular: The Dell AI Factory -- a portfolio of Dell infrastructure from storage to servers to PCs, combined with a partner ecosystem and additional services -- is aimed at making it easier to build and run AI in the enterprise.
"The Dell AI Factory is a means for making the adoption of AI simpler for the average customer," according to Matt Baker, senior vice president of AI strategy at Dell.
Part of the Dell AI Factory includes its partnerships with Nvidia, AMD and Intel for GPUs, Broadcom for networking and Microsoft for its Azure AI Services, which provide access to APIs, models and out-of-the-box AI tools. It's also partnering with Meta, the maker of large language model (LLM) Llama 3, and Hugging Face, an open source AI platform, for on-premises AI model deployment. Indeed, Dell unveiled at the conference its Enterprise Hub on the Hugging Face platform, a portal that provides access to LLMs that customers can deploy on Dell infrastructure.
Dell is also offering access to its AI Professional Services, which provide implementation services for Microsoft Copilot and accelerator services for Dell Enterprise Hub on the Hugging Face platform.
"Most enterprises will not train their own large language models. But they will use open source models like Llama 3, Mistral and others," said Jeff Clarke, chief operating officer at Dell, during his keynote on the second day of the show. Later, during a Q&A that directly followed his talk, he added: "We clearly can't scale to the entirety of the needs out there, so you'll see us partner to expand the capabilities."
Collectively, this is Dell's version of an existing concept known as the AI data pipeline or the interconnection of tools to get from data to insights, according to Keith Townsend, principal of CTO Advisor, part of the Futurum Group.
"Dell AI Factory is the marketing term for the framework they are doing," he said.
In speaking with customers, Townsend said he saw a shift in resonation pre- and post-keynote. The conversation between Dell CEO Michael Dell and Nvidia CEO Jensen Huang on the first day of Dell Tech World added a layer of clarity to how customers see the AI Factory and how they can better use it.
Easing use through the factory
Customers on the showroom floor expressed a sense of cautious excitement at the potential of integrating Dell AI Factory technologies into workloads. While most said they see AI as augmenting their work, others said they believe Dell's approach of packaging up the tools needed is a good way to roll out the accelerated technology.
One Dell customer using technologies within the AI Factory portfolio, McLaren Automotive's racing team, pointed to Dell's PowerEdge XE9680 rack server as its "AI brain" for training and inference, according to Daniel Keyworth, director of business technology for McLaren Racing.
The PowerEdge XE9680 was first introduced late last year and is a GPU-dense server that can house Nvidia, AMD and Intel accelerators. At the show, Dell introduced the latest version of the rack server, the Direct Liquid Cooled PowerEdge XE9680L, as part of its expanded Dell AI Factory with Nvidia. The rack server uses liquid cooling to allow for denser CPU and GPU configurations and is optimized for the Nvidia HGX B200 server board.
Daniel KeyworthDirector of business technology, McLaren Racing
McLaren uses machine learning to run millions of simulations that help shape the design lifecycle of a car.
"AI on an exponential curve just allowed us really to supercharge our strategy around it," Keyworth said.
Using AI has helped McLaren make tweaks and changes in the digital world versus making physical changes to a car, Keyworth said. Due to the nature of the sport, there is an embracing of AI as it takes away mundane aspects of the job that can be redeployed elsewhere.
"AI isn't labor replacement, it is laborious replacement," Keyworth said.
Yan Chen, studio architect at Kennedy Miller Mitchell, one of the oldest existing Australian production companies and a Dell customer, echoed this sentiment. For visual effects, AI is less about replacing jobs and more about shortening the duration of tasks. An example he cited was matte painting, or creating the appearance of a realistic background in film. A matte painting may take an artist two weeks to complete, but with AI, the same artist can complete the painting within a few hours.
"That doesn't mean you're happy with the iteration," Chen said. "In the same two-week time frame, the artist can do 20 to 30 iterations until the director is happy."
Using technologies within AI Factory, such as PowerScale for storage and Dell Precision workstations for rendering, can also save time in virtual effects-heavy films, Chen said. Instead of shooting against a green screen that's then handed over to the special effects team to layer in the director's vision, AI allows for pre-visualization and post-visualization to visualize scenes before they are shot and add affects after. Adding effects over the shot gives a blueprint to what is needed.
"With AI Factory, we're able to create an 80% look very quickly with real time technology," Chen said.
Lightning on the horizon
Storage was a major focus at Dell Tech World, with Dell unveiling a new PowerStore, a unified offering for a bulk of data generated, and a new PowerScale, a scale-out file system used for unstructured data and training. Both technologies are part of the Dell AI Factory portfolio.
"It all starts with data, because if you don't have any data, you don't have AI," Dell said at a press briefing following his opening keynote.
The new PowerStore Prime is targeting midrange unified or block storage systems that will power the build of AI workloads needed for local language model training, according to Brent Ellis, an analyst at Forrester Research.
With PowerScale, Dell is looking to expand on larger systems focused on training and saturating GPUs. Dell is currently developing the software capabilities to do that with a new parallel file system it's calling Project Lightning.
"Project Lightning, which isn't quite out yet, is forward-looking and really kind of targeting a world where AI use cases and HPC use cases start to emerge," Ellis said.
Project Lightning is being developed specifically for PowerScale to be used for AI training, Dell's Clarke said during the second-day keynote at Dell Tech World.
"GPUs are becoming much more powerful, which require more data and more throughput," Clarke said.
Clarke said Project Lightning will provide a performance increase of 20 times and a throughput increase of 18.5 times compared with competitors, but he didn't go into many more details.
For Townsend, this minimal focus on Project Lightning is a bit of a missed opportunity for Dell.
"Dell could have focused on its leadership in data storage and being able to solve the I/O bottleneck problem of getting your data into the AI chips into their services into their AI factory," he said.
Lack of details isn't quelling customer interest. Storage innovations don't stand still, particularly in terms of scale and performance, McLaren's Keyworth said.
"Anything that is all-flash and super-fast is always on our radar," he said.
Adam Armstrong is a TechTarget Editorial news writer covering file and block storage hardware and private clouds. He previously worked at StorageReview.com.