Lessons in AI from Dell Technologies World 2024
A main focus of the Dell Technologies World 2024 conference was AI and how it impacts infrastructure environments. Dell spotlighted three AI infrastructure strategies to consider.
The Dell Technologies World 2024 conference came with an extra level of excitement compared to years past. That excitement was fueled by two letters: AI.
In the opening keynote, when Dell Technologies CEO Michael Dell mentioned the PowerEdge XE9680 server -- a server introduced over a year ago that can support eight Nvidia GPUs -- the audience erupted in applause. Infrastructure took center stage this year, as organizations have taken notice of the direct connection between the latest infrastructure technology and success with AI initiatives.
The keynote also featured Nvidia founder and CEO Jensen Huang, who reinforced the importance of infrastructure to the AI era. "Generative AI requires a new type of computing infrastructure -- an AI factory that produces intelligence," Huang said.
Generative AI and infrastructure environments
At TechTarget's Enterprise Strategy Group, we have been closely monitoring both the excitement around generative AI and the resulting impact on infrastructure environments. According to Enterprise Strategy Group research, 54% of organizations said they expect to have a generative AI project in production in the next 12 months. However, the ideal locale is not decided when it comes to the right infrastructure for generative AI. It varies based on the organization and the use.
Our study of both predictive and generative AI projects found that the most identified deployment model is hybrid cloud at 30%. Among the remaining deployment options, the percentage of organizations that identified public cloud providers, such as AWS, Azure or Google Cloud, as their primary AI infrastructure provider was slightly higher than the combined percentage of respondents that selected either data center or edge locations.
At the conference, Dell offered more detail on the Dell AI Factory with Nvidia, which was first announced at the Nvidia GTC event in March. This platform combines an organization's data and desired use case with Nvidia processors, Nvidia and third-party software, Dell services and infrastructure from both Dell and Nvidia.
Customers can either purchase integrated capabilities tailored to their specific needs or select prevalidated services for specific AI uses, such as deploying digital assistants for end users. The end goal of the Dell AI Factory for Nvidia is to accelerate time to value for AI initiatives.
How to modernize your infrastructure
Beyond the AI factory news, Dell made several additional announcements to help organizations modernize their infrastructures to support the requirements of either predictive or generative projects:
- PowerEdge XE9680L. A follow-on to the XE9680. This server supports eight Nvidia Blackwell GPUs in a denser 4U (rack unit) form factor and utilizes direct liquid cooling to improve overall cooling efficiency.
- PowerScale F910. All-flash file storage that offers improved performance and density over previous versions. It is Nvidia DGX SuperPod certified, providing an Ethernet-based storage option to support AI initiatives.
- Project Lightning. A new parallel file system software architecture that Dell anticipates integrating into PowerScale to accelerate file storage performance, likely for high-performance compute and/or AI training initiatives.
- PowerStore Prime. A higher-performing PowerStore model with a 5:1 data reduction guarantee.
It is essential for any organization exploring generative or predictive AI initiatives to consider all options when it comes to new deployment. According to a forthcoming Enterprise Strategy Group study, 84% of organizations said they agree that "The growth of AI (including generative AI) has us reevaluating our application deployment strategy."
3 bonus AI infrastructure strategy factors
In addition to the essential criteria organizations must evaluate regarding new AI initiatives, such as security, cost, data locality, data governance and performance, Dell Technologies World gave organizations three more AI infrastructure strategy factors to consider.
The necessity of third-party services
During the Day 2 keynote session, Dell Technologies COO Jeff Clarke explained an important reality in generative AI adoption, saying, "Most enterprises will not train their own models." This is by design. With the increasing availability of open source models, organizations can create value through the lower-cost tuning or augmentation of existing models. This space is evolving quickly.
Identifying the correct model and rightsizing the infrastructure requires expertise that is not in-house but is essential to ensure AI initiatives meet the organization's ROI goals. Multiple leaders in this space, such as Dell Technologies, Nvidia, major cloud providers and leading consulting firms offer AI services. For example, Dell Technologies offers services to help organizations define their AI strategy, identify and prioritize use cases, prep data, and select and tune models.
The value of a larger infrastructure modernization approach for AI
With AI initiatives expected to drive budgets for the next 12 to 24 months, organizations have an opportunity to do more than just deploy a new silo of infrastructure. For example, internally developed applications that use generative AI models will likely be container-based. Using AI investments to speed up container-based modernization can help increase the ROI of generative AI initiatives. For example, on the show floor at the event, Dell Technologies and Red Hat presented a demo of a validated design of Red Hat OpenShift AI on the Dell Apex Cloud Platform.
The need for modern liquid cooling options
Direct liquid cooling is a key technology behind the GPU density of the newly announced PowerEdge XE9680L. For decades, we have been trained to keep liquid away from technology. To be able to access the latest in infrastructure technology to empower GPUs, however, it is time that we as an industry start to increase our willingness to consider and embrace liquid cooled architecture.
The AI space is quickly evolving, and Dell continues to invest in innovation, expertise, services and partnerships to keep it at or near the forefront of this space. Businesses also need to move fast, or they will risk being left behind. As Michael Dell said on stage, "If you have a call center and you are not already leveraging a large language model to assist, you are already behind."
Scott Sinclair is Practice Director with TechTarget's Enterprise Strategy Group, covering the storage industry.
Enterprise Strategy Group is a division of TechTarget. Its analysts have business relationships with technology vendors.