Getty Images

With Project Lightning, Dell to strike out in new directions

Dell's Project Lightning is a parallel file system that will be tied to the vendor's AI Factory portfolio, but the promise of new hardware could also mean entry into new markets.

Dell Technologies will soon become a parallel file system provider to support AI workloads, but doing so could help push the vendor into new markets or into a more specific storage role.

In May at Dell Tech World 2024, Dell executives unveiled Project Lightning, a parallel file system in an early project phase, for its scale-out unstructured data storage technology, PowerScale. Parallel file systems use software to store massive amounts of data across multiple arrays and servers with concurrent and quick access. High-speed storage and parallel access are often cited as two attributes needed to fully utilize the GPUs that have become critical to AI workloads.

In its initial rollout, Project Lightning will become part of the new Dell AI Factory, a portfolio of infrastructure and services for AI. The project also marks a new technology category for the vendor, built for existing PowerScale hardware. This is the second new category for the vendor this year, with the first being the Dell Data Lakehouse, which launched in March.

During his keynote on Day 2 of Dell Tech World, Dell COO Jeff Clarke called Project Lightning "game-changing." But the executives shared little else other than noting that the technology will be designed in house and that Dell will not be partnering with or acquiring the technology from a third party.

There are not a lot of standard IT enterprise groups that can tolerate the amount of architecture decisions that have to go into implementing HPC.
Brent EllisAnalyst, Forrester Research

Still, experts have some ideas about what the new technology might mean for Dell. Aside from AI, high-performance computing (HPC) workloads -- which are complex, data-intensive workloads that require large amounts of compute -- are increasingly moving to the enterprise, according to Brent Ellis, an analyst at Forrester Research.

"There are not a lot of standard IT enterprise groups that can tolerate the amount of architecture decisions that have to go into implementing HPC," Ellis said.

Project Lightning would commercialize architecture and software as a feature in PowerScale, which would reduce the training to run HPC on premises and make HPC more of an off-the-shelf service for customers, he said.

One-stop shop for AI?

Dell projects are internal initiatives that are still being researched and refined by the company, but the vendor offers these projects as a preview to upcoming technology.

In 2020, for example, Dell previewed Project Apex -- now Dell Apex -- as an everything-as-a-service cloud console that eventually unified all of Dell's as-a-service models. In 2022, Dell introduced Project Alpine as a series of cloud storage offerings that went on to become Apex Block Storage for Public Cloud for AWS and Azure the following year.

Martin Glynn, senior director of product management for Dell's storage business unit, said Project Lightning will provide parallel access from a software layer, but he did not give details on possible hardware changes or a time frame.

According to Ellis, one avenue for Dell is to use some form of a parallel network file system, which provides a dedicated I/O path for metadata to overcome bottlenecks. In 2010, pNFS was added as an optional feature to NFS version 4.1, a standard for clustered deployment and scalable parallel file access for storage. PowerScale's operating system, OneFS, has supported NFS 4.1 since its 9.3 release in 2021.

Ellis said Dell could use NFS 4.1 to add Project Lightning as a feature to OneFS and that it "wouldn't be a big architectural change." But, he added, customers would likely need to invest in higher-performance hardware to reap all the benefits.

If Dell wants to be a one-stop shop today for AI, it needs to be prepared, according to Henry Baltazar, an analyst at 451 Research, which is part of S&P Global Market Intelligence. Having a parallel file system is part of that preparation, and Dell building its own technology instead of working with a third party could ensure a seamless integration into the Dell stack.

"A lot of vendors have a hard time just acquiring something and dropping it in," he said.

Dell's track record of acquiring and integrating technology -- such as the scale-out file system Isilon, acquired by EMC in 2010, and data deduplication storage company Data Domain, acquired by Dell in 2009 -- has been successful, Baltazar said. However, integration can be a hurdle.

Beyond AI

Parallel file systems have found a place in research, financial services, and media and entertainment industries, but things are changing, Baltazar said. Data-intensive workloads are becoming more prominent with AI leading the charge.

"Now, you're talking about generative AI ... or large amounts of video like the Vegas Sphere, then you start thinking, 'Maybe now we do need a parallel file system to make sure we're getting the right outcomes,'" he said.

But a parallel file system like Project Lightning isn't limited to AI, Baltazar said. Dell might make Project Lightning part of its AI Factory, but data proliferation is occurring even without AI, and parallel file systems could become essential to storage and to how companies use the data they're generating.

"[Companies are] retaining more and more of it, it's hard to delete, and data doesn't stand still," he said.

Video and medical image resolution keeps going up, for example, which creates more data that will need to be stored and accessed.

Alvin Nguyen, an analyst at Forrester, echoed the point. Any use case that has multiple streams of data, particularly when there is a lot of data to read, can benefit from a parallel file system.

"There are a number of use cases, but video streaming and genomic sequencing are the two that would be able to push at the level we're seeing with generative AI," Nguyen said.

Those use cases that extend beyond AI might be beneficial to Dell in the long term, Ellis said.

"If you're Dell and your goal is to be able to continue to sell hardware to businesses, then you're going to have to start to move into less commodity use cases," he said.

Commodity hardware and software, or generic and interchangeable components, are cheaply provided by public clouds or managed service providers, and Dell needs to look into differentiating what it's offering to customers, Ellis said. A parallel file system could be part of that strategy.

"The world of selling generic x86 servers is becoming increasingly low-margin," he said. "A lot of it is eaten up by the cloud."

Adam Armstrong is a TechTarget Editorial news writer covering file and block storage hardware and private clouds. He previously worked at StorageReview.com.

Dig Deeper on Storage architecture and strategy

Disaster Recovery
Data Backup
Data Center
Sustainability
and ESG
Close