Alex - stock.adobe.com

It's time to rethink enterprise storage for the AI era

Pure's platform-centric storage strategy will enable customers to scale their storage infrastructure to keep pace with the fast-evolving AI landscape.

At Pure Storage's recent Accelerate conference, the all-flash storage pioneer shared its strategy and vision spanning security, AI, new software, hardware, as-a-service delivery and more.

The announcement that struck me the most was Pure's move to embed its Fusion management and control software directly into the core Purity operating system. The implications for Pure customers, prospective customers and the market more broadly could be significant.

To understand why, let's take a step back and consider that the way enterprises buy, deploy, configure and manage their data center storage has not changed over the last 30 years.

First, business leaders decide they need a new application or service, and eventually the storage and infrastructure team buys or configures a storage system to support that new workload. Though this process supports the individual business requirement, the net result for most organizations has been a patchwork of storage islands, each supporting different technologies, protocols and service-level agreements. For most, this is an inefficient and complex mess that further saddles already-stretched IT teams.

Research from TechTarget's Enterprise Strategy Group found this storage complexity manifests in numerous ways; storage provisioning is still a laborious and time-consuming process for many, and finding the storage skills to maintain such environments is a struggle. Complexity impacts broader IT efforts around automation, and storage becomes a barrier to moving to modern application architectures, such as containers. And this is before you factor in that organizations increasingly want to manage storage both on premises and in the public cloud. Net-net, this silo-centric mindset ultimately slows down an organization's ability to deliver new projects and initiatives on time and on budget.

None of these are new problems, but they are becoming more challenging in the face of continued data growth, hybrid cloud and the rise of new application architectures designed to support, among other things, AI workloads. In particular, the success of an AI project puts a huge emphasis on having the right data, at the right time, in the right location. At a time when just about every organization is figuring out their AI play, the imperative to solve these underlying storage architecture issues has never been stronger.

Pure's platform play

Back to Pure Storage and Fusion. Pure's big idea here is that it can uniquely enable a transition from a product-centric approach to a platform-oriented posture. If that sounds too vague in an industry where every vendor seems to have a platform, then instead think about it in terms of managing a fleet of storage systems, rather than individual arrays.

This message should particularly resonate with larger customers, who might have tens or even hundreds of arrays, each of which must be configured and managed individually. But any organization with more than one array installed should be able to benefit.

By embedding Fusion into its Purity storage operating system, Pure is creating a control plane for data storage that spans all its arrays -- and is backward compatible -- incorporating block, file and object, scale-up and scale-out, and high performance as well as high-efficiency storage. The resulting "cloud of storage" enables data across all systems to be managed globally on a policy basis and dynamically adjusted over time to cater to changing requirements across a full span of workloads, including structured and unstructured data, and across on-premises and cloud-based locations.

This means capacity previously "trapped" in one system could be easily tapped to support a fast-growing workload located on another system without having to buy a new storage system. This could be a major benefit in an industry where average storage system utilization still struggles to reach 50%. It would also benefit a developer requiring access to storage for a new app, such that they could easily locate and provision this themselves using APIs. This could be particularly important as organizations continue to mature their AI strategies, and as data engineers look to quickly tap storage resources to use AI models for inference, retrieval-augmented generation, prompt engineering and the like.

Pure says it is uniquely positioned to deliver this, not only due to the new Fusion capability, but also because of its ability to deliver both nondisruptive upgrades by its Evergreen subscriptions, as well as its increasingly popular storage-as-a-service offering. This combination is designed to ensure that Pure's customers can evolve their storage environments to meet critical current and future challenges: AI readiness, cyber resilience, modern application support and hybrid cloud realization.

Pure will not be the only storage vendor talking up a platform approach: NetApp is also beginning to couch its capabilities in these terms as part of its broader rebranding around the "intelligent data infrastructure," and others will follow.

Nonetheless, Pure has a strong track record of innovation -- not just at the product level, but in helping organizations simplify the way they buy and manage enterprise storage. The latest set of announcements from Accelerate demonstrate that Pure remains resolutely committed to continuing that path and helping customers ensure their critical storage infrastructure keeps pace with the fast-evolving IT landscape.

Simon Robinson is principal analyst covering infrastructure at TechTarget's Enterprise Strategy Group.

Enterprise Strategy Group is a division of TechTarget. Its analysts have business relationships with technology vendors.

Dig Deeper on Storage architecture and strategy

Disaster Recovery
Data Backup
Data Center
Sustainability
and ESG
Close