- Share this item with your network:
- Download
Storage
- FeatureHow NVMe technology will rock the enterprise storage boat
- FeatureNVMe market and technology rapidly evolve
- FeatureThe challenges of flash enterprise storage and how to beat them
- OpinionStorage conferences aren't what they used to be
- OpinionUnderstanding data storage and its import eludes younger IT folks
- OpinionCatch the next wave of digital IT transformation: Learning tech
Fotolia
Understanding data storage and its import eludes younger IT folks
Jon Toigo has a lot to say about the consistent disparagement and disregard many young people in the IT industry show toward data storage technology and management.
I was recently ranting about the lack of common sense and historical wisdom that I've been seeing among younger IT folks when it comes to understanding data storage technology. To assuage me, an associate quoted one of the ancient philosophers (historians are unclear as to who actually said this) saying the children of his day were "impatient of all restraint ... [and] talk as if they alone knew everything and what passes for wisdom with us is foolishness with them." Little has changed in 2,500 years.
A meeting I attended at a data storage software vendor is what set me off. I was invited with other analysts and consultants to a technical roadmap session. The company had recently changed management, and the goal was to help the new guy chart a course forward.
Of course, we all opined about changes on the horizon and the opportunities or obstacles they might create for the company. In my view, the challenges are simple: We've drifted into a technical idiocracy, in which the foundations of everything computer science has created since the von Neumann machine are questioned. And this is occurring at a time when the technology adoption in every aspect of our daily lives has accelerated, and more data is being produced than we have media of any type to store it. From my perspective, this is a disaster in the making.
Nothing new
Market forces, not scientific experimentation, have mostly driven the deconstruction of technology. Storage is increasingly direct-attached and software-defined and not because these things are original or new. We mainly used DAS when I started in IT more than 30 years ago, and it predated my career by at least 20 years.
I'm also increasingly convinced software-defined doesn't mean anything when it comes to understanding data storage technology today. IBM had software-defined storage down cold in the early 1990s with system-managed storage. However, this model for software-defined and centralized storage gave way to distributed systems, and we saw the rise of proprietary storage systems using onboard array controllers with lots of embedded value-add software. Software-defined moved from the server to the array controller.
Most of the kids I talk to in the industry know nothing of this. They only know that software-defined is the newish buzzword that vendor marketers everywhere use. These young folks are genuinely puzzled when I tell them that all storage is software-defined, and the return to direct-attached topologies, now popular in discussions of software-defined hyper-converged data centers, is a step backwards -- a de-evolution. We've moved data storage topology away from shared, centrally-managed storage and back toward locally managed, difficult-to-share storage silos.
The revolution that wasn't
Many young folks think the latest storage memes are revolutionary, and that this revolution began in the early 2000s. Because mine was one of the early voices decrying the mess that the industry made of SAN and NAS, most youngsters take me for an early revolutionary -- a friend of the cloud and software-defined everything. They're surprised when I tell them hypervisor vendors hijacked that particular revolution early on. Instead of improving what needed to be improved -- like adding real management for a heterogeneous kit and reducing the complexity and cost of storage arrays by sharing value-add software in a server-side, software-defined stack -- the hypervisor vendors did the opposite.
First, they abandoned the idea of commodity storage and substituted licensed storage nodes as a building block for their proprietary virtual SANs. And they did little or nothing to improve manageability and storage-sharing efficiency. This form of software-defined produced more siloed infrastructure that proved as expensive as the monolithic gear it replaced.
In other words, VMware -- and to some extent Microsoft and some of the other hypervisor vendors -- sought to become what IBM had been in the '70s: the big cheese. They created the fiction of the I/O blender effect and used popular resentment against the cost and complexity of contemporary SAN and NAS to revive a software-defined and DAS model. However, in my experience, younger IT folks often don't know that.
The software-defined storage revolution wasn't intended to be a radical return to the proprietary direct-attached kit. Mainly, it was an effort to automate and make transparent the delivery of data storage resources and services to workloads and data in a more agile way, facilitating the rise of cloud computing and the decline of IT educated consumers. Unfortunately, all of the hype around this effort has led some people to believe it's already happened. It hasn't.
Losing sight of the truth
Vendors and customers talk about cloud storage as if it's the equivalent of flash, disk, optical and tape. They don't take it for what it is: a work-in-progress service-delivery model that uses the same storage media companies deploy in on-premises data centers, as old fashioned as that sounds. One young fellow -- a self-styled entrepreneur -- at the software company's meeting explained that neither he nor his peers gave one whit about the physicality of storage anymore, meaning the media, cabling, topology or anything like that.
He routinely sets up businesses in a cloud in a by-the-numbers way. He finds suppliers of goods he can buy at wholesale and sell on the web at retail. Next, he signs on with a cloud service and uses predefined processes to build a web store and make an app with which to access it.
Infrastructure, like storage, doesn't enter into the equation. Building the virtual storefront is easy, using pictures of products secured from the web and templates for the storefront site and purchasing services as needed from the cloud service provider for processes such as order taking, security checking, payment processing and order fulfillment. In short, he creates a business by linking those services, and he doesn't need to pay attention to the infrastructure underneath, especially the storage.
Because the world's becoming more and more like this, he argued, storage technology itself is obsolete and understanding data storage is unnecessary. I responded that the infrastructure client had simply changed from the business owner to the cloud service provider. Even in his business development model, someone somewhere had to provide the storage infrastructure.
He disagreed. New methods, such as containers, are enlisted to expedite app development and deployment, he said. Quicker application development has always been a goal in the do-everything-faster world of the cloud. With containers, programmers don't even concern themselves with data storage. Mainly because the technology currently lacks any provision for data permanency: Turn the app off and you lose your data. He took this to mean data had become so ephemeral in value that its retention wasn't a priority. Storage is obsolete.
I was astounded. When I used to run data centers, if a programmer asked permission for a new application development effort that provided no data permanency, I'd have told him or her "Don't let the door hit you on the way out." If I had time to vent, I would have talked about regulatory and legal mandates around data retention that made proper storage so important. Or I might have waxed philosophical about the historical value of data or the definition of good data stewardship. I might even have pointed out the many, though admittedly disjointed, efforts to add data permanency to container programming -- most of which require an intimate understanding of how data makes its way from system memory through various I/O channels to target storage devices.
I'll be honest, that meeting and recent trips to Silicon Valley and elsewhere have me concerned for the future. With regard to understanding data storage, if younger folks continue to disregard and disparage this fundamental technology, next-generation IT pros might do an even worse job of preserving and protecting the bits than we are. That philosopher's words may have withstood thousands of years of technology change, but I have my doubts that they will make it another 2,500 years.