Data center infrastructure spending gets AI boost

IT solutions providers are finding a role in helping clients create the artificial intelligence infrastructure they need to handle emerging technology requirements.

AI and other transformative emerging technologies may command the spotlight, but behind the scenes, they drive data center infrastructure spending and plenty of business for channel partners.

Indeed, enterprises often need to update compute, storage, networking and data architecture when they seek to absorb AI and other tech developments, such as IoT, edge computing and multi-cloud deployment. As a result, solutions providers report strong demand for those infrastructure essentials, which ride the coattails of the latest developments.

Tyler Beecher, CEO at Trace3, a solutions provider in Irvine, Calif., said part of his company's mission is to help clients get an early look at what's coming around the technology corner. The company's venture capital briefings introduce CIOs to Silicon Valley investors and entrepreneurs who provide a sense of the shape of things to come. Yet, Beecher noted, customers continue to spend heavily on the basics, even as they embrace the cutting edge.

Tyler BeecherTyler Beecher

"We found that, for every $1 spent on emerging technology, clients are spending $8 to $10 on core technology to support the changes or support bridging the gap from where they are to where they need to go," Beecher said.

Beecher said he sees continuing demand for foundational data center products and engineering services. The data center work is "in support of the business outcomes [that clients] are trying to drive with emerging technologies," he added. "We don't see that changing, but you need to show them tomorrow before they will invite you to execute on today."

Artificial intelligence infrastructure: A familiar ring

AI stands out as particularly prominent among the emerging technologies reshaping the data center. North American IT professionals polled by TechTarget ranked the desire to harness AI as among the leading purchase drivers for network servers, storage and data integration products (see graphic).

AI infrastructure chart
AI-driven infrastructure requirements are influencing data center infrastructure spending.

The demand for IT essentials surrounding AI is reminiscent of the tech industry's experience during earlier technology waves. The rise of relational databases in the 1980s, for example, required supporting technologies for management and security.

Juan OrlandiniJuan Orlandini

"These new workloads are no different," said Juan Orlandini, chief architect for cloud and data center transformation at Insight, an IT solutions provider based in Tempe, Ariz. "You still have to do the blocking and tackling of it. How do I protect it? How do I secure it? How do I give it an appropriate amount of compute, networking, storage and connectivity?"

In the past, we used to think that gigs or terabytes were big numbers, and now we are talking about environments where petabytes are the norm.
Juan OrlandiniChief architect for cloud and data center transformation, Insight

What is different is the vast amounts of data involved in deploying technologies like AI. "In the past, we used to think that gigs or terabytes were big numbers, and now we are talking about environments where petabytes are the norm," Orlandini said.

Against that backdrop, solutions providers must advise clients on the types of servers needed to support AI. That could mean machines with GPUs, application-specific integrated circuits or AI acceleration engines built into them. "You have to reconsider what kind of computation engine you need to provide in order to support the workload."

A storage upgrade may be on the data center infrastructure to-do list as well. AI's demand for clean, reusable data sets is driving demand for storage and data warehousing, noted Charles Fullwood, director of the software solutions practice at Force 3, a Crofton, Md., cloud, networking and security solutions provider. He said emerging technologies are contributing to the demand for hyper-converged storage, for example.

Charles FullwoodCharles Fullwood

Orlandini said vendors such as NetApp, Pure Storage and Dell offer specific products for addressing rigorous data workloads. Network-attached storage systems may be called upon to handle the unstructured data associated with AI. And storage devices may need to be heavily flash-optimized because of the performance requirements, he added.

Orlandini pointed to the storage demands of Nvidia's DGX-2 systems, which target AI and analytics, as a case in point. The DGX-2 comes equipped with eight 100 Gb network interfaces. "Keeping those eight interfaces busy takes a tremendous amount of performance out of the storage platforms," he said.

Networking is another component that may need an upgrade to work as part of an artificial intelligence infrastructure. An AI deployment may call for new connectivity between data sources and the storage system receiving the feeds and between the storage system and the AI workload engine, Orlandini said.

Data as infrastructure

Raj Patil, CEO of Orion Business Innovation, a systems integrator based in Edison, N.J., said he views data as another infrastructure component. Organizations may have to revisit data architecture, which includes data governance and integration, as they look to adopt AI.

"AI has a lot of promise and a lot of value, but if your data is not in the form or shape to drive that, then your AI is not going to be of value," Patil said. "If the data is a mess, the AI is going to be a mess."

Addressing the infrastructure opportunity

Even as IT solutions providers ponder the infrastructure details of atypical applications, they may still find themselves on familiar territory.

Partner engagements may have a business-as-usual flavor, particularly among customers with well-disciplined IT organizations. New technology deployments, for example, require data protection and security measures just as previous generations of technology did. Customers with that understanding make the infrastructure task easier.

"There are some changes in what you are buying, but what you are doing tends not to change," Insight's Orlandini said.

Scoping out appropriate use cases for technologies such as AI offers a greater challenge than deploying the data center infrastructure on which it will reside. "The hard part is not really in the infrastructure," Orlandini said. "You still have to figure it out, but once you figure out what you are doing with AI, what to buy … it tends to be similar across all clients."

That said, solutions providers may need to learn how to work with new types of customers when working on emerging technology infrastructure.

"We don't need to become data scientists to support an AI workload, but we sure do need to understand how data scientists think," Orlandini said. "That is the heavy lift that we have to do."

Orlandini noted the criticality of creating usable data.

"The hard part of AI is not the computation itself," he said. "The data prep and making sure the data is in a good state so computation can be done is more time-consuming and, in many ways, more difficult."

Customers planning AI systems need to consider governance -- where is the data going to come from, and what are the relevant regulatory constraints to consider in its use? -- and then devise a plan for making data usable for the business, Patil said.

When it comes to application data, other architectural changes are in store -- namely, microservices and containerization. Enterprises recognize a need to build code that can be self-contained and readily moved among different cloud providers or different stacks of infrastructure, Patil said. This ease of movement makes data available for AI services, which may exist in different cloud providers.

Microservices and containers "are certainly big drivers of the … new architectural thinking," he noted.

AI, from cloud to on premises

Cloud-based AI services move the basic infrastructure to an external provider -- AWS, Google Cloud Platform or Microsoft Azure, for instance. Most customers launch their AI forays in the cloud, Insight's Orlandini said. With public cloud platforms, organizations can quickly develop a proof of concept or a working product, he noted.

"It's a great place to start," Orlandini said of AI in the cloud. "For many, it will stick there."

But other clients that begin AI pilots in the cloud may balk at the cost of building a production system in that setting. The expense of lifting and shifting massive amounts of data from on-premises data center infrastructure to the public cloud may prove prohibitive, Orlandini said. The client's regulatory and compliance environment may also compel them to launch AI systems on their own infrastructure, he added.

In general, enterprises recognize an ongoing role for in-house processing amid the growth of cloud computing.

"Customers want to go to the cloud and they are … going there in a controlled and sensible way," Force 3's Fullwood said. "They realize certain applications and workloads are not a good fit to move to the public cloud, and they are leaving them where they are."

As AI tools and models become containerized, the resulting portability will also let enterprises run AI in a distributed manner, Fullwood noted. Containers enable organizations to run AI models at the source of the data, permitting faster processing on a smaller scale, he noted. This edge computing approach provides an alternative to creating large, centralized data warehouses and consolidating data.

"With containers, you can run those AI models on the edge," Fullwood said.

Other factors influencing infrastructure requirements

Naturally, enterprises are responding to data center infrastructure purchasing drivers beyond AI. Fullwood cited the need for greater efficiency and automation, along with the infrastructure-as-code concept, as the factors influencing investment in the data center.

Hybrid and multi-cloud deployments also play a role.

A hybrid cloud requires investment in "bridging" infrastructure, Fullwood noted. That bridging begins with the deployment of a new compute platform in which an organization migrates applications and workloads to a private cloud. Once those assets are containerized and made portable, the private cloud becomes a steppingstone to the public cloud. Such hybrid environments also call for cloud management software to move workloads among private and public clouds, Fullwood said.

"Cloud is actually causing customers to spend more on the data center just to accommodate the hybrid cloud approach," he added.

Dig Deeper on Emerging technologies for MSPs