Joshua Resnick - Fotolia

HCI, acquisitions led 2018 data center news

In 2018, major data center vendors expanded portfolios with acquisitions for HCI and software-defined data center tech -- bolstering admin tool sets for cloud, containers and VMs.

This year's data center news focused on hyper-converged infrastructure, cloud and artificial intelligence. These developments can help admins expand their tool sets for software-defined data center management, cloud setups and shared resource allocation.

Here's a snapshot of the top data center developments and acquisitions from 2018.

Dell rearranges its HCI portfolio

Dell started 2018 with a reorganization of its converged platforms and solutions division. The company split up the technology for hyper-converged infrastructure (HCI) and converged infrastructure (CI) in hopes of gaining more HCI market share.

The restructuring placed HCI offerings -- including VMware products -- into the server division. Customers will find CI options within the storage group.

"Dell can gain economies of scale and commoditize CI and HCI through streamlining," said Mike Matchett, principal consultant and technology strategist at Small World Big Data. "Most users will see this move as leading to lower price points."

HPE acquires Plexxi

In 2018, Hewlett Packard Enterprise (HPE) picked up Plexxi and rebranded it as HPE Composable Fabric. This acquisition expanded HPE's HCI offerings, and it can help admins simplify software-defined data center installations.

HPE Composable Fabric delivers intent-based networking for the data center, which allows admins to define the network state and use automation to meet those specified goals.

"The next phase in HCI is blending in composable infrastructure to give larger IT shops more flexibility in drawing on a common pool of resources to feed different servers from bare metal, containers and cloud," said Ric Lewis, HPE's senior vice president and general manager of the company's software-defined and cloud group.

IBM rolls out private cloud designed for data

IBM announced the Cloud Private for Data tool in a move to improve AI and machine learning in the data center and to help admins use public cloud features for on-premises data workloads.

Through a partnership with Red Hat, IBM's Cloud Private for Data uses OpenShift to containerize legacy applications such as DB2, Cognos and SPSS; manage and clean data; and connect to public cloud applications for processing and storage.

"It is all about having just one cloud architecture to deliver applications and data across private and public clouds," said Rob Thomas, general manager of analytics at IBM. "And from a data perspective, it helps users establish the building blocks for AI."

Broadcom buys CA Technologies in surprise deal

Data center news analysts are curious about Broadcom Inc.'s sudden purchase of CA Technologies because both organizations have diverse portfolios. CA Technologies has mainframe and enterprise infrastructure software expertise, which could help Broadcom build its offerings to become an across-the-board infrastructure technology company, said Hock Tan, Broadcom's president.

Even with concerns regarding Broadcom's ability to manage CA Technologies' offerings, some analysts see opportunities for the company to expand into IoT tools and network management technology.

"They need to diversify their offerings to be more competitive given they primarily focus on chips, networking and the hardware space," said Judith Hurwitz, president and CEO of Hurwitz & Associates. "CA [Technologies] has done a lot of work on the operational and analytics side, so maybe [Broadcom] is looking at that as a springboard into the software enablement space."

Public cloud adoption climbs, prompts SD-WAN evaluation

The public cloud services market grew by 21% in 2018 from $145.3 billion to $175.8 billion, according to Gartner. The research firm predicts an upward trend of 17.3% growth in 2019. The technology's top benefits include increased flexibility for workload shifting and hosting data off site.

This increase has led organizations to re-examine their use of software-defined wide area networks (SD-WAN) as a way to curb rising usage costs.

"Enterprises are searching for ways to monitor their entire network, from the data centers to the WAN, in a more consistent fashion," said Joe Skorupa, vice president distinguished analyst at Gartner.

Cisco servers tap into Nvidia GPUs

In 2018, Cisco announced the UCS C480 ML M5 rack server, which contains a Nvidia Tesla V100 Tensor Core and NVLink. Admins can use the GPU-based server to automate policies and operations across their infrastructures from the cloud.

Cisco launched the UCS C480 in Q4, aiming to serve sectors such as finance, healthcare and manufacturing with this offering.

This move has enabled Cisco to address growing interest from organizations to implement AI- and machine learning-based workloads. According to a Gartner study, only 4% of CIOs currently run AI projects, but approximately 46% have plans to pursue such projects or have kicked off pilot programs.

Red Hat releases RHEL 7.6, rolls 8.0 into beta

Red Hat saw big changes in 2018, most notably an IBM buyout and Red Hat Enterprise Linux (RHEL) updates. RHEL 7.6 is touted as a foundation for hybrid cloud workflows and it does not include any changes from IBM.

RHEL 7.6 also includes a new version of the Trusted Platform Module, nftables enhancements and support for RHEL System Roles. These updates make it easier for admins to automate and remotely manage enterprise Linux deployments.

Shortly after the 7.6 release, Red Hat announced the RHEL 8.0 beta. The biggest change for RHEL 8.0 is the addition of Application Streams, which divides user-space packages from the operating system kernel. This feature lets admins run developer applications on RHEL minor releases. Plus, the new Composer tool helps build packages and deploy RHEL-based images across hybrid clouds.

"RHEL 8.0 has a delivery stream for everything in the user space," said Ron Pacheco, director of product management at RHEL. "There will be one for MongoDB, Node.js or MySQL. So as these are being developed, they are sent upstream where app developers can access them from a minor RHEL release and not have to wait for a major release."

Dig Deeper on Data center ops, monitoring and management