Getty Images/iStockphoto

Cloud maturation, data mesh among 2021 top analytics trends

After years of slow growth, this was the year analytics in the cloud became the norm and on-premises BI receded to the background. It was also the year data mesh emerged.

The analytics trends that shaped 2021 favored function over flash.

While vendors continue to make progress with augmented intelligence and machine learning capabilities like natural language processing (NLP) and AutoML, their functionality remains limited, and according to David Menninger, an analyst at Ventana Research, his research shows that user adoption remains low.

So rather than a surge in the use of natural language query tools or features designed to enable users who aren't familiar with code to build and train machine learning models, one of the main analytics trends that shaped 2021 was realizing the cloud's potential.

While the cloud has grown in use in recent years, most prominent and proven analytics vendors were startups before the advent of the cloud. They built their platforms to run on premises, and it's taken time for them to rebuild for the cloud.

In 2021, however, the cloud took center stage.

"Cloud deployments of analytics have really caught up to the hype or the promise," Menninger said. "That's probably the biggest trend. We're trending toward natural language processing, but the reason I'm not calling it the biggest trend is that … NLP is still used by a small minority of organizations. Vendors may be making progress there, but it's not widely adopted yet."

Beyond the cloud, other functional but not flashy analytics trends that were prominent in 2021 include data mesh and data governance.

Here's a look at some of the analytics trends that shaped 2021, according to industry insiders.

Cloud deployments of analytics have really caught up to the hype or the promise. That's probably the biggest trend.
David MenningerAnalyst, Ventana Research

The cloud comes of age

Analytics vendors simply needed time.

The cloud was first invented in the 1960s, but it wasn't until the second decade of the 21st century that tech giants AWS, Google and Microsoft built easy-to-use clouds where organizations could store and access data.

Many analytics vendors, however, had already been founded and had developed platforms aimed at on-premises users.

Domo, founded in 2010, was one of the first to be cloud-native from its inception. But even ThoughtSpot, born in 2012 and forward-thinking enough to develop a platform built on AI and machine learning capabilities, didn't immediately recognize the promise of the cloud.

In recent years, as cloud data warehouses such as Amazon Redshift, Google BigQuery and Microsoft Azure have become more popular, and newcomers like Snowflake and Databricks have exploded on the scene, analytics vendors have recognized the need to become cloud-first.

A graphic shows some of the benefits of cloud deployments.
Speed and scalability are two of the benefits of the cloud, which analytics vendors are now making their priority over enterprise platforms.

As a result, analytics vendors such as MicroStrategy, Qlik and ThoughtSpot have all made strategic shifts in the past couple of years. Broad-based tech companies like IBM, Oracle and SAP, all of which offer BI platforms in addition to other capabilities, also rebuilt their platforms to make them cloud-first.

"Most vendors were on premises," Menninger said. "There were only a few, like Domo, that were cloud-native. For those other vendors to get there, they had to do a lot of work. All have now made significant strides to where they're talking about or have achieved parity on the web versus on premises. Before this year, not all vendors had achieved parity."

Likewise, Elif Tutuk, vice president of innovation and design at Qlik, noted that a significant analytics trend is cloud-first architectures finally becoming the norm.

And with those cloud-first analytics platforms combining with cloud data warehouses, analytics vendors are enabling customers to query data and get responses much faster than in the past, resulting in near real-time insights.

"There has been a focus on re-architecting platforms for the cloud -- that's a trend we're seeing more," she said. "With that, there's the movement of data to hyperscale warehouses in the cloud. That is a shift that's happening more on the data side, but it's affecting analytics providers who have to enable users to have access to that data in real time."

Data mesh

Data mesh, a term coined and defined in 2019 by Zhamak Dehghani, now director of emerging technologies at ThoughtWorks, is a decentralized approach to analytics.

And it may just be the next big analytics trend.

A typical data architecture includes tools for data ingestion, data storage in warehouses and data lakes, staging in order to create data sets that can be used for analysis, and business intelligence for the analysis of data that leads to insight and decision-making. And most data architectures are overseen by a single IT or data operations department.

It's straightforward, and monolithic, and in many cases all the tools are provided by a single vendor such as tech giants AWS, Microsoft or Oracle. Now, even vendors like Qlik and Tableau that formerly focused solely on one aspect of the analytics process offer end-to-end platforms.

Data mesh, however, is a distributed method of data management and analytics that places an emphasis on domain expertise while easing the burden on centralized teams having to deal with exponentially growing amounts of data.

Data mesh is a federated approach built on an organizational microservices architecture with each department -- or domain -- overseeing and working with its own data.

"Data mesh has really taken off this year," said Donald Farmer, founder and principal of TreeHive Strategy. "Rather than being single-product architecture, it's a way of piecing together an enterprise architecture out of many components which are put together somewhat ad-hoc.

"It's a very distributed architecture, and it's about how you join those pieces together rather than any monolithic architecture," he continued.

He added that 2021 was the year vendors began supporting and enabling data mesh to appeal to both their traditional customers and those looking for a new approach to their data architecture.

"It's been around for a few years," Farmer said. "There's no product that makes data mesh happen, but that's part of the attraction. It fits in with data lakehouse really well. I rarely have a data and analytics conversation that does not include data meshes."

Better governance

Enhanced data governance isn't an eye-catching analytics trend, but it's a critical one.

As more platforms enable self-service analytics and make data exploration and analysis available to people in organizations who don't have backgrounds in computer science and statistics, guardrails need to be in place. These access and security measures aim to ensure that users explore and analyze data in ways that don't harm the organization and also give them confidence in the work they're doing.

It's a delicate balance, but an important one.

Without strong data governance, organizations risk publicly exposing sensitive data and running afoul of regulations. Meanwhile, without strong data governance, business users can be uncomfortable working with data and often wind up avoiding it altogether as they make business decisions.

As a result, data governance is taking on growing importance, and vendors are responding.

"Vendors are providing more governance to enable more SaaS capabilities," Tutuk said. "There are more regulations and governance needed for data, but also from the SaaS perspective, it's enabling users so they can easily search for data, understand data and then ask questions using that data."

Among the data governance tools gaining popularity are data catalogs, and organizations are emphasizing data lineage.

"Those make sure the user understands what data exists behind the numbers they're seeing and create an explainable BI experience," Tutuk said. "Explainable experiences have been coming up more and more with AI, but we have the same problem with BI, and now we're in a position where we can build better trust."

Qlik, as Tutuk noted, is one vendor adding more data governance capabilities. Among others, Tableau's most recent platform update prioritized governance and security, and Alteryx has made governance a focus.

Meanwhile, the sole purpose of some platforms is to provide data governance. These include Alation, Collibra, Informatica and Talend.

New personas

As analytics evolves, so too do the people needed to enable data exploration and analysis.

Self-service analytics is the ultimate goal: the independent use and analysis of data in a safe and secure way that leads to data-driven decision-making without having to consult with a centralized team of data experts.

But it takes more than just a platform with no-code or embedded BI capabilities to enable business users. There are organizational challenges like implementing a strong data governance framework, and there are people who need to build analytics products -- including reports, dashboards and models -- that can be embedded into users' workflows.

A rising analytics trend, therefore, is the growing importance of developers, according to Mike Leone, an analyst at Enterprise Strategy Group.

"With organizations focused on enabling more stakeholders to access and analyze more data, developers are increasingly being tasked with creating applications that incorporate data and analytics capabilities," he said. "These apps, when built right, can empower a wider audience to experiment and analyze data in a controlled environment."

But developers aren't the only employees upon which organizations are relying to enable self-service analytics. IT staff are also important enablers, according to Leone.

"For IT, it's about ensuring the BI platforms can effectively deliver on end-user requirements," he said. "Areas like performance, scale and reliability of BI platforms are increasingly falling under the purview of IT, which is also increasingly becoming the de facto line of BI support."

AI and automation

Analytics trends in 2021 weren't all about function. There was some flash as well.

Though user adoption of augmented analytics capabilities such as NLP and AutoML remains low, a host of vendors added or enhanced those capabilities within their platforms over the past year.

For example, Yellowfin recently added NLQ capabilities, Amazon QuickSight added an NLQ tool, and vendors such as Sisense, Tableau and ThoughtSpot continued to enhance existing NLP features.

The past year may not have been the time users -- at least not in large numbers -- started taking advantage of improving augmented analytics capabilities, but it was a year of growing user adoption and the setup for even more in the years to come.

"Natural language interactions are flourishing, and natural language query interfaces are being unchained from BI and analytics platforms and embedded within transactional and productivity applications," said Doug Henschen, an analyst at Constellation Research. "NLQ responses only improve when questions are asked within the context of specific applications and decision points."

Beyond improved natural language interactions, vendors added automation capabilities to lessen the burden of doing the same tasks over and over again.

Alteryx and Qlik both formed partnerships with robotic process automation vendor UiPath to improve their automation capabilities, and Tibco's parent company reached an agreement to acquire RPA vendor Blue Prism and roll it into Tibco upon closing.

"Workflow and automation are on fire," Henschen said. "Why bog people down with repetitive manual tasks? And why demand that people navigate to separate analytical platforms to find and interpret reports and dashboards?

"When there's confidence that analytical alerts and thresholds signal specific outcomes, organizations are using them as triggers for workflows and automated actions," he continued.

And when there's not that confidence, process automation tools are able to surface exceptions so people only have to address those.

"They're … lightening the load and liberating humans from repetitive, rubber-stamp tasks," Henschen said.

Enterprise Strategy Group is a division of TechTarget.

Dig Deeper on Data science and analytics