carloscastilla - Fotolia
Confluent improves Kafka event streaming in the cloud
Confluent unveiled the first new capabilities of its Project Metamorphosis, designed to help users deploy and manage event streaming in the cloud.
Streaming data vendor Confluent released on Wednesday a new set of capabilities under the Project Metamorphosis umbrella aimed at improving the scalability of Kafka event streaming in the cloud.
While Confluent is grouping its latest new features and capabilities under Project Metamorphosis, the vendor's core technology is based on the open source Apache Kafka event streaming technology that is widely used by organizations of all sizes to stream messages and data. Confluent provides a commercially supported edition of Kafka known as the Confluent Platform, as well as the Confluent Cloud service.
While Kafka itself can scale to deliver high volumes of data, a key challenge is the ability to scale in an elastic approach, in which resources can grow or shrink as needed. The first major new feature that Confluent's Project Metamorphosis is delivering is a set of elastic scaling capabilities for Kafka deployments running in the Confluent Cloud. Among those capabilities is a self-serve provisioning feature so users can rapidly deploy Kafka, as well as a self-balancing cluster capability that will automatically handle resource workload balancing.
Project Metamorphosis itself is a packaging of Confluent's product roadmap for the remainder of the year, said David Menninger, an analyst at Ventana Research. In the modern world of agile software with frequent releases, it's an interesting way to tie together upcoming releases with some common themes, he said, and it reflects Confluent's overall strategy of enabling organizations to reorient their data architectures to be focused on event streaming.
"This goal aligns with my vision of where the market is headed; we are in the midst of a transition from data at rest as the default information architecture to data in motion as the default," Menninger said.
Why elastic scaling matters for Kafka event streaming
Menninger noted that the first element of Project Metamorphosis is focused on making it easier to scale Kafka, which is important since scaling an Apache Kafka configuration is so complex.
"Elastic scaling is a critical capability for all elements of an organization's information architecture to support the growth and/or fluctuation in data processing needs," Menninger said.
Dave MenningerAnalyst, Ventana Research
As to whether or not the elastic scaling will make a significant difference for users, Menninger noted that users don't want to waste their time modifying scripts and configuration files and rebalancing data. Elasticity helps users because it could free up time to focus on the value in the data for users, rather than spending time on maintenance.
Event streaming improvements build on Confluent momentum
The vendor first hinted at the new elastic capabilities in Confluent Cloud on April 21 when Confluent unveiled a $250 million Series E round of funding. Total funding for Confluent now stands at $456 million, as interest and demand for event streaming technologies continues to grow.
Project Metamorphosis also builds on top of a set of recent technology releases, including the open source Apache Kafka 2.5 update, which Apache made generally available on April 16. Also, Confluent released its commercially supported Confluent Platform 5.5, which is based on Kafka 2.5, on April 24. Among the new features in the Confluent Platform 5.5 update is support for more data serialization formats as well as improved multilanguage development capabilities.
"The features in Confluent Platform 5.5 are primarily aimed at making the platform more broadly accessible to developers of all backgrounds," said Mauricio Barra, product marketing manager at Confluent.
How elastic scaling is different than what cloud providers deliver already
Priya Shivakumar, senior director for Confluent Cloud, said it was important for Confluent to get elastic scaling out now, given the global COVID-19 pandemic business environment.
"Elasticity is especially relevant in these times where we've seen demand sort of skyrocket in certain industries and fall rapidly in others," she said.
The general idea of elasticity, especially in the cloud, is not a new concept. Amazon's core cloud service is known as the Elastic Compute Cloud (EC2), though the type of elasticity that cloud providers enable already isn't what Confluent said its new features are aimed at.
"Elasticity from a cloud provider really is elasticity in your compute, but it is not elasticity in your software," said Dan Rosanova, group product manager for Confluent Cloud. "So adding more VMs [virtual machines] is not a big challenge, but actually making those VMs become part of your workload dynamically without having any downtime is actually quite a big challenge."
Rosanova noted that while there are existing tools to help with scaling Kafka, it's still mostly a manual process for users. Other tools to help with scaling Kafka include the open source cruise control project developed by LinkedIn.
"Cruise control is a cool tool, but it's very complicated tool," Rosanova said. "So just learning how to use that tool is a lot of work and then you're still responsible for using the tool."
Confluent's elasticity efforts are targeted at automating the process and removing management overhead for users to make it easier to scale Kafka.
The elasticity improvements are the first set of new features from Project Metamorphosis effort. The vendor plans to deliver a steady stream of new features that it will release incrementally over the coming months.
"We want to bring together the benefits of cloud computing and event streaming to build a next-generation event streaming platform that's both easy to get started with, and can scale as you go," Shivakumar said.