Kit Wai Chan - Fotolia

New Confluent Platform release boosts event streaming quality

Based on the open-source Kafka event streaming platform, the Confluent Platform 5.4 update adds new capabilities to help meet enterprise data management requirements.

Event streaming is a critical component of modern data management and analysis, bringing real-time data to organizations. One of the most popular tools for event streaming is the open source Apache Kafka technology that is at the foundation of the commercial Confluent platform.

The vendor, based in Mountain View, Calif., has enhanced the platform with capabilities that make event streaming more secure and resilient.

The Confluent Platform 5.4 event streaming update became generally available Wednesday and benefits from improvements that first landed in the Apache Kafka 2.4 update that was released on Dec. 18. Beyond what's available in the Kafka update, Confluent's new release adds role-based access control (RBAC) security, improved disaster recovery and enhanced schema validation for data quality.

Confluent is on a path to improve the usability and manageability of Kafka, said Maureen Fleming, an IDC analyst.

"The introduction of Confluent Schema Registry simplifies and improves control over schema validation and enforcement," Fleming said. "This aligns well with efforts enterprises are going through to ensure their data is trustworthy."

The Confluent Platform update also introduces support for CloudEvents, an open specification for describing event data. Fleming noted that improvements in audit logging and support for the CloudEvents specification provide mechanisms for more sophisticated monitoring and use of security-related anomaly detection algorithms.

Screenshot of Confluent schema validation process
Confluent Platform schema validation process

"The improved logging also supports regulatory compliance requirements of Confluent's customers," she said.

Securing event data

RBAC is a critical security mechanism that can be used to ensure that only authorized users get access to a given service. Confluent Platform has had integration with directory-based security policy systems, including Microsoft ActiveDirectory, in the past, noted Addison Huddy, group product manager at Confluent. He said the new RBAC system provides more control than what Confluent previously delivered.

The introduction of Confluent Schema Registry simplifies and improves control over schema validation and enforcement. This aligns well with efforts enterprises are going through to ensure their data is trustworthy.
Maureen FlemingAnalyst, IDC

"What role-based access control does is it allows you to take the groups that you have defined already inside of something like Active Directory, and you tie those to roles that we defined in the system," Huddy said.

Confluent Platform 5.4 has a component that enables administrators to define roles and then have policies on those roles enforced across the platform, he added.

Schema Registry improves data quality

The Confluent Schema Registry is a centralized location where teams can upload their data schemas. Kafka as a platform generally has more users who read and consume data from an event stream than users who write data, Huddy noted.

"So now if I'm writing an application, that's a consumer of data I don't have to coordinate with directly to say, "Hey, what serialization format did you use?'" Huddy said. "I can go out and reach the schema registry to grab that data."

Going a step further, the schema registry can also be used to help enforce data quality, by only accepting data that adheres to a given schema model that is defined in the registry.

KsqlDB now in preview

The new Confluent Platform release also includes a technical preview of the ksqldb event streaming database technology that first became generally available as an open source project on Nov. 20.

One of Confluent customers' main goals is to build enterprise event streaming applications, said Praveen Rangnath, senior director of product marketing at Confluent. Without ksqlDB, building enterprise event streaming applications would be a more complicated process involving putting multiple distributed systems together, he said.

"What we're trying to do with ksqlDB is essentially integrate those systems into a single solution to just make it super easy for developers to build event streaming applications," Rangnath said.

Next Steps

Confluent goes IPO as Kafka event streaming goes mainstream

Confluent platform update targets streaming data quality

Dig Deeper on Data management strategies