Skip to content
Techzine Global
  • Home
  • Topstories
  • Topics
    • Analytics
    • Applications
    • Collaboration
    • Data Management
    • Devices
    • Devops
    • Infrastructure
    • Privacy & Compliance
    • Security
  • Insights
    • All Insights
    • Agentic AI
    • Analytics
    • Cloud ERP
    • Generative AI
    • IT in Retail
    • NIS2
    • RSAC 2025 Conference
    • Security Platforms
    • SentinelOne
  • More
    • Become a partner
    • About us
    • Contact us
    • Terms and conditions
    • Privacy Policy
  • Techzine Global
  • Techzine Netherlands
  • Techzine Belgium
  • Techzine TV
  • ICTMagazine Netherlands
  • ICTMagazine Belgium
Techzine » News » Applications » Confluent announces new capabilities to ensure data quality
2 min Applications

Confluent announces new capabilities to ensure data quality

Joseph BrunoliMay 16, 2023 4:40 pmMay 16, 2023
Confluent announces new capabilities to ensure data quality

This week Confluent announced new Confluent Cloud capabilities. They should “give customers confidence that their data is trustworthy and can be easily processed and securely shared. “The data streaming specialist is launching a range of new features to secure data for decision-making.

Confluent is introducing what they call Data Quality Rules, an expansion of the Stream Governance suite. It aims to help organizations easily resolve data quality issues so data can be relied on for making business-critical decisions.

In addition, Confluent’s new Custom Connectors, Stream Sharing, the Kora Engine, and early access program for managed Apache Flink make it easier for companies to gain insights from their data on one platform. This should help reduce operational burdens and ensure industry-leading performance.

Confluent made these announcements at the Kafka Summit London a leading event for developers, architects, data engineers, DevOps professionals, and those looking to learn more about streaming data and Apache Kafka.

Introducing Data Quality Rules

To address the need for more comprehensive data contracts, Confluent’s Data Quality Rules enable organizations to deliver trusted, high-quality data streams across the organization using customizable rules that ensure data integrity and compatibility. With Data Quality Rules, schemas stored in Schema Registry can now be augmented with several types of rules. In the process, teams can ensure high data integrity by validating and constraining the values of individual fields within a data stream.

The new rules also help users quickly resolve data quality issues with customizable follow-up actions on incompatible messages. They also “simplify schema evolution using migration rules to transform messages from one data format to another”, Confluent says.

Securing “real-time data”

Shaun Clowes, Chief Product Officer at Confluent, hailed the new features. “Real-time data is the lifeblood of every organization, but it’s extremely challenging to manage data coming from different sources in real time and guarantee that it’s trustworthy,” he said.

“As a result, many organizations build a patchwork of solutions plagued with silos and business inefficiencies. Confluent Cloud’s new capabilities fix these issues by providing an easy path to ensuring trusted data can be shared with the right people in the right formats”.

Tags:

Apache Kafka / Confluent / Data quality / real time data streaming / Stream Sharing

"*" indicates required fields

This field is for validation purposes and should be left unchanged.

Stay tuned, subscribe!

Nieuwsbrieven*

Related

IBM completes multi-billion-dollar acquisition of Confluent

IBM to acquire Confluent for $11 billion

Confluent crowns the new AI kingmaker, real-time data in context

How important is data analytics in cycling?

Editor picks

The SaaSpocalypse is a myth, and Salesforce proves it

The SaaSpocalypse is coming, at least that is what many self-proclaim...

ABBYY: AI with a purpose starts with documents

ABBYY CEO Ulf Persson has led the Document AI platform company for al...

Kubernetes attack surface explodes: number of threats quadruples

Kubernetes is the industry standard for cloud-native workloads. Its u...

Tredence powers up Gemini-powered agentic AI accelerators

Data science platform Tredence has detailed its suite of agentic AI a...

Techzine.tv

Cisco's 102.4 terabit chip supercharges AI data centers

Cisco's 102.4 terabit chip supercharges AI data centers

Inside AIDA Cruises' massive floating data centers

Inside AIDA Cruises' massive floating data centers

Why only 25% of teams are ready for the Cyber Resilience Act

Why only 25% of teams are ready for the Cyber Resilience Act

How Falco catches threats that static analysis misses

How Falco catches threats that static analysis misses

Read more on Applications

Google Gemini Enterprise to become the AI platform for everyone
Top story

Google Gemini Enterprise to become the AI platform for everyone

Gemini Enterprise was introduced a few months ago as the new AI tool for business users. At Google Cloud Next...

Coen van Eenbergen 1 day ago
Atos reports weak first quarter, Genesis continues to perform

Atos reports weak first quarter, Genesis continues to perform

Atos closed the first quarter of 2026 with an 11 percent organic revenue decline. Customer caution and the de...

Berry Zwets 2 days ago
Anthropic tests restrictions on Claude Code in Pro subscription

Anthropic tests restrictions on Claude Code in Pro subscription

Anthropic has caused a stir among developers with an experiment regarding the availability of Claude Code wit...

Mels Dees 8 hours ago
The SaaSpocalypse is a myth, and Salesforce proves it
Top story

The SaaSpocalypse is a myth, and Salesforce proves it

The SaaSpocalypse is coming, at least that is what many self-proclaimed AI experts would like you to believe....

Coen van Eenbergen 1 day ago

Expert Talks

Anthropic’s Mythos preview: why the human layer matters more, not less

Anthropic’s Mythos preview: why the human layer matters more, not less

Anthropic has announced Anthropic’s Mythos Preview, a frontier mode...

Why SAST is growing in importance in the age of AI-generated source code

Why SAST is growing in importance in the age of AI-generated source code

Vibe coding is rising astonishingly quickly, but even developers who ...

Infosecurity Europe announces first wave of keynote speakers for 2026

Infosecurity Europe, the most influential information security event...

Better connected business technology is essential for prosperity in the Netherlands 

According to PwC, the Netherlands ranks fourth in Europe for producin...

Tech calendar

SAS Innovate 2026

April 27, 2026 Grapevine

Team '26

May 5, 2026 Anaheim

Knowledge 26

May 5, 2026 Las Vegas

GISEC GLOBAL 2026

May 5, 2026 DUBAI

Red Hat Summit

May 11, 2026 Atlanta

DevOpsCon London

May 11, 2026 London

Whitepapers

Experience Synology’s latest enterprise backup solution

Experience Synology’s latest enterprise backup solution

How do you ensure your company data is both secure and quickly recove...

How to choose the right Enterprise Linux platform?

How to choose the right Enterprise Linux platform?

"A Buyer's Guide to Enterprise Linux" comprehensively analyzes the mo...

Enhance your data protection strategy for 2025

The Data Protection Guide 2025 explores the essential strategies and...

Strengthen your cybersecurity with DNS best practices

The white paper "DNS Best Practices" by Infoblox presents essential g...

Techzine Global

Techzine focusses on IT professionals and business decision makers by publishing the latest IT news and background stories. The goal is to help IT professionals get acquainted with new innovative products and services, but also to offer in-depth information to help them understand products and services better.

Follow us

Twitter
LinkedIn
YouTube

© 2026 Dolphin Publications B.V.
All rights reserved.

Techzine Service

  • Become a partner
  • Advertising
  • About Us
  • Contact
  • Terms & Conditions
  • Privacy Statement