Skip to content
Techzine Global
  • Home
  • Topstories
  • Topics
    • Analytics
    • Applications
    • Collaboration
    • Data Management
    • Devices
    • Devops
    • Infrastructure
    • Privacy & Compliance
    • Security
  • Insights
    • All Insights
    • Agentic AI
    • Analytics
    • Cloud ERP
    • Generative AI
    • IT in Retail
    • NIS2
    • RSAC 2025 Conference
    • Security Platforms
    • SentinelOne
  • More
    • Become a partner
    • About us
    • Contact us
    • Terms and conditions
    • Privacy Policy
  • Techzine Global
  • Techzine Netherlands
  • Techzine Belgium
  • Techzine TV
  • ICTMagazine Netherlands
  • ICTMagazine Belgium
Techzine » News » Applications » Confluent announces new capabilities to ensure data quality
2 min Applications

Confluent announces new capabilities to ensure data quality

Joseph BrunoliMay 16, 2023 4:40 pmMay 16, 2023
Confluent announces new capabilities to ensure data quality

This week Confluent announced new Confluent Cloud capabilities. They should “give customers confidence that their data is trustworthy and can be easily processed and securely shared. “The data streaming specialist is launching a range of new features to secure data for decision-making.

Confluent is introducing what they call Data Quality Rules, an expansion of the Stream Governance suite. It aims to help organizations easily resolve data quality issues so data can be relied on for making business-critical decisions.

In addition, Confluent’s new Custom Connectors, Stream Sharing, the Kora Engine, and early access program for managed Apache Flink make it easier for companies to gain insights from their data on one platform. This should help reduce operational burdens and ensure industry-leading performance.

Confluent made these announcements at the Kafka Summit London a leading event for developers, architects, data engineers, DevOps professionals, and those looking to learn more about streaming data and Apache Kafka.

Introducing Data Quality Rules

To address the need for more comprehensive data contracts, Confluent’s Data Quality Rules enable organizations to deliver trusted, high-quality data streams across the organization using customizable rules that ensure data integrity and compatibility. With Data Quality Rules, schemas stored in Schema Registry can now be augmented with several types of rules. In the process, teams can ensure high data integrity by validating and constraining the values of individual fields within a data stream.

The new rules also help users quickly resolve data quality issues with customizable follow-up actions on incompatible messages. They also “simplify schema evolution using migration rules to transform messages from one data format to another”, Confluent says.

Securing “real-time data”

Shaun Clowes, Chief Product Officer at Confluent, hailed the new features. “Real-time data is the lifeblood of every organization, but it’s extremely challenging to manage data coming from different sources in real time and guarantee that it’s trustworthy,” he said.

“As a result, many organizations build a patchwork of solutions plagued with silos and business inefficiencies. Confluent Cloud’s new capabilities fix these issues by providing an easy path to ensuring trusted data can be shared with the right people in the right formats”.

Tags:

Apache Kafka / Confluent / Data quality / real time data streaming / Stream Sharing

"*" indicates required fields

This field is for validation purposes and should be left unchanged.

Stay tuned, subscribe!

Nieuwsbrieven*

Related

IBM completes multi-billion-dollar acquisition of Confluent

IBM to acquire Confluent for $11 billion

Confluent crowns the new AI kingmaker, real-time data in context

How important is data analytics in cycling?

Editor picks

‘Nvidia wants to acquire a PC manufacturer like Dell or HP’

A rumor suggests that Nvidia has been negotiating for over a year to ...

Networks that brought us here won’t carry us into AI future

Enterprise networks face a fundamental challenge as companies deploy ...

Rust enters the Linux kernel, but its adoption is leveling off

A milestone for Rust: version 7.0 of the Linux kernel has been releas...

Runtime security becomes critical as AI accelerates threats

Security leaders face a substantial challenge as artificial intellige...

Techzine.tv

Why vulnerability counting fails: a new approach to risk ops

Why vulnerability counting fails: a new approach to risk ops

Inside AIDA Cruises' massive floating data centers

Inside AIDA Cruises' massive floating data centers

Cisco's 102.4 terabit chip supercharges AI data centers

Cisco's 102.4 terabit chip supercharges AI data centers

Why observability is critical for AI code generation success

Why observability is critical for AI code generation success

Read more on Applications

Broadcom combines AI and automation in Automic V26

Broadcom combines AI and automation in Automic V26

Broadcom has released a new version of its automation platform with Automic Automation V26. With this release...

Mels Dees April 10, 2026
Slackbot can take over the role of Microsoft Copilot on any business PC
Top story

Slackbot can take over the role of Microsoft Copilot on any business PC

There is still one limitation standing in the way

Coen van Eenbergen April 7, 2026
All shook up, IFS unlocks asset-based pricing for enterprise AI
Top story

All shook up, IFS unlocks asset-based pricing for enterprise AI

We've often talked about the (perhaps fabled) future land where software is sold on outcomes-based pricing. A...

Adrian Bridgwater April 3, 2026
Celonis and Oracle collaborate on identifying automation and AI opportunities

Celonis and Oracle collaborate on identifying automation and AI opportunities

The Celonis Process Intelligence Platform is now available on Oracle Cloud Infrastructure (OCI) and integrate...

Berry Zwets 1 day ago

Expert Talks

Anthropic’s Mythos preview: why the human layer matters more, not less

Anthropic’s Mythos preview: why the human layer matters more, not less

Anthropic has announced Anthropic’s Mythos Preview, a frontier mode...

Why SAST is growing in importance in the age of AI-generated source code

Why SAST is growing in importance in the age of AI-generated source code

Vibe coding is rising astonishingly quickly, but even developers who ...

Infosecurity Europe announces first wave of keynote speakers for 2026

Infosecurity Europe, the most influential information security event...

Better connected business technology is essential for prosperity in the Netherlands 

According to PwC, the Netherlands ranks fourth in Europe for producin...

Tech calendar

Southeast Asia AI Application Summit 2026

April 23, 2026 Bangkok

SAS Innovate 2026

April 27, 2026 Grapevine

Team '26

May 5, 2026 Anaheim

Knowledge 26

May 5, 2026 Las Vegas

GISEC GLOBAL 2026

May 5, 2026 DUBAI

Red Hat Summit

May 11, 2026 Atlanta

Whitepapers

Experience Synology’s latest enterprise backup solution

Experience Synology’s latest enterprise backup solution

How do you ensure your company data is both secure and quickly recove...

How to choose the right Enterprise Linux platform?

How to choose the right Enterprise Linux platform?

"A Buyer's Guide to Enterprise Linux" comprehensively analyzes the mo...

Enhance your data protection strategy for 2025

The Data Protection Guide 2025 explores the essential strategies and...

Strengthen your cybersecurity with DNS best practices

The white paper "DNS Best Practices" by Infoblox presents essential g...

Techzine Global

Techzine focusses on IT professionals and business decision makers by publishing the latest IT news and background stories. The goal is to help IT professionals get acquainted with new innovative products and services, but also to offer in-depth information to help them understand products and services better.

Follow us

Twitter
LinkedIn
YouTube

© 2026 Dolphin Publications B.V.
All rights reserved.

Techzine Service

  • Become a partner
  • Advertising
  • About Us
  • Contact
  • Terms & Conditions
  • Privacy Statement