Skip to content
Techzine Global
  • Home
  • Topstories
  • Topics
    • Analytics
    • Applications
    • Collaboration
    • Data Management
    • Devices
    • Devops
    • Infrastructure
    • Privacy & Compliance
    • Security
  • Insights
    • All Insights
    • Agentic AI
    • Analytics
    • Cloud ERP
    • Generative AI
    • IT in Retail
    • NIS2
    • RSAC 2025 Conference
    • Security Platforms
    • SentinelOne
  • More
    • Become a partner
    • About us
    • Contact us
    • Terms and conditions
    • Privacy Policy
  • Techzine Global
  • Techzine Netherlands
  • Techzine Belgium
  • Techzine TV
  • ICTMagazine Netherlands
  • ICTMagazine Belgium
Techzine » News » Applications » Confluent announces new capabilities to ensure data quality
2 min Applications

Confluent announces new capabilities to ensure data quality

Joseph BrunoliMay 16, 2023 4:40 pmMay 16, 2023
Confluent announces new capabilities to ensure data quality

This week Confluent announced new Confluent Cloud capabilities. They should “give customers confidence that their data is trustworthy and can be easily processed and securely shared. “The data streaming specialist is launching a range of new features to secure data for decision-making.

Confluent is introducing what they call Data Quality Rules, an expansion of the Stream Governance suite. It aims to help organizations easily resolve data quality issues so data can be relied on for making business-critical decisions.

In addition, Confluent’s new Custom Connectors, Stream Sharing, the Kora Engine, and early access program for managed Apache Flink make it easier for companies to gain insights from their data on one platform. This should help reduce operational burdens and ensure industry-leading performance.

Confluent made these announcements at the Kafka Summit London a leading event for developers, architects, data engineers, DevOps professionals, and those looking to learn more about streaming data and Apache Kafka.

Introducing Data Quality Rules

To address the need for more comprehensive data contracts, Confluent’s Data Quality Rules enable organizations to deliver trusted, high-quality data streams across the organization using customizable rules that ensure data integrity and compatibility. With Data Quality Rules, schemas stored in Schema Registry can now be augmented with several types of rules. In the process, teams can ensure high data integrity by validating and constraining the values of individual fields within a data stream.

The new rules also help users quickly resolve data quality issues with customizable follow-up actions on incompatible messages. They also “simplify schema evolution using migration rules to transform messages from one data format to another”, Confluent says.

Securing “real-time data”

Shaun Clowes, Chief Product Officer at Confluent, hailed the new features. “Real-time data is the lifeblood of every organization, but it’s extremely challenging to manage data coming from different sources in real time and guarantee that it’s trustworthy,” he said.

“As a result, many organizations build a patchwork of solutions plagued with silos and business inefficiencies. Confluent Cloud’s new capabilities fix these issues by providing an easy path to ensuring trusted data can be shared with the right people in the right formats”.

Tags:

Apache Kafka / Confluent / Data quality / real time data streaming / Stream Sharing

"*" indicates required fields

This field is for validation purposes and should be left unchanged.

Stay tuned, subscribe!

Nieuwsbrieven*

Related

IBM to acquire Confluent for $11 billion

Confluent crowns the new AI kingmaker, real-time data in context

How important is data analytics in cycling?

Confluent introduces Streaming Agents for real-time AI agents

Editor picks

IBM to acquire Confluent for $11 billion

Update - IBM confirms an $11 billion (€9.4 billion) agreement with ...

Ataccama raises data trust and money from Snowflake Ventures

Snowflake Ventures is taking a strategic stake in Ataccama, which aim...

The data center of the future: high voltage, liquid cooled, up to 4 MW per rack

AI upends everything at the moment. The design of data centers is no ...

IBM: Without talent there will be no breakthrough for quantum computing

Companies currently spend 11 percent of their R&D budget on quant...

Techzine.tv

NetSuite founder reveals AI transformation 5 years in the making

NetSuite founder reveals AI transformation 5 years in the making

SAP Business Network: $6.5 trillion B2B collaboration platform

SAP Business Network: $6.5 trillion B2B collaboration platform

SAP's AI migration tools from ECC to S/4HANA: faster and cheaper ERP transitions

SAP's AI migration tools from ECC to S/4HANA: faster and cheaper ERP transitions

Slack is evolving into a work operating system

Slack is evolving into a work operating system

Read more on Applications

NetSuite founder reveals AI transformation 5 years in the making
Top story

NetSuite founder reveals AI transformation 5 years in the making

What happens when five years of strategic platform development collides with the generative AI revolution? Or...

Coen van Eenbergen December 8, 2025
The state of cloud in 2026
Top story

The state of cloud in 2026

Cloud computing has now entered its mature adolescence i.e. it’s still surprisingly developmental, changeab...

Adrian Bridgwater December 3, 2025
OpenAI partners with Disney and receives a $1B investment

OpenAI partners with Disney and receives a $1B investment

AI video platform Sora can now generate over 200 Disney characters. However, the deal between The Walt Disney...

Erik van Klinken 13 hours ago
Agentic AI browser Opera Neon now generally available

Agentic AI browser Opera Neon now generally available

Opera has made Opera Neon, its new AI browser with agentic features, publicly available. Users pay $19.90 per...

Erik van Klinken 14 hours ago

Expert Talks

How our team optimizes infrastructure for minimal AI video processing latency 

How our team optimizes infrastructure for minimal AI video processing latency 

Over the past year, AI-generated video diffusion models have enabled ...

Redefining the Software Development Lifecycle in the Age of AI

Redefining the Software Development Lifecycle in the Age of AI

For developers, the best coding happens when they’re in a state of ...

AI Integrity: The Invisible Threat Organizations Can’t Ignore

AI systems are increasingly making decisions that impact people, proc...

Three Ways Secure Modern Networks Unlock the True Power of AI

AI is rapidly becoming the main driver of innovation for businesses, ...

Tech calendar

Appdevcon

March 10, 2026 Amsterdam

Webdevcon

March 10, 2026 Amsterdam

Dutch PHP Conference

March 10, 2026 Amsterdam

GITEX ASIA 2026

April 8, 2026 Singapore

SAS Innovate 2026

April 27, 2026 Grapevine

Team '26

May 5, 2026 Anaheim

Whitepapers

Experience Synology’s latest enterprise backup solution

Experience Synology’s latest enterprise backup solution

How do you ensure your company data is both secure and quickly recove...

How to choose the right Enterprise Linux platform?

How to choose the right Enterprise Linux platform?

"A Buyer's Guide to Enterprise Linux" comprehensively analyzes the mo...

Enhance your data protection strategy for 2025

The Data Protection Guide 2025 explores the essential strategies and...

Strengthen your cybersecurity with DNS best practices

The white paper "DNS Best Practices" by Infoblox presents essential g...

Techzine Global

Techzine focusses on IT professionals and business decision makers by publishing the latest IT news and background stories. The goal is to help IT professionals get acquainted with new innovative products and services, but also to offer in-depth information to help them understand products and services better.

Follow us

Twitter
LinkedIn
YouTube

© 2025 Dolphin Publications B.V.
All rights reserved.

Techzine Service

  • Become a partner
  • Advertising
  • About Us
  • Contact
  • Terms & Conditions
  • Privacy Statement