Skip to content
Techzine Global
  • Home
  • Topstories
  • Topics
    • Analytics
    • Applications
    • Collaboration
    • Data Management
    • Devices
    • Devops
    • Infrastructure
    • Privacy & Compliance
    • Security
  • Insights
    • All Insights
    • Agentic AI
    • Analytics
    • Cloud ERP
    • Generative AI
    • IT in Retail
    • NIS2
    • RSAC 2025 Conference
    • Security Platforms
    • SentinelOne
  • More
    • Become a partner
    • About us
    • Contact us
    • Terms and conditions
    • Privacy Policy
  • Techzine Global
  • Techzine Netherlands
  • Techzine Belgium
  • Techzine TV
  • ICTMagazine Netherlands
  • ICTMagazine Belgium
Techzine » News » Applications » Confluent announces new capabilities to ensure data quality
2 min Applications

Confluent announces new capabilities to ensure data quality

Joseph BrunoliMay 16, 2023 4:40 pmMay 16, 2023
Confluent announces new capabilities to ensure data quality

This week Confluent announced new Confluent Cloud capabilities. They should “give customers confidence that their data is trustworthy and can be easily processed and securely shared. “The data streaming specialist is launching a range of new features to secure data for decision-making.

Confluent is introducing what they call Data Quality Rules, an expansion of the Stream Governance suite. It aims to help organizations easily resolve data quality issues so data can be relied on for making business-critical decisions.

In addition, Confluent’s new Custom Connectors, Stream Sharing, the Kora Engine, and early access program for managed Apache Flink make it easier for companies to gain insights from their data on one platform. This should help reduce operational burdens and ensure industry-leading performance.

Confluent made these announcements at the Kafka Summit London a leading event for developers, architects, data engineers, DevOps professionals, and those looking to learn more about streaming data and Apache Kafka.

Introducing Data Quality Rules

To address the need for more comprehensive data contracts, Confluent’s Data Quality Rules enable organizations to deliver trusted, high-quality data streams across the organization using customizable rules that ensure data integrity and compatibility. With Data Quality Rules, schemas stored in Schema Registry can now be augmented with several types of rules. In the process, teams can ensure high data integrity by validating and constraining the values of individual fields within a data stream.

The new rules also help users quickly resolve data quality issues with customizable follow-up actions on incompatible messages. They also “simplify schema evolution using migration rules to transform messages from one data format to another”, Confluent says.

Securing “real-time data”

Shaun Clowes, Chief Product Officer at Confluent, hailed the new features. “Real-time data is the lifeblood of every organization, but it’s extremely challenging to manage data coming from different sources in real time and guarantee that it’s trustworthy,” he said.

“As a result, many organizations build a patchwork of solutions plagued with silos and business inefficiencies. Confluent Cloud’s new capabilities fix these issues by providing an easy path to ensuring trusted data can be shared with the right people in the right formats”.

Tags:

Apache Kafka / Confluent / Data quality / real time data streaming / Stream Sharing

"*" indicates required fields

This field is for validation purposes and should be left unchanged.

Stay tuned, subscribe!

Nieuwsbrieven*

Related

IBM to acquire Confluent for $11 billion

Confluent crowns the new AI kingmaker, real-time data in context

How important is data analytics in cycling?

Confluent introduces Streaming Agents for real-time AI agents

Editor picks

As MySQL stagnates, its community sounds the alarm

This year, Oracle has already tried to placate fears about MySQL's de...

Cisco builds its own VMware hypervisor alternative, release imminent

Cisco is introducing a hypervisor for running its own applications. N...

Major hack of Dutch telco Odido was a classic case of social engineering

Customer support got found out multiple times

Everyone works with AI agents, but who controls the agents?

86% of IT decision-makers believe AI agents will add more complexity from integration challenges

Techzine.tv

Sophos CEO sees "cybersecurity poverty line": what to do about it?

Sophos CEO sees "cybersecurity poverty line": what to do about it?

AFX is NetApp's data platform of the future with integrated AI data prep

AFX is NetApp's data platform of the future with integrated AI data prep

Why this CIO ditched Microsoft for Google and Slack

Why this CIO ditched Microsoft for Google and Slack

IFS builds an industrial AI ecosystem through partnerships

IFS builds an industrial AI ecosystem through partnerships

Read more on Applications

AI agents are changing entire roles, not just task augmentation
Top story

AI agents are changing entire roles, not just task augmentation

ServiceNow is seeing a fundamental shift in how companies implement AI. The discussion is no longer exclusive...

Berry Zwets 20 hours ago
Snowflake CEO: Software risks becoming a “dumb data pipe” for AI

Snowflake CEO: Software risks becoming a “dumb data pipe” for AI

AI tooling works best when it has access to relevant business data. Snowflake CEO Sridhar Ramaswamy argues th...

Erik van Klinken 2 days ago
Qwen3.5 aims to position Alibaba alongside GPT and Claude

Qwen3.5 aims to position Alibaba alongside GPT and Claude

Alibaba is taking a step forward in developing open-source AI models with the introduction of Qwen3.5. The co...

Mels Dees 20 hours ago
Oracle NetSuite introduces the missing puzzle piece for its platform
Top story

Oracle NetSuite introduces the missing puzzle piece for its platform

An AI-based low-code integration platform

Coen van Eenbergen February 12, 2026

Expert Talks

4 steps to create a future-proof data infrastructure

4 steps to create a future-proof data infrastructure

A future-proof IT infrastructure is often positioned as a universal s...

Secure networking: the foundation for the AI era

Secure networking: the foundation for the AI era

As organizations get ready to roll out AI to change their business pr...

Why AI adoption requires a dedicated approach to cyber governance

AI tools are spreading through the enterprise ecosystem at an astonis...

Professional print materials for European tech events, why booth design still makes the difference

When you participate in a major tech event, you naturally want to sta...

Tech calendar

Appdevcon

March 10, 2026 Amsterdam

Webdevcon

March 10, 2026 Amsterdam

Dutch PHP Conference

March 10, 2026 Amsterdam

De IT Afdeling van de toekomst

March 31, 2026 Naaldwijk

GITEX ASIA 2026

April 8, 2026 Singapore

Southeast Asia AI Application Summit 2026

April 23, 2026 Bangkok

Whitepapers

Experience Synology’s latest enterprise backup solution

Experience Synology’s latest enterprise backup solution

How do you ensure your company data is both secure and quickly recove...

How to choose the right Enterprise Linux platform?

How to choose the right Enterprise Linux platform?

"A Buyer's Guide to Enterprise Linux" comprehensively analyzes the mo...

Enhance your data protection strategy for 2025

The Data Protection Guide 2025 explores the essential strategies and...

Strengthen your cybersecurity with DNS best practices

The white paper "DNS Best Practices" by Infoblox presents essential g...

Techzine Global

Techzine focusses on IT professionals and business decision makers by publishing the latest IT news and background stories. The goal is to help IT professionals get acquainted with new innovative products and services, but also to offer in-depth information to help them understand products and services better.

Follow us

Twitter
LinkedIn
YouTube

© 2026 Dolphin Publications B.V.
All rights reserved.

Techzine Service

  • Become a partner
  • Advertising
  • About Us
  • Contact
  • Terms & Conditions
  • Privacy Statement