Skip to content
Techzine Global
  • Home
  • Topstories
  • Topics
    • Analytics
    • Applications
    • Collaboration
    • Data Management
    • Devices
    • Devops
    • Infrastructure
    • Privacy & Compliance
    • Security
  • Insights
    • All Insights
    • Agentic AI
    • Analytics
    • Cloud ERP
    • Generative AI
    • IT in Retail
    • NIS2
    • RSAC 2025 Conference
    • Security Platforms
    • SentinelOne
  • More
    • Become a partner
    • About us
    • Contact us
    • Terms and conditions
    • Privacy Policy
  • Techzine Global
  • Techzine Netherlands
  • Techzine Belgium
  • Techzine TV
  • ICTMagazine Netherlands
  • ICTMagazine Belgium
Techzine » News » Applications » Confluent announces new capabilities to ensure data quality
2 min Applications

Confluent announces new capabilities to ensure data quality

Joseph BrunoliMay 16, 2023 4:40 pmMay 16, 2023
Confluent announces new capabilities to ensure data quality

This week Confluent announced new Confluent Cloud capabilities. They should “give customers confidence that their data is trustworthy and can be easily processed and securely shared. “The data streaming specialist is launching a range of new features to secure data for decision-making.

Confluent is introducing what they call Data Quality Rules, an expansion of the Stream Governance suite. It aims to help organizations easily resolve data quality issues so data can be relied on for making business-critical decisions.

In addition, Confluent’s new Custom Connectors, Stream Sharing, the Kora Engine, and early access program for managed Apache Flink make it easier for companies to gain insights from their data on one platform. This should help reduce operational burdens and ensure industry-leading performance.

Confluent made these announcements at the Kafka Summit London a leading event for developers, architects, data engineers, DevOps professionals, and those looking to learn more about streaming data and Apache Kafka.

Introducing Data Quality Rules

To address the need for more comprehensive data contracts, Confluent’s Data Quality Rules enable organizations to deliver trusted, high-quality data streams across the organization using customizable rules that ensure data integrity and compatibility. With Data Quality Rules, schemas stored in Schema Registry can now be augmented with several types of rules. In the process, teams can ensure high data integrity by validating and constraining the values of individual fields within a data stream.

The new rules also help users quickly resolve data quality issues with customizable follow-up actions on incompatible messages. They also “simplify schema evolution using migration rules to transform messages from one data format to another”, Confluent says.

Securing “real-time data”

Shaun Clowes, Chief Product Officer at Confluent, hailed the new features. “Real-time data is the lifeblood of every organization, but it’s extremely challenging to manage data coming from different sources in real time and guarantee that it’s trustworthy,” he said.

“As a result, many organizations build a patchwork of solutions plagued with silos and business inefficiencies. Confluent Cloud’s new capabilities fix these issues by providing an easy path to ensuring trusted data can be shared with the right people in the right formats”.

Tags:

Apache Kafka / Confluent / Data quality / real time data streaming / Stream Sharing

"*" indicates required fields

This field is for validation purposes and should be left unchanged.

Stay tuned, subscribe!

Nieuwsbrieven*

Related

IBM completes multi-billion-dollar acquisition of Confluent

IBM to acquire Confluent for $11 billion

Confluent crowns the new AI kingmaker, real-time data in context

How important is data analytics in cycling?

Editor picks

“MCP is just an API,” and that is precisely the problem with Gemini Enterprise

Google presents the Agentic Data Cloud as the connective tissue of th...

HPE Networking firmly takes the wheel of the self-driving network

HPE Networking, and specifically the division formerly known as Junip...

Atlassian takes Teamwork Graph off its leash for even more impact

Will Teamwork Graph become even more relevant?

Object First brings absolute immutability to Veeam backups

Object First, founded by the creators of Veeam, provides immutable ba...

Techzine.tv

How OpenObserve cuts observability costs by 140x

How OpenObserve cuts observability costs by 140x

Why observability is critical for AI code generation success

Why observability is critical for AI code generation success

Why only 25% of teams are ready for the Cyber Resilience Act

Why only 25% of teams are ready for the Cyber Resilience Act

How Cisco's AI Canvas is revolutionizing network troubleshooting

How Cisco's AI Canvas is revolutionizing network troubleshooting

Read more on Applications

Atlassian takes Teamwork Graph off its leash for even more impact
Top story

Atlassian takes Teamwork Graph off its leash for even more impact

Will Teamwork Graph become even more relevant?

Sander Almekinders May 6, 2026
“AI without processes is fast and confident, but often wrong”
Top story

“AI without processes is fast and confident, but often wrong”

Appian has been saying for some time that AI only truly works when it is embedded in processes. At AppianWorl...

Kim Loohuis May 4, 2026
Claude is getting higher usage limits and more computing power

Claude is getting higher usage limits and more computing power

Anthropic has signed an agreement with SpaceX for access to the full computing power of the Colossus 1 data c...

Berry Zwets 3 days ago
DeepL is laying off a quarter of its workforce

DeepL is laying off a quarter of its workforce

DeepL is laying off 250 of its more than 1,000 employees. CEO Jarek Kutylowski calls the move a deliberate st...

Erik van Klinken 3 days ago

Expert Talks

The only thing constant in technology is change, except for unrealistic hopefulness

The only thing constant in technology is change, except for unrealistic hopefulness

If anyone was ever forced to pick the tritest phrase in the world, it...

mnemonic opens Dutch Security Operations Centre (SOC) and relocates to new office in Utrecht

mnemonic opens Dutch Security Operations Centre (SOC) and relocates to new office in Utrecht

The new SOC in the Netherlands further strengthens mnemonic’s regio...

AI governance: the invisible prerequisite for success

When AI never gets past the demo

Agentic AI is reshaping the network – and it’s time to upgrade

Wireless connectivity is becoming a critical infrastructure for the A...

Tech calendar

Red Hat Summit

May 11, 2026 Atlanta

DevOpsCon London

May 11, 2026 London

Digitale soevereiniteit in de boardroom

May 11, 2026 Maarssen

Infosecurity Europe

June 2, 2026 London

.NEXT On Tour Amsterdam

June 9, 2026 Amsterdam

Oxygenate

June 11, 2026 Hilversum

Whitepapers

Experience Synology’s latest enterprise backup solution

Experience Synology’s latest enterprise backup solution

How do you ensure your company data is both secure and quickly recove...

How to choose the right Enterprise Linux platform?

How to choose the right Enterprise Linux platform?

"A Buyer's Guide to Enterprise Linux" comprehensively analyzes the mo...

Enhance your data protection strategy for 2025

The Data Protection Guide 2025 explores the essential strategies and...

Strengthen your cybersecurity with DNS best practices

The white paper "DNS Best Practices" by Infoblox presents essential g...

Techzine Global

Techzine focusses on IT professionals and business decision makers by publishing the latest IT news and background stories. The goal is to help IT professionals get acquainted with new innovative products and services, but also to offer in-depth information to help them understand products and services better.

Follow us

Twitter
LinkedIn
YouTube

© 2026 Dolphin Publications B.V.
All rights reserved.

Techzine Service

  • Become a partner
  • Advertising
  • About Us
  • Contact
  • Terms & Conditions
  • Privacy Statement