11 min Analytics

Klarrio uses open source expertise to build foundational data platforms

Klarrio uses open source expertise to build foundational data platforms

Belgian company Klarrio stands out as a platform engineering player in a competitive market. Using open source technology, it builds complete data platforms for customers. The core of the team has been working in open source for more than 25 years, from the first ADSL modems to modern cloud-native solutions. We spoke with Kurt Jonckheer, CEO, and Dirk Van de Poel, CPO, about how that experience creates value today.

Klarrio works almost exclusively with open source technology, but not by simply picking tools off the shelf. The company focuses on what Jonckheer describes as the foundation for data. “We build everything that goes into the basement. All the new pipes, all the electricity. Everything you essentially need to create a solid foundation.”

Klarrio is not a standard system integrator, but an architect and builder of the very foundation itself. As a data-driven organization, you usually end up at Klarrio when you encounter complexity and skyrocketing costs. Through platform engineering, they build the underlying data layer, focusing on integrations, platform architecture, data flows, security, governance, and scalability. The customer continues to perform work related to analytics, dashboards, and data science. Klarrio ensures the data infrastructure on which these applications run is robust, future-proof, and controllable.

Building such a platform requires significant effort. Of course, with open source, there are quite a few components that are extremely good and will get you a long way. However, they must be coordinated and made compliant with the customer’s security requirements. “If you deploy open source directly, so to speak, it doesn’t meet the requirements dictated by the CISO or local legislation,” Jonckheer explains. “It has to be made compliant and checked for security.”

Building between existing systems

Klarrio’s customers almost always have existing systems and platforms. These include CRM systems, ERP solutions, cloud platforms, and legacy environments that are the starting point for keeping the business running. Klarrio builds the data layer between them that enables integration, processing, and access. For example, if a company uses Salesforce for CRM and wants to apply more analytics to that CRM data, that information must be transferred to a data platform. “Those are the things we can arrange for the customer,” Jonckheer confirms.

In some cases, this means replacing existing platforms such as Snowflake or Databricks. In others, the platform to be built serves as a system that prepares data before it is sent to one of the existing platforms. The system Klarrio builds then performs data preparation, filtering, or anonymization before transferring it to commercial analytics environments. This allows control over data flows without dependence on specific platform providers. “Perhaps you can first remove the sensitivities from the data and start working with a smaller dataset in another data platform,” adds Van de Poel.

Klarrio as a platform builder

In fact, any company facing a specific cloud- or data-related problem can turn to Klarrio. At some point in their data and cloud journey, businesses encounter limitations. We will discuss those limitations later in this article. In any case, companies have reached a point where they need to redesign their environment and remove or replace certain components. Klarrio first maps the organization’s current landscape and then determines the best route to a new architecture.

Klarrio then builds a complete, customized data platform based on open source technology. This is not done by simply implementing tools, but by designing and building fundamental architectures. As mentioned at the beginning, Klarrio’s role lies explicitly below the application layer. “We ensure that data is made available to the customer’s teams in a controlled, secure, and efficient manner,” says Jonckheer.

This also means that open source is not applied in a plug-and-play manner. Security, compliance, and governance are built in from the ground up. Open source components are adapted to local regulations, internal security requirements, and organizational processes. The platform becomes part of the business architecture itself rather than a separate technical layer.

25 years of technical discipline

Jonckheer explains that Klarrio’s approach is unique because it is rooted in software engineering. In the past, the core team built embedded software, firmware, and network infrastructure for telecom environments. This background has fostered a culture of efficient programming, resource-conscious design, and technical discipline.

According to Jonckheer, these principles are directly relevant in modern cloud environments. Inefficiency translates directly into costs. Architectural errors become apparent in consumption peaks, scaling problems, and budget overruns. By structurally embedding this discipline in the development process, scalability remains manageable. With 25 years of expertise, efficient programming is logical. This principle from the embedded era has direct value in the cloud. “If you don’t apply or recognize those basic principles, and you don’t know about buffers or registers, your costs will skyrocket,” he says.

Knowledge is actively passed on through mentoring, internal training, and pairing structures. New Klarrio employees benefit from the experience of Klarrio’s core team so that they, too, can reach a high level. Plus, knowledge transfer to the customer is part of the business model. A certain level of maturity is required on the customer’s part, but Klarrio always sees it as part of a project’s task to brush up the customer’s knowledge. This prepares the customer to manage the platform much better from the very beginning.

Fundamental choices

Klarrio applies clear principles to what it wants to achieve with platform engineering. All code developed is the intellectual property of the customer. The company deliberately avoids creating dependency relationships. Transparency, transferability, and control are central.

Not all assignments, however, are accepted. Deliberately avoided are technology choices that are not future-proof, strategically unsustainable architectures, or ethically sensitive projects. This principled stance limits short-term growth, but according to Jonckheer, it strengthens the company’s long-term position.

Through the roof

Jonckheer sees that, sooner or later, many organizations will need to adopt this approach. A cloud or data project usually starts with a limited scope, a pilot, a proof of concept, and a small user group. In that phase, costs seem manageable, and complexity appears to be crystal clear. But as soon as applications are rolled out organization-wide, consumption grows exponentially. This applies to storage and computing power, as well as to additional services, integrations, and platform components. “What starts with ten or twenty users quickly becomes an organization-wide platform problem,” Jonckheer explains. “And at that point, costs often skyrocket.”

According to Van de Poel, this dynamic is not an isolated incident. It is a structural consequence of how cloud architectures are built today: fragmented, tool-driven, and without long-term design.

Complexity is a structural problem

The core of the problem, Jonckheer says, aside from cloud prices, lies in the complexity of modern cloud architectures. Organizations face different building blocks. Think of data lakes, data warehouses, streaming platforms, message layers, API gateways, protocol adapters, and analytics environments. Each component has its own cost model, scaling mechanism, and dependencies. Without a deep understanding of data flows, workloads, and usage patterns, cost control quickly becomes a theoretical exercise.

In addition, many IT departments are structurally burdened with operational legacy issues. A large part of their capacity is spent on maintenance, management, and incident handling. The scope for making in-depth architectural choices or understanding complex cloud models is limited. As a result, forecasting cloud costs often becomes a best-effort approach rather than a substantiated calculation. “Companies then inevitably face the question: do we have the right skills in-house for this?” says Jonckheer.

The dependence on cloud providers’ pricing models also plays a role here. Suppliers implement price changes and new rate structures, and consumption models change. Customers have little influence on this. In combination with consumption-driven SaaS and PaaS services, this creates a cost structure that is becoming increasingly unpredictable. “Essentially, you are handing over your own checkbook,” Jonckheer believes.

The pitfall of lift-and-shift

A common response to digitization projects is the lift-and-shift approach. This involves migrating existing infrastructure one-to-one to the cloud without making any fundamental redesign choices. Legacy architectures remain intact, but now run on cloud platforms. Instead of simplification, fragmentation occurs. That is, multiple platforms, suppliers, cost models, and management layers coexist.

Company data is spread across different systems. The data is located in the CRM package, the ERP environment, but also in cloud applications and internal databases. On top of that, each platform requires its own integrations, governance structures, and security measures. The overview disappears, while dependencies increase. Instead of flexibility, complexity arises. According to Jonckheer, this leads to financial overruns and manageability problems.

That is why he and Van de Poel both stress the importance of asking fundamental questions at the start of every cloud project. “What requirements must a cloud environment meet in terms of governance and sovereignty? And what role will the cloud play in the overall architecture? Is it exclusively about infrastructure, or will core functionalities also become dependent on SaaS and PaaS services?” are among the questions they ask. The answers will largely determine an organization’s future agility.

Core business requires control

Ultimately, not every workload is suitable for complete outsourcing to cloud platforms. Dependency becomes a risk, especially when it comes to core processes, intellectual property, or strategic data assets. Jonckheer argues that organizations must explicitly ask themselves which parts of their IT landscape are core to their business and which are not.

When analytics, data models, or algorithms create direct value for the organization, they constitute intellectual property. In such cases, it is not self-evident to fully embed the logic in proprietary services of external platforms. Similar considerations apply to sensitive data and strict latency requirements. How much control does an organization want to retain over its digital foundation?

This is where open source as an architectural principle comes into play. Building applications and platform components on open standards and open source technology creates infrastructure independence. The software layer is decoupled from the underlying infrastructure layer. This creates flexibility to move workloads without rebuilding the entire application architecture.

Open source as a structural exit strategy

According to Jonckheer, open source is the right choice to address data sovereignty concerns. It offers companies the structural exit strategy they need. By avoiding proprietary software deeply intertwined with a single platform, organizations retain their agility. The infrastructure layer becomes a commodity. Issues relating to virtual machines, storage, and network functionality can be arranged through different providers.

This also shifts the cost model. Investments in software development become capex-driven and predictable. Infrastructure costs remain variable, but are more manageable because the software is not dependent on specific cloud services. This separation enables long-term planning.

At the same time, Jonckheer warns that open source, in itself, does not guarantee sovereignty. Open source software can just as easily run on non-European infrastructure. A legal framework such as the Cloud Act makes it clear that data access is not only a technical issue, but a geopolitical one as well. According to Jonckheer, true data sovereignty requires a combination of open source and European infrastructure or proprietary environments.

Hybrid architecture: a realistic model

Practice requires nuance. Not everything has to be on-premises; not everything has to be in the cloud. The reality is hybrid. Certain workloads benefit from the scalability and flexibility of public cloud platforms. Other components require local control, legal certainty, or architectural autonomy.

Jonckheer therefore advocates hybrid architectures in which core components remain under their own management, while less critical workloads can be rolled out flexibly. This approach requires platform-level design choices, not project-level ones. Architecture becomes a strategic tool rather than a technical prerequisite.

Control your own destiny

Klarrio’s core philosophy can be summarized in one principle: control your own destiny. “Keep your core expertise in-house. Today, everything is software-driven; even the smallest device contains software. So you can no longer say that you are not a technology company. You have to embrace software, not necessarily everything, but the core,” says Jonckheer. And organizations must control that digital core, even when they use external platforms and services.

By combining open source, hybrid architectures, and platform thinking, a model emerges in which organizations retain flexibility without becoming dependent. Not by avoiding technology, but by strategically positioning it.

The expertise Klarrio has built up throughout the years forms the foundation for this. It is embedded in design choices, development processes, architectural principles, and organizational culture. According to Jonckheer, this is not a characteristic that can be easily copied. “It’s in the company’s DNA. In how we think, design, and build.” Klarrio positions itself as a builder of digital foundations. Foundations on which organizations can build their own data economy without relinquishing control.