Has the value of data increased?

Has the value of data increased?

Data is often likened to oil, gold, money and other entities and commodities of value, largely because we make things from it, we build with it and we even trade with it. But in a world of big (and sometimes massive) data analytics with the tipping point from exabytes to zettabytes and yottabytes on the horizon, has the core value of data itself actually changed? Spoiler alert, obviously yes, it has, but how and why?

The Impact Of AI

Dheeraj Pandey, founder and CEO of AI-native platform company focused on tools that unite product development, support and customer experience in one system of record DevRev says that AI has fundamentally recalibrated how we think about enterprise data. 

“It’s no longer just about storing transactions or measuring volume and velocity. It’s organising information into knowledge graphs that capture the deep context and relationships across an entire business. This allows AI to form precise connections between business records, customer interactions and operational processes. Systems evolve from simple keyword matching to semantic search, from answering questions to true reasoning, unlocking insights that were previously buried in fractured silos. The result? Data’s value is measured by how effectively it powers reasoning and decision-making,” explained Pandey, on a press and analyst keynote call this week.

James Fisher, chief strategy officer at data integration, data quality and data analytics company Qlik is of much the same mind. He says that as AI capabilities like foundation models and basic use cases (summarisation, chatbots) become commoditised, competitive differentiation shifts to the business setting. 

A new measure of competitive differentiation

Here, it’s not the model but the quality, context and accessibility of the data (and how it’s governed, integrated and embedded into workflows) that creates value. Without this foundation, even the most advanced AI delivers unreliable or biased outcomes. Organisations that invest in governance and real-time readiness are already turning AI from experiments into measurable business results,” said Fisher.

“We’ve seen that AI’s true potential is unlocked by connecting trusted, governed data – structured and unstructured – with real-time analytics and decision intelligence. With the rise of agentic AI, the next wave of value creation will come from intelligent systems that don’t just interpret data, but continuously and autonomously act on it at scale. Put simply, AI isn’t a shortcut to insight – it’s a multiplier of value, if the data is ready. Enterprises that treat data as an afterthought will fall behind, while those that treat it as a strategic asset will lead,” added the Qlik CSO.

Garima Kapoor, co-founder and co-CEO of high-performance object storage company MinIO adds to this discussion by saying that data is indeed the currency of the AI economy, but currency is useless without a means of understanding and capitalising on its value.

A new data-centric centre of gravity 

“Industry research suggests that in the past six months alone, companies have spent more than $47 billion on AI projects, yet only about 11% have produced meaningful business results. The reason is that many companies tried to run AI on top of slow, fragmented and expensive data systems. AI is not simply about adopting the latest model, but about re-architecting how data is stored, moved and made accessible at scale. This shift is being accelerated by industry-wide forces – such as model proliferation, rising cloud fees and national data regulations, which are forcing companies to control where their data is stored and how it’s moved (sovereignty) – that are reshaping the competitive landscape. These forces make data, not models, the centre of gravity in AI strategy,” said Kapoor.

From her perspective, Kapoor thinks that this all means that the value of data is in its fluidity. In other words, the value of data is in ensuring that every dataset, no matter where it lives, can move seamlessly with built-in security, governance and scale. 

“In this AI economy, compute power may set the pace, but data sets the ceiling. MinIO raises that ceiling, transforming scattered, hard-to-reach datasets into a living, high-performance fabric that fuels every AI prompt and initiative. With MinIO AIStor, organizations gain the ability to store and understand. Data that is secure, fluid, and always ready for action is a competitive weapon,” added Kapoor.

How ‘fit’ is your data?

“AI has made everyone take a hard look at how to define what makes data valuable. We’ve quickly realised value goes beyond volume,” stated Maggie Laird, president at Pentaho. She says that it’s more about truly understanding the fitness of the data, which relates to its accuracy, completeness, timeliness, and contextual relevance through metadata. Pentaho is a business analytics platform used to integrate, prepare, and analyse diverse data sources to create customised business intelligence solutions.

“For AI to really deliver, organisations must evaluate data across multiple dimensions: bias, lineage, role-based usage, age, and adaptability. This helps to ensure data is both fit for AI-driven decision-making and is being used within the bounds of regulations and business rules. Platforms that unify integration, governance, and analytics play a critical role, preparing and delivering quality data for action and trustworthy AI outcomes while minimising risk,” stated Pentaho’s Laird.

Data recalibration situation

Rounding out this discussion is Steve Neat, chief revenue officer at data lineage company Solidatus. Neat says that AI has recalibrated the value of data from quantity to qualities like provable provenance, timeliness, permission and fitness for purpose. 

“Data that is fresh, well described and policy aware beats bigger but blind datasets because it can be safely composed, reused and measured for impact, with the lineage to show teams what to trust and what to fix so they can ship faster,” said Neat. “Governance creates value only when it runs at runtime. Lineage makes that real by showing what to trust, what to fix and who owns the fix. It also prices risk because you can’t put sensitive fields into RAG or agents unless you can trace them and enforce policy at read time. In my view, the recalibration means that the winners will be the ones that stop hoarding and start curating data so that models learn faster and waste less.”

While there is no question, really, of whether the value of data has increased and, further, whether the proliferation of AI has been fundamental to that value escalation, the mechanics as variously described here should point us towards the new wave of emerging truths in this space.

Dutch 10 guilder banknote featuring a blue portrait, strong data value and security features, signatures, date "25 april 1968," and text "de nederlandsche bank TIEN GULDEN." Handwriting visible at left adds to data quality.
Opnamedatum: 2019-07-30