No, cloud computing is not going away. This (perhaps obvious) negative-positive statement is still worth saying out loud; that’s because there’s an undercurrent (if not a groundswell) of strategic effort designed to put (or keep) those parts of the data and application stack in on-premises technologies, especially where cloud computing is not necessarily as cost-efficient, secure, accessible or functional. So what’s going on?
If we think back to the evangelistic proclamations made in the early cloud years sometime around the millennium, we were repeatedly told that “cloud is for everyone, but not necessarily everything” at the time. That core truth was perhaps brushed under the carpet somewhat over the decades that have followed i.e. the public cloud providers rarely tell us that they can’t think of a good place on their backbone for just about every portion of an enterprise’s entire IT estate.
Even if a business definitively identifies the fact that it needs a proportion of private cloud on-premises compute and storage, the public hyperscaler answer is often the tautologically tangled option of so-called public-private cloud, or virtual private cloud (VPC), which is a private cloud that uses a public cloud’s shared resources to create an isolated environment. But that’s still not a move back to on-premises as a concerted deliberate move is it?
It’s another world
One man who thinks he can identify areas where on-premises data analytics, storage and core compute can improve performance is Larry O’Connor, founder and CEO of Other World Computing (OWC). In an era when even Mark Zuckerberg thinks his cloud services bills are too high, O’Conner and team talk about why they think companies may now prioritise security, performance and financial control with more clearly defined on-premises strategies.
This is not necessarily a question of growing cloud vulnerabilities and the possibility of system-level misconfigurations (we have plenty of security platform specialists such as Qualys or infrastructure-as-code purists like Progress to come in and shore up those fragilities), but the spectre of flaky data management and exposure to large-scale breaches in public cloud will always feature as part of this conversation.
Local data storage (especially in the case of smaller businesses), is often argued to be a far less attractive target for cyber-attacks.
Freebie heebie-jeebie kriebels
“The cost of cloud storage in particular has gone from nearly given away to becoming significantly expensive. The freebies that drew people in have been slowly but surely pulled away and with a growing cost to the storage. I have spoken to some in the service space that a decade ago got into the business of driving customers to cloud storage services who are now finding good business driving them back to on-premises,” said O’Connor.
While we clearly need to remind ourselves that CEO O’Connor – who happens to operate a business initially founded in hard drive sales and today known for its extensive line of physical storage devices that can be used to extend an organisation’s on-premises footprint would clearly want to extol the virtues of “not only cloud” these days – the move to gain a little more agility across needs to be more intelligently architected hybrid environments is no doubt real. He thinks that it’s not that public cloud distributed storage providers do not offer value – but it’s all about the right services for the right need.”
Banjaxed by bandwidth
“There’s no reason to depend on the cloud for all or even a majority of your data needs. It’s not cost-effective to do so vs. easy-to-deploy, faster, on-premises options. The cloud also requires and costs you bandwidth and also time,” said O’Connor. ““If your confidential data is on the cloud, you obviously have a greater risk of being part of a massive, large-scale breach. It’s less of a risk to use the cloud for external data sharing but not for corporate infrastructure. Keeping your data local, as a smaller target, is often more secure.”
He insists that the cloud for backup “really should be tertiary”, in his humble opinion.
“Whereas having a good backup strategy locally is going to be more cost-effective and give you much greater accessibility. If something goes down locally and you need to recover, it’s faster and more convenient if you do so locally, rather than having to pull it off the cloud,” he said.
Is on-premises AI smarter?
The OWC chief also thinks that bringing AI capabilities on-premises allows organisations that may not have the budget for extensive cloud-based AI, to benefit from powerful data processing and analysis tools. That’s not necessarily an argument that would hold water with some of the open source and embryonic experimental test group projects emanating from the periphery of the hyperscalers, but it may have some part to play in this total argument.
The suggestion here is that when AI is run locally, sensitive and proprietary data such unique algorithms and business strategies, it stays within the company’s secure environment. The act of keeping data further from the risk of exposure or leakage in the cloud is argued by some to help avoid so-called “data bleed” where proprietary information could unintentionally enhance third-party models or be accessed intentionally by external entities.
“On-premises AI will democratize some of the AI learning and capabilities that smaller businesses and institutions will have access to. AI in the cloud is already really expensive,” asserted O’Connor. “Having on-prem AI means that your data and IP gets to stay on-premises. There’s not one bit of risk that there’s any bleed-over from the datasets you provide for AI, ultimately helping other systems and potential competitors learn and benefit from your private data and knowledge. You also have an ongoing benefit, not a forever cost expenditure.”
Long-lasting LLM AI
O’Connor clarified his position further on AI and said that it’s more an argument from him that, at a scale, owning the equipment for AI gives a long-lasting, secure benefit. It could also be noted – the same open source objects you can run in the cloud, you can run securely on-premises as well.
“Certainly, when experimenting at a small scale, the cloud options are fantastic to be testing the waters… but once you see a strong application opportunity that you’re going to put a larger scale data set to learn/train to – that’s where all the on-premise opportunities ought to be weighed,” underlined O’Connor.
There are of course arguments on both sides of the coin here.
Public cloud will always have massive scale, speed, scope, solidity and – by definition – services, services and yet more services to wrap customers up in the warm glow of managed environments that benefit from a whole army of datacentre minions running around looking after an organisation’s objectives and goals. Equally, when a firm is able to operate locally with interface speeds, on plug-and-play cabling or networking, at up to 7000MB/s vs. a typical at best of 100MB/s in cloud (1/70th the speed), they might just feel the “drag” of remote cloud dependency vs. what they can do internally.
“I am a believer that there is a lot of good we can do on-premises while still of course opportunities where cloud is the only way. Certainly, for the largest demands, the cost to build on-prem just doesn’t make sense compared to the value for intense loads you can only get from the cloud. But for so many small and even medium organisations – that have significant data sets and benefits from how they can deploy AI applications from their data – in ways most haven’t even conceived or imagined of yet – on-premises solutions will be more cost-effective over the long haul and more resilient as well,” concluded O’Connor.
Waves of cloud repatriation?
For an external view here outside of OWC, Vadim Tkachenko, technology fellow and co-founder at Percona thinks that whether or not we’ll see a massive wave of data repatriation take place in 2025 is still hard to say.
“However, I am confident that it will almost certainly mark a turning point for the trend. Yes, people have been talking about repatriation off and on and in various contexts for quite some time. I firmly believe that we are facing a real inflection point for repatriation where the right combination of factors will come together to nudge organisations towards bringing their data back in-house to either on-premises or private cloud environments which they control, rather than public cloud or as-a-Service options,” he said.
Tkachenko further states that companies across the private sector (and tech in particular) are tightening their purse strings considerably.
“We’re also seeing more work on enhanced usability, ease of deployment, and of course, automation. The easier it becomes to deploy and manage databases on your own, the more organizations will have the confidence and capabilities needed to reclaim their data and a sizeable chunk of their budgets,” said the Percona man.
It turns out then, cloud is still here and on-premises is still here and… actually, a hybrid world is typically the most prudent route to go down. He leaves us with a comment to ponder i.e. it’s interesting to realise today how many things break when an Internet outage occurs… so perhaps the more hybridised we are, the more resilient we are.