In that time, the technology community has been debating the relative merits of this approach vs. traditional on-premises models, and it seems that the broad conclusion is that while the cloud does provide some advantages, by itself it does not lend full support for what the enterprise needs going forward. (Also read: 7 Reasons Why You Need a Database Management System.)
According to some experts, the cloud enables a foundation for what does need to happen for DBMS going forward, but only higher level architectures will truly bring about the flexibility and advanced service levels that will define the enterprise in the age of connected digital services and the Internet of Things (IoT).
Cloud, In Context
It is important to note that this view does not contradict anything that Gartner has said about DBMS and the cloud, just that focusing strictly on cloud-based infrastructure does not tell the complete story.
In its blog posting last June entitled: “The Future of the DMBS Market is the Cloud,” Gartner’s Adam Ronthal notes that cloud-based DMBS adoption was already on the uptake and will dominate the field in short order.
Mainly, this is due to the fact that the cloud is where much of the innovation and cost optimization is taking place, while on-prem will soon be relegated to specialty applications and jobs that require legacy compatibility.
While this may be true up to a point, service providers like Altinity note that these facts need a little context.
For one thing, the idea that pricing models that show capex dropping but opex remaining the same or increasing only slightly are not panning out in the real world. Cloud architectures, in fact, tend to hit opex pretty hard as environments scale.
In addition, while cloud innovation is impressive, the DBMS market is also being influenced by open source, Kubernetes, AI and other developments that can be deployed in the cloud or on-prem. (Read: How do companies use Kubernetes?)
Open source in particular has been a hotbed of DBMS innovation for at least two decades, according to Altinity. Many of the most disruptive data management technologies were developed within open source projects, and the continued inflow of venture capital into this market ensures that this will continue for some time. (Also read: Open Source: Is It Too Good to Be True?)
For anybody contemplating a change to their data management strategies, the cloud is certainly worthy of attention but it will likely come up short without a healthy dose of open source.
In this light, DBMS is just like any other application where success is not determined simply by moving to the cloud but by how you do it.
Aiven’s Gilad David Maayan suggests the following five best practices for cloud-based DBMS:
Develop a strategy first, then migrate
The IT universe is littered with failed cloud initiatives because too much attention was paid to how it was going to work in the end rather than how to make the transition smoothly.
Encrypt and tokenize data at rest
Hackers prefer to target data at rest because no one is paying attention to it at the moment. But it still contains valuable, privileged information like personal finances and trade secrets. (Read: What benefit can real-time data provide that data at rest can't?)
Secure with Identity and Access Management (IAM)
A standard authorization procedure is critical to ensure data and infrastructure protection, particularly in large environments where access must be updated constantly.
Protect in-transit with encryption and VPNs
Data in motion is subject to theft as well, so a secure, encrypted digital tunnel is a highly effective way to keep it safe.
Tedious but important jobs are best handled by automation, which can be configured around key aspects of the monitoring process like reporting, testing and integration.
A Cloud Management Nightmare?
Organizations should also be aware that the cloud can easily turn into a management nightmare without the proper controls, says Deloitte’s David Linthicum.
Many clouds, in fact, devolve into the same silo-laden infrastructure that currently inhibits workflows in the data center, particularly when individual business units are left to create and manage their own clouds without IT supervision.
To counter this, consider adopting a master data management plan, along with data virtualization and other means to federalize the environment in order to create a single source of truth for the entire organizational structure.
Ultimately, he says, the enterprise should strive for “self-identifying data” in which the data itself is imbued with the intelligence to know what it is and how it can be used. If implemented correctly, this could deliver a 20-fold reduction in data complexity.
To put all of this in perspective, consider the cloud to be a brand new pick-up truck. Sure, it’s bigger, can haul more and has all sorts of bells and whistles like back-up cameras and WiFi, but ultimately it is up to the driver to make the most use of it and to keep it clean and in good working order.
Putting DBMS on the cloud is the easy part. Making it all work properly takes a bit more effort.