Where The Cloud Goes From Here: The Big Interview With Deloitte’s David Linthicum

Why Trust Techopedia

In a thought-provoking interview, Techopedia speaks with David Linthicum, chief cloud strategy officer at Deloitte Consulting LLP, about the future of cloud computing and how it enables enterprise AI and ubiquitous computing.

About David Linthicum

David LinthicumWith more than 30 years of experience in enterprise technology, Linthicum is a globally recognized thought leader in cloud computing, artificial intelligence (AI), and cybersecurity.

He is also a frequent keynote speaker, podcast host, and media contributor on digital transformation, cloud architecture, AI, and cloud security topics.

Linthicum leads the cloud technology strategy and architecture practice at Deloitte Consulting, where he creates solutions for global clients across various industries.

He is the author of more than 15 best-selling books, more than 7,000 articles, and more than 50 courses on LinkedIn Learning.

Linthicum also serves as an adjunct instructor for Louisiana State University and a mentor for Deloitte’s cloud talent development program.

Advertisements

Key Takeaways

  • Cloud is no longer a one-size-fits-all solution — instead, being flexible includes edge computing, micro clouds, on-premises solutions, and vertically aligned clouds.
  • Linthicum sees generative AI as a major focus in the cloud market in 2024, with cloud providers investing in GenAI and companies planning for its integration, even if execution is not widespread yet.
  • Ubiquitous computing involves running applications and datasets on platforms that provide the best value, regardless of location.
  • This can be edge computing, traditional data centers, colocation providers, and even personal devices like wristwatches and mobile phones.

The Future of Cloud Computing

Q: Let’s start with where we are today and how you see cloud computing changing?

A: Cloud computing is not always the slam-dunk answer that it probably was a couple of years ago because people began to realize that, in some instances, it’s not as cost-effective for some applications and some workloads. The big pattern that’s happening right now is that people are moving to more of a heterogeneous, ubiquitous model.

The future of computing is a variety of platforms – edge computing, or micro clouds (small cloud providers such as AI providers that just serve a particular function), back on-premises, or to clouds that are vertically aligned.

So, it’s best described as more of a distribution of workloads, even though the cloud providers are still going to grow. It’s just a different perspective in terms of how we’re dealing with the economic viability of certain workloads and datasets and where they should reside.

Q: What trends do you see driving the cloud market in 2024?

A: Everything is going to be generative AI-focused, if anything. It’s more retooling in the market, i.e., what the cloud and enterprise technology providers are doing to adjust and invest in the market. And that’s usually shifting to generative AI.

Cloud providers are building in the back room now — whether they have a generative AI presence or not — to get them either deeper into generative AI, or looking to build new applications using generative AI.

And the market, as far as the companies that consume the technology, are talking about and planning generative AI, even if they’re not doing a lot of generative AI yet.

We’re not seeing it being executed yet, but we’re seeing it in research, and that’s fine. It’s a big change and a big expense, and companies should walk into it slowly.

The Cloud and Enterprise AI

Q: How does the cloud enable enterprise AI?

A: The cloud is the path of least resistance when it comes to deploying AI technology. So we can provision things on demand — we don’t have to buy hardware; we don’t have to set up networks; we don’t have to build big databases; we don’t have to install AI software… It automatically comes with the cloud.

We can provision a generative AI system, or a machine learning system, or a deep learning system. And at the same time, we can provision the storage and the compute, and even the specialized compute resources.

As I’ve said many times, the cloud is the most convenient way to implement AI, versus buying hardware, negotiating with a managed service provider, or buying colocation space and then having to maintain that yourself.

The cloud enables companies to get something up and running quickly. For a particular AI application and dataset, the cloud is going to be the best-optimized way to make that happen.

The cloud should always be considered because that is going to make our lives easier as we move into this particular piece of technology, and it’s going to provide an ecosystem.

All of these services that are already tested to play well and work together are going to allow us to build and deploy these AI systems.

I always tell people that if they’re looking to build something fast, the cloud is where you want to look. But you do have to look at the economies of it, and you have to look at what it’s going to cost to operate.

 

But in many instances, the price is justified because of the value that the businesses are able to get out of it. The cloud enables them to get to market faster, and it allows them to pivot to different areas.

Companies are able to leverage that as a force multiplier that comes back in huge amounts of revenue — so that makes the cloud the best value for deploying AI technology.

Ubiquitous Computing for Better Performance

Q: What exactly is ubiquitous computing?

A: It’s basically the end-state that I presented with the first question. In other words, if we’re going to move to any platform that’s going to be the most viable, and optimize for the applications and workloads in the business, then those are platforms that are going to be running at any location.

It’s going to be edge computing, it’s going to be traditional systems within data centers, it’s going to be colocation providers, it’s going to be managed service providers.

It’s going to be my wristwatch, it’s going to be our mobile phones — basically running at any location — that are going to provide the best bang for the buck.

It’s the ability to really understand that what we’re doing is looking for the best platform to run our applications. And sometimes that’s going to be under a desk, sometimes it’s going to be a public cloud provider or an existing data center that we may have.

It’s understanding that our workloads will be scattered hither and yon, and they will be distributed. So, the ubiquity of computing means that the applications and datasets are going to reside everywhere. It’s really opening your mind and not necessarily pushing everything toward a particular kind of platform.

Q: What are the benefits of ubiquitous clouds to reduce costs, manage workloads, and boost operational efficiency?

A: Because we can put those application workloads and datasets on the platforms, they will bring the most value back to the business.

In other words, we’re putting the application workloads and datasets where they’re going to do the most good or where they’re going to run in the most optimized way.

And they’re going to run for the least amount of cost and provide the maximum value returned to the business. And if that’s the public cloud, that’s fine. We can do those metrics and determine where they should exist.

However, they could be on a private server someplace if that’s where they can run for the least cost and the maximum value.

Leveraging the Right Technology for the Business

Q: What are the benefits of a modified, simplified cloud tech stack with diverse, choose-as-you-go offerings in mobile computing, micro clouds, and on-premises?

A: It’s the ability to configure and make it exactly to our needs. We’re not adapting to how a public cloud provider wants to do it. We’re going to leverage the technology using our rules. We’re mitigating complexity.

We’re dealing with financial management and doing so in a way that empowers us to make the right decisions — no matter what those decisions are — using any hardware or any cloud-based system to get us to what’s best for the business.

Q: How will enterprises that adopt a ubiquitous computing approach to multi-cloud gain a new sense of autonomy and control over cloud operations and the ability to leverage generative AI capabilities between multiple cloud providers?

A: That’s a great question. The thing is, it empowers you to make those decisions — you’re not going to try to fit everything within a public cloud provider.

You’re not having to pick their databases, and their storage systems, and their compute systems. You’re free to pick whatever you need to support the workload.

That’s going to provide and empower them to, in essence, get to a state where they can get more value out of their compute infrastructure.

And as we move to generative AI, this is going to be even more important because, by running generative AI systems in a public cloud, you’re going to get the mother of all bills that come from the huge amounts of storage and huge amounts of processing.

They need specialized processors, such as GPUs and TPUs — which is all well and good, because it may make sense to host those on the cloud providers because there’s an ecosystem there.

We may like a particular cloud provider brand, but in many instances, we can run them in other places where it’s going to be a lot less expensive.

And you have to remember that companies only have a finite amount of money, and they have to allocate that money in the best way to bring the most value back from these investments — because that’s how they win the game.

And in many instances, they blew all their budgets on the cloud bills that came in. And now they’re going to look to leverage whatever they need to empower them to make a decision and leverage whatever technology they need.

So cloud only, or no cloud, or if “you’re moving to this particular brand”, or “you’ve partnered with these guys, and this is all you’re going to leverage” – that’s not going to work, that’s not going to scale.

You’re going to get to the big bills again because you’re going to be limited by your inability to leverage best-of-breed solutions.

You need to pick best-of-breed solutions, whatever clouds they exist in, to bring value back to the business, and that should be the key metric.

Ubiquitous Computing for What’s to Come

Q: How can ubiquitous computing create use cases for the future of advanced cloud and generative AI operations? 

A: I think what it can do is open up the innovation path for what things are to come.

So again, we’re opening up our minds to the fact that we’re not just going to leverage one particular platform and certainly not just on the cloud.

We’re considering a completely blank slate in terms of innovation and the ability and the power to pick whatever technologies will facilitate that innovation to become a force multiplier.

Ubiquitous computing can really open it up and have the innovators in the company use any tools they need — any platforms they need, any locations they need — to build the next generation of technology.

In many instances, that will be generative AI running on clouds — and in many instances it’s not. And the ability to mix and match that stuff actually makes the cloud solution stronger.

So, if I’m able to leverage certain services from a major cloud provider and mix those with other services that exist outside of the cloud using ubiquitous computing models, I may be able to manage those things more cheaply. And I’m able to pay less for them.

And I can enhance my ability to be innovative because if those services aren’t in the cloud, but are on other systems or as a multi-cloud deployment, then I’m winning.

So we’re saying: “You’re not limited”.

In many instances, sitting in rooms, we hear, “Well, we’re not allowed to use services outside of our preferred cloud provider”.

But that restriction is taken off by companies that are trying to leverage technology as a force multiplier for innovation.

So businesses need to get into the ability to innovate, to provide a better customer experience, to provide better value propositions for customers, and digital transformation.

And that needs to include every piece of technology available to make that happen.

We’re not going to be limited, and we’re going to leverage whatever we need to take the business to the next level.

Advertisements

Related Reading

Related Terms

Advertisements
Linda Rosencrance
Tech Journalist
Linda Rosencrance
Tech Journalist

Linda Rosencrance is a freelance writer and editor based in the Boston area with expertise ranging from AI and machine learning to cybersecurity and DevOps. She has covered IT topics since 1999 as an investigative reporter for several newspapers in the greater Boston area. She also writes white papers, case studies, e-books, and blog posts for a variety of corporate clients, interviewing key stakeholders including CIOs, CISOs, and other C-suite executives.