Utility Computing

Definition - What does Utility Computing mean?

Utility computing is the process of providing computing service through an on-demand, pay-per-use billing method. Utility computing is a computing business model in which the provider owns, operates and manages the computing infrastructure and resources, and the subscribers accesses it as and when required on a rental or metered basis.

Techopedia explains Utility Computing

Utility computing is one of the most popular IT service models, primarily because of the flexibility and economy it provides. This model is based on that used by conventional utilities such as telephone services, electricity and gas. The principle behind utility computing is simple. The consumer has access to a virtually unlimited supply of computing solutions over the Internet or a virtual private network, which can be sourced and used whenever it's required. The back-end infrastructure and computing resources management and delivery is governed by the provider.

Utility computing solutions can include virtual servers, virtual storage, virtual software, backup and most IT solutions.

Cloud computing, grid computing and managed IT services are based on the concept of utility computing.
Posted by:

Connect with us

Techopedia on Linkedin
Techopedia on Linkedin
Tweat cdn.techopedia.com
Techopedia on Twitter


'@Techopedia'
Sign up for Techopedia's Free Newsletter!

Email Newsletter

Join 138,000+ IT pros on our weekly newsletter

Resources
Free 30 Day Trial: SolarWinds® Log & Event Manager
Free 30 Day Trial: SolarWinds® Log & Event Manager:
Use this powerful SIEM to tackle compliance audits, perform root cause analysis and manage all logs in one place.
Free 30 Day Trial – VMTurbo Operations Manager
Free 30 Day Trial – VMTurbo Operations Manager:
Handles the scale and complexity of today’s virtual and cloud environments for you, keeping your application owners happy while maximizing...