Utility Computing

Definition - What does Utility Computing mean?

Utility computing is the process of providing computing service through an on-demand, pay-per-use billing method. Utility computing is a computing business model in which the provider owns, operates and manages the computing infrastructure and resources, and the subscribers accesses it as and when required on a rental or metered basis.

Techopedia explains Utility Computing

Utility computing is one of the most popular IT service models, primarily because of the flexibility and economy it provides. This model is based on that used by conventional utilities such as telephone services, electricity and gas. The principle behind utility computing is simple. The consumer has access to a virtually unlimited supply of computing solutions over the Internet or a virtual private network, which can be sourced and used whenever it's required. The back-end infrastructure and computing resources management and delivery is governed by the provider.

Utility computing solutions can include virtual servers, virtual storage, virtual software, backup and most IT solutions.

Cloud computing, grid computing and managed IT services are based on the concept of utility computing.
Posted by:
How Can Analytics Improve Business Free Webinar

Connect with us

Techopedia on Linkedin
Techopedia on Linkedin
Tweat cdn.techopedia.com
Techopedia on Twitter


'@Techopedia'
Sign up for Techopedia's Free Newsletter!
Techwise Webinar Series
How Can Analytics Improve Business?
Register for this episode of TechWise to learn from two of the most experienced analysts in the business: Dr. Robin Bloor, Chief Analyst of The Bloor Group, and Dr. Kirk Borne, Data Scientist, George Mason University.

Email Newsletter

Join 138,000+ IT pros on our weekly newsletter