Tech moves fast! Stay ahead of the curve with Techopedia!
Join nearly 200,000 subscribers who receive actionable tech insights from Techopedia.
On-demand computing (ODC) is an enterprise-level model of technology and computing in which resources are provided on an as-needed and when-needed basis. ODC make computing resources such as storage capacity, computational speed and software applications available to users as and when needed for specific temporary projects, known or unexpected workloads, routine work, or long-term technological and computing requirements.
Web services and other specialized tasks are sometimes referenced as types of ODC.
ODC is succinctly defined as “pay and use” computing power. It is also known as OD computing or utility computing.
The major advantage of ODC is low initial cost, as computational resources are essentially rented when they are required. This provides cost savings over purchasing them outright.
The concept of ODC is not new. John McCarthy at the Massachusetts Institute of Technology (MIT) made the prophetic and insightful comment in 1961 that someday computing may be organized to provide services much like public utilities do. Over the following two decades, IBM and other mainframe providers made computing power and database storage available to many banks and other large organizations all over the world. Later, the business model changed as low-cost computers became ubiquitous in the business world.
By the late 1990s, computer data centers were filled with thousands of servers, and utility computing emerged. On-demand computing, software-as-a-service and cloud computing are all models for repackaging computational, software application and network services.
The conceptual and actual technologies that allow these companies to develop ODC services include virtualization, computer clusters, supercomputers and distributed computing.