Definition - What does Metacomputing mean?
Metacomputing is a technology designed to integrate multiple computing resources to develop a variety of applications related to business, management, industry and software. Metacomputing technology is also used to gather, interpret and analyze data from various databases and devices.
The goal of a metacomputing system is to facilitate transparent resource and network heterogeneity by effectively using all network grid resources.
Techopedia explains Metacomputing
The metacomputing concept was developed at the National Center for Supercomputing Applications (NCSA) in the late 1980s, as programming engineers realized that increasing computational demands required multiple computing system connectivity. Recent metacomputing developments include large computer grids that operate like virtual networked supercomputers.
A metacomputing system is a made up of the following components:
- A set of loosely coupled nodes
- A comprehensive set of services, allowing a network to perform beyond single system capacity
Metacomputing advantages include:
- Superior graphics
- Solves complex distributed computing problems
- Provides high-performance computing in data intensive applications
- Reduces bandwidth by using a single high-speed network to connect computers in different locations
Join thousands of others with our weekly newsletter
The 4th Era of IT Infrastructure: Superconverged Systems:
Approaches and Benefits of Network Virtualization:
Free E-Book: Public Cloud Guide:
Free Tool: Virtual Health Monitor:
Free 30 Day Trial – Turbonomic: