Why Trust Techopedia

What Does Metacomputing Mean?

Metacomputing is a technology designed to integrate multiple computing resources to develop a variety of applications related to business, management, industry and software. Metacomputing technology is also used to gather, interpret and analyze data from various databases and devices.


The goal of a metacomputing system is to facilitate transparent resource and network heterogeneity by effectively using all network grid resources.

Techopedia Explains Metacomputing

The metacomputing concept was developed at the National Center for Supercomputing Applications (NCSA) in the late 1980s, as programming engineers realized that increasing computational demands required multiple computing system connectivity. Recent metacomputing developments include large computer grids that operate like virtual networked supercomputers.

A metacomputing system is a made up of the following components:

  • A set of loosely coupled nodes
  • A comprehensive set of services, allowing a network to perform beyond single system capacity

Metacomputing advantages include:

  • Superior graphics
  • Solves complex distributed computing problems
  • Provides high-performance computing in data intensive applications
  • Reduces bandwidth by using a single high-speed network to connect computers in different locations

Related Terms

Margaret Rouse
Senior Editor
Margaret Rouse
Senior Editor

Margaret is an award-winning technical writer and teacher known for her ability to explain complex technical subjects to a non-technical business audience. Over the past twenty years, her IT definitions have been published by Que in an encyclopedia of technology terms and cited in articles by the New York Times, Time Magazine, USA Today, ZDNet, PC Magazine, and Discovery Magazine. She joined Techopedia in 2011. Margaret's idea of a fun day is helping IT and business professionals learn to speak each other’s highly specialized languages.