Metacomputing

Why Trust Techopedia

What Does Metacomputing Mean?

Metacomputing is a technology designed to integrate multiple computing resources to develop a variety of applications related to business, management, industry and software. Metacomputing technology is also used to gather, interpret and analyze data from various databases and devices.

Advertisements

The goal of a metacomputing system is to facilitate transparent resource and network heterogeneity by effectively using all network grid resources.

Techopedia Explains Metacomputing

The metacomputing concept was developed at the National Center for Supercomputing Applications (NCSA) in the late 1980s, as programming engineers realized that increasing computational demands required multiple computing system connectivity. Recent metacomputing developments include large computer grids that operate like virtual networked supercomputers.

A metacomputing system is a made up of the following components:

  • A set of loosely coupled nodes
  • A comprehensive set of services, allowing a network to perform beyond single system capacity

Metacomputing advantages include:

  • Superior graphics
  • Solves complex distributed computing problems
  • Provides high-performance computing in data intensive applications
  • Reduces bandwidth by using a single high-speed network to connect computers in different locations
Advertisements

Related Terms

Margaret Rouse
Technology expert
Margaret Rouse
Technology expert

Margaret is an award-winning writer and educator known for her ability to explain complex technical topics to a non-technical business audience. Over the past twenty years, her IT definitions have been published by Que in an encyclopedia of technology terms and cited in articles in the New York Times, Time Magazine, USA Today, ZDNet, PC Magazine, and Discovery Magazine. She joined Techopedia in 2011. Margaret’s idea of ​​a fun day is to help IT and business professionals to learn to speak each other’s highly specialized languages.