Definition - What does Computer Cluster mean?
A computer cluster is a single logical unit consisting of multiple computers that are linked through a LAN. The networked computers essentially act as a single, much more powerful machine. A computer cluster provides much faster processing speed, larger storage capacity, better data integrity, superior reliability and wider availability of resources.
Computer clusters are, however, much more costly to implement and maintain. This results in much higher running overhead compared to a single computer.
Techopedia explains Computer Cluster
Many organizations use computer clusters to maximize processing time, increase database storage and implement faster data storing & retrieving techniques.
There are many types of computer clusters, including:
- Load-balancing clusters
- High availability (HA) clusters
- High performance (HP) clusters
The major advantages of using computer clusters are clear when an organization requires large scale processing. When used this way, computer clusters offer:
- Cost efficiency: The cluster technique is cost effective for the amount of power and processing speed being produced. It is more efficient and much cheaper compared to other solutions like setting up mainframe computers.
- Processing speed: Multiple high-speed computers work together to provided unified processing, and thus faster processing overall.
- Improved network infrastructure: Different LAN topologies are implemented to form a computer cluster. These networks create a highly efficient and effective infrastructure that prevents bottlenecks.
- Flexibility: Unlike mainframe computers, computer clusters can be upgraded to enhance the existing specifications or add extra components to the system.
- High availability of resources: If any single component fails in a computer cluster, the other machines continue to provide uninterrupted processing. This redundancy is lacking in mainframe systems.