Definition - What does Fault Tolerance mean?
Fault tolerance is the way in which an operating system (OS) responds to a hardware or software failure. The term essentially refers to a system’s ability to allow for failures or malfunctions, and this ability may be provided by software, hardware or a combination of both. To handle faults gracefully, some computer systems have two or more duplicate systems.
Techopedia explains Fault Tolerance
Fault tolerance software may be part of the OS interface, allowing the programmer to check critical data at specific points during a transaction.
Fault tolerance can include:
- Responding to a power failure (the lowest level of fault tolerance)
- Immediately using a backup system in the event of a system failure
- Allowing mirrored disks to immediately take over for a failed disk
- Multiple processors working together and comparing data and output for errors, then immediately correcting the detected errors.
In general, 100% fault tolerance can never be achieved due to cost constraints.
Why Traditional Database Technology Fails to Scale
Join thousands of others with our weekly newsletter
The 4th Era of IT Infrastructure: Superconverged Systems:
Approaches and Benefits of Network Virtualization:
Free E-Book: Public Cloud Guide:
Free Tool: Virtual Health Monitor:
Free 30 Day Trial – Turbonomic: