Definition - What does Cache mean?
A cache, in computing, is a data storing technique that provides the ability to access data or files at a higher speed.
Caches are implemented both in hardware and software. Caching serves as an intermediary component between the primary storage appliance and the recipient hardware or software device to reduce the latency in data access.
Techopedia explains Cache
A cache works in both hardware and software to provide similar functionality. In its physical or hardware form, it is a small form factor of internal memory that stores instances of the most frequently executed programs in the main memory to enable faster access when they are requested by the CPU.
Join thousands of others with our weekly newsletter
The 4th Era of IT Infrastructure: Superconverged Systems:
Approaches and Benefits of Network Virtualization:
Free E-Book: Public Cloud Guide:
Free Tool: Virtual Health Monitor:
Free 30 Day Trial – Turbonomic: