Cisco CloudCenter: Get the Hybrid IT Advantage

Distributed Cache

Definition - What does Distributed Cache mean?

Distributed cache is an extension to the traditional concept of caching where data is placed in a temporary storage locally for quick retrieval. A distributed cache is more cloud computing in scope, meaning that different machines or servers contribute a portion of their cache memory into a large pool that can be accessed by multiple nodes and virtual machines. The concept and meaning of caching here remain the same; it is only the process of creating the large pool of cache that is relatively new in concept and technology.

Techopedia explains Distributed Cache

Distributed cache is widely used in cloud computing systems and virtualized environments because it provides great scalability and fault tolerance. A distributed cache may span multiple nodes or servers, which allows it to grow in capacity by simply adding more servers. A cache has traditionally served as a very fast method for saving and retrieving data, and, as such, has been mostly implemented using fast hardware in close proximity to whatever is using it. But distributed cache sometimes needs to be accessed over communication lines aside from hardware-level bus, which gives it additional overhead, meaning that it is not quite as fast as traditional hardware cache. Because of this, it is ideal to use distributed cache for storing application data residing in databases and Web session data. It is more suitable for workloads that do more reading than writing data, such as product catalogs or set images that do not change frequently and multiple user access at the same time. It would not provide much benefit for data unique to each user that can be dynamic; this is served better by local cache.

Although not as fast as traditional local cache, distributed cache has been made possible because main memory has become very cheap and network cards and networks in general have become very fast.

Share this: