Don't miss an insight. Subscribe to Techopedia for free.


Cache Coherence

What Does Cache Coherence Mean?

Cache coherence is the regularity or consistency of data stored in cache memory. Maintaining cache and memory consistency is imperative for multiprocessors or distributed shared memory (DSM) systems. Cache management is structured to ensure that data is not overwritten or lost.


Different techniques may be used to maintain cache coherency, including directory based coherence, bus snooping and snarfing. To maintain consistency, a DSM system imitates these techniques and uses a coherency protocol, which is essential to system operations.

Cache coherence is also known as cache coherency or cache consistency.

Techopedia Explains Cache Coherence

The majority of coherency protocols that support multiprocessors use a sequential consistency standard. DSM systems use a weak or release consistency standard.

The following methods are used for cache coherence management and consistency in read/write (R/W) and instantaneous operations:

Written data locations are sequenced.
Write operations occur instantaneously.
Program order preservation is maintained with RW data.
A coherent memory view is maintained, where consistent values are provided through shared memory. Several types of cache coherency may be utilized by different structures, as follows:

Directory based coherence: References a filter in which memory data is accessible to all processors. When memory area data changes, the cache is updated or invalidated.
Bus snooping: Monitors and manages all cache memory and notifies the processor when there is a write operation. Used in smaller systems with fewer processors.
Snarfing: Self-monitors and updates its address and data versions. Requires large amounts of bandwidth and resources compared to directory based coherence and bus snooping.


Related Terms