Garbage Collection (GC)
Definition - What does Garbage Collection (GC) mean?
Garbage collection (GC) is a dynamic approach to automatic memory management and heap allocation that processes and identifies dead memory blocks and reallocates storage for reuse. The primary purpose of garbage collection is to reduce memory leaks.
GC implementation requires three primary approaches, as follows:
- Mark-and-sweep - In process when memory runs out, the GC locates all accessible memory and then reclaims available memory.
- Reference counting - Allocated objects contain a reference count of the referencing number. When the memory count is zero, the object is garbage and is then destroyed. The freed memory returns to the memory heap.
- Copy collection - There are two memory partitions. If the first partition is full, the GC locates all accessible data structures and copies them to the second partition, compacting memory after GC process and allowing continuous free memory.
Some programming languages and platforms with built-in GC (e.g., Java, Lisp, C# and .Net) self-manage memory leaks, allowing for more efficient programming.
Techopedia explains Garbage Collection (GC)
Garbage collection's dynamic approach to automatic heap allocation addresses common and costly errors that often result in real world program defects when undetected.
Because they are difficult to identify and repair, allocation errors are costly. Thus, garbage collection is considered by many to be an essential language feature that makes the programmer’s job easier with lower manual heap allocation management. However, GC is not perfect, and the following drawbacks should be considered:
- When freeing memory, GC consumes computing resources.
- The GC process is unpredictable, resulting in scattered session delays.
- When unused object references are not manually disposed, GC causes logical memory leaks.
- GC does not always know when to process within virtual memory environments of modern desktop computers.
- The GC process interacts poorly with cache and virtual memory systems, resulting in performance-tuning difficulties.
Why Traditional Database Technology Fails to Scale
Join thousands of others with our weekly newsletter
The 4th Era of IT Infrastructure: Superconverged Systems:
Approaches and Benefits of Network Virtualization:
Free E-Book: Public Cloud Guide:
Free Tool: Virtual Health Monitor:
Free 30 Day Trial – Turbonomic: