Definition - What does Buffer Overflow mean?
A buffer overflow occurs when more data are written to a buffer than it can hold. The excess data is written to the adjacent memory, overwriting the contents of that location and causing unpredictable results in a program. Buffer overflows happen when there is inproper validation (no bounds prior to the data being written. It is considered a bug or weakness in the software
Techopedia explains Buffer Overflow
Attackers can exploit a buffer overflow bug by injecting code that is specifically tailored to cause buffer overflow with the initial part of a data set, then writing the rest of the data to the memory address adjacent to the overflowing buffer. The overflow data might contain executable code that allows the attackers to run bigger and more sophisticated programs or grant themselves access to the system.
Buffer overflows are one of the worst bugs that can be exploited by an attacker mostly because it is very hard to find and fix, especially if the software consists of millions of lines of code. Even the fixes for these bugs are quite complicated and error-prone. That is why it is really almost impossible to remove this type of bug entirely.
Although all programmers know the potential threat of buffer overflow in their programs, there are still a lot of buffer overflow-related threats in both new and old software, regardless of the number of fixes that have already been performed.
Join thousands of others with our weekly newsletter
The 4th Era of IT Infrastructure: Superconverged Systems:
Approaches and Benefits of Network Virtualization:
Free E-Book: Public Cloud Guide:
Free Tool: Virtual Health Monitor:
Free 30 Day Trial – Turbonomic: