[WEBINAR] Bulletproof: How Today's Business Leaders Stay on Top

Packet Buffer

Definition - What does Packet Buffer mean?

A packet buffer is memory space set aside for storing packets awaiting transmission over networks or storing packets received over networks. These memory spaces are either located in a network interface card (NIC) or in the computer that holds the card.

Packets are stored temporarily during the transmission of information to create a reserve for use during packet transmission delays or during a retransmission request. Packet buffering in media systems reduces the effects of packet delays and packet loss for streaming. Buffering provides the necessary time to synchronize packets and request and replace those lost during transmission.

Techopedia explains Packet Buffer

Packet buffers are normally located in receiving devices, although in some cases they are used in sending devices to permit the rapid selection and retransmission of packets requested by devices on the receiving end.

Packets for every application are multiplexed into single streams. A packet buffer management algorithm determines whether a packet has to be accepted or rejected. Accepted packets are placed into logical first in, first out (FIFO) queues, where each application has its own queue in packet buffers. An accepted packet remains in the buffer until an application retrieves it. Newly arrived packets are rejected when the buffer is full.

A parallel packet buffer incorporates an individual dynamic random-access memory (DRAM) memory module in order to emulate a common memory buffer, where every module has the same size, data width and access time. The total amount of data buffered is the aggregate buffering capacity of every memory module. Read and write operations are performed in a pipeline manner in individual memory modules. While a packet is written to some other memory module, newly arrived packets are written in modules that are not currently being accessed. Pipelined and simultaneous access to individual memory modules boosts aggregate bandwidth, reducing loads in individual memory.

Share this:

Connect with us

Email Newsletter

Join thousands of others with our weekly newsletter

The 4th Era of IT Infrastructure: Superconverged Systems
The 4th Era of IT Infrastructure: Superconverged Systems:
Learn the benefits and limitations of the 3 generations of IT infrastructure – siloed, converged and hyperconverged – and discover how the 4th...
Approaches and Benefits of Network Virtualization
Approaches and Benefits of Network Virtualization:
Businesses today aspire to achieve a software-defined datacenter (SDDC) to enhance business agility and reduce operational complexity. However, the...
Free E-Book: Public Cloud Guide
Free E-Book: Public Cloud Guide:
This white paper is for leaders of Operations, Engineering, or Infrastructure teams who are creating or executing an IT roadmap.
Free Tool: Virtual Health Monitor
Free Tool: Virtual Health Monitor:
Virtual Health Monitor is a free virtualization monitoring and reporting tool for VMware, Hyper-V, RHEV, and XenServer environments.
Free 30 Day Trial – Turbonomic
Free 30 Day Trial – Turbonomic:
Turbonomic delivers an autonomic platform where virtual and cloud environments self-manage in real-time to assure application performance.