Definition - What does Packet Buffer mean?
A packet buffer is memory space set aside for storing packets awaiting transmission over networks or storing packets received over networks. These memory spaces are either located in a network interface card (NIC) or in the computer that holds the card.
Packets are stored temporarily during the transmission of information to create a reserve for use during packet transmission delays or during a retransmission request. Packet buffering in media systems reduces the effects of packet delays and packet loss for streaming. Buffering provides the necessary time to synchronize packets and request and replace those lost during transmission.
Techopedia explains Packet Buffer
Packet buffers are normally located in receiving devices, although in some cases they are used in sending devices to permit the rapid selection and retransmission of packets requested by devices on the receiving end.
Packets for every application are multiplexed into single streams. A packet buffer management algorithm determines whether a packet has to be accepted or rejected. Accepted packets are placed into logical first in, first out (FIFO) queues, where each application has its own queue in packet buffers. An accepted packet remains in the buffer until an application retrieves it. Newly arrived packets are rejected when the buffer is full.
A parallel packet buffer incorporates an individual dynamic random-access memory (DRAM) memory module in order to emulate a common memory buffer, where every module has the same size, data width and access time. The total amount of data buffered is the aggregate buffering capacity of every memory module. Read and write operations are performed in a pipeline manner in individual memory modules. While a packet is written to some other memory module, newly arrived packets are written in modules that are not currently being accessed. Pipelined and simultaneous access to individual memory modules boosts aggregate bandwidth, reducing loads in individual memory.