[TIME RUNNING OUT] Discovery & Management Across the Database Environment


Definition - What does Latency mean?

Latency is a networking term to describe the total time it takes a data packet to travel from one node to another. In other contexts, when a data packet is transmitted and returned back to its source, the total time for the round trip is known as latency. Latency refers to time interval or delay when a system component is waiting for another system component to do something. This duration of time is called latency.

Techopedia explains Latency

In data communication, digital networking and packet-switched networks, latency is used in two major contexts. One represents a one-way trip while the other is a round trip. One-way latency is measured by counting the total time it takes a packet to travel from its source to its destination.

Round-trip latency is measured by adding one-way latency from the destination to the time it takes the packet to return from the destination and arrive back at the source. Unlike one-way latency, round-trip latency always excludes processing time at the destination point. A service called ping is used to measure round-trip latency.

In formal network transmission, the following four elements are involved in latency:

  1. Delay in Storage: As data is written on hard disks and other storage devices, a delay occurs in reading and writing to and from different blocks of memory. Processors often consume a lot of time finding the exact location for reading and writing data. Sometimes intermediate devices like switches or hubs also cause delays.
  2. Device Processing: Latency is not limited to storage devices but can also be caused by different network devices. For example, when a router receives a data packet, it keeps that packet for a few seconds to read its information and also to write some extra information.
  3. Transmission: There are many kinds of transmission media and all have limitations. Each medium, from fiber optics to coaxial cables, takes some time to transmit one packet from a source to a destination. Transmission delays depend on packet size; smaller packets will take less time to reach their destination than larger packets.
  4. Propagation: Delays occur even when packets travel from one node to another at the speed of light.
Share this:

Techopedia Deals

Connect with us

Techopedia on Linkedin
Techopedia on Linkedin
"Techopedia" on Twitter

Sign up for Techopedia's Free Newsletter!

Email Newsletter

Join thousands of others with our weekly newsletter

The 4th Era of IT Infrastructure: Superconverged Systems
The 4th Era of IT Infrastructure: Superconverged Systems:
Learn the benefits and limitations of the 3 generations of IT infrastructure – siloed, converged and hyperconverged – and discover how the 4th...
Approaches and Benefits of Network Virtualization
Approaches and Benefits of Network Virtualization:
Businesses today aspire to achieve a software-defined datacenter (SDDC) to enhance business agility and reduce operational complexity. However, the...
Free E-Book: Public Cloud Guide
Free E-Book: Public Cloud Guide:
This white paper is for leaders of Operations, Engineering, or Infrastructure teams who are creating or executing an IT roadmap.
Free Tool: Virtual Health Monitor
Free Tool: Virtual Health Monitor:
Virtual Health Monitor is a free virtualization monitoring and reporting tool for VMware, Hyper-V, RHEV, and XenServer environments.
Free 30 Day Trial – Turbonomic
Free 30 Day Trial – Turbonomic:
Turbonomic delivers an autonomic platform where virtual and cloud environments self-manage in real-time to assure application performance.