Definition - What does Throughput mean?
Throughput refers to the performance of tasks by a computing service or device over a specific period. It measures the amount of completed work against time consumed and may be used to measure the performance of a processor, memory and/or network communications.
Techopedia explains Throughput
Throughput was conceived to evaluate the productivity of computer processors. This was generally calculated in terms of batch jobs or tasks per second and millions of instructions per second. Some derivatives measure a system's overall throughput by evaluating the amount and complexity of work, number of simultaneous users and application/system responsiveness.
Similarly, for network communications, throughput is measured by calculating the amount of data transferred between locations during a specified period, generally resulting as bits per second (bps), which has evolved to bytes per second (Bps), kilobytes per second (KBps), megabytes per second (MBps) and gigabytes per second (GBps).
Why Traditional Database Technology Fails to Scale
Join thousands of others with our weekly newsletter
The 4th Era of IT Infrastructure: Superconverged Systems:
Approaches and Benefits of Network Virtualization:
Free E-Book: Public Cloud Guide:
Free Tool: Virtual Health Monitor:
Free 30 Day Trial – Turbonomic: