Millisecond (ms or msec)
Definition - What does Millisecond (ms or msec) mean?
A millisecond (ms or msec) is a unit of time that represents 1/1000th of a second. It is one of the more useful chronological measurements related to cycle speeds, central processor unit (CPU) operations and microprocessor design, as well as some forms of data transfer.
There are also measurements in between seconds and milliseconds: A centisecond (cs or csec) is 100 ms, and a decisecond (ds or dsec) is 10 ms.
Techopedia explains Millisecond (ms or msec)
The millisecond is one of several increasingly small units used to measure time. In these chronological ranges, it becomes more difficult to identify technological ability and make these extremely short time frames relevant to evaluation.
A millisecond can be used to assess computing activity in many ways. For example, this range is useful in assessing the access time of a disk drive or the time required to make an object available.
Experts estimate the average disk access time of an above-average personal computer to be in the range of 9-15 ms.
The Past, Present and Future of Autonomic Computing
Join thousands of others with our weekly newsletter
The 4th Era of IT Infrastructure: Superconverged Systems:
Approaches and Benefits of Network Virtualization:
Free E-Book: Public Cloud Guide:
Free Tool: Virtual Health Monitor:
Free 30 Day Trial – Turbonomic: