Don't miss an insight. Subscribe to Techopedia for free.



What Does Millisecond Mean?

A millisecond (ms or msec) is a unit of time that represents 1/1000th of a second. It is one of the more useful chronological measurements related to cycle speeds, central processor unit (CPU) operations and microprocessor design, as well as some forms of data transfer.


There are also measurements in between seconds and milliseconds: A centisecond (cs or csec) is 100 ms, and a decisecond (ds or dsec) is 10 ms.

Techopedia Explains Millisecond

The millisecond is one of several increasingly small units used to measure time. In these chronological ranges, it becomes more difficult to identify technological ability and make these extremely short time frames relevant to evaluation.

A millisecond can be used to assess computing activity in many ways. For example, this range is useful in assessing the access time of a disk drive or the time required to make an object available.

Experts estimate the average disk access time of an above-average personal computer to be in the range of 9-15 ms.


Related Terms