Millisecond

What Does Millisecond Mean?

A millisecond (ms or msec) is a unit of time that represents 1/1000th of a second. It is one of the more useful chronological measurements related to cycle speeds, central processor unit (CPU) operations and microprocessor design, as well as some forms of data transfer.

Advertisements

There are also measurements in between seconds and milliseconds: A centisecond (cs or csec) is 100 ms, and a decisecond (ds or dsec) is 10 ms.

Techopedia Explains Millisecond

The millisecond is one of several increasingly small units used to measure time. In these chronological ranges, it becomes more difficult to identify technological ability and make these extremely short time frames relevant to evaluation.

A millisecond can be used to assess computing activity in many ways. For example, this range is useful in assessing the access time of a disk drive or the time required to make an object available.

Experts estimate the average disk access time of an above-average personal computer to be in the range of 9-15 ms.

Advertisements

Related Terms

Latest Hardware Terms

Related Reading

Margaret Rouse

Margaret Rouse is an award-winning technical writer and teacher known for her ability to explain complex technical subjects to a non-technical, business audience. Over the past twenty years her explanations have appeared on TechTarget websites and she's been cited as an authority in articles by the New York Times, Time Magazine, USA Today, ZDNet, PC Magazine and Discovery Magazine.Margaret's idea of a fun day is helping IT and business professionals learn to speak each other’s highly specialized languages. If you have a suggestion for a new definition or how to improve a technical explanation, please email Margaret or contact her…