Millisecond

Why Trust Techopedia

What Does Millisecond Mean?

A millisecond (ms or msec) is a unit of time that represents 1/1000th of a second. It is one of the more useful chronological measurements related to cycle speeds, central processor unit (CPU) operations and microprocessor design, as well as some forms of data transfer.

Advertisements

There are also measurements in between seconds and milliseconds: A centisecond (cs or csec) is 100 ms, and a decisecond (ds or dsec) is 10 ms.

Techopedia Explains Millisecond

The millisecond is one of several increasingly small units used to measure time. In these chronological ranges, it becomes more difficult to identify technological ability and make these extremely short time frames relevant to evaluation.

A millisecond can be used to assess computing activity in many ways. For example, this range is useful in assessing the access time of a disk drive or the time required to make an object available.

Experts estimate the average disk access time of an above-average personal computer to be in the range of 9-15 ms.

Advertisements

Related Terms

Margaret Rouse
Technology Expert
Margaret Rouse
Technology Expert

Margaret is an award-winning technical writer and teacher known for her ability to explain complex technical subjects to a non-technical business audience. Over the past twenty years, her IT definitions have been published by Que in an encyclopedia of technology terms and cited in articles by the New York Times, Time Magazine, USA Today, ZDNet, PC Magazine, and Discovery Magazine. She joined Techopedia in 2011. Margaret's idea of a fun day is helping IT and business professionals learn to speak each other’s highly specialized languages.