Micron

What Does Micron Mean?

A micron is a unit of measurement that is often used to measure the size of processors, microprocessors and their components. It is 1/1,000th of a millimeter or 1/25th of a thousandth of an inch.

Advertisements

Micron is short for micrometer.

Techopedia Explains Micron

In microchips, microns are usually used to express their line width. For example, the line width of a Pentium 4 microprocessor is equivalent to 0.13 to 0.18 microns. In more recent years, however, the size of these microchips has decreased even more, to as low as 0.022 microns or even lower.

Advertisements

Related Terms

Latest Computer Science Terms

Related Reading

Margaret Rouse

Margaret Rouse is an award-winning technical writer and teacher known for her ability to explain complex technical subjects to a non-technical, business audience. Over the past twenty years her explanations have appeared on TechTarget websites and she's been cited as an authority in articles by the New York Times, Time Magazine, USA Today, ZDNet, PC Magazine and Discovery Magazine.Margaret's idea of a fun day is helping IT and business professionals learn to speak each other’s highly specialized languages. If you have a suggestion for a new definition or how to improve a technical explanation, please email Margaret or contact her…