Micron

Why Trust Techopedia

What Does Micron Mean?

A micron is a unit of measurement that is often used to measure the size of processors, microprocessors and their components. It is 1/1,000th of a millimeter or 1/25th of a thousandth of an inch.

Advertisements

Micron is short for micrometer.

Techopedia Explains Micron

In microchips, microns are usually used to express their line width. For example, the line width of a Pentium 4 microprocessor is equivalent to 0.13 to 0.18 microns. In more recent years, however, the size of these microchips has decreased even more, to as low as 0.022 microns or even lower.

Advertisements

Related Terms

Margaret Rouse
Technology Specialist
Margaret Rouse
Technology Specialist

Margaret is an award-winning writer and educator known for her ability to explain complex technical topics to a non-technical business audience. Over the past twenty years, her IT definitions have been published by Que in an encyclopedia of technology terms and cited in articles in the New York Times, Time Magazine, USA Today, ZDNet, PC Magazine, and Discovery Magazine. She joined Techopedia in 2011. Margaret’s idea of ​​a fun day is to help IT and business professionals to learn to speak each other’s highly specialized languages.