[WEBINAR] Application Acceleration: Faster Performance for End Users


Definition - What does Micron mean?

A micron is a unit of measurement that is often used to measure the size of processors, microprocessors and their components. It is 1/1,000th of a millimeter or 1/25th of a thousandth of an inch.

Micron is short for micrometer.

Techopedia explains Micron

In microchips, microns are usually used to express their line width. For example, the line width of a Pentium 4 microprocessor is equivalent to 0.13 to 0.18 microns. In more recent years, however, the size of these microchips has decreased even more, to as low as 0.022 microns or even lower.

Techopedia Deals

Connect with us

Techopedia on Linkedin
Techopedia on Linkedin
"Techopedia" on Twitter

Sign up for Techopedia's Free Newsletter!

Email Newsletter

Join thousands of others with our weekly newsletter

Free Whitepaper: The Path to Hybrid Cloud
Free Whitepaper: The Path to Hybrid Cloud:
The Path to Hybrid Cloud: Intelligent Bursting To Amazon Web Services & Microsoft Azure
Free E-Book: Public Cloud Guide
Free E-Book: Public Cloud Guide:
This white paper is for leaders of Operations, Engineering, or Infrastructure teams who are creating or executing an IT roadmap.
Free Tool: Virtual Health Monitor
Free Tool: Virtual Health Monitor:
Virtual Health Monitor is a free virtualization monitoring and reporting tool for VMware, Hyper-V, RHEV, and XenServer environments.
Free 30 Day Trial – Turbonomic
Free 30 Day Trial – Turbonomic:
Turbonomic delivers an autonomic platform where virtual and cloud environments self-manage in real-time to assure application performance.