Definition - What does Nanometer mean?
A nanometer (NM) is a unit of measurement that is equivalent to one billionth of a meter. It is widely used as a scale for building tiny, complex, and atomic scale computing and electronic components - specifically in nanotechnology.
Techopedia explains Nanometer
Nanometers are used to measure the smallest things, usually those the size of an atom or molecule. This term is used in the context of miniature computing devices, such as integrated circuits (IC) and transistors embedded within a processor. Typically, the size of transistors on a semiconductor-based processor is calculated in nanometers. Each microchip can have transistors that are 100 nanometers wide and can accommodate more than 1 billion transistors within a single microchip die.
Join thousands of others with our weekly newsletter
The 4th Era of IT Infrastructure: Superconverged Systems:
Learn the benefits and limitations of the 3 generations of IT infrastructure – siloed, converged and hyperconverged – and discover how the 4th...
Approaches and Benefits of Network Virtualization:
Businesses today aspire to achieve a software-defined datacenter (SDDC) to enhance business agility and reduce operational complexity. However, the...
Free E-Book: Public Cloud Guide:
This white paper is for leaders of Operations, Engineering, or Infrastructure teams who are creating or executing an IT roadmap.
Free Tool: Virtual Health Monitor:
Virtual Health Monitor is a free virtualization monitoring and reporting tool for VMware, Hyper-V, RHEV, and XenServer environments.
Free 30 Day Trial – Turbonomic:
Turbonomic delivers an autonomic platform where virtual and cloud environments self-manage in real-time to assure application performance.