Definition - What does Microchip mean?
A microchip is a small semiconductor module of packaged computer circuitry that serves a specific role in relation to other microchips in a computer hardware system. It also refers to the small wafer of semiconductive material used to make an integrated circuit (IC).
A microchip is also known as an integrated circuit (IC).
Techopedia explains Microchip
Microchips are used in all electronic devices - from small flash drives to complex computers and even some motorized vehicles.
After the transistor was invented, subsequent technology allowed for a dramatic reduction in size and the creation of complex circuits that can be placed on a small piece of semiconductive material, usually silicon, known as a chip. This is a far cry from the old vacuum tubes that characterized early electronic circuits.
In 1949, early mentions in microchip technology development began when Werner Jacobi, a German engineer for Siemens AG, filed a patent for an IC-like amplification device. He claimed this device could be used to create hearing aids.
Join thousands of others with our weekly newsletter
The 4th Era of IT Infrastructure: Superconverged Systems:
Approaches and Benefits of Network Virtualization:
Free E-Book: Public Cloud Guide:
Free Tool: Virtual Health Monitor:
Free 30 Day Trial – Turbonomic: