Tech moves fast! Stay ahead of the curve with Techopedia!
Join nearly 200,000 subscribers who receive actionable tech insights from Techopedia.
A microchip is a small semiconductor module of packaged computer circuitry that serves a specific role in relation to other microchips in a computer hardware system. It also refers to the small wafer of semiconductive material used to make an integrated circuit (IC).
A microchip is also known as an integrated circuit (IC).
Microchips are used in all electronic devices - from small flash drives to complex computers and even some motorized vehicles.
After the transistor was invented, subsequent technology allowed for a dramatic reduction in size and the creation of complex circuits that can be placed on a small piece of semiconductive material, usually silicon, known as a chip. This is a far cry from the old vacuum tubes that characterized early electronic circuits.
In 1949, early mentions in microchip technology development began when Werner Jacobi, a German engineer for Siemens AG, filed a patent for an IC-like amplification device. He claimed this device could be used to create hearing aids.