Microchip

What Does Microchip Mean?

A microchip is a small semiconductor module of packaged computer circuitry that serves a specific role in relation to other microchips in a computer hardware system. It also refers to the small wafer of semiconductive material used to make an integrated circuit (IC).

Advertisements

A microchip is also known as an integrated circuit (IC).

Techopedia Explains Microchip

Microchips are used in all electronic devices – from small flash drives to complex computers and even some motorized vehicles.

After the transistor was invented, subsequent technology allowed for a dramatic reduction in size and the creation of complex circuits that can be placed on a small piece of semiconductive material, usually silicon, known as a chip. This is a far cry from the old vacuum tubes that characterized early electronic circuits.

In 1949, early mentions in microchip technology development began when Werner Jacobi, a German engineer for Siemens AG, filed a patent for an IC-like amplification device. He claimed this device could be used to create hearing aids.

Advertisements

Related Terms

Latest Hardware Terms

Related Reading

Margaret Rouse

Margaret Rouse is an award-winning technical writer and teacher known for her ability to explain complex technical subjects to a non-technical, business audience. Over the past twenty years her explanations have appeared on TechTarget websites and she's been cited as an authority in articles by the New York Times, Time Magazine, USA Today, ZDNet, PC Magazine and Discovery Magazine.Margaret's idea of a fun day is helping IT and business professionals learn to speak each other’s highly specialized languages. If you have a suggestion for a new definition or how to improve a technical explanation, please email Margaret or contact her…