Microchip

Why Trust Techopedia

What Does Microchip Mean?

A microchip is a small semiconductor module of packaged computer circuitry that serves a specific role in relation to other microchips in a computer hardware system. It also refers to the small wafer of semiconductive material used to make an integrated circuit (IC).

Advertisements

A microchip is also known as an integrated circuit (IC).

Techopedia Explains Microchip

Microchips are used in all electronic devices – from small flash drives to complex computers and even some motorized vehicles.

After the transistor was invented, subsequent technology allowed for a dramatic reduction in size and the creation of complex circuits that can be placed on a small piece of semiconductive material, usually silicon, known as a chip. This is a far cry from the old vacuum tubes that characterized early electronic circuits.

In 1949, early mentions in microchip technology development began when Werner Jacobi, a German engineer for Siemens AG, filed a patent for an IC-like amplification device. He claimed this device could be used to create hearing aids.

Advertisements

Related Terms

Margaret Rouse
Technology expert
Margaret Rouse
Technology expert

Margaret is an award-winning writer and educator known for her ability to explain complex technical topics to a non-technical business audience. Over the past twenty years, her IT definitions have been published by Que in an encyclopedia of technology terms and cited in articles in the New York Times, Time Magazine, USA Today, ZDNet, PC Magazine, and Discovery Magazine. She joined Techopedia in 2011. Margaret’s idea of ​​a fun day is to help IT and business professionals to learn to speak each other’s highly specialized languages.