Definition - What does Microcontroller mean?
A microcontroller is a computer present in a single integrated circuit which is dedicated to perform one task and execute one specific application.
It contains memory, programmable input/output peripherals as well a processor. Microcontrollers are mostly designed for embedded applications and are heavily used in automatically controlled electronic devices such as cellphones, cameras, microwave ovens, washing machines, etc.
Techopedia explains Microcontroller
Features of a microcontroller:
- Far more economical to control electronic devices and processes as the size and cost involved is comparatively less than other methods.
- Operating at a low clock rate frequency, usually use four bit words and are designed for low power consumption.
- Architecture varies greatly with respect to purpose from general to specific, and with respect to microprocessor, ROM, RAM or I/O functions.
- Has a dedicated input device and often has a display for output.
- Usually embedded in other equipment and are used to control features or actions of the equipment.
- Program used by microcontroller is stored in ROM.
- Used in situations where limited computing functions are needed
Join thousands of others with our weekly newsletter
The 4th Era of IT Infrastructure: Superconverged Systems:
Approaches and Benefits of Network Virtualization:
Free E-Book: Public Cloud Guide:
Free Tool: Virtual Health Monitor:
Free 30 Day Trial – Turbonomic: