Definition - What does Multimeter mean?
A multimeter is an electronic tool used to measure voltage, amps and resistance across circuits.By attaching two leads to different parts of an electrical system, professionals can use multimeters to detect levels of voltage and resistance, or changes in electrical currents.
This tool may also be known as a volt-ohm meter or volt-ohm-milliammeter (VOM).
Techopedia explains Multimeter
New digital multimeters have advanced to the point that they can measure extremely tiny differences or fluctuations. Experts point out that although some multimeters test higher ranges of voltage, it will be less possible to detect smaller changes in these higher ranges.
Multimeters have a lot of practical applications in IT. Hardware troubleshooting is an area where professionals may use a multimeter to figure out whether individual hardware devices are getting enough current, or whether anything has changed in an existing IT setup. Although many think of the multimeter as something that is in a residential or commercial electrician’s toolbox, this tool can also be something that IT professionals use in diagnosing energy supply issues behind advanced data systems.
Join thousands of others with our weekly newsletter
The 4th Era of IT Infrastructure: Superconverged Systems:
Approaches and Benefits of Network Virtualization:
Free E-Book: Public Cloud Guide:
Free Tool: Virtual Health Monitor:
Free 30 Day Trial – Turbonomic: