Tech moves fast! Stay ahead of the curve with Techopedia!
Join nearly 200,000 subscribers who receive actionable tech insights from Techopedia.
A multimeter is an electronic tool used to measure voltage, amps and resistance across circuits.By attaching two leads to different parts of an electrical system, professionals can use multimeters to detect levels of voltage and resistance, or changes in electrical currents.
This tool may also be known as a volt-ohm meter or volt-ohm-milliammeter (VOM).
New digital multimeters have advanced to the point that they can measure extremely tiny differences or fluctuations. Experts point out that although some multimeters test higher ranges of voltage, it will be less possible to detect smaller changes in these higher ranges.
Multimeters have a lot of practical applications in IT. Hardware troubleshooting is an area where professionals may use a multimeter to figure out whether individual hardware devices are getting enough current, or whether anything has changed in an existing IT setup. Although many think of the multimeter as something that is in a residential or commercial electrician’s toolbox, this tool can also be something that IT professionals use in diagnosing energy supply issues behind advanced data systems.