Introduction to Digital Multimeters - Part 1

August 14th, 2020:

Digital Multimeters (or DMM's) have been described as the tape measure of the new millennium! But what exactly is a digital multimeter and what can you do with it? How do you make measurements safely and what features should you look for? What is the best way to get the most out of your DMM? These and other questions will be answered in our introduction to digital multimeters, parts 1 and 2.

Technology is rapidly changing the world around us. Electrical and electronic circuitry seems to permeate everything and continues to become more complex while shrinking in size. The communication industry booms with smart phones, tablets, smart TV's and AI devices and internet connections have put increased pressure on the electronics technicians. Servicing, repairing and installing this complex equipment requires diagnostic tools that provide accurate information. Tools such as DMM's.

A digital multimeter (or DMM) is simply an electronic tape measure for making electrical measurements. While many DMM’s include a number of additional features, the main purpose is to measure Volts, Ohms and Amperes (or current).

Choosing your DMM

Choosing the right DMM for the job requires not only looking at basic specifications, but also looking at features, functions, safety ratings and the overall value represented by the meter’s design. Reliability, especially under tough conditions, is more important than ever today. Most DMM’s undergo rigorous testing and evaluations before they are ever packaged and sent out. User safety is also a primary concern to help prevent injury or meter damage if/when the meter is used improperly.

While most DMM’s handle the standard Volts, Ohms and Amperes measurements, many also measure things like Capacitance, Frequency, Process signals and Temperature. Additional features include things such as backlit displays, Bluetooth capability, datalogging and True-RMS measurements. On the safety side of things, DMM’s are typically rated for CAT III-600V, CAT III-1000V or CAT IV-600V. These are all things to consider when picking out the proper DMM for the application.

The Basics: Resolution, Digits & Counts

Resolution refers to how fine of a measurement the meter can make. By knowing the resolution of a meter, you can determine if it is possible to see a small change in the measured signal. For example, if the DMM has a resolution of 1 mV on a 4V range, it is possible to see a change of 1 mV (1/1000 of a volt) while reading 1V.

You wouldn’t buy a ruler marked in 1 inch (or 1 centimeter) segments if you needed to measure down to a quarter of an inch (or 1 millimeter). At the same time, a thermometer that only measures in whole degrees isn’t much use when your normal temperature is 98.6°F. You would need a thermometer with a 1-degree resolution.

The terms “digits” and “counts” are used to describe a meter’s resolution as DMM’s are typically grouped by the number of counts or digits they display. A 3½-digit meter can display three full digits ranging from 0-9 and one “half” digit which only displays a 1 or is left blank. A 3½-digit meter will display up to 1,999 counts of resolution and a 4½-digit meter will display up to 19,999 counts of resolution.

It is more precise to describe a meter by counts of resolution than by digits. Today’s 3½-digit meters may have enhanced resolutions of up to 3,200, 4,000 or 6,000 counts. For certain measurements, 3,200-count meters offer a better resolution. For example, a 1,999-count meter won’t be able to measure down to a tenth of a volt if you are measuring 320 volts or more. However, a 3,200-count meter will display a tenth of a volt up to 320 volts. This is the same resolution as a more expensive 20,000-count meter until you exceed 320 volts.

Analog vs Digital Displays

For high accuracy and resolution, the digital display excels, displaying three or more digits for each measurement. The analog needle display is less accurate and has lower effective resolution because you have to estimate values between the lines. A bar graph shows changes and trends in a signal just like an analog needle, but is more durable and less prone to damage.


Accuracy is the largest allowable error that will occur under specific operating conditions. In other words, it is an indication of how close the DMM’s displayed measurement is to the actual value of the signal being measured.

Accuracy for a DMM is usually expressed as a percent of reading. An accuracy of 1% of reading means that for a displayed reading of 100 volts, the actual value of the voltage could be anywhere between 99 volts and 101 volts. Specifications may also include a range of digits added to the basic accuracy specification. This indicates how many counts the digit to the extreme right of the display may vary. So the preceding accuracy example might be stated as +/- (1% + 2). Therefore, for a display reading of 100 volts, the actual voltage would be between 98.8 volts and 101.2 volts.

Analog meter specifications are determined by the error at full scale, not at the displayed reading. Typical accuracy for an analog meter is +/- 2% or +/- 3% of full scale. At one-tenth of full scale, these become 20% or 30% of reading. Typical basic accuracy for a DMM is between +/- (0.7% + 1) and +/- (0.1% + 1) of reading, or better.

Ohm’s Law

Voltage, Current and Resistance in any electrical circuit can be calculated using Ohm’s Law, which states that Voltage equals Current times Resistance. Thus, if any two values in the formula are known, the third value can be easily determined.

A DMM makes use of Ohm’s Law to directly measure and display either Volts, Amps or Ohms. Thus, you can use most standard DMM’s to find the answers you need.

Continued in our Introduction to Digital Multimeters - Part 2

Note: Information taken from an application note courtesy of Fluke Electronics Corporation (which can be found HERE). Ram Meter Inc. sells and stocks an assortment of Fluke digital multimeters and other products, which can all be found on our website at