Posted Thu, 29 Sep 2022 23:05:32 GMT by User, Forum

As part of our EMI requirements, we need all exposed metallization to be electrically connected to the PWB ground plane, with <2.5 milliohm resistance. By using the RATE and FILTER functions we are able to get rid of the noise, but the readings often drift into the negative ohms range. Can we interpret this reading like an absolute value, or is this just so far down in the mud that the meter tolerances are causing this. If one can't consider this as an absolute value, can we assume that we have close to a 0.000 ohm measurement?

Posted Thu, 29 Sep 2022 23:10:33 GMT by Expert, Tektronix Applications

The instrument resistance is spec'd at ~30ppm of range after 24 hours, ~40ppm of range after 1 year, on the minimum resistance range of 100 ohms, that is a 3-4 milliohm accuracy or uncertainty. This could be + or -.

However, that range uses a 1mA test current, which means the voltage of a 3 milliohm resistance would be 3uV. Any EMI or even just external electric fields, static electricity, or additional junction voltages from other sources will easily affect this measurement.

You could provide shielding around the test leads which is connected to the instrument chassis (isolated from the test leads) and this may reduce EMI susceptibility.

A different instrument which uses a higher source current will raise the measured voltage higher, bringing it above any small voltage error contributors, and thus above the "mud".

Our DMM6500 instrument uses a 10mA test current on the 1Ohm range, which brings the voltage level up 10x compared to the model 2000 DMM. However, it may still be susceptible to voltage error contributors if high enough.

We have many customers measuring milliohms or even microohms with a Source Measure Unit (SMU) like the 2450. With that instrument, you could source 100mA or even 1A and measure voltage levels high enough to be well above noise/error levels.

You must be signed in to post in this forum.