Posted Wed, 18 Oct 2023 09:22:57 GMT by Chan, John
Hi everyone! I am relatively new to oscilloscopes and was unsure what the uncertainty measurement would be if I were to record a single value from an input/output signal.
Say I had a sine wave of Amplitude 2V, and wanted to use my cursor to figure out how long it takes it to reach a value of 1.5V. What is my given uncertainty there? So I would either take this measurement with a single cursor at the 1.5V, or with two cursors, one at 0V and 1.5V either way my measurement is the same (not sure how this impacts error though). Say that it took 0.5 seconds to reach 1.5V. I am a bit confused on how to apply this to what I believe to be the dT uncertainty.

My manual for the TDS2012 says:
Single-shot, Sample mode = ±(1 sample interval + 100 parts per million × |reading| + 0.6 ns)
So would I find my uncertainty to be
Single-shot, Sample mode = ±(1 sample interval + 100 parts per million × |1.2s| + 0.6 ns)
But here I am confused what the sample interval is? Is it related to the sample rate? I believe that sample interval equals 1/sample rate.
I also though that the sample rate = 1/dt? So I guess my main confusion assuming I have the right formula for uncertainty is how the sample rate is found.


Thank you all for helping!
 
Posted Fri, 20 Oct 2023 21:59:35 GMT by Teles, Afonso
Hi John,

The manual specifies the delta time measurement accuracy at full bandwidth for single-shot sample mode to be: +- (1 sample interval + 50 ppm x |reading| + 0.6 ns).
You are correct that the sample interval is 1/sample rate, and the sample rate is set by the user. The scope has a record length of 2500 samples, if you set the time/div to 100 ms/div, and considering there are a total of 10 horizontal divisions, then we have 2500 S / 1 s = 2.5 kS/s, so your sample interval is 400 us.

You must be signed in to post in this forum.