-
RE: Update rate of K2450
<p>Thanks for raising a ticket. 👍</p> <p>The example you cite with the <a href="https://forum.tek.com/viewtopic.php?f=14&t=142669">other instrument is an interesting one</a>. We have done similar measurments with other instruments by starting an acquisition loop just after asserting updating the voltage. This is a good way of characterising transients, and seeing how they change with capacitive loading (you can choose how much effort to go to reduce capactance)<br> <br> In that post, a ramp in fairly large steps (+1 V every 200 ms) on a 10 GOhm resistor gives a short transient and a longer decay,<br> While there is sometimes variation on ~100s ms time scales, the 0 to 1 V seems to settle quickly, within ~ 5 point say (which is ~20 ms if I am correct that the rate = 250 S/s). This is the kind of speed that we are targeting.</p> <p>If it turns out we can remove the delay with a firmware update that would be great.</p> -
Update rate of K2450
<p>What is the fastest rate I should be able to sweep/update the voltage output on the K2450, using point-by-point level updates (not a source measure loop controlled by the machine)?
I am trying to understand the delay that occurs when the source level is updated, and how this relates to the current measurement range and any automatically applied source delay. I know why delays are added for current measure (I have seen ringing on high capacitance lines on other instruments when voltage changes) but in this case I just want the fastest source update time.
We discovered that sweeps executed point by point could cause a 'pile-up' of voltage updates. You can drop in an *OPC? command to make sure you don't rush the measurement (on another instrument) but the process is slow, hundreds of milliseconds.
I tried to set the source delays to zero (:SOUR:VOLT:DEL 0) and confirmed that this deactivated the auto setting :SOUR:VOLT:DEL:AUTO? but this did not help. Changing the current measurement range of the instrument to a less sensitive range changed the delay time by a factor 2, but this did not tally with the numbers in the manual for the autodelay (we see ~ 170 ms for measure range 1uA).
This looks like some artificially added software delay that I have not got under control. The response time on other queries is ~normal (10 ms for a range setting query, for instance). Any ideas?
PS: I could move the speed/performance slider but I assume this has the effect of changing multiple real parameters like the integration time on the ammeter and the delays times and is largely focussed on the noise/resolution of me measurement. I'd rather set the actual parameters.