Hello,
I am working on an application where we use the 6430 source to inject currents into a piece of our own hardware, which for simplicity we will simply refer to as an I/V converter, for a calibration process. We have 5 operating ranges from 2uA to 20mA.
We test 95%, 75%, 50%,25%, and 5% of each of our operating ranges. In practice, I am able to get the unit to pass our calibration test on every test, EXCEPT 5% of our 20mA range, and 50% of our 2mA range. Both of the tests use a 1mA output from the 6430 on the 1mA output range.
The error for these two inputs is significant for only this case (around 2% according to our hardware when injecting 1mA on the 1mA range). Other output currents on the 1mA range seem to behave as expected. If I manually inject the current and perform the test, but utilize the 6430's 10mA range (1mA output on 10mA range), the error goes away and we get a result that our test would pass.
From the data sheet I would expect the 6430 1mA range to be more accurate than the 10mA, but based on the fact that all of our other tests for the same circuit pass and the magnitude of this error, coupled by the fact the pattern shows up on multiple IVs I have tested and two separate 6430 sources, I am wondering if I missed something in terms of how the 6430 unit should be set up? Any suggestions you can provide would be appreciated. For now I am trying to determine if I can force the unit to inject 1mA on the 10mA range via labview.