Page 1 of 1

[SOLVED] 100A DC Current Sensor - Can't get an accurate reading

Posted: Mon Apr 02, 2018 6:34 pm
by unnamed_player
Hello, well not a whole of of folks seem to be using this sensors so I hope someone else out there has the answer.

I just acquired a couple of these guys: https://www.phidgets.com/?tier=3&catid= ... prodid=439

which I paired with these: https://www.phidgets.com/?tier=3&catid= ... prodid=118

to get to an interface kit I already had: https://www.phidgets.com/?tier=3&prodid=122

Seems pretty straight forward :) Yet, no matter what current runs through it, I get a high volt value on the interface kit's analog input. It's always above 5V ranging from roughly 5.17V to 5.34V.

High current, low current no current, nothing seems to affect it. Am I doing anything wrong?

Any help much appreciated, thanks in advance!

Re: 100A DC Current Sensor - Can't get an accurate reading

Posted: Tue Apr 03, 2018 8:32 am
by mparadis
Are both of your 1145s exhibiting this behaviour?

Out of curiosity, are your 1145s missing the same 3 components as the one on the 1145 product picture? They should be missing (These components were removed late in the design cycle in order to work with the bi-directional current transducers).

Re: 100A DC Current Sensor - Can't get an accurate reading

Posted: Wed Apr 04, 2018 7:45 am
by unnamed_player
Thanks for the idea,

I can't tell if they are missing not knowing what they are but the ones I have look exactly the same (https://i.imgur.com/jRp8YrJ.jpg).

This morning I plugged in the other 1145 with the other 3588. Same result. Even with no wire running through the 3588.

It's worth saying I had a 1122 on the same port which worked fine, in fact I have another along with a couple of voltage sensors on this interface kit and they're all happy. The point being I think know how to get a reading from these guys :).

Yet given they both exhibit the same behavior, maybe I'm doing something wrong in my code?

I simply run this (Python):

Code: Select all

device.waitForAttach(10000)
print ("%d attached!" % (device.getSerialNum()))
print device.getSensorValue(0)
This gives me the voltage at the port and it's always around the same high value.

Here are a few values I get with no wire running through:
> 394916 attached!
> 537.0

> 394916 attached!
> 535.0

> 394916 attached!
> 537.0

> 394916 attached!
> 534.0

If I unplug at the interface kit, the value goes back to 0 as one would expect.

Any other ideas? Thank you!

Re: 100A DC Current Sensor - Can't get an accurate reading

Posted: Wed Apr 04, 2018 8:14 am
by mparadis
Ah, I see what's wrong. "SensorValue" is a number that ranges from 0 to 1000, with 0 being 0V and 1000 being 5V. So the reading of 537 that you're getting actually corresponds to 2.835V. If you plug this value into the equation for the 3588, you get:

Current (A) = (Vout * 40) - 100 = 13.4A

But with no wire running through, this reading doesn't seem right either. Can you test it with a known current to see how accurate it is?

Re: 100A DC Current Sensor - Can't get an accurate reading

Posted: Wed Apr 04, 2018 7:51 pm
by unnamed_player
Well that's really good to know (that conversion must be included in the formulas I grabbed for the other sensors). But it is as you say, the reading isn't accurate in that it remains the same no matter what I put through it. It's always something very close to 534. With wire, without wire. I grabbed a clamp meter, and measured the solar panels at 3 amps (cloudy :)) and the reading was 534.

Now to try and further isolate the problem, I moved to another port I use for another sensor, same stable value.

I unplug at the 1145 as such
Interface Kit ----X---- 1145 -------- 3588
I get a reading of 0

Out of curiosity I ran the test with only the 3588 unplugged as such
Interface Kit -------- 1145 ----X---- 3588
And I would get a stable reading of 180 to 200, not that this should tell us much.

So I'm back to square one, having replaced all the variables. I guess I didn't change the interface kit or the code I'm running, but given the interface kit gives me good readings on the same code otherwise, I'm disinclined to want to blame them.

You brought up 2 valid leads so far, hopefully you have a 3rd one and it is the charm :).

Thanks again.

Re: 100A DC Current Sensor - Can't get an accurate reading

Posted: Thu Apr 05, 2018 12:18 pm
by mparadis
I just got an 1145 and 3588 from our stock and tested them to see if I could replicate your situation. Mine gives a reading close to 2.5V (a little over 500) with no wire, and seems to respond to current change in a wire as expected.

Can you try using the Phidget Control Panel so we can rule out anything strange happening in the program? Also, ensure that there is no debris contacting the pins on the bottom of the 1145 and check to make sure all of the cables are in good condition, to ensure there are no short-circuits.

Re: 100A DC Current Sensor - Can't get an accurate reading

Posted: Thu Apr 05, 2018 12:29 pm
by unnamed_player
There is the terrible possibility that I misinterpreted 2.5V being erroneous for 0A when in fact, it's right in the middle for a sensor which covers negative Amp values... Your test proves that.

I'll try this shortly, today is sunny so fluctuations in current will not get lost in the noise.

I'm crossing my fingers and will post an update later today.

Re: 100A DC Current Sensor - Can't get an accurate reading

Posted: Thu Apr 05, 2018 1:09 pm
by mparadis
Well, the reading of 2.8V for no wire still seems a bit high. If you find that the transducer does respond to current changes but seems off, you can try adjusting the gain and offset dials on the back. In order to do this accurately, you'd need a predictable current source or another accurate current sensor to calibrate it against.

Re: 100A DC Current Sensor - Can't get an accurate reading

Posted: Fri Apr 06, 2018 5:57 am
by unnamed_player
It is 100% as you say :) I always do a lot of research before asking for help and this was no exception. Several factors got me confused here.

the port reading from 0 to 1000 meaning 0V to 5V
the middle point of 2.5V meaning 0A instead of the intuitive 0V=0A
the fact that indeed, the sensor is not calibrated and so it's not quite 2.5V (I'll remedy that today with the good old clamp meter helping).
the fact that when I checked it was a cloudy day which meant low current to check for fluctuations
the fact that I had the wire going "backward" and so when more current was being produced by the solar panels, the Amp reading was going from an offset positive value to one around 0.

All kinds of little things aligned themselves to make me confused. Regardless, I don't like to solicit help too much but I clearly needed it here. I'm super grateful that you took the time to go through each step and even pull a sensor from stock.

I often recommend Phidgets for the incredible documentation, robustness and bigger ecosystem. To this list I'll add incredible support.

Thank you again, hopefully I'll be more self sufficient in the future, thus far I had always found what I needed in the docs.

Take care.

Re: [SOLVED] 100A DC Current Sensor - Can't get an accurate reading

Posted: Wed Apr 11, 2018 7:34 am
by unnamed_player
More woes, kind of a heads up to others, and wondering if the all knowing mparadis has any thoughts on this :).

I did eventually calibrate using a 1011 interface kit (https://www.phidgets.com/?tier=3&catid= ... 1&prodid=4) on my laptop.

Everything was honky dory.

Then I moved my super calibrated 3588s to their final destination, the 1203 interface kit and I get a consistent skew of ~7A. (I've done more empirical testing to verify this).

It looks like there are nuances of readings depending on the interface kit, or maybe the hardware they are attached to.

Not a big deal but definitely something to keep in mind. Throughout this experience I've had to learn to be ok with the imprecision of analog readings. I'm more of a discrete data type of guy :).

Thanks again for all the help.