diy solar

diy solar

Clamp meter Ampere testing (Batts to PV)

A J

New Member
Joined
Sep 21, 2019
Messages
79
On 21/09/19 I used my SLA X2 6V (12V) 520Ah Batteries for nearly 14Hrs and dragged them down to 74% SOC.

Today 22/09/19 I am recharging them with a 315W PV, the controller was giving me 17.9A form solar and the controller was producing was 18.4A!
With that I grabbed the clamp meter and tested the current at the batts and it checked out 18.4A. Moving to the PV I measured 9.6A.

As a general discussion I invite people to debate on what is going on? (For a lot of you it should be 4 letters)
But what is really going on?

And for those that don’t know then this is for you!
 
On 21/09/19 I used my SLA X2 6V (12V) 520Ah Batteries for nearly 14Hrs and dragged them down to 74% SOC.

Today 22/09/19 I am recharging them with a 315W PV, the controller was giving me 17.9A form solar and the controller was producing was 18.4A!
With that I grabbed the clamp meter and tested the current at the batts and it checked out 18.4A. Moving to the PV I measured 9.6A.

As a general discussion I invite people to debate on what is going on? (For a lot of you it should be 4 letters)
But what is really going on?

And for those that don’t know then this is for you!
From tech school one billion years ago, I remember that a clamp meter was described as a "jouleometer", (with one joule being how many electrons are passing by a specific point at a given moment in time). Why does a clamp meter have a higher current reading at the batteries than at the panels you ask? %#$@ if I know, the flow of electrons, (aka current), should be the same at the batteries as at the panels. Maybe there is inductive interference at the batteries from nearby equipment that is messing up the clamp meter's reading?
 
Last edited:
Transformation of one to another. Is that cryptic enough? :sneaky:
 
On 21/09/19 I used my SLA X2 6V (12V) 520Ah Batteries for nearly 14Hrs and dragged them down to 74% SOC.

Today 22/09/19 I am recharging them with a 315W PV, the controller was giving me 17.9A form solar and the controller was producing was 18.4A!
With that I grabbed the clamp meter and tested the current at the batts and it checked out 18.4A. Moving to the PV I measured 9.6A.

As a general discussion I invite people to debate on what is going on? (For a lot of you it should be 4 letters)
But what is really going on?

And for those that don’t know then this is for you!

If your using an MPPT charge controller that’s what it does. If your panels are in series they are producing higher voltage into the CC. The MPPT converts that higher voltage into higher charging amps. That’s why MPPT is the best way to go on charge controllers. That is also one one the reasons to connect the battery before the solar panels so the CC can set the battery voltage on the C controller.
On mine with 3 100w panels in series I’m producing 60>< Volts at maybe 5 amps. My MPPT optimizes that excess voltage by turning it into higher charging amps. On a good day it will produce 18amps from that 5amps from the panels available the the battery or any load that I may be using at that time. Series panel connection will work best because even when it’s cloudy or shaded my panels will still produce 30-40 volts. PWM CCs can’t take that higher voltage. Parallel panel connection with clouds or shade will be sometimes be only 12-14 volts.
 
I seen the KEY 4 letters
"As a general discussion I invite people to debate on what is going on? (For a lot of you it should be 4 letters)
But what is really going on?
"

Yes, the answer I was attempting to get across for those who ran the test I did and just didn't know what MPPT actually did, apart from
gnubie ;)
MrNatural22 (y)
and b.james :).
Oh and offgriddle "From tech school one billion years ago" That far back WOW :cool:, It dose feel like that, just over a decade for me, not the answer i was trying to get, but THANK YOU any way. Are you describing PWM? "the flow of electrons, (aka current), should be the same at the batteries as at the panels"?
There is a mass of RFI, EMI in my shed but no, the meter was dead on with the current sense module (Whizbang Jr.) between the controller & Battery Negs.

Now we know it's MPPT But what is really going on?
For the record I have a good idea on what is going on "for those who don't!" is the answer i am looking for..... Any one?
 
Mppt moves watts with smaller conductors. By letting high voltage leave the panels, and get converted to battery voltage with minimal loss as close to the batteries as possible.

This lets you use small conductors from the distant panels, and large conductors ($$$) as short as possible.
 
The MPPT charger first measures the PV output by applying an increasing load on it, and measuring voltage and current at the various loads. As the load increases, the voltage slowly drops until it approaches the maximum current the panel(s) can supply, then the voltage drops much more quickly, headed for zero. Once it has the load curve, it multiplies the voltage and current numbers at each load point to get a power reading there...it picks the highest power point as the target.
Next it connects itself to the battery and starts charging the battery hard enough to get the voltage and current from the panel(s) to the target point it just measured. This is how it gets the most power from the panel. The maximum power point can change as the sun moves with time of day, and as clouds, etc. change light levels, so it repeats the load sweep to get a new target every few minutes.
The battery charger circuit is basically a DC to Dc converter. Most use a buck converter which only goes down in voltage, but some specialty chargers use a boost converter that can also increase voltage from input to output.
Info on DC to DC converters:
 
Back
Top