diy solar

diy solar

10w lost. But where???

dragonzzr

New Member
Joined
Dec 1, 2020
Messages
31
I will put it as simple as it gets.
I spent the last 6 hours watching the behaviour of my new system which currently is consisting only of a 230ah Basen battery and a Victron 100/20 charge controller.
What i cant figure out is this: SCC reports battery voltage of 13.60 and battery bms 13.4. i started charging with solar panels and throughtout the whole 6 hours, this difference was stable. Also i noticed that scc was reporting 110w of solar an battery bms was reporting consistenly 10w less thoughout the whole time. What am i missing here?
 
There will be voltage drop in the wiring, so the higher potential source, the SCC will have higher voltage on its terminals than the battery while it is charging. The missing 10w could be what's lost on the wire as heat due to this voltage drop.

But these things aren't exactly accurate to 10w either.
 
Different measuring devices may give different results and also where the measurement is taken at can affect outcome. There is a resistance loss due to wire and connections between things. Over time you will get a feel for all the normal conversion and connection losses as well as metering discrepancies. To be honest I am beginning to suspect there is a deliberate effort in some SCC/AIO to give results that hide what is actually going on.
 
I agree about the power losses on cables, but here there is only a 16mm2 cable from the battery and a 6mm2 cable from the victron, all tighened up with cable lugs on a copper busbar. What puzzles me is that the Victron is reading higher than the bms itself, constantly by 0.2 V. The 10w losses correspond to almost 18ohms of resistance which i find very hard to believe.
 
You are using two different voltage indicators on either end, not a voltage meter. The difference being, a voltage meter would be calibrated to a standard reference and could be used at either end to ensure you are referencing the same fixed point.

In other words, don't trust the readings on the devices themselves, get yourself a good quality meter and trust that.
 
......Beware of the power lost caused by loose connection........one might wonder where is lost at....it is converted into heat at the loose connection.

Granted 10w ain't much, but if you are seeing 50w-100w difference, I highly recommend you to check the connection.
 
I agree about the power losses on cables, but here there is only a 16mm2 cable from the battery and a 6mm2 cable from the victron, all tighened up with cable lugs on a copper busbar. What puzzles me is that the Victron is reading higher than the bms itself, constantly by 0.2 V. The 10w losses correspond to almost 18ohms of resistance which i find very hard to believe.
Measure each with an independent meter. DMM.
These devices are not NIST calibrated.
 
Thank you for your replies and help. Just checked scc battery terminals and also directly on battery terminals and confirmed that 0.2v difference. So the question at the end of the day is to trust the bms more than the scc? And that means that f.e when the scc sees 13.8V and goes in absorb mode, the actual voltage of the battery will be only 13.6V, and of course never get charged 100% or even 90% ?
 
There will be voltage drop in the wiring, so the higher potential source, the SCC will have higher voltage on its terminals than the battery while it is charging. The missing 10w could be what's lost on the wire as heat due to this voltage drop.

But these things aren't exactly accurate to 10w either.
I think i maybe have finally an explanation,but correct me if iam wrong. The scc reads the battery voltage CORRECTLY ONLY when not charging. When i disconnected solar panels, the difference between the scc and the bms went down to only 0.02V or so.
 
I think i maybe have finally an explanation,but correct me if iam wrong. The scc reads the battery voltage CORRECTLY ONLY when not charging. When i disconnected solar panels, the difference between the scc and the bms went down to only 0.02V or so.
(y) V=IR
 
Maybe and maybe not. Any loads will cause battery voltage to display less at SCC, just like charging causes battery voltage to display higher. Than there is the loss from cells to SCC due to connection and wire. Heavy loads will cause a voltage slump that goes away once the load is off.

The common DC voltage bus that connects the SCC, battery, inverter and other components like the control circuit of an AIO will (if accurate) displays a reading somewhere in between things.
 
Maybe and maybe not. Any loads will cause battery voltage to display less at SCC, just like charging causes battery voltage to display higher. Than there is the loss from cells to SCC due to connection and wire. Heavy loads will cause a voltage slump that goes away once the load is off.

The common DC voltage bus that connects the SCC, battery, inverter and other components like the control circuit of an AIO will (if accurate) displays a reading somewhere in between things.
No loads, just scc and battery. The only other variable is just to connect-disconnect the panels. So i guess that the TRUE battery voltage (as confirmed also by dmm on the terminals) is what the BMS reports, and the scc just keeps that 200mV higher on its terminals (which is what it reports as battery voltage) to keep the electrons flowing.
 
The 200mV there is likely just an inaccuracy/calibration disagreement. The load of measuring the voltage should be so low that it wouldn't create a 200mV drop across that wire.
 
The 200mV there is likely just an inaccuracy/calibration disagreement. The load of measuring the voltage should be so low that it wouldn't create a 200mV drop across that wire.
I see your point,but if thats the case how it can be explained the fact that while charging the scc reported (for battery) 13.7v and 110W going in the battery while the bms was reporting 13.49V and 98W coming in? I cant find any other explanation than what i mentioned...any other ideas???
 
I see your point,but if thats the case how it can be explained the fact that while charging the scc reported (for battery) 13.7v and 110W going in the battery while the bms was reporting 13.49V and 98W coming in? I cant find any other explanation than what i mentioned...any other ideas???
Oh, I mean to say any difference at rest is unexplained. Differences while charging are explained by voltage drop.
 
BMS needs power to operate. If you are monitoring it, the Bluetooth chip on the BMS board is also being powered. Voltage will sag when there is a load. The bigger the load, the greater the sag. 10W is nothing. Don't worry too much about it.

The voltage at the SCC while the sun is out is going to be misleading. It'll be higher than what the battery actually is because in order to charge a battery the charging voltage has to be higher than what the battery voltage is. If the voltage is the same, the battery will not charge. What you are seeing when the sun is out is what the charging voltage is, not what the battery is. The SCC is directly connected to the battery but it's not going to stop charging every now and then just to give you a voltage of the battery.
 
Last edited:
Oh, I mean to say any difference at rest is unexplained. Differences while charging are explained by voltage drop.
/*the difference between the scc and the bms went down to only 0.02V*/. That's easily explained by measurement error, each of them is off by 10 mv plus or minus...
 
BMS needs power to operate. If you are monitoring it, the Bluetooth chip on the BMS board is also being powered. Voltage will sag when there is a load. The bigger the load, the greater the sag. 10W is nothing. Don't worry too much about it.

The voltage at the SCC while the sun is out is going to be misleading. It'll be higher than what the battery actually is because in order to charge a battery the charging voltage has to be higher than what the battery voltage is. If the voltage is the same, the battery will not charge. What you are seeing when the sun is out is what the charging voltage is, not what the battery is. The SCC is directly connected to the battery but it's not going to stop charging every now and then just to give you a voltage of the battery.
Thamk you! This confirms my understanding that the scc does NOT measure the real voltage while charging but ONLY when at "rest". So, basically it stops charging when it THINKS that the target voltage is reached which in my case is about 0.2V less than actual. Good to know.
 
That’s voltage drop in the wire (as you confirmed by dropping the current to zero), so either live with 10W of wire heating or upsize the wires.
As i mentioned i think my wires are more than sufficient for such small currents of 8-10 amps max(2x100w solarpanels). So upsize the 6mm2 cable or the 16mm2 cable? Wouldnt that be overkill?
 
Back
Top