Quattro, Smart Li, RS 450 PV, mbs-V2, cerbo.
I am unhappy/suspicious of default charging.
There have been some significant discrepancies between SOC and Vbattery and the system tripping out due low Vbatt even though SOC was still very high.
What’s going on?
Where I have got to:
Quattro charges from grid (winter) on the default SmartLi settings of Absorption Voltage 56.80 for 1 hr, Float at 54.00V.
There are a variety of charging curves for LiFePO3 all pretty well identical eg:
Quattro, Smart Li, RS 450 PV, mbs-V2, cerbo.
I am unhappy/suspicious of default charging.
There have been some significant discrepancies between SOC and Vbattery and the system tripping out due low Vbatt even though SOC was still very high.
What’s going on?
Where I have got to:
Quattro charges from grid (winter) on the default SmartLi settings of Absorption Voltage 56.80 for 1 hr, Float at 54.00V.
There are a variety of charging curves for LiFePO3 all pretty well identical eg:
They all suggest a float of 55.2 and absorbtion of 58.4 for 30 min.
Victron’s default float of 54 suggests a SOC of, well, 65% (!) {yes I know it doesn’t really mean this]. Note this claims to be CHARGE voltage, I can find no figures for discharge voltage vs SOC (strangely).
The ‘standard curve’ at 55.2 sets it at ~95%.
Looking back I notice that in the summer when PV power was plentiful the 54V float went on for hours (as in often 6+) and the battery seemed to have plenty of reserves.
This month, with much of the charging done via the Quattro, of course ir’s only 1 hour.
So let’s take a look at Vbatt. In summer at the start of discharge it was usually about 54-54.2 (discharging lightly), corresponding to a 95% SOC.
In winter (now) first discharge after charge is about 53V corresponding to an SOC of about 75%.
BUT the system thinks both of these are 100%, and of course adjusts the SOC downwards (roughly) as if it was. This leads to HUGE discrepancies between actual SOC and displayed SOC.
I suspect that in order to get a sensible SOC displayed what you need to do is to reset the charging inputs in the Quattro (which controls things in my system, more of less). That means setting new absorbtion and float voltages, and/or resetting the “state of charge when bulk finished” wich may only be 75% with the default figures, and perhaps (if set really low as at present) extening the absorbtion period. It may also be appropriate to adjust (downwards) your battery capacity.
SO if you think your batteries are underperforming, maybe they are, and the default settings may need adjusting.
It would be nice to see the SOC vs Vbatt on discharging, but this may be a forlorn hope.
First to check the shunt accuracy. Its been suggested they are not as accurate as one might expect. Its got excellent resolution of 0.01A (!) but accuracy is quoted as ±0.4%, presumably of full scale as usually quoted for these devices. On a 500A shunt that’s an accuracy of, er, 2A! Similar to what a previous poster measured.
Second, to find out how the shunt is doing the integration because whilst a lot of victron stuff quotes high accuracy, its not clear if it actually achieves it nor is it clear victron tell anyone, at least in datasheets.
So believing a shunt result may be no better than anything else.
For example their batteries do not even have charge and discharge curves published as far as I can find.
Default charging is just fine.
Charge battery to 3.55v/cell, hold for 2 hours in absorp for cell to balance.
To repeat, the only time Li battery voltage is useful is when the batteries are under charge and full 3.55v/cell) (100%) and when the battery is depleted and the load drags the cell voltage down to 2.8v/cell (~1%).
Your voltage/Soc chart is wrong.
Look, I am an engineer who has been on the net for decades. Anyone can claim anything and say anything so decades ago I tended to discount anyone who just makes statements with no evidence.
Please point me to the URL with the ‘correct’ voltage/SOC because frankly I have come across many from reputable sources, they agree quite well with each other.
However one thing that ALL agree on is 100% charge is 3.65V/cell and for 30 mins.
Victron’s nayyery supplier may have a different process but since they give no figures its hard to tell, other than their default which is 3.55V.
So are Victron’s cells going to supply 80% of their rated output between Vcell 3.55 and 20% SOC? That is, will I get 320Ah or ~16kWh from this charge, because it doesn’t look like it to me right now (actually to 40% but pro-rata).
You are not asking the right question.
There is a difference in Soc @ cell voltage depending on whether the cell is charging, or discharging.
And the C rate this is occurring.
Of course a battery at rest would have cell voltages mid way between the charge/discharge curves.
Victron rated battery capacity is achieved when following the recommended charge procedures. LSB manual. Bumping up the charge voltages WILL NOT INCREASE capacity in any meaningful amount.
If I was to follow my cell manufactures advice, I should charge my cells to 4v.
FANTASTIC!
For clarification on 500A smartshunt.
Tested zero to 10A the accuracy was -+10-20mV, which is excellent.
Previous poster suggested >1A out at modest currents but this is not true of this one.
I assume the 0.4% is the accuracy of (effectively) the shunt resistance but that is not a problem.
They stated a ‘resolution’ of “0.01A” but this is often (usually) more related to the readout than the accuracy (often absolutely dire). What they have is best described as :
“Accuracy is <0.02A or 0.4% whichever is greater”.
Now I have it, I may put it in. Don’t even have to balance cables!
Hi,
I have seen many curves from many manufacturers but although they all pretty much agree, none match your one very well. What was the source? It looks more like an extract from an academic or quasi-academic paper. Its exactly what I have been looking for with both charge and discharge voltages, it only lacks temperature …
Interesting a 0.12V difference for 0.2C dis/charge or circa 3.5% loss before inverter losses are taken into account, OK admittedly much better than lead.
I note the manual suggests much longer absorbtion time (2-8hrs per month) than the 1hr default (but suggests weekly)…
Completely agree with your comments on assessing what IS the voltage you measure on a cell, its affected by quite a few things (including a rest time, its chemistry after all, and not in equilibrium) but I still think there is a reasonable algorithm to give a reasonable estimate of the ACTUAL SOC (say to <5%) that takes battery degradation into account.
So there still remains the problem of how to find the actual capacity of the batteries as they age. In my case, which is pretty well off grid most of the year, I suspect I have several to lots of charge-discharge cycles per day, so battery degradation may be quite high.
As with another poster elsewhere I have been caught with low battery voltage tripping the system (set at Vcell = 3.25V => 40% SOC) with ~70% SOC displayed by the system.
I have a smartshunt, which I will fit, but despite statements made here, I doubt it make much difference. Putting a finger on why this happens is more useful. [PS it has gone in the middle of the night with very low power drains, a few hundred watts at most, so not drops due high current).
For a Soc of 50%, at worst the cell voltage could be between 3.0 and 3.6v.
With a more moderate C rate @ 50% the cell voltage could be between 3.2 and 3.4v.
For a nice healthy cell voltage of 3.2v, the actual Soc could be anywhere between 5% and 70%.
Hi,
Your data is invaluable, which, together with other information, allows me to set the quattro up correctly whilst being aware of the limitations.
Supplier of your datasheet:
Might even be higher spec than victron!
For Information the only other thing I have come across is the significant (in the scheme of things) noisiness of real life curves which makes capacity estimation even more difficult as corrected (for current and temperature) also need to be integrated over time.
Most of this information is scattered over the web and usually has no reference, making the data suspect in part.
For information, I typically discharge at <C/40 (ie ~500W on 20kW battery) with peaks at <C/5 ish when electrical heating devices are used (kettle, dishwasher etc). PV/mains charging is up to C/3.
The only thing that now remains is still a way to estimate TRUE battery capacity. We can assess “100%” but 20% sadly reverts to battery voltage (with some adjustment for current) and the only other point where all the curves agree is the scary 2.8V/cell - 0!
Probably a test annually from 100% to 10% (via Vb) SOC and use this as the capacity is about right but 10% would be on Vb so has its own basic inaccuracies. I prefer to set min charge at 40% to 60% but this (I now realise) must be done using SOC rather than Vb. Vb should be the damage prevention - pull the plug now - level!
Its likely that the existing system I have (no smartshunt) is not the problem but my overconservative settings in the quattro.
In passing, whilst I know how multicell Li packs are balanced in power tools, does anyone know the mechanism for the Victron Smart Li? Bypassing some of the current for fully charged cells is possible but there would be a heat penalty, possibly tens of watts per battery and the electronics in the battery does not seem designed to dissipate a lot of heat. Bypassing a cell completely has all sorts of problems best avoided (!),
Yes, I know all that, but it doesn’t explain the mechanism.
I imagine (unless there is another mechanism) you have to be charging one cell in the battery whilst NOT charging the others and maintaining 56.8V across the battery to avoid the system detecting a fault.
So the likely technique would be to bleed current across the fully charged cells, but not through the undercharged one. For a 200Ah battery we ought to be talking quite a few amps, maybe 2A. So that will be about 24V at 2A = 50W for a number of hours/ I would expect that to heat the battery significantly and require significant heat sinking on the balancing device, here example (bottom 25VV 200Ah) board in situ.
Each cell has passive (resistive) balancers that burn of energy.
And active balancers that move energy from high v cells to lower v cells.
Cell balancing is 1.8a/cell (except the smallest battery) . Split between the active and passive balancers.
Remember only the highest v cells will be balanced.
And for a 8 cell battery only a couple of cells will need to be balanced.
I have read it several times.
By passive, I assume you mean resistive where cells are in parallel, these I count as one cell (obviously).
How do they move energy from high to low voltage cells with a significant DC differential?
OK, possibly PWM inverters, operating at 3.6V, yes that’s possible and would explain some of the board complexity, the black blocks would be transformers.
Yes, of course that’s how it’s done because they are cheap as chips these days.
It’s how I would do it anyway. Should have thought of that before.
More expensive than tool charger method but a tiny amount of cost% and no heat problem.
Neat.
A cheap pic chip could handle this no bother.
In fact the balancing could be done at any state of charge if done this way, you do not even need to wait for an absorption phase, that would only be required to fully charge where the initial charge rate was very high (eg C/3) and end of bulk was 95% SOC charge although if a slow charge (eg C/20 via PV) would get battery close to 100% at the end of bulk (ie 56.8V) so some intelligence would be required of the charging system.
Not sure (in fact I doubt) victron algorithm takes charge rate at end of bulk into account for the absorption phase. Maybe they do as there are some steps in the SOC which are odd going from 95 to 100% sometimes far too quickly given the charge introduced in that period.
Something to look at another day.
Nothing like a chat to solve a problem. Easy once you know how.
Top balancing is preferred during absorption as this is where the cell voltages differ, and when the charge current drops off.
Balance current plus time works here.
Time can be accumulated across a many charge cycles to balance cells.
During the fat part of the Soc/cell voltage curve, there is not enough voltage differential between the cells for active balancing to occur.
And again at the low Soc cell voltage differential, the battery load current is much higher than any balancer could compensate.
That’s the beauty of PWM power transfer, the DC differential is irrelevant and the power is a delta and not dependent on the bulk current through the battery either. Any cell can pump a little power to the adjacent one stopping at any that are a bit behind to boost their charge compared to the others.
Really simple, really neat, really efficient and very flexible and I’m sure its how they do it these days. I would/ Anyway problem solved.
In passing…
I don’t remember seeing it stated anywhere but does anyone know how to configure the Cerbo advanced charts kWh (and other accumulators) so they start accumulating at a set time (dd:)hh:mm and restart after a settable period or date/time?
Currently, mine seem to do it somewhat randomly.
Many thanks.