I am in the same boot but using a pace bms (16s system)
I searched the forums/net and could not find the right solution.
I played with the sustain voltage (ESS Assistant) and set it to 45V and that reduced the number of low battery warnings. My current settings are:
And yeah you have three fields with 3.50 Volt. That makes no sense. I get it that people set float voltage to the same value as their charge voltage, because it’s not really something that you want with a LiFePo4 chemistry, right? But hold on. Check the datasheet.
I import containers full with prismatic cells, and in every single spec sheet, the float value is specifies lower than the charge volume. Also. Check the Advanced tab in your VRM.
I have four test battery packs set to different charge volumes, and after over 3300 cycles, that’s seven years of work, the one with a charge voltage of 3.460 Volt has the highest SOH.
Setting the float voltage too high may lead to overshoots. I’ve seen over 3.85 Volt with JK BMS’es. And the reality is that the end of the charge curve, your battery gains almost nothing, but it will hurt the SOH overtime,
Thanks for noticing, i have 16, but lifepo4 cells can go down below 3 volts before becoming fully discharged. So maybe 48 volt range instead of 47 volt range would be better I think
These batteries (LiFePo4) don’t have a direct dependence between SOC and Voltage! Moreover, it is not for nothing that the configurator in the ESS assistant has voltage threshold values ​​for different loads. It must be measured specifically for your battery bank. The voltage drop will depend on the capacity of all batteries.
For example, the voltage drop at a load of 0.25C (3.5 kW) for a battery with a capacity of 280 Ah will be approximately 0.5 V, and for a capacity of 560 Ah, the same load of 0.25C (7.0 kW) the voltage drop will be less approximately 0.35-0.4 V. The determining factor is capacity.
Therefore, all these tables can be thrown in the trash.
It’s necessary to measure and calculate for each bank of batteries with precise measurements. And also attention that the battery in the idle state and 100% SOC will have a voltage of 3.33-3.35V per cell.
In my situation, the maximum current is 100A by a pack of 16S 314Ah. So that’s about 0.33c. The voltage drop is maybe 0.6 volt over the whole 16S pack, so I can probably use that for the dynamic cut off together with the standby voltage table.
A cut off voltage of 47 - 47,6 volt would be good I suppose, still leaving some charge in the battery. And this is only used when an off grid situation occurs due to power grid failure because DESS limit is on 25%.
*Provided that there are no losses in the DC wires
In this part we ensure that the BMS never off earlier during normal operation.
Next, we move on to the ECC assistant settings.
In this part we have double dependencies and utmost attention.
Since the voltage drop at a load of 0.25C and a battery charge of 100% SOC will be one value. But the same load and residual battery charge of 50% SOC will be another value. As I already said, the determining factor is the capacity of the batteries. The greater the capacity, the less the voltage drop at different charge levels.
Sustain voltage = +/- Voltage of 20%SOC, approximately 2.95-3.0V per cell. (47.2-48.00V)
Dynamic cut-off 0.005C = (Victron DC input low shut-down + 1.2-1.5V)*
*Because at low loads, the voltage is close to the upper level for the current charge level. But the actual capacity is at the threshold of deep discharge.
To make your measurements easier, you can use VRM telemetry. Test loads at different levels battery. Allow enough time for recovery voltage after heavy loads.
I am deliberately not giving you specific values ​​in numbers. All installations are strictly individual and this is a big responsibility! But I am sure that you understand what you are doing.
Thank you for this information, I will use them in the following days to calculate good values. The loss over the dc wires (6 meter 70mm2) is 0,15 volts.