Multiplus 2 low battery alarm with jkbms (Configure victron and jk-bms voltages)

Just tested at 93% SOC and the total voltage drop is 800 millivolts.

So that makes 500 millivolts over the cells, resulting in 31 millivolts per cell.

I also see that it takes a few seconds to stabilize, first it drops with 600 millivolts and after a few seconds it settles at 800 millivolts drop.

When I go back to charging, the difference is only 500 millivolts between discharging and charging voltage at 16 amps, and after a minute the difference is 700 millivolts between discharging and charging voltage.

No, no, no.
Jeroen, I told you that for a clean measurement it must be make from Idle state of the battery, when the SOC is 100% and the battery voltage 53.4 (3.33-3.35) per cell.

Subsequent measurements are made in the same way. For accurate conclusions, it is necessary to record only the peak of the drop under load.

For example, what I mean.
Idle SOC100



Consumption and maximum load - peak drop

UPD: Before each measurement, allow the battery to stay idle mode for sufficient time to steady voltage at each charge level.

I charged up to 100% soc. Configured inverter power to 0, dess off and grid setpoint to -8000. When I raise inverter power discharging will begin.

But the battery stays at 55.57 volt (CVL I think)?

Do I just have to wait?

Yes, you need to wait until the cells are at rest. 3.33-3.35 total 53.4V
Then you apply the load and measure the peak drop.
After that, discharge the battery to 50%, stop the discharge, turn on the idle mode, wait few hours to stabilize the voltage.
And so on at each charge level, next level 30%.

UPD: Once you have all off the peak voltage drop values, you can make a general assessment and start calculating the values ​​for Dynamic cut-off and adjusting them.

A few hours later, the battery voltage is still 55.12 volt. Should I still wait?

Or would it be better to discharge to approximately 53.4 volt / 99% soc, let the system idle for a while, write down voltage, start discharging, write down voltage again and calculate voltage drop?

Yes, yes, wait.
I specifically gave you telemetry where the timeline is visible.

I show you how important it is to make accurate measurements to calculate the values ​​correctly. And what are the dependencies on the residual capacity, how does it change in values.

It’s my drop voltage of 50%SOC


In fact, we have tend more losses when the battery charge is low.
My data
SOC 100% - drop 0.70V
SOC 50% - drop 0.95V

That’s why I didn’t give you ready-made Dynamic cut-off values.
There are no template configurations! Never!

Okay, that looks like the same voltage drop I measured today already, so I expect more or less the same results tomorrow.

We both have a 200A jkbms and EVE cellls of 314Ah, so we have probably more or less the same results.

Nope!
I have less capacity in this installation. 280Ah
Your values ​​should be better.

And finally I finished the presentation of the measurements & trend drop voltage in battery, with the different residual capacities.
My data (Not applicable to other installations, just as an example)
SOC 100% - drop 0.70V
SOC 50% - drop 0.95V
SOC 30% - drop 1.20V
Based on the received data, we make a calculation for “Dynamic cut-off” values ESS.


P.S. In the last test for SOC 30%, I deliberately increased the load to 90A (approximately 4.3kW) to show that there was no clipping.

I have this problem, the multiplus charges the battery after several hours. Probably repeated absorption time.

Okay, my setup at 25% soc was 1 volt drop at 88 amps so it’s a bit less at my side so it seems okay

I do not understand what you want to achieve by measuring the voltage drop at extractly 100% soc.

I can imagine you want to have the voltage drop measurements at different socs so for calculation the highest voltage drop can be used.

But why exactly 100% soc and not 99%?

99 or 100 is not important.
The important thing is the resting state of the cells & upper voltage of 3.33-3.35.
By measuring the entire range I immediately identify many problems, if they exist.
Experience :wink:

UPD: I was a little alarmed by the values ​​in the start post.
But now I don’t see anything critical.
Now you need to make adjustments to the values for ESS “Dynamic cut-off” & Inverter “DC input low”.
And everything will be fine.

Okay, first test results after a few hours of rest:

Start voltage at 0A: 53.39V
Voltage at 80A: 52.6V
Voltage before stopping discharge at 80A: 52.3V
Stop voltage at 0A: 53V

Voltage drop 99% SOC: 790mV
In the first test above: voltage drop at 25% SOC and 88 amps: 1V

So when I take the worst value, of 1 volt drop at 88 amps, I get the following result:

0,005C = 0,0178V drop
0,25 = 0,8920V drop
0,7C = 2,4977V drop
2C = 7,1363V drop

The idle voltage of a cell at 10% is 3 volts and at 10% it is 2,5 volts. So I’ll take in between 2,75 volts as minimum voltage (+/- 5%) which is 44 volts for the battery so it never depletes to 0%.

This gives me the following numbers:

0,005C = 44,98V
0,25C = 44,10V
0,7C = 42,50V
2C = 37,86V

Am I in the good direction?

I will also measure voltage drop at 50% SOC later.

What values ​​did you set for Victron
DC input low shut-down, restart, pre-alarm?

Are these settings for BMS valid and final?

Dynamic cut-off 0.7C & 2C = Victron DC input low shut-down
Because you can’t take more than 4-4.3 kW from your battery.
You are limited by the inverter power.

I understand. But the values calculated are valid?