Tail current setting

If I understand all this correctly, I would need to tweak my Tail Current to 0.2% or possibly 0.05%.

Let me explain. I have a 560Ah 12V LiFePO battery bank with a Victron controller and 4 100W solar panels. My panels put out typically about 75V.

To get to 4% Tail Current (i.e. 22A), I would need to generate 1650W. Correct?

I typically get 250W peak (125W avg over 6hours) on a summer day, and 90W peak (35W avg in the fall/winter). So I either need to increase my solar array substantially - which is impractical on a sailboat - or decrease the Tail Current setting accordingly.

Did I get this right? Is this reasonable?

Your maths / understanding is out.

On 560Ah a 4% tail current is

560 * 4 / 100 = 22.4 A as you say.

This tail current is the tail current into the battery, charging at 14.2V

Power = 14.2 x 22.4 = 318W.

My guess is you used the Solar voltage, it is not tail current from the panels.

Lithium may be set with a 2% tail current if they charge OK.

When you have low sun and lower charging, you may not get to absorption voltage anyway, and tail current only applied once you get to absorption.

I manage on my boat with 680W solar on 690Ah batteries in the UK.

Thanks so much for such a quick response. I see where my mistake was (to compute power from the solar voltage times current, instead of battery voltage times current from the panel).
The number you used as example make more sense and correspond to what I see on my boat (no big problem with SoC during the summer, but certainly during the winter when the output peaks at a piddly 30W or so ).
Now that I moved the boat to a marina for the winter, this won’t make a difference, but it will come handy in the spring when I get back to my mooring.

I was back at the boat this weekend and decided to review the current settings and the docs. Our current settings are:
Absorption voltage 14.2V
Float Voltage 13.5V
Charged Voltage 13.5V
Tail Current 4%
Charge Detection Time 3min

I also noted that the MPPT Solar Charger states on P34 : "Default settings for LiFePO4 batteries - The default absorption voltage is to 14.2V (28.4V) and the absorption time is fixed and set to 2 hours. The float voltage is set at 13.5V (27V). " which matches the above.

However the manual for the Battery Monitor and SmartShunt states on P39: “It is also possible that the battery monitor synchronises too early. This can happen in solar systems or in systems that have fluctuating charge currents. If this is the case change the following settings:
Increase the “charged voltage [21]” to slightly below the absorption charge voltage. For example: 14.2V in case of 14.4V absorption voltage (for a 12V battery). "
And then on P21, that manual also states: “The “charged voltage” parameter should be set to 0.2V or 0.3V below the float voltage of the charger.”

As I try to put this all together, I land on leaving the Absorption Voltage at 14.2V for my LiFePO batteries, but then it’s not clear what to do with Float and Charged. It seems that in my situation, I would want to keep these high, and the Charged Voltage should be 0.2V to 0.3V below Float. If this is all correct, then I probably should increase Float to 14V and Charged to 13.7V. Is that right?

As to the Tail Current, it’s become clear that the winter sun level don’t matter, and I should aim to optimize the system for my charging capacity in the Spring-Summer (which is when I’m off the grid). A 2% Tail Current with the voltage changes above may gain me greater reliability in reported SoC.

Final note: while reading the manuals, I noted that Tail Current is stated as a percentage in Victron Connect, but as an Amp value on P34 of the MPPT Solar Charger manual. Is it just that in this case the Amp value was 0 which would also be 0%???

Hi @xavierlh1

I think you’re mixing your ‘Tails’. The one in the mppt (in Amps) is to tell it when to switch from Absorb to Float. With LFP batts you don’t need it as you have a fixed charge profile (which you should adhere to). Should be set to Zero or whatever’s default in the chosen charge profile.

‘Tail’ (in %) in a Battery Monitor is to determine when to sync the battery to 100%. Your LFP’s are full when they can’t accept current. So can hold a V (Charged V) for a set time (Charged Detection Time) without more than the Tail current. The Tail set there could be quite low, but needs to allow for possible bursts of cell balancing.

Thanks for clarifying the confusion. Being a newbie at these things, and as neither Tail were qualified, I simply made the assumption they were the same. 'll make a note of this for future reference, but Victron might think of adding this clarification in the documentation for future readers.

As you have solar I suggest setting the charged voltage on the SmartShunt to 14.0V. In a system that has ac charging only setting this to below float is OK. On solar systems setting it below absorption is better.

1 Like

If you set the absorption voltage to 14.2V, make sure that your BMS will balance the cells at that voltage. Some BMS require 14.4V to start the balancing process.

Thanks for the input. 14.2V is the original setting by the consultant who put the system together. The cells have been behaving well and staying close to each other in voltage so I’m not overly worried, but it won’t hurt to double check what the manufacturers recommend.

I just checked my the BMS info on my batteries (2 x SOK SK12V280H) and it state that charge voltage (which I assume is what’s called Absorption Voltage in Victron docs) should be 14.6V (min 14.4V).