That is incorrect.
Please read some cell datasheets. Cell manufacturers have very clear definitions of 0% and 100%. For a 100 Ah cell, it goes like this:
To get to 100% SOC, charge the cell with a constant current of 50A, until voltage reaches 3.65V. Maintain a constant voltage of 3.65V until current drops below 5A and then cutoff.
To get to 0% SOC, discharge the cell with a constant current of 50A, until voltage reaches 2.50V and then cutoff.
That is incorrect.
A cell can reach nominal 100% SOC even at 3.375V if you charge with less than 0.5 C-rate and give it enough “Absorption” time.
3.5V per cell, where will be a compromise between the quickest time with the lowest voltage.
3.4V per cell, from where the time starts to become a nuisance and usually you won’t finish the charging for that solar day.
Anyway, his conclusion was that 3.475V per cell (13.9V) is not a bad thing (52.12V for 15S and 55.6V for 16S).
In my experience, charging with 3.5V per cell, the last 2%, from 98% to 100% (reported by the BMS), takes between 25 and 60 minutes, depending on the initial SOD and cell imbalance, while the main charging is taking about 4-5 hours.
So the last article confirms for me that at 3.5V per cell, as the final 2% of charging is taking 15% from the total charging time, it’s a pretty conservative charging.
@MondeoMan
I was referring to the cell voltages vs their soc correlation in the study. One that you linked. I have now edited the comment to make that clearer
The abbreviation SOC is thrown around too easily. I do think more attention needs to be paid to cell voltage vs how the battery manufacturer uses the term SOC.
I know how they are used otherwise.
The misconceptions of having to limit soc charge by the end user is based on half reading the paper.
Most manufacturers have been aware of optimal operating ranges for years and already apply the best practice.
No offense, but do you have any opinion of your own and/or any personal experience to share regarding this mater?
Up until now you just contradicted all that tried to share from their experiences or from the sayings of some manufacturers.
Now we reached a point when, according with your saying, not even the manufacturer is reliable on their advice and/or actions, which is odd to say the least…
Like Trevor said above, Do you have any practical suggestions to the application of this information in the context of Victron equipment or is the purpose of the information to simply reinforce the assertion of the original poster?
Well on another note with Pylon.
We have just had one or two units from a heavily cycled large bank give a bit of trouble and had zero problems with support. (Batteries are just over 7 years old) {in a fairly poor operating environment to what to would like to have them in.}
@alexpescaru
It was actually good to see the agreement from the study and pylons own statement agreeing.
No offense, but: Kiwirob has opened this topic to ask for advice on how to limit charging to 80%.
And then all you and a few others have done is to contradict him that he doesn’t need to do that.
My opinion is that there is nothing inherently wrong in contradicting and pointing out technical mistakes and misconceptions regarding technical topics on a technical forum.
My opinion is that manufacturers should not be considered the absolute best source of information. My opinion is that people responding to Tech Support emails are not the most knowledgeable. If they were, they wouldn’t be working in Tech Support.
My opinion is that yes, there is value in reducing the average SOC but in the Victron ecosystem this is difficult to accomplish (take for instance the minimum Absorption time of 1 hour).
And I can present more than personal evidence to support my opinions.