Hello everyone. I’d like to reopen this thread and raise awareness among both Victron developers and operators about a crucial issue: State of Charge (SOC) determination! I believe we all agree that determining the SOC, and thus calculating the stored energy, is fundamental for the correct functioning of the DESS (Device Energy Storage System). Elsewhere in the forum, there’s discussion about adding a decimal place. However, this is of little help as long as we can’t accurately define and specify the actual SOC value. There are two methods available for determining the SOC value. I won’t elaborate further on method 1 (determining the SOC via the BMS), as it’s been discussed often enough. It’s nothing more than a rough estimate, and the values are completely unsuitable for precise DESS control. The second method, using a shunt, is much more accurate and suitable, but from my perspective and observations, it also has a fundamental weakness: the battery’s own power consumption, or rather, the power consumption of the internal components, is not taken into account. Most of these batteries contain BMS, balancers, heaters, displays, or similar components that lead to power consumption downstream of the shunt. In summer, with high solar yields and regular SoC calibration, this problem might be negligible; otherwise, it leads to miscalculations by the DESS. Currently, the problem is addressed by regular battery balancing and the associated 100% calibration. In my opinion, however, this is merely a cover-up of the real issue. A solution, in my view, would be to integrate a mechanism, either in the shunt or directly in the DESS, to account for and calibrate this battery standby power consumption. I’ve been observing the development of the DESS for years now and believe that much of the sometimes justified criticism stems from this. The numerous discussions about a reserve SoC or a separate DESS SoC are likely related to this problem in many respects. Please consider this issue. Otherwise, thank you for the new developments, and I wish everyone a happy 2026.
I have exactly the same problems; all consumers that have anything to do with the battery run through the shunt, yet I still lose 1% of SOC per day without it being taken into account by the shunt in any way. I have to readjust it every two to three days, which is frustrating.
From my perspective, this should be integrable. I’m not a technician or programmer, but simply adding a line to the configuration where the user can roughly configure the standby power consumption, and then factoring that value into the calculations, would be enough. But perhaps I’m oversimplifying things.
First of all, it has nothing to do with self-discharge. From my perspective, it’s the power consumption of some kind of electronic component beyond the shunt. These days, there’s practically no object without some kind of indicator light or signal-emitting dongle and the like. But of course, there’s also the standby power consumption of the BMS and balancer. In my case, one BMS and one balancer per battery block. This standby power consumption can be roughly estimated in the datasheets and, if necessary, slightly adjusted after some time. I don’t really see the problem. Both economically and logically, this approach would be more sensible than weekly full charges just to calibrate the state of charge (SoC).
Let’s agree to disagree: I can’t see a way to get permanently precise SOC without taking into the maths such parameters as cell self discharge, loses due to internal resistance, charge/discharge ratio, loses in wiring and safety devices located on “battery” side from the shunt.
Secondly, the IS at least 2 problems regarding the consumption of battery “internal electronics”:
the several datasheets I have lack any info on self consumption (even the one not considering the different use scenarios),
the lack of info on changes in consumption due to different balancing scenarios, battery internal wiring etc.
I’m also very keen to have more usable SOC, but it’s a angry can of worms.
And weekly full charge has nice side effect of balancing the cells.
Cell balancing is optional. I don’t need it. I performed my first full charge since November 9, 2025, on January 1, 2026. Because of the day off, I was able to observe how all cells crossed the finish line perfectly in parallel, but that’s a different topic. Back to the SoC. It’s not about having more SoC or capacity. If I wanted that and forced it, I’d certainly open Pandora’s box. It’s about displaying the SoC correctly. Only a near-accurate SoC ensures the correct functioning of the DESS, because all calculations and schedules are based on this value. That’s precisely why I’ve brought this topic up here.
One more thing. User “Dieter Buchholz” mentions a loss of about 1% per day; I estimate roughly half that for myself, so that already accounts for the unrecorded consumption. Whether that’s 599 W or 601 W is secondary, but the calculation isn’t missing 600 W.
The only way to create a long-term reliable SoC (or SoE) counter is to implement a digital twin battery that is able to populate a battery map based on measured voltages, currents and raw SoC input values. Over a few cycles the digital twin is capable of correctly populating an instantaneous internal resistance map Rinst=Function(Voltage, Current, Temp, SoH) then calculating a normalized voltage Vnorm=(Voltage - Rinst * Current) that represents the resting state voltage. And finally calculating SoCcal= Map(Vnorm) that can be used to periodically calibrate the measured SoC with.
For lead-acid and Lithium Ion (Li-NCM for instance) calibrating can be done semi permanently and is not all too difficult, for Li-FePO4 batteries calibrating should be limited to either low or high SoC battery states due to the extremely flat Voltage-SoC curve in between.
An added benefit of this approach is that it will provide a direct means to track SoH (actual capacity / initial capacity ) as well
I am building such a digital twin in Node-RED that feeds into a virtual battery that emulates a perfectly flat voltage battery (set to Vnominal) to allow a linear relationship between a normalized SoC and Current (Anorm) and Power (W=Vnominal*A) for improved DESS scheduling and inversely Anorm=Power/Vnominal for boatpage display.
Alternatively, especially when using a BMV Smartshunt, a minimal calibration flow could be implemented based on limited Voltage-SoC map only, as long as that flow only calibrates SoC when the battery has been idle long enough to measure the actual resting voltage directly.
And you expect to estimate the state of charge from the resting voltage? With such minimal differences, you can hardly call it a calculation. And these differences also vary between cell manufacturers and due to minute fluctuations in chemical composition.
Yes indeed, but with Li-FePO4 you’d need to make sure only to calibrate in the curve either on the low or high SoC end. Not ideal but being able to match resting voltage to SoC below say 3.0V and above 3.3V is still much better than only being able to set SoC to 100% at 3.55V after(during) balancing. Keep in mind this is a Li-FePO4 problem only, other Li chemistries (of good quality) hardly ever need to balance (say once a year, passive balancing 10mV) nor do they have the drawbacks of such a flat curve. Yes drawbacks, not just for SoC calculation but even more so for parallel battery operation.
Correct, therefore the digital twin needs to populate its own internal resistance map ánd voltage-SoC curve. But only in the curve (the bend, de knik). What happens in between and during normal cycling operation, can then sufficiently be covered by the Smartshunt coulomb counter (plus some ‘leakage’ factor for battery internal power use, if at all applicable because a good BMS should factor that in itself).
Either way, no matter the actual battery and system design, all the required measurements to implement a digital twin with a substantially more accurate SoC tracker is readily available, it’s just a matter of making the best use of it within the systems limitation.
I believe that for occasional calibration, it’s sufficient to consider standby power consumption. I don’t think anyone intends to fly to the moon or initiate nuclear fusion. Just a reasonably accurate calculation of the stored energy is needed. The current loss, to stick with the example above, of 1% seems too high to me personally. Especially if it can only be remedied by taking a rough estimate of standby power consumption into account!
The problem with that is that you then need to manually calibrate your standby consumption, for which you’d need to log your SoC, voltages and currents anyway. Assuming it will end up being easier to (learn and) use Node-RED than taking notes manually. And by doing so you would have already spend 90% of the time and attention required to implement a bare bones auto-calibrating Node-RED flow.
But yeah, if all you want is to manually calibrate occasionally you are correct but I’m a little confused what you’d what or even need from Victron then.
That said, how about creating a virtual battery to achieve what you want? Then you could accommodate for a constant albeit small DC ‘leakage’ load to adjust instantaneous current and SoC and occasionally, say once a day or when the SoC drift > 0.1% adjust the BMS or Smartshunt to get ‘back in line’
I have published a flow to calculate battery capacity from a BMV Smartshunt somewhere in this forum. That flow also calculates a higher precision SOC%. All you’d need to add is your ‘leakage’ current or power and a function node to ‘calibrate’ the BMV when the SoC drift goes over a certain threshold value.
Come to think of it, with a constant current ‘leakage’, you could simply calculate a statically timed SoC% correction because SoC is calculated from consumedAh/capacityAh. With known leakage current you know exactly how long it takes for that to correspond with, say, 0.1% and correct for it.
First of all, I’m at an age where I’m certainly not going to start learning any programming languages or applications. And I think that’s the case for many users here. I don’t buy a car just to tune it afterward. Anyone who needs that as a hobby, whether it’s car tuning or Node-RED programming of a solar power system, is welcome to do so, but it’s not my life’s purpose. My intention is to provide feedback to Victron and other users based on observations of my system, in order to further develop an already very good product. And after three years of constructive discussions without the often polemical criticism found here, I think I’m succeeding quite well.
To be perfectly clear, this isn’t about solving an individual’s existing problem, but about further developing a product. That’s why I’m offering constructive criticism. I personally check the VRM daily, and it’s incredibly easy to adjust the SoC rate downwards by 0.5% or even 1%. The effort involved in investigating the issue, finding the root causes, and discussing them here is definitely greater than the effort of making the necessary adjustments.
I tend to agree with you on this in general but I also observe a limited interest from Victron to get involved in edge cases (or so I assume they will see this specific topic). Being able to create workarounds in Node-RED is IMHO a double edged sword: on one hand it lowers incentives on Victron to implement fundamental improvements, on the other it allows independent development of proof of concept functionality, that then may help driving implementation into VenusOS. I do appreciate a position that the latter is not optimal but that’s a different topic.
Accept what you like. My experience with Victron after more than three years is different. I’m familiar with the language of many of your posts, so little surprises me.