Battery/System efficiency only 70%?

I think the suprise is that lithium cells are marketed at 95 to 97% efficiency. Real world figures definitely dont match.

You cannot just look at the denoted Battery-Efficency and then assume the overall system would have a 97% efficency.

System-losses are the major part, especially when you do AC-DC and later DC-AC conversion again.

Just assuming every conversion is 95% efficent, and also the battery has a 95% efficency, the resulting system efficency is down to 0.95^3 = 85%. And this doesn’t even cover any losses due to heat, cabling, system self consumption etc.

Everything put together however is “what you finally see” as the difference between “Energy to battery” and “Energy from battery”.

Also look at typical conversion-efficencies of inverters. With little power the conversion is at it’s least efficency. And that’s how you typically discharge your batteries: with little power over the course of 12 - 14h.

2 Likes

Panels are no different. All depends under which conditions they were tested.

Thats why when working battery efficiency i looked at what went in and out of the pack only. Its not as high as manufacturers note in technical documents compared to ‘real life.’ as Nick noted test conditions are never the same as use conditions. What I am really saying is the surprise Alex Pescaru has is one alot of people who look at data have. It is much lower than expected and it makes you start questioning everything.
Pich cells are considered to be the least efficient.

1 Like

Yes and no. The ratio of device (zero load) consumption to low loads is high, so efficiency is low. The inverters are most efficient at moderate load, and that reduces a fair amount at peak production. I certainly don’t have low loads but the system runs in an optimal band, most of the time.

With only DC chargers, there should be minimal conversion losses, laws of physics aside. DC/AC losses are more complex.

The Multi RS I have, being an All-In-One device, must have its own means to measure currents, SOC and consumed amphours.
I’ve monitored the (undocumented) 0xFFC register on the Multi RS which I believe it’s the Battery Consumed Amphours.
It starts to increase as soon as the battery starts discharging and decrease as it’s charged back. All normal here.

What’s interesting to note is that the amps put back are down to 0 when the BMS reports about 90% and starts to decrease the CCL from 150A max to 60A and further down. Look at the picture below.
In the same time the Pylontech starts to balance the cells, because they are set to start balancing as soon the cells hit the 3.36V and the difference between the cells is bigger than 30mV.

So all the final, “absorption” phase is done with additional energy that seems to account for so call “loses”.

5 Likes

Well, should define what is a “low load” and what not, else we may reference the same load, but one considering it low and one high.

https://community.victronenergy.com/questions/56351/multiplus-485000-efficiency-curve.html

For me, it is a quite good match at around 600 -800 Watts continiously, but in Germany the typical house hold (^1) would more be like 150 - 200, watts during night, i’d say, where efficency is not yet ideal.

^1 Data has been empirically collected by checking my brothers and fathers nightly consumption :grin:

You are talking about efficiencies of inverters, MPPTs and so on.

But, like Alexandra @lxonline pointed out, we (should) talk about the energy measured at the battery terminals, based on a current measured device, like a shunt. The energy measured in one direction (IN) and the energy measured in the opposite direction (OUT).
And the Multi RS is measuring it with its own shunt.
And the results are like in the report, 71% efficiency
 Makes you wonder


How can we improve it, if possible, or it’s a lost cause?

One way is to use a single big battery with only one BMS, because I am using 6 modules.
And one module is drawing for the BMS a constant 50mA.
So, 50mA x 50V will be about 2.5W per BMS. Six BMSes x 2.5W = 15W. In 6 months, 65kW.
So the “real” cells efficiency, from 526 / 738 (in my case) = 71.2%, went up to (526+65) / (738-65) = 87.8%.
I am considering (+65) at discharging because when discharging the cells must supply the BMS current and (-65) at charging, because when charging the charger will supply the BMS current.
So, 12% loss versus 29% loss is not bad


2 Likes

After all, the real issue is not the (bad) efficency itself, it’s more a very wrong expectation, leading to dissapointment.

When I purchased my first battery - a 12 kWh BYD - my maths were to shallow as well: 12 kWh, read something about 95%, so I can use 11.4 kWh, which should support my 600W/hour for 19 hours, right? Well - no.

After all imprecisions, limitations and losses, about 9 kWh were actually usable - and 600 Watts AC turned out to be ~700W DC requirement.

When I purchased my second battery, I just knew, that I should add about 30% to what I really want.
Due to module size / count requirements, I went with 28 kWh.

Now, the Pylontech may have a worse efficency than the BYD HV-System, but since capacity now satisfies my daily needs, i’m happier with it.

3 Likes

This is my report of the last 6 month. Would be a charge/discharge ratio round about 63%.
Can’t imagine this realy shows the efficiency of the battery.

System is ESS on-grid with pv excess feed-in and 2 Pylontech us2000c modules.

I concur, for July I have around 75% efficiency. I have 19.2kWh pylontech 5000 (4 batteries). My system is up and running since mid March (where I have 70% efficiency up till today) and after reducing the sell price (removed the energy tax in June from the price calculation), the system favours charging the batteries directly from the MPPT so very few AC-DC conversion for the batteries when charging.

Have a 32 x US5000 array. My VRM reports only 68% what is less than expected. Chargers are 4xRS450-200. Last week I heard from a VARTA customer who suffers with only 54% efficiency for his AC Battery storage what is not to much away but even worster.

At least my last capacity test reported a total of 154kWh what is slightly over the datasheet. Possibly we are able to boost the efficiency, if we avoid the operating range over 80-90%.

pylons

I didn’t even know that this report existed but I have just run the report for the last 6 months and I’m getting 78% efficiency. My setup is 4x Pylontech US5000 modules all with separate battery cables feeding into a Lynx Distributor. I also have a pair of MPPT 250/60 charge controllers and my inverter is a Multiplus II 48/5000/70-50

1 Like

great graphic Alex says it all

1 Like

Interesting - never used this report - my figures are ridiculous - my batteries make power


I am on AC (micros / 3 feed lines) DC AC with feed in before and after a multi on 4 x pylon 3000C
Maybe the coms as I have a Slovenian US3000C in the mix and various Firmware versions which according to pylon I should not touch.

Question about the reported discharge amount - does anyone know if this number includes the amount consumed by the inverter?

Don’t think so, as the power is measured at the exit to the loads and the inverter consumption is before that.

4x us3000c 14.4kWh ~80%efficiency

As several users report 10% or more with better efficiency than my 68% I thought about the reasons for the diffrences of very same hardware (US5000, MP2-5000, RS450).

  1. One thing to examine seperate: The VRM Download only works on one of three PCs. They all use Firefox 130.0. One W10 system is ok, another W10 system and Ubuntu only displays a waiting message although other file downloads work without problems on this systems. If anybody likes to dig in more to this issue, drop me a message and a am going to assist with examining more details therefore.

In meantime I intentionaly kept the SOC around 30% ±20% for the last 2 weeks. In long term numbers, I always stuck exactly at 67-68% efficiency. I do not charge from grid and my MP2 (6pcs array) has proven average efficiency of about 93%

  1. Charge and discharge current is typically below 0,1C

  2. Intrinsic resistance of battery including bus wiring seems about 1mOhm. To meassure this value I used following procedure.
    a) take any given SOC and charge battery with a reasonable current e.g. 100 or 300 Amps
    b) note the displayed battery voltage from dashboard
    c) switch from charging to discharging with the same current of a
    d) note the displayed battery voltage from dashboard what is typically lower than b
    e) Calculate diffrence between b and d what should be several 0,1 Volts
    f) Calclulate R = U/I where U is the voltage from e and I is 2x current from a

Assume all, the losses caused by 1mOhm resistance are about only 10 Watt per 100 Amp what is not the reason for deviation of more than 10% efficiency

  1. During the last weeks without top balancing of the Pylons, my SOC shiftet to a value what is obviosly too low. Probably Pylontech does not provide the accuracy of a smartshunt and possibly the Peukert exponent is not taken into account

Sorry guys, but this isn’t the way to effectively see the round-trip-efficiency of your batteries. Just for fun I grabbed those figures for my fairly fresh 2x US500B’s, and they bobbed at 61%. Ridiculous.

System losses the first reason. And you can plainly see inverter standby losses if you set up VRM Dashboard to show Multiplus as a battery monitor, and remove grid and charge sources. With say 100W ac load, my Multi shows ~40W less than the batts say they’re discharging. The Multi figure obviously based on a conversion from it’s ac output. And that can only get worse with inverting losses under bigger loads. Then of course there’s the true RTE of the batts in there somewhere. It all adds up.

But the real killer is actual DC loads which, even if measured by a shunt and allocated as a load, aren’t counted by VRM as Consumption.
Take care what you’re reading when you see figures like this. Most aren’t documented either.