MultiPlus Compact 24/1600/40 charging current limited to ~30-32A (despite 40A settings)

Hardware:

• Unit: MultiPlus Compact 24/1600/40-16
• Firmware: v556
• Battery: DIY LiFePO4 8S (EVE 105Ah) with Daly BMS (no CAN/communication with Victron).
• AC Input: Stable grid (Zubr relay protection), voltage ~220-230V.
• Cabling: 25mm² DC cables, short length (<1.5m).

Problem:
The maximum charge current never exceeds 30-32A (approx. 75-80%), even though it is configured to 40A.
The AC input current is far below the limit (drawing ~4-5A from grid with a 16A limit set), so PowerControl should not be active.

Settings Checked (VE.Configure & Remote Console):

  1. Charge Current: Set to 40A in VE.Configure (Charger tab).
  2. AC Input Current Limit: Set to 16A (both switch and software).
  3. Weak AC input: OFF (tried ON, no change).
  4. Dynamic Current Limiter: OFF.
  5. DVCC: Tried both OFF and ON (with limit set to 40A) — result is the same (~31A max).
  6. Temperature: External sensor connected, showing ~19°C (normal).
  7. DIP Switches: This is a Compact model, so no dedicated “Charge Current” DIP switch (per manual).
  8. BMS: Daly BMS settings checked — no charge current limitations active

Observations:

• VRM shows charge current flatlining around 30-31A during bulk.
• No alarms (Temp, Ripple, Overload) are active.
• DC Ripple is low/normal.

Question:
Is there a hidden hardware limitation or a specific DIP switch combination for the Compact model that forces a 75% current limit? Or is this a known behavior with v556 firmware? Or what could be the cause of this issue?

Thank you in advance,

You need to look at the CCL (Charge current limit) imposed by your BMS as the system is charging. This should be visible in the advanced widgets page of VRM. If this is showing 40A, but you are still only charging in the mid 30’s then also have a look at the CVL (charge voltage limit) if this is close to the battery voltage, then the charge current is probably being limited by the size of the battery cables. Check all connections from the inverter to the battery are clean and tight, and the voltage drops are minimal.

1 Like

Thanks a lot for your answer.

I had similar thoughts, and your suggestion confirms them.
Regarding the BMS: my Daly BMS does not broadcast any CCL/CVL data to the Victron system (no communication cable), so the MultiPlus is charging based on its own configuration. The target absorption voltage is set to 28.2V, while the BMS cutoff is typically around 29.2V, so I am still in the Bulk phase, where current should be maxed out.

This leads me to believe the bottleneck is indeed in the physical setup (voltage drop).
Since the issue persisted across two different builds (initial temporary setup with native 1.5m cables vs. current cabinet build with much shorter cables), I suspect the cables themselves are fine (25mm²). The issue likely lies in the components between the inverter and the battery, which remained the same in both setups:

• Battery Switch: 12/24V 100A rated
• Fuse: ANL Fuse 150A + Holder
• Shunt: Victron SmartShunt 300A
• Terminal Posts: 150A rated (2 pcs, pass-through for box connections)

My plan is to measure the voltage drop across each component under load (while charging at ~30A) to identify the weak link with high resistance. I will report back if I find a significant drop on any of these nodes.

Thanks again for pointing me in the right direction!

I did a full voltage drop test under load (~30A charge current) to rule out cabling issues.

Total Voltage Drop:

• Inverter Terminals: 27.34V
• Battery Terminals: 27.20V
• Delta: 0.14V (negligible)

Individual Component Drops:

  1. Positive (+): • Battery → Terminal Block: 7.4 mV
    • Terminal Block → Fuse Input: 5.7 mV
    • ANL Fuse (across holder + fuse): 28 mV
    • Fuse Output → Switch Input: 11.7 mV
    • Battery Switch (Switch itself): 73.5 mV (0.074V)

  2. Negative (-): • Battery → BMS (internal + wire) → Terminal Block: 44.7 mV (0.045V)
    • Terminal Block → Shunt Input: 4.9 mV
    • SmartShunt (Shunt itself): 5.5 mV

Conclusion:
Total drop is <0.15V. The inverter sees nearly the exact battery voltage, so this cannot be the reason for premature charging limitation.
The battery switch has the highest drop (74mV), but even that is within acceptable limits and doesn’t explain the 30A current cap.
The BMS drop (45mV) confirms it is fully open and not throttling current via PWM.

Is there any other hardware reason besides cables/connections that could force this limit?

Also, one thing that really bothers me is this excerpt from the manual (screenshot attached).

It clearly states:

“The default charge current setting is 75% of the maximum charge current.”

My math:
40A (Rated) * 75% = 30A
This matches my actual charge current (30-31A) almost perfectly!

The problem is:
The manual mentions this default limit but does not specify which DIP switch controls it for the Compact model (since DS-6 is documented as “Search Mode”, not “Charge Current”).
And in VE.Configure, I have explicitly set 40A, but the unit seems to ignore it and stick to this 75% hardware default.

Is there a hidden way to disable this 75% default on the Compact 1600 via software or a specific DIP combination? Or is my unit simply ignoring the software override?

Temperature of the internals of the unit (ambient is often misunderstood as the air we feel- ambient refers to that around componens which can be much hotter and cause debarating)?. Those compacts run hot inside. I have one. But the 1600/70

I have only ever once or twice seen it charge at about 3A less than max.

The default refers to the default (factory setting) if you have programmed it since with VC then it should be overriden.

Thanks regarding the temperature explanation. I understand internal components run hotter. However, the 30A limit appears immediately on a cold start (after the unit has been off for hours), so thermal derating seems unlikely to be the instant cause. And yes, I have reprogrammed it with VE.Configure multiple times to 40A, but it stubbornly sticks to ~30-31A.

Very puzzling.
Your investigations have been pretty thorough.

1 Like

This is essential for correct operation of Lithium batteries. If you have DVCC turned on, it needs a source for the cvl and ccl values. limit charge current and limit managed battery voltage values are to restrict the charging below the BMS requested values. Whilst the system may default to programed values, the behavior may also be unpredictable.

Additionally, you can wire up the Inverter remote voltage sensing. I see that you have 28.2V programed for Abs voltage, but are only getting to 27.34V. One other reason for the low charge current may be the AC input voltage at full charge.

1 Like

Thanks for the critical suggestion about DVCC needing a source. It makes total sense.

I followed your advice:

  1. Turned OFF DVCC completely in the Remote Console.
  2. Executed “Redetect VE.Bus system”.
  3. Executed “Restart VE.Bus system” (via Advanced menu) to clear any cached limits.
  4. Even power-cycled the inverter physically.

Result:
Unfortunately, the behavior is exactly the same: charging current is stuck at ~31A.
(Please see the attached video showing the settings and the immediate current limit upon charging).

Can you do the same test while watching the AC input voltage? How much is that sagging when those 1300W are applied.

1 Like

Interesting how you mentioned this.

I did an experiment a while ago and the different extension used to charge my little mobile application did affect the charge amps even though there was no setting changes. And technically each extension was ‘capable’ of the full 16A pass through.

The theory is input impendence (and voltage) has an effect on charge as well.

I simply repeated from mike as it seemed like a possibility, that i havent tested myself, but would be relatively easy to do by adding a temporary resistor into the input line

I see your point, but I draw less than 6A from the grid. Also, I don’t use any extension cables; my system is plugged directly into the wall socket.

Its not the current its the impedance. And the ‘noise’ in the connection.
If you are able check the wall plug and socket connecions.

While I don’t know how to properly check the socket internals, I tested charging from another socket in a different room using a 30m extension cord. I noticed the current dropped by ~1A compared to previous tests (~30+A). This drop might be related to the extension cord, but it doesn’t explain the total lack of 8-9A while charging.

I believe I’ve found a data inconsistency in the MultiPlus ‘Overview’ menu. While charging, the reported values are:

DC Power: 1076 W
DC Voltage: 26.97 V
DC Current: 32.6 A

The calculated current (1076W / 26.97V) is 39.89A. It seems the internal logic is using this inflated power value to enforce the 40A limit, effectively capping the real output at ~32 A.

I disabled SCS (Shared Current Sense) to verify the internal sensor reading, and the mismatch between Power/Voltage and Current persists. Is this a known hardware sensing or calibration issue?

It is calculated. Not measured. By definition will not be as accurate as a shunt.

Interesting how it isn’t using the shunt for information.

Exactly. My latest data confirms this. The MultiPlus is limiting its charge because it calculates its output has reached the 40A setpoint (based on its internal power reading of 1076W / 26.97V = ~40A)

However, at the same moment, the SmartShunt measures the actual current at only 32.6A.

This leads to my main question: is it possible that the MultiPlus’s charge algorithm prioritizes its own internal power calculation over the measured current from the SmartShunt (via SCS)?