I have detected a different measurement when mettering the DC voltage in the multiplus II. In this example there is no DC current, so 3 voltages should be the same:
The DC voltage detected in the BMS is aprox equal to the DC voltage detected in the “Lynx Shunt”, but is is not equal to the one detected in the Multiplus II.
At this moment I have solved the issue applying a 0.3V increase to the limits. For example; RCV: from 55.2V to 55.5V. But I do not like this solution. Is there a way to callibrate the voltmeter in the Multiplus II system?
Indeed, normally they are pretty well calibrated.
For example, on my system, the voltage differences between the Multi RS, the BMS and a pretty good multimeter are all within 20mV.
Also the cable sizes and various connections, switches and fuses play a big role on voltage drops.
The voltages order of magnitude for the given current flow (charging) is correct: 55.52V (MP2) → 55.28V (Shunt) → 55.18V (BMS)
There can be many reasons for it.
0.25v difference,yes. There is less than 1% difference. Again. How to great power flow in directions needs potential? Hmmm
A simple example is when you set the system to feedback to grid at 100% the MP terminals will rise 0.4v above to do so.
As a further thought the Victron is not a calibrated reading instrument. The shunt is a bit different and you can see that is closer to what you are expecting.
It is within acceptable deviation between meters. Compared even with the two readings between the two different digital meters you used. (Which are supposedly calibrated devices).
I have had the same problem from installation 2 years ago. I have a second MPPT now installed which gives correct readings of battery voltage, but the main system is around 350mV too low. I have had to add 0.35V to all the levels to get correct battery voltage readings at the batteries. It is not to do with loads and connections as some suggest, as with yourself, a calibrated meter proves the voltage is reading incorrectly. If this is adjustable in software, why won’t Victron allow us to change this calibration?
I also seem to have about 0.2-0.3v offset between MP2 and calibrated Fluke DMM, where MP2 is lower. This is an issue because my battery refuses to set 100% SOC unless voltage reaches high enough. Because of DVCC, cell voltage does not reach what battery requests.
This forces me to incorrectly calibrate the battery side such that it sees about 0.2-0.3v higher voltage than there is in reality. It does not tickle my engineer brain in the right way.
I just purchased and installed in my cruising boat a Multiplus 12-3000-120-50 inverter/charger and noticed the same sort of rather large voltage discrepancy. I have measured at the main output terminals and I have added a voltage sense connection to my battery bank. In all cases the voltage reported by the Multiplus through Victron Connect is about 0.25 volts higher than the reading at the MP output terminals, at any charging current. I am using a well calibrated Fluke meter that I believe is within a couple of mV of being completely correct.
Clearly this means that the careful charging algorithms, including “safe mode” are completely bonkers. The various voltages at the battery bank are now about 0.25 volts below the battery manufacturers specifications for Absorption, Float, etc. (AGM batteries)
Obviously I can jack up the settings to accommodate this large discrepancy, but it would be preferable to somehow make the MP work correctly. Is there any method other than buying a bunch more Victron gadgets to make the Multiplus read more closely to the actual voltage?
Frankly this problem is rather shocking. I have used Xantrex inverter/chargers in the past, and there was never any such voltage error beyond a few mV.
My MPPT is right on, within a few mV. That is why I was surprised by the Multiplus deviation. Voltage measurement to a few mV is easy and cheap. Very surprising that Victron does not deem it important.