VM-3P75CT Meter disconnecting frequently

I noticed randomly (every couple hours or so, I haven’t noticed an exact schedule yet) that my ESS (Cerbo GX, multiplus II, MPPT controller with solar panels, other brand batteries behind a smartshunt, and grid meter) doesn’t look like it’s generating solar. It says it’s in passthrough mode. It seems to be the issue is loss of connectivity with the grid meter.
The grid is connected via ethernet. I work in networking professionally so I’m pretty sure I have a solid network for them.

If I restart the cerbo I usually get connectivity back.
If I power cycle the meter I may get connectivity back.
If I go to Modbus TCP/UDP devices and scan again I may get connectivity or the cerbo may get sluggish which prompts me to do a cerbo restart which usually fixes it.

Any ideas on what is happening to cause the connectivity issue?

How have you gone with this?

I’ve got exactly the same issue — seemingly intermittent randomness that doesn’t have an obvious cause. Irritating as like you point out, it puts the system in Pass-Thru mode, throttles down my Fronius inverters and really impacts useful production.

The VM-3P75CT is running firmware 1.08 and I’m running the latest beta of Venus OS on the Cerbo (as of writing that’s 3.60~58), but this issue has persisted for many versions of Venus OS.

I too seem to get some benefit from a reboot of the Cerbo. One of the jobs of the next week or so is to more closely examine the logs and try and get to the bottom of it. Gut feeling is that the Modbus connection is flakey somehow (even though the link layer for both ends seems fine — my network monitoring isn’t showing either device dropping off the network).

I’ve just watched it regularly and rebooted the cerbo when it happens.
It seems to only happen when the battery is in the 80-89% or so soc range.

Super frustrating since I had cheaper options and went this route since Victron is high quality. Hardware is great, but this is very frustrating. Often, when it happens, it is just approaching peak production hours and I miss a few peak hours.

I haven’t tried beta software, but the devices are all up-to-date on the stable software channel.

The Cerbo is connected wirelessly and at further distance than I prefer, but it still has a solid connection and never misses a ping for days unless this happens. Yes, I sometimes go for days without this happening.

I should probably open a Victron support case.

I have a lead! I think this is a subtle network issue (i.e., WiFi is crap), interplaying poorly with the communication protocol Victron have chosen for the VM-3P75CT and a bug in dbus-modbus-client.

I caught this in the act today and I’ve got an idea why it’s misbehaving. From /var/log/dbus-modbus-client/current, I can see the following repeating errors in the small amount of logging that VenusOS seems to retain:

  File "/opt/victronenergy/dbus-modbus-client/dbus-modbus-client.py", line 147, in update_device
    self.del_device(dev)
  File "/opt/victronenergy/dbus-modbus-client/dbus-modbus-client.py", line 324, in del_device
    s.del_tree('/Devices/' + dev.get_ident())
                             ^^^^^^^^^^^^^^^
  File "/opt/victronenergy/dbus-modbus-client/device.py", line 371, in get_ident
    return '%s_%s' % (self.vendor_id, self.get_unique())
                                      ^^^^^^^^^^^^^^^^^
  File "/opt/victronenergy/dbus-modbus-client/device.py", line 368, in get_unique
    return self.info['/Serial']
           ~~~~~~~~~^^^^^^^^^^^
KeyError: '/Serial'
INFO     [udp:172.16.4.247:502:1] Device failed: pack expected 1 items for packing (got 2)
ERROR    Uncaught exception in update
Traceback (most recent call last):
  File "/opt/victronenergy/dbus-modbus-client/dbus-modbus-client.py", line 139, in update_device
    dev.update()
  File "/opt/victronenergy/dbus-modbus-client/device.py", line 486, in update
    self.reinit()
  File "/opt/victronenergy/dbus-modbus-client/device.py", line 454, in reinit
    self.init(self.settings_dbus, self.enabled)
  File "/opt/victronenergy/dbus-modbus-client/device.py", line 463, in init
    self.device_init()
  File "/opt/victronenergy/dbus-modbus-client/victron_em.py", line 68, in device_init
    phase_cfg = self.read_register(self.data_regs[0])
                ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/opt/victronenergy/dbus-modbus-client/device.py", line 121, in read_register
    reg.decode(   — reidy.re
  File "/opt/victronenergy/dbus-modbus-client/register.py", line 78, in decode
    v = struct.unpack(self.coding[0], struct.pack(self.coding[1], *values))
                                      ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
struct.error: pack expected 1 items for packing (got 2)

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/opt/victronenergy/dbus-modbus-client/dbus-modbus-client.py", line 256, in update_timer
    self.update()
  File "/opt/victronenergy/dbus-modbus-client/dbus-modbus-client.py", line 290, in update
    super().update()
  File "/opt/victronenergy/dbus-modbus-client/dbus-modbus-client.py", line 239, in update
    self.update_device(d)
  File "/opt/victronenergy/dbus-modbus-client/dbus-modbus-client.py", line 147, in update_device
    self.del_device(dev)
  File "/opt/victronenergy/dbus-modbus-client/dbus-modbus-client.py", line 324, in del_device
    s.del_tree('/Devices/' + dev.get_ident())
                             ^^^^^^^^^^^^^^^
  File "/opt/victronenergy/dbus-modbus-client/device.py", line 371, in get_ident
    return '%s_%s' % (self.vendor_id, self.get_unique())
                                      ^^^^^^^^^^^^^^^^^
  File "/opt/victronenergy/dbus-modbus-client/device.py", line 368, in get_unique
    return self.info['/Serial']
           ~~~~~~~~~^^^^^^^^^^^
KeyError: '/Serial'
INFO     [udp:172.16.4.247:502:1] Device failed: pack expected 8 items for packing (got 1)
ERROR    Uncaught exception in update
Traceback (most recent call last):
  File "/opt/victronenergy/dbus-modbus-client/dbus-modbus-client.py", line 139, in update_device
    dev.update()
  File "/opt/victronenergy/dbus-modbus-client/device.py", line 486, in update
    self.reinit()
  File "/opt/victronenergy/dbus-modbus-client/device.py", line 454, in reinit
    self.init(self.settings_dbus, self.enabled)
  File "/opt/victronenergy/dbus-modbus-client/device.py", line 464, in init
    self.read_info()
  File "/opt/victronenergy/dbus-modbus-client/device.py", line 170, in read_info
    self.read_info_regs(self.info)
  File "/opt/victronenergy/dbus-modbus-client/device.py", line 136, in read_info_regs
    self.read_register(reg)
  File "/opt/victronenergy/dbus-modbus-client/device.py", line 121, in read_register
    reg.decode(   — reidy.re
  File "/opt/victronenergy/dbus-modbus-client/register.py", line 157, in decode
    newval = struct.pack(self.pfmt, *values).rstrip(b'\0')
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
struct.error: pack expected 8 items for packing (got 1)

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/opt/victronenergy/dbus-modbus-client/dbus-modbus-client.py", line 256, in update_timer
    self.update()
  File "/opt/victronenergy/dbus-modbus-client/dbus-modbus-client.py", line 290, in update
    super().update()
  File "/opt/victronenergy/dbus-modbus-client/dbus-modbus-client.py", line 239, in update
    self.update_device(d)
  File "/opt/victronenergy/dbus-modbus-client/dbus-modbus-client.py", line 147, in update_device
    self.del_device(dev)
  File "/opt/victronenergy/dbus-modbus-client/dbus-modbus-client.py", line 324, in del_device
    s.del_tree('/Devices/' + dev.get_ident())
                             ^^^^^^^^^^^^^^^
  File "/opt/victronenergy/dbus-modbus-client/device.py", line 371, in get_ident
    return '%s_%s' % (self.vendor_id, self.get_unique())
                                      ^^^^^^^^^^^^^^^^^
  File "/opt/victronenergy/dbus-modbus-client/device.py", line 368, in get_unique
    return self.info['/Serial']
           ~~~~~~~~~^^^^^^^^^^^
KeyError: '/Serial'
INFO     [udp:172.16.4.247:502:1] Device failed: pack expected 2 items for packing (got 1)
INFO     [udp:172.16.4.247:502:1] Found Energy meter: Victron Energy VM-3P75CT
INFO     registered ourselves on D-Bus as com.victronenergy.grid.ve_HQ2331JZ4GT
INFO     [udp:172.16.4.247:502:1] Device failed: pack expected 2 items for packing (got 1)
ERROR    Uncaught exception in update
Traceback (most recent call last):
  File "/opt/victronenergy/dbus-modbus-client/dbus-modbus-client.py", line 139, in update_device
    dev.update()
  File "/opt/victronenergy/dbus-modbus-client/device.py", line 486, in update
    self.reinit()
  File "/opt/victronenergy/dbus-modbus-client/device.py", line 454, in reinit
    self.init(self.settings_dbus, self.enabled)
  File "/opt/victronenergy/dbus-modbus-client/device.py", line 463, in init
    self.device_init()
  File "/opt/victronenergy/dbus-modbus-client/victron_em.py", line 76, in device_init
    self.fwver = self.read_register(self.info_regs[1])
                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/opt/victronenergy/dbus-modbus-client/device.py", line 121, in read_register
    reg.decode(   — reidy.re
  File "/opt/victronenergy/dbus-modbus-client/victron_regs.py", line 17, in decode
    return self.update(struct.unpack('4B', struct.pack('>2H', *values)))
                                           ^^^^^^^^^^^^^^^^^^^^^^^^^^^
struct.error: pack expected 2 items for packing (got 1)

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/opt/victronenergy/dbus-modbus-client/dbus-modbus-client.py", line 256, in update_timer
    self.update()
  File "/opt/victronenergy/dbus-modbus-client/dbus-modbus-client.py", line 290, in update
    super().update()
  File "/opt/victronenergy/dbus-modbus-client/dbus-modbus-client.py", line 239, in update
    self.update_device(d)
  File "/opt/victronenergy/dbus-modbus-client/dbus-modbus-client.py", line 147, in update_device
    self.del_device(dev)
  File "/opt/victronenergy/dbus-modbus-client/dbus-modbus-client.py", line 324, in del_device
    s.del_tree('/Devices/' + dev.get_ident())
                             ^^^^^^^^^^^^^^^
  File "/opt/victronenergy/dbus-modbus-client/device.py", line 371, in get_ident
    return '%s_%s' % (self.vendor_id, self.get_unique())
                                      ^^^^^^^^^^^^^^^^^
  File "/opt/victronenergy/dbus-modbus-client/device.py", line 368, in get_unique
    return self.info['/Serial']
           ~~~~~~~~~^^^^^^^^^^^
KeyError: '/Serial'
INFO     [udp:172.16.4.247:502:1] Device failed: pack expected 1 items for packing (got 2)
ERROR    Uncaught exception in update
Traceback (most recent call last):
  File "/opt/victronenergy/dbus-modbus-client/dbus-modbus-client.py", line 139, in update_device
    dev.update()
  File "/opt/victronenergy/dbus-modbus-client/device.py", line 486, in update
    self.reinit()
  File "/opt/victronenergy/dbus-modbus-client/device.py", line 454, in reinit
    self.init(self.settings_dbus, self.enabled)
  File "/opt/victronenergy/dbus-modbus-client/device.py", line 463, in init
    self.device_init()
  File "/opt/victronenergy/dbus-modbus-client/victron_em.py", line 68, in device_init
    phase_cfg = self.read_register(self.data_regs[0])
                ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/opt/victronenergy/dbus-modbus-client/device.py", line 121, in read_register
    reg.decode(   — reidy.re
  File "/opt/victronenergy/dbus-modbus-client/register.py", line 78, in decode
    v = struct.unpack(self.coding[0], struct.pack(self.coding[1], *values))
                                      ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
struct.error: pack expected 1 items for packing (got 2)

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/opt/victronenergy/dbus-modbus-client/dbus-modbus-client.py", line 256, in update_timer
    self.update()
  File "/opt/victronenergy/dbus-modbus-client/dbus-modbus-client.py", line 290, in update
    super().update()
  File "/opt/victronenergy/dbus-modbus-client/dbus-modbus-client.py", line 239, in update
    self.update_device(d)
  File "/opt/victronenergy/dbus-modbus-client/dbus-modbus-client.py", line 147, in update_device
    self.del_device(dev)
  File "/opt/victronenergy/dbus-modbus-client/dbus-modbus-client.py", line 324, in del_device
    s.del_tree('/Devices/' + dev.get_ident())
                             ^^^^^^^^^^^^^^^
  File "/opt/victronenergy/dbus-modbus-client/device.py", line 371, in get_ident
    return '%s_%s' % (self.vendor_id, self.get_unique())
                                      ^^^^^^^^^^^^^^^^^
  File "/opt/victronenergy/dbus-modbus-client/device.py", line 368, in get_unique
    return self.info['/Serial']
           ~~~~~~~~~^^^^^^^^^^^
KeyError: '/Serial'
INFO     [udp:172.16.4.247:502:1] Device failed: pack expected 8 items for packing (got 1)
ERROR    Uncaught exception in update
Traceback (most recent call last):
  File "/opt/victronenergy/dbus-modbus-client/dbus-modbus-client.py", line 139, in update_device
    dev.update()
  File "/opt/victronenergy/dbus-modbus-client/device.py", line 486, in update
    self.reinit()
  File "/opt/victronenergy/dbus-modbus-client/device.py", line 454, in reinit
    self.init(self.settings_dbus, self.enabled)
  File "/opt/victronenergy/dbus-modbus-client/device.py", line 464, in init
    self.read_info()
  File "/opt/victronenergy/dbus-modbus-client/device.py", line 170, in read_info
    self.read_info_regs(self.info)
  File "/opt/victronenergy/dbus-modbus-client/device.py", line 136, in read_info_regs
    self.read_register(reg)
  File "/opt/victronenergy/dbus-modbus-client/device.py", line 121, in read_register
    reg.decode(   — reidy.re
  File "/opt/victronenergy/dbus-modbus-client/register.py", line 157, in decode
    newval = struct.pack(self.pfmt, *values).rstrip(b'\0')
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
struct.error: pack expected 8 items for packing (got 1)

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/opt/victronenergy/dbus-modbus-client/dbus-modbus-client.py", line 256, in update_timer
    self.update()
  File "/opt/victronenergy/dbus-modbus-client/dbus-modbus-client.py", line 290, in update
    super().update()
  File "/opt/victronenergy/dbus-modbus-client/dbus-modbus-client.py", line 239, in update
    self.update_device(d)
  File "/opt/victronenergy/dbus-modbus-client/dbus-modbus-client.py", line 147, in update_device
    self.del_device(dev)
  File "/opt/victronenergy/dbus-modbus-client/dbus-modbus-client.py", line 324, in del_device
    s.del_tree('/Devices/' + dev.get_ident())
                             ^^^^^^^^^^^^^^^
  File "/opt/victronenergy/dbus-modbus-client/device.py", line 371, in get_ident
    return '%s_%s' % (self.vendor_id, self.get_unique())
                                      ^^^^^^^^^^^^^^^^^
  File "/opt/victronenergy/dbus-modbus-client/device.py", line 368, in get_unique
    return self.info['/Serial']
           ~~~~~~~~~^^^^^^^^^^^
KeyError: '/Serial'
INFO     [udp:172.16.4.247:502:1] Device failed: pack expected 2 items for packing (got 1)
INFO     [udp:172.16.4.247:502:1] Found Energy meter: Victron Energy VM-3P75CT
INFO     registered ourselves on D-Bus as com.victronenergy.grid.ve_HQ2331JZ4GT

My working hypothesis is that malformed data is coming in, and dbus-modbus-client not is handling the error terribly gracefully. I’m assuming the data malformation is due to a combination of a sometimes-flaky WiFi connection (even though it doesn’t drop packets according to my network monitoring) and the UDP connection, which lacks any kind of error correction.

There seems to be no validation of the incoming data, hence the unhandled exception. I’m assuming that causes dbus-modbus-client to crash or otherwise misbehave, +/- is then respawned and results in the loop of “disconnect-reconnect” behaviour, which appears in the GUI as the energy meter dropping on and off (and ESS flicking in and out of Pass-Thru mode).

Unfortunately, Victron haven’t allowed issues to be reported in the GitHub repository for dbus-modbus-client, so I’m hoping the only committer I can find on here @ptrenz might be able to take this on notice, or pass it onto someone who can.

I’ll have a play with the code myself in coming days, but this ultimately feels like a bug combined with a good reason not to use WiFi for your energy infrastructure. I’ll know for sure once I change the Cerbo over to a wired connection tomorrow.

2 Likes

Interesting. Network trouble crossed my mind but I can’t make the cerbo network connection fail in any way in terms of it missing a ping which leads me to believe it has a solid connection. Also, other devices in the same area as the cerbo work fine.

Also, it only ever happens when charging batteries at least at a couple hundred watts in full sun and the battery is in the 80%-85% range. It could be coincidence, but it hasn’t done this once when charging very slowly (PV energy is going to grid or low solar hours) or if the battery is less than 70% or over 90% SOC.

These lead me to something other than network troubles, but who knows.
Any new insight or were you able to wire the cerbo and see if there’s any change?

Just went looking online at release notes for the meter and cerbo and you wouldn’t guess what I found.

I found it here.

Specifically this part:
v3.60~61 till v3.60~65
General

  • Fix VM-3P75CT on LAN connection issues

Decided to get all crazy and install the beta (v3.60~66) and we’ll see what happens.

1 Like

Well, I’m mixed.

Overall the meter response time and overall performance appears to be better, but it has done the same thing once still. I don’t think I can say it is fixed, but I do see improvement with the meter being seen by the cerbo. Maybe we’ll get meter firmware that in conjunction with the cerbo will do better?

Hey all, thanks for your reports! We made improvements to the comms stack, and expect that the here reported issue has been solved.

Note that using wifi anywhere in the path between the meter and the cerbo will always remain to be suboptimal. Depending on signal quality it can be, and in most cases will be, good enough - but still sub optimal compared to a proper wired network.

As you’ve looked quite deep into it, here is the changes we made:

3 Likes

That’s great! Thanks for your work on this.

I just installed the beta and we’ll see how it goes.

Obviously wired connections are better but as long as we get this bug fixed I think the cerbo works well enough on Wi-Fi in this case to be worth using versus having to get Ethernet cable to it at this time.

Greatly appreciate the update, @mpvader! I’ve installed ~71 and am looking forward to seeing how it goes… before I change to ethernet! :slight_smile:

Well, it doesn’t seem to have fully fixed it @mpvader

It happened again today. The biggest difference I noticed is that typically restarting the cerbo fixed it but this time I restarted the cerbo 3 times and it was not fixed. I then cut power to the meter, gave it a minute, then power cycled the cerbo and it began working again. Previously the meter did not usually have to be restarted.

If there’s any other detail I can provide to help, let me know what would be helpful.

Update - while something is wrong still, I think the initial issue may be fixed. If I don’t log into VRM, it seems to be able to go for some time without issue - longer than before.
If I log into VRM and watch status constantly, it will often cause the cerbo to become unresponsive.
This has happened several times but only if I am logged in and watching it.

Hi all, we have found some further issues & fixes for this - all related to unreliable connections, or something else very specific in certain lan/wifi networks.

We’ll make these fixes available for public beta testing in v3.70 beta versions, we’re not adding them into v3.60.

1 Like

Hello

No grid meter VRM Warning using VM3P75CT

Please check this as well

Been a while and thought I should come update.

Both the latest stable release (3.62 and 3.60 as well) and the latest beta (3.70~6) are so much better. Thank you.

I have still seen it flip over to passthrough likely due to not receiving good data from the meter, but before it took between many minutes to a couple hours to recover and sometimes froze the cerbo so much I had to pull the power cord out for a moment to reboot it. Now, when the issue occurs, it recovers in single digit seconds as far as I can tell. That’s totally acceptable when using Wi-Fi versus a cable.

1 Like

Hey @resilience thanks for that, good to have that confirmed.

Have a good weekend, Matthijs

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.