question

pracman avatar image
pracman asked

Settings changes when fault occured Multiplus II 5000VA

I have two Victron MultiPlus-II operating in parallel. Last week 2020-07-24 07:20.23 "VE.Bus Error: VE.Bus Error 17: Phase master missing" We had had no issues prior to that with my Victron’s working almost perfectly for several months. I have Grid and solar and use my system mostly as a UPS for my business and in particular some vaccine fridges I have.

With both Victron stoped and Grid working fine. Restarts failed with several reboots tripping out my entire buildings grid power when the Victrons tried to start up. This occurred even after splitting from parallel. Start up of one or both units saw the same building power failure issue despite no load on the victron's. They would have tried to charge the batteries.


I first assumed it was just one unit as turning it on replicate the issue inside 10 minutes as it started.


An electrician with dozens of Victon's in his care was onsite to confirm the faults after he tried to update the settings. He later found he second unit killed the building power.


Testing now by a knowledgeable and helpful Victron authorized repair site found the same issue before another factory reset found everything working perfectly in first solo then in parallel using a back up of the settings I had on site.


My Concern is every thing was working perfectly. Victron suggested it is not possible for an error to change settings such as grid code which is a very unwanted change suspected in the faults I had on a system which had been working.


Has anyone else found changes in software settings in a MultiPlus-II in parallel or in standalone modes after a fault or error?


Multiplus-II
4 comments
2 |3000

Up to 8 attachments (including images) can be used with a maximum of 190.8 MiB each and 286.6 MiB total.

pracman avatar image pracman commented ·

Test by Victron did not find fault with hardware. They tested standalone and parallel using my settings. Reinstalled noted a failure quickly, Tried again after hours. Everything worked for 5 hours then huge charge/discharge swing three times. First 2 restarted the Victron's. The third blew off the entire buildings breakers. Full grid power was available and there appears no reason the Victron's changed from charging a battery to 3% SOC to trying to invert power. The minimum SOC set was 50% on my attendant/controller. The fault crashing the Victron's is still not known.

I will find issue by eliminating the easy part. I am adding borrowed Victron batteries and BMS to the system. Any fault will/may prove it is the Victron MuliplusII. No fault will point the finger at the batteries. As usual I have to work over time as cannot risk testing will we are working. My system is almost totally a UPS to enable work. All the wiring has been checked by three separate electricians. As no faults occur at all with Victrons off and relay in Grid only I doubt it is possible I have a wiring issue.

Its almost funny the biggest electrical risk I have at present is my back up system except I chose and paid for it of course.

Have to add help via Victron and battery supplier has be fantastic.


If you see any fault with my method please feel free to comment.

0 Likes 0 ·
pracman avatar image pracman commented ·

About 2 am in the morning I woke up remembering my Muliplus have a three position on/off/charger only. As fault seen occurred with load changing Victrons from charging a really 0%SOC ( not lithium), Assumption issues I have had with Victron shutting down may be related to Zero SOC battery being asked to supply load despite me having grid and everything working when GRID only used.

Switch to "charger only" and I know have a nicely charged 10kwh of power. several hours later.

19:00 Multiplus turn to ON and are paralleled and working as if perfect now. I held them isolated from my load with our ATS-relay off meaning while power goes to the DC side and Victrons it is not providing my load at all for 3 hours.

Load has been consistently at 3500 watts with a little increase or decrease as AC ramped up or down.

19:50 Turned on ATS-relay with Grid on bypassing U.P.S. No surprises as expected

20:00'Color Controller' relay engaged. Grid reports watts lower at 1690w which is correct for UPS circuits as one big AC unit in the building is not on the same circuit.

Waiting for cleaners to go before turning of grid- paint drying watching time.

2020 Grid off. Flicker and UPS is up. critical load only of 809w to be mean I just turned on 4 AC units- Load is 2680w. Pushed the elevator call button was 2703. I think the AC units are backing off.

2028 Victrons shut off -

Grid turned back on. Load when grid turned on is still a light 2000w odd.


2035 grid off again with out my extra AC units running- I might have been a bit cheeky with the AC units. Load on battery only is now 1800-2000w. Dropping to 1565w at 2045

No issues at 2050 - Turned the grid on and isolated the UPS Victron and Batteries. Moved battery offline and left the Multiplus on as wall flowers for the night.

The load has been about or under 50% of one of my multiplus and mostly aroung 20% to 30% the sustained 10,000 w (plus peak) of the two cabablity of my 2 Multiplus units.

Battery is not miss behaving though I may have had a AC start up ramp up to bite the first test as the battery current shows. Note a report of a High DC ripple at 2027 as well.



Still very confused as why the system which had worked perfectly for 6 months with no issues often under much higher loads is now misbehaving. More than a little surprised the Victrons behaved this way regardless of the DC side as Grid was not off when the primary issues occurred. I am going to double the DC cable sizes and switch circuit breakers to the biggest they are allowed to be. Electrician is really doing this not me. leaving every thing in hibernation until the wiring upgrades to the DC sides are done. Not going to change the battery to enable test now as battery is in the clear I feel.

0 Likes 0 ·
1596710521987.png (7.5 KiB)
1596710548693.png (313.0 KiB)
pracman avatar image pracman commented ·

Turned everything on this morning as no one is at work so I can. No issues at all. Honestly almost prefer there was. System worked perfectly until recently and has been working with the same AC and DC side wiring for a long time. I am making several changes next week before more testing before I make the system live again during working hours.

One possible item I have in the building which in the past caused issues for me at about 8am when everything is first turned on and load peaks. Our hot water system if it jumped on at about 8am gave me a few problems years ago. It was put on its own circuit and a timer to allow it to only turn on at the time of a issue at about midnight last week. It seem to be the only possible item I have which draws a lot of power when it turn on. Given it is again the likely culprit impacting on the stability of my grid its on the chopping block now along with other upgrades to the DC wiring recommended.

0 Likes 0 ·
pracman avatar image pracman commented ·

Today I split the two Multiplus 2 and ran one at a time after setting them one at a time exactly the same, Both behaved yet one drew 2000w plus when on and the other sub 1000w, I had SOC set to 0%

Both handled a grid turned off test with no problems .


Noted another item I think which may be of interest. Before and after above, I could not used the button on the Color controller or the online system to access the Menu items. Noted that two days ago.


Powered down and re powered the ESS and button and online worked. As assistant loaded onto both individual Multiplus 2's was the same I wonder if ESS fault could be the cause of some or all of my errors? The controller is powered by its own little UPS so is never off unless I unplug it from the UPS which I did tonight.





0 Likes 0 ·
1 Answer
rickp avatar image
rickp answered ·

I read through this, and can’t imagine your building shutting down like that! It must have been a madhouse. Sorry to hear about it.

I just had one thought, which may be nothing but I thought I should throw it out there.

Is there any possibility someone else accessed your system and changed a setting that sent it crashing? No default passwords being used, etc? It’s the only thing that made sense for a setting changing in mid-stream that way.

Again, it might be nothing, but it’s possible.

1 comment
2 |3000

Up to 8 attachments (including images) can be used with a maximum of 190.8 MiB each and 286.6 MiB total.

pracman avatar image pracman commented ·

Yes a few other people can being two helpful Electricians and Redflow Batteries gurus. They are a lot smarter than I am Rick and fix issues for me. The systems including my network firewalls log access and nothing of that type of issue occurred.

Business is always a madhouse :)

Extra capacity in the breakers, DC heavier wiring. (it was already spec its now over Spec)

System has been stable. Still unsure of the faults. I knew of a battery issue before the fault and failures. It was off line for week before the events. My batteries do cycle to 0%SOC which just might have caused the Victons to get grumpy. Honestly with all the testing we really do not know why.


I have a theory known issue with computer systems stability. All my servers and computers are restarted at least monthly. I wonder if Victrons might benefit from a weekly or monthly restart. A access fault with the ESS controller noted needed a simple power off/on to fix. A restart for that component of my system resolved the fault.


On Software experts are better than I am.

"Some degree of a priori suspiciousness of degeneracy makes sense. It’s hard to imagine how it would benefit a business or software system. It sounds both more expensive and more complex than redundancy. Let’s examine these assumptions with some “straw man” examples.

Consider a static HTML web site served via Apache on a single server. Here are two ways we could introduce degeneracy:

  1. Install Apache, Nginx and Lighttpd on the same server, each on a different port, all serving the same HTML content, and each activating depending on the “context” (the port is the context here).
  2. Take a second server which is used for some other purpose (how about Microsoft Exchange) and stick a copy of the web site on there running via IIS. This time the “context” is the IP address. The same “function” (serving the HTML) can be accomplished via two different mechanisms depending on the context.


Adding a power down to "everything" with my server reboot schedule. My Emergency Exits signs will apparently appreciate that as well.



.



0 Likes 0 ·