I would very much like to make sure that Smart battery sense can share the temperature with the smartshuunt to compensate for capacity, and not just with smartsolar.
Because my smartshuunt has no temperature sensor because I need the engine battery voltage reading. Is it possible to do this? If this is not possible, it would be very convenient for victronenergy to implement this possibility via firmware in the future.
I am concerned that my Blue Solar 150/70 MPPT will not work efficiently , i have set the parameters without any problems however the Temperature compensation is My Blue solar MPPT will not allow me to set the temperature compensation .
Firstly the default says -2.7 which is way out for my batteries , in fact for most batteries . Rolls Flooded recommend 5mv c/cell ( 0.005 ) , my new Trojans AGM say exactly the same . I can select it and change the setting as with any other parameter but once i select Set to have it saved it doesn't save it like any other parameter , it returns to -2.7 .
I have always know temperature compensation to be a value which will be added or subtracted to the charge voltage , never seen any with a " - " in front of it , makes no sense as colder weather increases the voltage .
The temperature here over a year is between 2c and 40c which can change during daylight hours by as much as 8c ( on my 24 volt system that's .06 per degree so less 1.9v to plus .9v .so compensation is necessary , without it my batteries would be ruined very quickly .
Has anyone any knowledge of these controllers and is it a common fault and worth sending iot off to be repaired , its an expensive bit of equipment but batteries cost far more .
Hello Victron User, I will like to know if it's normal that my temperature compensation maximum value on my MPPT 150/100 is -30mV...I cannot enter a higher data, For the moment it's cause me problem of undercharging my battery bank in winter season. My absorption charging voltage of 14,5V cannot reach more than 15,2V at -15 Celsius the same charging voltage that I got at +2 Celsius..?? Seem to be a kind of protection on the MPPT to prevent overcharging ??? (N.B. MPPT v3.08, I'm using it with the BMV-712 v4.08 & the CCGX v2.71) Thanks to help me figure it out !!!
Seems to be a bug with temp compensation. I created a new preset and edited that preset. I set temp compensation to 5.16. I then went back and chose that preset and it had a different value. This version does not seem to load the saved value properly.
when checking battery temperature with victron connect, batteries indicate a 38-43 F degree offset. I reset to default (0 degrees offset) and they quickly go to the 38-43 F degree offset. The BMS allowed the batteries to charge when the actual battery temperature was 22 F degrees. I have the latest update for Victron connect, have reloaded the app. and nothing is fixing the problem.
I am getting a bit confused with the amount of information I am trying to take in so I thought I would join the community for some guidance, so thank you in advance.
I am on a easy solar 1600 24v system, 2x Rolls 290a, 50 Charge controller.
When using the formula, I managed to calculate an absorption time of 46mins (rounded up to 1hr)
Does this sound right?
T = 0.38 x C /I
0.475 = 0,38 x 260ah / 208
If this is incorrect then where have I gone wrong? Also, if I am using adaptive charge then does this figure matter?
I also was confused regarding temperature compensation - I saw that the compensation should be set at -4mV/ºC/Cell - so i have set this at-32V/ºC - is this correct?
The system has a temperature sensor, how do I know that it is even working?
Could you also advise on the absorption and float voltages please?
After looking through the manuals I settled for -
Absorption Voltage - 29.40V
Float Voltage - 27.60V
Max Charge Current - 50A
Also, i set the system to 'charge only' when I got to bed, in the morning despite the batteries reading at 24.8+ the system wont turn on until there is a bit of sun to top up the charge, or i need to turn the gene on to allow a top up... It could be for 5 mins and the system turns back on again and doesn't seem to be lacking charge at all.. is there something I have accidentally mis-configured when I was using veconfigure? I've noticed Victron connect app is not a comprehensive as Veconfigure or am i missing something by plugging into the charge controller?
Please excuse the basic questions but I have already fried one set of these batteries in 3 years and it was an expensive mistake i can let happen again... trying my best to get studying again : )
I came across this blog entry which basically says that, for lead acid batteries, lower temperatures require higher charging voltage. Can anyone please explain why this is true, or point to an authoritative source?
I did ask this question on Chemistry SE a while ago, and those guys seem to be of the opposite opinion: higher temps need higher voltage.
Intuitively I disagree with them and agree with the blog post: lower temps need more power to move molecules around, remove sulphate crystals etc. But I seek a definitive authoritative explanation.
ich bin neu hier und hätte eine Frage an Euch...
Ich habe einen MPPT Smart Solar 100/50 und habe diesen heute mit 4 12V 140Ah AGM Batterien (Electronics GmbH)
verbunden (je 2 in Reihe und parallel -> 280Ah 24V).
Bei der Inbetriebnahme des Ladereglers habe ich die Daten von meinen Batterien eingegeben (eigentlich nur die Schlussspannung 28,8V. Im Datenblatt steht Ladekennlinie AGM1 mit Schlussspannung 14,4V).
Es ist auch ein Smart Battery Sens verbaut, der die Spannung und Temperatur an den Laderegler meldet.
Bei der Eingabe vom Temperaturkorrekturfaktor habe ich jedoch ein Problem... hier kann ich nur bis -60mV gehen.
Auf dem Datenblatt der Batterie steht aber (max. Ladespannung 14,5V empfohlen):
unter 20°C: +0,018V/Zelle/°C , über 20°C: -0,018 V/Zelle/°C
Das wären aber 216mV für die 2 in Reihe geschalteten Batterien (je 6Zellen)... oder habe ich hier einen Denkfehler?
Und wo kann ich die im Datenblatt max. empfohlene Ladespannung einstellen?
Bin hierbei für Jede Hilfe/Info Dankbar!
Mit freundlichen Grüßen, Richard.
With a simple system, all DC without inverter but with a GX device + MPPT + smartshunt with temp. sensor and GEL batteries, which is the correct way to compensate temperature?
a) Enable BLE smart.network and join MPPT and smartshunt; or
b) Enable DVCC on GX device and select Smartshunt as the Battery monitor/ temperarture sensor
Or... both (not at same time) work?
I recently installed 28kw of lithium batteries in a bus conversion and we'll hit colder temperatures in central MN USA.
It is my understanding that it's no so much that lithium can not be absolutely charged below freezing but that it must be derated or compensated below 45f or so. This is because the negative ions stack up and can damage the cells of charged too quickly.
I have an idea... what if we set a temp compensation value intended for lead acid batteries was reversed? Example, I think a traditional value would be -36mV/C, but what if that value was reversed to be +36mV/C or higher. Wouldn't this invert the voltage compensation curve to produce a higher voltage as it got warmer and a lower voltage as it gets colder?
I'm just looking for a way to gradually derate the charge curve instead of a hard off or on switch. I'm also aware of a super secret "Low temperature charge current" setting HEX-Protocol in VE Direct. Would be nice to see that exposed behind a password or something.
Any chance getting the Orion Smart TR DC-DC chargers working with the VE remote battery sensor is happening yet?
i need set up the compensation thing ,i have a 48 volt lithium leaf volt batts,, its at -36.00mV/f, i don't want to hurt or blowup something,, please help,, right now i have the switch turned off
I found out that the temperature compensation does not work correctly. It is too high by a factor of 6! I tried out some things and found out that I need to enter the compensation value per cell and NOT per battery bank as the app is asking for.
The app is asking the temperature compensating coefficient explicitly as per bank and not per cell.
Thus I multiplied the -4mV/°K per cell by 6 and get -24.00mV/°K per bank which I entered in the app.
Today the battery temperature was measured at 19°C. This is 1°C below 20°C (20°C means no compensation).
13.36V + (19°C - 20°C) * -0.024V = 13,384V
13,50V (which is obviously too high for a lousy delta of 1°K)
Then I entered the coefficient as per cell (-4.00mV/°K) and got 13,39V which is what I wanted all the time but never got.
For testing, I also tried to switch off the compensation and got exactly the configured 13.36V.
It looks like the controller is multiplying the entered value internally by 6 as it knows I am using the controller at 12V with lead acid cells (can only be 6 cells as 12V / 2V = 6).
Because of this I killed my new bank last winter as the voltage was too high all the time!!!
0°C in winter means (0°C - 20°C) * (-0.024V * 6 [6 because of the bug]) compensation
= -20 * -0.144V = 2.88V (WTF?!?)
Floating voltage: 13.36V + 2.88V = 16.24V
Absorbing voltage: 14.40V + 2.88V = 17.28V
Balancing voltage: 15.10V + 2.88V = 17.98V
I updated the MPPT to FW v1.50 today which has still the same bug. :-(
Did anyone noticed the same?