Why would lower temperatures lower the maximum voltage?
Sure, diode forward voltages change a little but seems like something else is going on…
Why would lower temperatures lower the maximum voltage?
Sure, diode forward voltages change a little but seems like something else is going on…
It’s not that it lowers the maximum voltage that the charge controller/inverter can handle. It’s actually that the panels become MORE efficient in the cold temperatures, resulting in a (potentially unconsidered by end-user) increase in voltage, overwhelming the downstream BoS components.
Solar panel voltage goes up as temperature decreases. The chart is misleading, it's stating something like an equivalent max voltage if you think of the solar cell voltage as staying the same (and presuming your temperature coefficient matches)
I think it works this way but I could be wrong: 1. The forward voltage across a diode depends on the current, and the graph is r-shaped. For a certain diode, 0.01A might make the voltage across it 0.4V, 0.1A might be 0.6V, 1A might be 0.65V. 2. The forward voltage also depends on the temperature of the diode. For a certain diode, per °C decrease there could be a 0.002V increase.
Let's say with certain current there is 0.687V. If two diodes are connected in series i.e. (point a) → diode 1 → diode 2 → (point b), and each has a 0.687V voltage across, that's 0.687 + 0.687 = 1.374 V between point b and point a.
For the solar diodes, the "current" depends on how "strong" the sun is. If at a certain "current", across each diode is 0.687V, you'll need 216 diodes in series (between two points) to get 0.687×216≈148.4 V.
If there is 0.002 V increase per diode per 1°C decrease, with 216 diodes, that's a 0.002×216=0.432V increase per °C decrease, so with a 4°C decrease it exceeds the MPPT's limit.
Another thing about solar that differs from how diodes are "normally" used in circuits is that the "true" voltage depends on the max current achievable with the how bright the sun is right now, instead of the "true" current. When the "true" current is 0 A, the voltage across each diode might be 0.687 V. When the "true" current is 0.5 A, maybe 0.65 V. 1A, maybe 0.6V. 2A, maybe 0.3V. Try to get more "true" current, the "true" voltage drops. Try to get more "true" voltage, the "true" current drops. Power is voltage × current so when full speed charging, the MPPT uses an algorithm to find the (possibly) best minimum (not maximum) input voltage based on the temperature etc and trades voltage for current at the output. If there is no minimum input voltage restriction, solar will follow the battery's terminal voltage + cable drop instead, and instead of something like 111V, there could for example be a ~4x less powerful 25.5V (if the battery is a "24 V") with just 10% more current.
At the MPPTed min input voltage for full speed charging, maybe 111V, all might seem well even with low temperatures, but when the battery is full and there is approximately nothing using electricity from solar, the real input current will be ~0 A, so there will no voltage "sag", so the solar will realize the full voltage corresponding to the temperature and the illumination, potentially >150 V...
Lower temperatures increase PV max voltage output, not lower it. Conversely, when solar panel temperature increases, voltage decreases. So the headline specs/outputs assume to be valid at a particular temperature. As the temperature of the panels change, the realized performance changes.
It's not that little given these diodes run between sun baked summer and cold soaked winter dawn.